org.scalatest.exceptions.TestFailedException: spark-submit returned with exit code 1. Command line: './bin/spark-submit' '--name' 'prepare testing tables' '--master' 'local[2]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--conf' 'spark.sql.warehouse.dir=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/warehouse-f58ea99f-b5f3-40ee-942d-afe643b3a43c' '--conf' 'spark.sql.test.version.index=1' '--driver-java-options' '-Dderby.system.home=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/warehouse-f58ea99f-b5f3-40ee-942d-afe643b3a43c' '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/test4459602032925096236.py' 2019-02-25 22:10:55.501 - stdout> 2019-02-25 22:10:55 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2019-02-25 22:10:56.251 - stdout> 2019-02-25 22:10:56 INFO SparkContext:54 - Running Spark version 2.4.0 2019-02-25 22:10:56.275 - stdout> 2019-02-25 22:10:56 INFO SparkContext:54 - Submitted application: prepare testing tables 2019-02-25 22:10:56.332 - stdout> 2019-02-25 22:10:56 INFO SecurityManager:54 - Changing view acls to: jenkins 2019-02-25 22:10:56.332 - stdout> 2019-02-25 22:10:56 INFO SecurityManager:54 - Changing modify acls to: jenkins 2019-02-25 22:10:56.332 - stdout> 2019-02-25 22:10:56 INFO SecurityManager:54 - Changing view acls groups to: 2019-02-25 22:10:56.333 - stdout> 2019-02-25 22:10:56 INFO SecurityManager:54 - Changing modify acls groups to: 2019-02-25 22:10:56.333 - stdout> 2019-02-25 22:10:56 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jenkins); groups with view permissions: Set(); users with modify permissions: Set(jenkins); groups with modify permissions: Set() 2019-02-25 22:10:56.627 - stdout> 2019-02-25 22:10:56 INFO Utils:54 - Successfully started service 'sparkDriver' on port 37777. 2019-02-25 22:10:56.652 - stdout> 2019-02-25 22:10:56 INFO SparkEnv:54 - Registering MapOutputTracker 2019-02-25 22:10:56.67 - stdout> 2019-02-25 22:10:56 INFO SparkEnv:54 - Registering BlockManagerMaster 2019-02-25 22:10:56.673 - stdout> 2019-02-25 22:10:56 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 2019-02-25 22:10:56.674 - stdout> 2019-02-25 22:10:56 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up 2019-02-25 22:10:56.682 - stdout> 2019-02-25 22:10:56 INFO DiskBlockManager:54 - Created local directory at /tmp/blockmgr-e8441f5f-4c53-40a9-abac-d85bb66abef5 2019-02-25 22:10:56.699 - stdout> 2019-02-25 22:10:56 INFO MemoryStore:54 - MemoryStore started with capacity 366.3 MB 2019-02-25 22:10:56.714 - stdout> 2019-02-25 22:10:56 INFO SparkEnv:54 - Registering OutputCommitCoordinator 2019-02-25 22:10:56.791 - stdout> 2019-02-25 22:10:56 INFO Executor:54 - Starting executor ID driver on host localhost 2019-02-25 22:10:56.858 - stdout> 2019-02-25 22:10:56 INFO Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 44398. 2019-02-25 22:10:56.858 - stdout> 2019-02-25 22:10:56 INFO NettyBlockTransferService:54 - Server created on amp-jenkins-worker-05.amp:44398 2019-02-25 22:10:56.86 - stdout> 2019-02-25 22:10:56 INFO BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 2019-02-25 22:10:56.884 - stdout> 2019-02-25 22:10:56 INFO BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, amp-jenkins-worker-05.amp, 44398, None) 2019-02-25 22:10:56.887 - stdout> 2019-02-25 22:10:56 INFO BlockManagerMasterEndpoint:54 - Registering block manager amp-jenkins-worker-05.amp:44398 with 366.3 MB RAM, BlockManagerId(driver, amp-jenkins-worker-05.amp, 44398, None) 2019-02-25 22:10:56.891 - stdout> 2019-02-25 22:10:56 INFO BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, amp-jenkins-worker-05.amp, 44398, None) 2019-02-25 22:10:56.892 - stdout> 2019-02-25 22:10:56 INFO BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, amp-jenkins-worker-05.amp, 44398, None) 2019-02-25 22:10:57.051 - stdout> 2019-02-25 22:10:57 INFO log:192 - Logging initialized @2580ms 2019-02-25 22:10:57.203 - stdout> 2019-02-25 22:10:57 INFO SharedState:54 - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/warehouse-f58ea99f-b5f3-40ee-942d-afe643b3a43c'). 2019-02-25 22:10:57.204 - stdout> 2019-02-25 22:10:57 INFO SharedState:54 - Warehouse path is '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/warehouse-f58ea99f-b5f3-40ee-942d-afe643b3a43c'. 2019-02-25 22:10:57.716 - stdout> 2019-02-25 22:10:57 INFO StateStoreCoordinatorRef:54 - Registered StateStoreCoordinator endpoint 2019-02-25 22:10:59.768 - stdout> 2019-02-25 22:10:59 INFO HiveUtils:54 - Initializing HiveMetastoreConnection version 1.2.1 using Spark classes. 2019-02-25 22:11:00.389 - stdout> 2019-02-25 22:11:00 INFO HiveMetaStore:589 - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 2019-02-25 22:11:00.412 - stdout> 2019-02-25 22:11:00 INFO ObjectStore:289 - ObjectStore, initialize called 2019-02-25 22:11:00.524 - stdout> 2019-02-25 22:11:00 INFO Persistence:77 - Property hive.metastore.integral.jdo.pushdown unknown - will be ignored 2019-02-25 22:11:00.525 - stdout> 2019-02-25 22:11:00 INFO Persistence:77 - Property datanucleus.cache.level2 unknown - will be ignored 2019-02-25 22:11:06.342 - stdout> 2019-02-25 22:11:06 INFO ObjectStore:370 - Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order" 2019-02-25 22:11:06.437 - stdout> 2019-02-25 22:11:06 ERROR MetaData:115 - Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo" 2019-02-25 22:11:06.443 - stdout> 2019-02-25 22:11:06 WARN MetaStoreDirectSql:188 - Database initialization failed; direct SQL is disabled 2019-02-25 22:11:06.443 - stdout> javax.jdo.JDOException: Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo" 2019-02-25 22:11:06.443 - stdout> at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:600) 2019-02-25 22:11:06.443 - stdout> at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:230) 2019-02-25 22:11:06.443 - stdout> at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.ensureDbInit(MetaStoreDirectSql.java:183) 2019-02-25 22:11:06.443 - stdout> at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.<init>(MetaStoreDirectSql.java:137) 2019-02-25 22:11:06.443 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:295) 2019-02-25 22:11:06.443 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258) 2019-02-25 22:11:06.443 - stdout> at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76) 2019-02-25 22:11:06.443 - stdout> at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136) 2019-02-25 22:11:06.443 - stdout> at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57) 2019-02-25 22:11:06.443 - stdout> at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66) 2019-02-25 22:11:06.443 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593) 2019-02-25 22:11:06.443 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571) 2019-02-25 22:11:06.443 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620) 2019-02-25 22:11:06.443 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461) 2019-02-25 22:11:06.443 - stdout> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66) 2019-02-25 22:11:06.443 - stdout> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72) 2019-02-25 22:11:06.443 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762) 2019-02-25 22:11:06.443 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199) 2019-02-25 22:11:06.443 - stdout> at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74) 2019-02-25 22:11:06.443 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2019-02-25 22:11:06.443 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2019-02-25 22:11:06.443 - stdout> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2019-02-25 22:11:06.443 - stdout> at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 2019-02-25 22:11:06.443 - stdout> at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521) 2019-02-25 22:11:06.443 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86) 2019-02-25 22:11:06.443 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132) 2019-02-25 22:11:06.443 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) 2019-02-25 22:11:06.443 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005) 2019-02-25 22:11:06.443 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024) 2019-02-25 22:11:06.444 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234) 2019-02-25 22:11:06.444 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174) 2019-02-25 22:11:06.444 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166) 2019-02-25 22:11:06.444 - stdout> at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:183) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:117) 2019-02-25 22:11:06.444 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2019-02-25 22:11:06.444 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2019-02-25 22:11:06.444 - stdout> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2019-02-25 22:11:06.444 - stdout> at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:272) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:384) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:215) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.org$apache$spark$sql$hive$HiveSessionStateBuilder$$externalCatalog(HiveSessionStateBuilder.scala:39) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog$lzycompute(SessionCatalog.scala:90) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog(SessionCatalog.scala:90) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.catalyst.catalog.SessionCatalog.tableExists(SessionCatalog.scala:415) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.execution.command.CreateDataSourceTableAsSelectCommand.run(createDataSourceTables.scala:154) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:104) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:102) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.execution.command.DataWritingCommandExec.executeCollect(commands.scala:115) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:195) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:195) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.Dataset$$anonfun$53.apply(Dataset.scala:3365) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3364) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.Dataset.<init>(Dataset.scala:195) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:80) 2019-02-25 22:11:06.444 - stdout> at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642) 2019-02-25 22:11:06.444 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 2019-02-25 22:11:06.444 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 2019-02-25 22:11:06.444 - stdout> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2019-02-25 22:11:06.444 - stdout> at java.lang.reflect.Method.invoke(Method.java:498) 2019-02-25 22:11:06.444 - stdout> at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) 2019-02-25 22:11:06.444 - stdout> at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) 2019-02-25 22:11:06.444 - stdout> at py4j.Gateway.invoke(Gateway.java:282) 2019-02-25 22:11:06.444 - stdout> at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) 2019-02-25 22:11:06.444 - stdout> at py4j.commands.CallCommand.execute(CallCommand.java:79) 2019-02-25 22:11:06.444 - stdout> at py4j.GatewayConnection.run(GatewayConnection.java:238) 2019-02-25 22:11:06.444 - stdout> at java.lang.Thread.run(Thread.java:748) 2019-02-25 22:11:06.444 - stdout> NestedThrowablesStackTrace: 2019-02-25 22:11:06.444 - stdout> Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo" 2019-02-25 22:11:06.444 - stdout> org.datanucleus.exceptions.NucleusException: Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo" 2019-02-25 22:11:06.444 - stdout> at org.datanucleus.metadata.xml.MetaDataParser.parseMetaDataURL(MetaDataParser.java:145) 2019-02-25 22:11:06.444 - stdout> at org.datanucleus.api.jdo.metadata.JDOMetaDataManager.parseFile(JDOMetaDataManager.java:240) 2019-02-25 22:11:06.445 - stdout> at org.datanucleus.api.jdo.metadata.JDOMetaDataManager.loadXMLMetaDataForClass(JDOMetaDataManager.java:773) 2019-02-25 22:11:06.445 - stdout> at org.datanucleus.api.jdo.metadata.JDOMetaDataManager.getMetaDataForClassInternal(JDOMetaDataManager.java:383) 2019-02-25 22:11:06.445 - stdout> at org.datanucleus.metadata.MetaDataManager.getMetaDataForClass(MetaDataManager.java:1570) 2019-02-25 22:11:06.445 - stdout> at org.datanucleus.query.compiler.JavaQueryCompiler.getType(JavaQueryCompiler.java:960) 2019-02-25 22:11:06.445 - stdout> at org.datanucleus.query.compiler.JavaQueryCompiler.getType(JavaQueryCompiler.java:952) 2019-02-25 22:11:06.445 - stdout> at org.datanucleus.query.expression.PrimaryExpression.bind(PrimaryExpression.java:129) 2019-02-25 22:11:06.445 - stdout> at org.datanucleus.query.expression.DyadicExpression.bind(DyadicExpression.java:87) 2019-02-25 22:11:06.445 - stdout> at org.datanucleus.query.compiler.JavaQueryCompiler.compileFilter(JavaQueryCompiler.java:481) 2019-02-25 22:11:06.445 - stdout> at org.datanucleus.query.compiler.JDOQLCompiler.compile(JDOQLCompiler.java:113) 2019-02-25 22:11:06.445 - stdout> at org.datanucleus.store.query.AbstractJDOQLQuery.compileInternal(AbstractJDOQLQuery.java:367) 2019-02-25 22:11:06.445 - stdout> at org.datanucleus.store.rdbms.query.JDOQLQuery.compileInternal(JDOQLQuery.java:240) 2019-02-25 22:11:06.445 - stdout> at org.datanucleus.store.query.Query.executeQuery(Query.java:1744) 2019-02-25 22:11:06.445 - stdout> at org.datanucleus.store.query.Query.executeWithArray(Query.java:1672) 2019-02-25 22:11:06.445 - stdout> at org.datanucleus.store.query.Query.execute(Query.java:1654) 2019-02-25 22:11:06.445 - stdout> at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:221) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.ensureDbInit(MetaStoreDirectSql.java:183) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.<init>(MetaStoreDirectSql.java:137) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:295) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74) 2019-02-25 22:11:06.445 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2019-02-25 22:11:06.445 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2019-02-25 22:11:06.445 - stdout> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2019-02-25 22:11:06.445 - stdout> at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166) 2019-02-25 22:11:06.445 - stdout> at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503) 2019-02-25 22:11:06.445 - stdout> at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:183) 2019-02-25 22:11:06.445 - stdout> at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:117) 2019-02-25 22:11:06.445 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2019-02-25 22:11:06.445 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2019-02-25 22:11:06.445 - stdout> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2019-02-25 22:11:06.445 - stdout> at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 2019-02-25 22:11:06.445 - stdout> at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:272) 2019-02-25 22:11:06.445 - stdout> at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:384) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:215) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.org$apache$spark$sql$hive$HiveSessionStateBuilder$$externalCatalog(HiveSessionStateBuilder.scala:39) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog$lzycompute(SessionCatalog.scala:90) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog(SessionCatalog.scala:90) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.catalyst.catalog.SessionCatalog.tableExists(SessionCatalog.scala:415) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.execution.command.CreateDataSourceTableAsSelectCommand.run(createDataSourceTables.scala:154) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:104) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:102) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.execution.command.DataWritingCommandExec.executeCollect(commands.scala:115) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:195) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:195) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.Dataset$$anonfun$53.apply(Dataset.scala:3365) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3364) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.Dataset.<init>(Dataset.scala:195) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:80) 2019-02-25 22:11:06.446 - stdout> at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642) 2019-02-25 22:11:06.446 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 2019-02-25 22:11:06.446 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 2019-02-25 22:11:06.446 - stdout> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2019-02-25 22:11:06.446 - stdout> at java.lang.reflect.Method.invoke(Method.java:498) 2019-02-25 22:11:06.446 - stdout> at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) 2019-02-25 22:11:06.446 - stdout> at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) 2019-02-25 22:11:06.446 - stdout> at py4j.Gateway.invoke(Gateway.java:282) 2019-02-25 22:11:06.446 - stdout> at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) 2019-02-25 22:11:06.446 - stdout> at py4j.commands.CallCommand.execute(CallCommand.java:79) 2019-02-25 22:11:06.446 - stdout> at py4j.GatewayConnection.run(GatewayConnection.java:238) 2019-02-25 22:11:06.446 - stdout> at java.lang.Thread.run(Thread.java:748) 2019-02-25 22:11:06.447 - stdout> 2019-02-25 22:11:06 INFO ObjectStore:272 - Initialized ObjectStore 2019-02-25 22:11:06.517 - stdout> 2019-02-25 22:11:06 ERROR MetaData:115 - Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo" 2019-02-25 22:11:06.519 - stdout> 2019-02-25 22:11:06 WARN HiveMetaStore:622 - Retrying creating default database after error: Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo" 2019-02-25 22:11:06.519 - stdout> javax.jdo.JDOException: Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo" 2019-02-25 22:11:06.519 - stdout> at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:600) 2019-02-25 22:11:06.519 - stdout> at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:230) 2019-02-25 22:11:06.519 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.getMSchemaVersion(ObjectStore.java:6721) 2019-02-25 22:11:06.519 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.getMetaStoreSchemaVersion(ObjectStore.java:6703) 2019-02-25 22:11:06.519 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.checkSchema(ObjectStore.java:6661) 2019-02-25 22:11:06.519 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.verifySchema(ObjectStore.java:6645) 2019-02-25 22:11:06.519 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 2019-02-25 22:11:06.519 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 2019-02-25 22:11:06.519 - stdout> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2019-02-25 22:11:06.519 - stdout> at java.lang.reflect.Method.invoke(Method.java:498) 2019-02-25 22:11:06.519 - stdout> at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:114) 2019-02-25 22:11:06.519 - stdout> at com.sun.proxy.$Proxy19.verifySchema(Unknown Source) 2019-02-25 22:11:06.52 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:572) 2019-02-25 22:11:06.52 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620) 2019-02-25 22:11:06.52 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461) 2019-02-25 22:11:06.52 - stdout> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66) 2019-02-25 22:11:06.52 - stdout> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72) 2019-02-25 22:11:06.52 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762) 2019-02-25 22:11:06.52 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199) 2019-02-25 22:11:06.52 - stdout> at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74) 2019-02-25 22:11:06.52 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2019-02-25 22:11:06.52 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2019-02-25 22:11:06.52 - stdout> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2019-02-25 22:11:06.52 - stdout> at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 2019-02-25 22:11:06.52 - stdout> at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521) 2019-02-25 22:11:06.52 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86) 2019-02-25 22:11:06.52 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132) 2019-02-25 22:11:06.52 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) 2019-02-25 22:11:06.52 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005) 2019-02-25 22:11:06.52 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024) 2019-02-25 22:11:06.52 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234) 2019-02-25 22:11:06.52 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174) 2019-02-25 22:11:06.52 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166) 2019-02-25 22:11:06.52 - stdout> at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:183) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:117) 2019-02-25 22:11:06.52 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2019-02-25 22:11:06.52 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2019-02-25 22:11:06.52 - stdout> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2019-02-25 22:11:06.52 - stdout> at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:272) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:384) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:215) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.org$apache$spark$sql$hive$HiveSessionStateBuilder$$externalCatalog(HiveSessionStateBuilder.scala:39) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog$lzycompute(SessionCatalog.scala:90) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog(SessionCatalog.scala:90) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.catalyst.catalog.SessionCatalog.tableExists(SessionCatalog.scala:415) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.execution.command.CreateDataSourceTableAsSelectCommand.run(createDataSourceTables.scala:154) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:104) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:102) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.execution.command.DataWritingCommandExec.executeCollect(commands.scala:115) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:195) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:195) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.Dataset$$anonfun$53.apply(Dataset.scala:3365) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3364) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.Dataset.<init>(Dataset.scala:195) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:80) 2019-02-25 22:11:06.52 - stdout> at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642) 2019-02-25 22:11:06.52 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 2019-02-25 22:11:06.52 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 2019-02-25 22:11:06.52 - stdout> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2019-02-25 22:11:06.52 - stdout> at java.lang.reflect.Method.invoke(Method.java:498) 2019-02-25 22:11:06.52 - stdout> at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) 2019-02-25 22:11:06.52 - stdout> at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) 2019-02-25 22:11:06.52 - stdout> at py4j.Gateway.invoke(Gateway.java:282) 2019-02-25 22:11:06.52 - stdout> at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) 2019-02-25 22:11:06.52 - stdout> at py4j.commands.CallCommand.execute(CallCommand.java:79) 2019-02-25 22:11:06.52 - stdout> at py4j.GatewayConnection.run(GatewayConnection.java:238) 2019-02-25 22:11:06.52 - stdout> at java.lang.Thread.run(Thread.java:748) 2019-02-25 22:11:06.52 - stdout> NestedThrowablesStackTrace: 2019-02-25 22:11:06.52 - stdout> Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo" 2019-02-25 22:11:06.52 - stdout> org.datanucleus.exceptions.NucleusException: Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo" 2019-02-25 22:11:06.52 - stdout> at org.datanucleus.metadata.xml.MetaDataParser.parseMetaDataURL(MetaDataParser.java:145) 2019-02-25 22:11:06.52 - stdout> at org.datanucleus.api.jdo.metadata.JDOMetaDataManager.parseFile(JDOMetaDataManager.java:240) 2019-02-25 22:11:06.52 - stdout> at org.datanucleus.api.jdo.metadata.JDOMetaDataManager.loadXMLMetaDataForClass(JDOMetaDataManager.java:773) 2019-02-25 22:11:06.521 - stdout> at org.datanucleus.api.jdo.metadata.JDOMetaDataManager.getMetaDataForClassInternal(JDOMetaDataManager.java:383) 2019-02-25 22:11:06.521 - stdout> at org.datanucleus.metadata.MetaDataManager.getMetaDataForClass(MetaDataManager.java:1570) 2019-02-25 22:11:06.521 - stdout> at org.datanucleus.ExecutionContextImpl.hasPersistenceInformationForClass(ExecutionContextImpl.java:5768) 2019-02-25 22:11:06.521 - stdout> at org.datanucleus.store.rdbms.query.JDOQLQuery.compileInternal(JDOQLQuery.java:258) 2019-02-25 22:11:06.521 - stdout> at org.datanucleus.store.query.Query.executeQuery(Query.java:1744) 2019-02-25 22:11:06.521 - stdout> at org.datanucleus.store.query.Query.executeWithArray(Query.java:1672) 2019-02-25 22:11:06.521 - stdout> at org.datanucleus.store.query.Query.execute(Query.java:1654) 2019-02-25 22:11:06.521 - stdout> at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:221) 2019-02-25 22:11:06.521 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.getMSchemaVersion(ObjectStore.java:6721) 2019-02-25 22:11:06.521 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.getMetaStoreSchemaVersion(ObjectStore.java:6703) 2019-02-25 22:11:06.521 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.checkSchema(ObjectStore.java:6661) 2019-02-25 22:11:06.521 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.verifySchema(ObjectStore.java:6645) 2019-02-25 22:11:06.521 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 2019-02-25 22:11:06.521 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 2019-02-25 22:11:06.521 - stdout> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2019-02-25 22:11:06.521 - stdout> at java.lang.reflect.Method.invoke(Method.java:498) 2019-02-25 22:11:06.521 - stdout> at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:114) 2019-02-25 22:11:06.521 - stdout> at com.sun.proxy.$Proxy19.verifySchema(Unknown Source) 2019-02-25 22:11:06.521 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:572) 2019-02-25 22:11:06.521 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620) 2019-02-25 22:11:06.521 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461) 2019-02-25 22:11:06.521 - stdout> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66) 2019-02-25 22:11:06.521 - stdout> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72) 2019-02-25 22:11:06.521 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762) 2019-02-25 22:11:06.521 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199) 2019-02-25 22:11:06.521 - stdout> at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74) 2019-02-25 22:11:06.521 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2019-02-25 22:11:06.521 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2019-02-25 22:11:06.521 - stdout> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2019-02-25 22:11:06.521 - stdout> at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 2019-02-25 22:11:06.521 - stdout> at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521) 2019-02-25 22:11:06.521 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86) 2019-02-25 22:11:06.521 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132) 2019-02-25 22:11:06.521 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) 2019-02-25 22:11:06.521 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005) 2019-02-25 22:11:06.521 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024) 2019-02-25 22:11:06.521 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234) 2019-02-25 22:11:06.521 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174) 2019-02-25 22:11:06.521 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166) 2019-02-25 22:11:06.521 - stdout> at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503) 2019-02-25 22:11:06.521 - stdout> at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:183) 2019-02-25 22:11:06.521 - stdout> at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:117) 2019-02-25 22:11:06.521 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2019-02-25 22:11:06.521 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2019-02-25 22:11:06.521 - stdout> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2019-02-25 22:11:06.521 - stdout> at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 2019-02-25 22:11:06.521 - stdout> at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:272) 2019-02-25 22:11:06.521 - stdout> at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:384) 2019-02-25 22:11:06.521 - stdout> at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286) 2019-02-25 22:11:06.521 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66) 2019-02-25 22:11:06.521 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65) 2019-02-25 22:11:06.521 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:215) 2019-02-25 22:11:06.521 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215) 2019-02-25 22:11:06.521 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215) 2019-02-25 22:11:06.521 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97) 2019-02-25 22:11:06.521 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214) 2019-02-25 22:11:06.521 - stdout> at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114) 2019-02-25 22:11:06.521 - stdout> at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102) 2019-02-25 22:11:06.521 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.org$apache$spark$sql$hive$HiveSessionStateBuilder$$externalCatalog(HiveSessionStateBuilder.scala:39) 2019-02-25 22:11:06.521 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54) 2019-02-25 22:11:06.521 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54) 2019-02-25 22:11:06.521 - stdout> at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog$lzycompute(SessionCatalog.scala:90) 2019-02-25 22:11:06.521 - stdout> at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog(SessionCatalog.scala:90) 2019-02-25 22:11:06.522 - stdout> at org.apache.spark.sql.catalyst.catalog.SessionCatalog.tableExists(SessionCatalog.scala:415) 2019-02-25 22:11:06.522 - stdout> at org.apache.spark.sql.execution.command.CreateDataSourceTableAsSelectCommand.run(createDataSourceTables.scala:154) 2019-02-25 22:11:06.522 - stdout> at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:104) 2019-02-25 22:11:06.522 - stdout> at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:102) 2019-02-25 22:11:06.522 - stdout> at org.apache.spark.sql.execution.command.DataWritingCommandExec.executeCollect(commands.scala:115) 2019-02-25 22:11:06.522 - stdout> at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:195) 2019-02-25 22:11:06.522 - stdout> at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:195) 2019-02-25 22:11:06.522 - stdout> at org.apache.spark.sql.Dataset$$anonfun$53.apply(Dataset.scala:3365) 2019-02-25 22:11:06.522 - stdout> at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78) 2019-02-25 22:11:06.522 - stdout> at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125) 2019-02-25 22:11:06.522 - stdout> at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73) 2019-02-25 22:11:06.522 - stdout> at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3364) 2019-02-25 22:11:06.522 - stdout> at org.apache.spark.sql.Dataset.<init>(Dataset.scala:195) 2019-02-25 22:11:06.522 - stdout> at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:80) 2019-02-25 22:11:06.522 - stdout> at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642) 2019-02-25 22:11:06.522 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 2019-02-25 22:11:06.522 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 2019-02-25 22:11:06.522 - stdout> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2019-02-25 22:11:06.522 - stdout> at java.lang.reflect.Method.invoke(Method.java:498) 2019-02-25 22:11:06.522 - stdout> at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) 2019-02-25 22:11:06.522 - stdout> at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) 2019-02-25 22:11:06.522 - stdout> at py4j.Gateway.invoke(Gateway.java:282) 2019-02-25 22:11:06.522 - stdout> at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) 2019-02-25 22:11:06.522 - stdout> at py4j.commands.CallCommand.execute(CallCommand.java:79) 2019-02-25 22:11:06.522 - stdout> at py4j.GatewayConnection.run(GatewayConnection.java:238) 2019-02-25 22:11:06.522 - stdout> at java.lang.Thread.run(Thread.java:748) 2019-02-25 22:11:06.522 - stdout> 2019-02-25 22:11:06 INFO HiveMetaStore:589 - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 2019-02-25 22:11:06.523 - stdout> 2019-02-25 22:11:06 INFO ObjectStore:289 - ObjectStore, initialize called 2019-02-25 22:11:06.529 - stdout> 2019-02-25 22:11:06 ERROR MetaData:115 - Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo" 2019-02-25 22:11:06.531 - stdout> 2019-02-25 22:11:06 WARN MetaStoreDirectSql:188 - Database initialization failed; direct SQL is disabled 2019-02-25 22:11:06.531 - stdout> javax.jdo.JDOException: Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo" 2019-02-25 22:11:06.531 - stdout> at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:600) 2019-02-25 22:11:06.531 - stdout> at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:230) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.ensureDbInit(MetaStoreDirectSql.java:183) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.<init>(MetaStoreDirectSql.java:137) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:295) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74) 2019-02-25 22:11:06.531 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2019-02-25 22:11:06.531 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2019-02-25 22:11:06.531 - stdout> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2019-02-25 22:11:06.531 - stdout> at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166) 2019-02-25 22:11:06.531 - stdout> at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503) 2019-02-25 22:11:06.531 - stdout> at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:183) 2019-02-25 22:11:06.531 - stdout> at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:117) 2019-02-25 22:11:06.531 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2019-02-25 22:11:06.531 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2019-02-25 22:11:06.531 - stdout> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2019-02-25 22:11:06.531 - stdout> at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 2019-02-25 22:11:06.531 - stdout> at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:272) 2019-02-25 22:11:06.531 - stdout> at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:384) 2019-02-25 22:11:06.531 - stdout> at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286) 2019-02-25 22:11:06.531 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66) 2019-02-25 22:11:06.531 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65) 2019-02-25 22:11:06.531 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:215) 2019-02-25 22:11:06.531 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.org$apache$spark$sql$hive$HiveSessionStateBuilder$$externalCatalog(HiveSessionStateBuilder.scala:39) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog$lzycompute(SessionCatalog.scala:90) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog(SessionCatalog.scala:90) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.catalyst.catalog.SessionCatalog.tableExists(SessionCatalog.scala:415) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.execution.command.CreateDataSourceTableAsSelectCommand.run(createDataSourceTables.scala:154) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:104) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:102) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.execution.command.DataWritingCommandExec.executeCollect(commands.scala:115) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:195) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:195) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.Dataset$$anonfun$53.apply(Dataset.scala:3365) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3364) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.Dataset.<init>(Dataset.scala:195) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:80) 2019-02-25 22:11:06.532 - stdout> at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642) 2019-02-25 22:11:06.532 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 2019-02-25 22:11:06.532 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 2019-02-25 22:11:06.532 - stdout> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2019-02-25 22:11:06.532 - stdout> at java.lang.reflect.Method.invoke(Method.java:498) 2019-02-25 22:11:06.532 - stdout> at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) 2019-02-25 22:11:06.532 - stdout> at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) 2019-02-25 22:11:06.532 - stdout> at py4j.Gateway.invoke(Gateway.java:282) 2019-02-25 22:11:06.532 - stdout> at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) 2019-02-25 22:11:06.532 - stdout> at py4j.commands.CallCommand.execute(CallCommand.java:79) 2019-02-25 22:11:06.532 - stdout> at py4j.GatewayConnection.run(GatewayConnection.java:238) 2019-02-25 22:11:06.532 - stdout> at java.lang.Thread.run(Thread.java:748) 2019-02-25 22:11:06.532 - stdout> NestedThrowablesStackTrace: 2019-02-25 22:11:06.532 - stdout> Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo" 2019-02-25 22:11:06.532 - stdout> org.datanucleus.exceptions.NucleusException: Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo" 2019-02-25 22:11:06.532 - stdout> at org.datanucleus.metadata.xml.MetaDataParser.parseMetaDataURL(MetaDataParser.java:145) 2019-02-25 22:11:06.532 - stdout> at org.datanucleus.api.jdo.metadata.JDOMetaDataManager.parseFile(JDOMetaDataManager.java:240) 2019-02-25 22:11:06.532 - stdout> at org.datanucleus.api.jdo.metadata.JDOMetaDataManager.loadXMLMetaDataForClass(JDOMetaDataManager.java:773) 2019-02-25 22:11:06.532 - stdout> at org.datanucleus.api.jdo.metadata.JDOMetaDataManager.getMetaDataForClassInternal(JDOMetaDataManager.java:383) 2019-02-25 22:11:06.532 - stdout> at org.datanucleus.metadata.MetaDataManager.getMetaDataForClass(MetaDataManager.java:1570) 2019-02-25 22:11:06.532 - stdout> at org.datanucleus.query.compiler.JavaQueryCompiler.getType(JavaQueryCompiler.java:960) 2019-02-25 22:11:06.532 - stdout> at org.datanucleus.query.compiler.JavaQueryCompiler.getType(JavaQueryCompiler.java:952) 2019-02-25 22:11:06.532 - stdout> at org.datanucleus.query.expression.PrimaryExpression.bind(PrimaryExpression.java:129) 2019-02-25 22:11:06.532 - stdout> at org.datanucleus.query.expression.DyadicExpression.bind(DyadicExpression.java:87) 2019-02-25 22:11:06.532 - stdout> at org.datanucleus.query.compiler.JavaQueryCompiler.compileFilter(JavaQueryCompiler.java:481) 2019-02-25 22:11:06.532 - stdout> at org.datanucleus.query.compiler.JDOQLCompiler.compile(JDOQLCompiler.java:113) 2019-02-25 22:11:06.532 - stdout> at org.datanucleus.store.query.AbstractJDOQLQuery.compileInternal(AbstractJDOQLQuery.java:367) 2019-02-25 22:11:06.532 - stdout> at org.datanucleus.store.rdbms.query.JDOQLQuery.compileInternal(JDOQLQuery.java:240) 2019-02-25 22:11:06.532 - stdout> at org.datanucleus.store.query.Query.executeQuery(Query.java:1744) 2019-02-25 22:11:06.532 - stdout> at org.datanucleus.store.query.Query.executeWithArray(Query.java:1672) 2019-02-25 22:11:06.532 - stdout> at org.datanucleus.store.query.Query.execute(Query.java:1654) 2019-02-25 22:11:06.532 - stdout> at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:221) 2019-02-25 22:11:06.532 - stdout> at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.ensureDbInit(MetaStoreDirectSql.java:183) 2019-02-25 22:11:06.533 - stdout> at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.<init>(MetaStoreDirectSql.java:137) 2019-02-25 22:11:06.533 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:295) 2019-02-25 22:11:06.533 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258) 2019-02-25 22:11:06.533 - stdout> at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76) 2019-02-25 22:11:06.533 - stdout> at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136) 2019-02-25 22:11:06.533 - stdout> at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57) 2019-02-25 22:11:06.533 - stdout> at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66) 2019-02-25 22:11:06.533 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593) 2019-02-25 22:11:06.533 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571) 2019-02-25 22:11:06.533 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624) 2019-02-25 22:11:06.533 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461) 2019-02-25 22:11:06.533 - stdout> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66) 2019-02-25 22:11:06.533 - stdout> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandle

sbt.ForkMain$ForkError: org.scalatest.exceptions.TestFailedException: spark-submit returned with exit code 1.
Command line: './bin/spark-submit' '--name' 'prepare testing tables' '--master' 'local[2]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--conf' 'spark.sql.warehouse.dir=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/warehouse-f58ea99f-b5f3-40ee-942d-afe643b3a43c' '--conf' 'spark.sql.test.version.index=1' '--driver-java-options' '-Dderby.system.home=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/warehouse-f58ea99f-b5f3-40ee-942d-afe643b3a43c' '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/test4459602032925096236.py'

2019-02-25 22:10:55.501 - stdout> 2019-02-25 22:10:55 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2019-02-25 22:10:56.251 - stdout> 2019-02-25 22:10:56 INFO  SparkContext:54 - Running Spark version 2.4.0
2019-02-25 22:10:56.275 - stdout> 2019-02-25 22:10:56 INFO  SparkContext:54 - Submitted application: prepare testing tables
2019-02-25 22:10:56.332 - stdout> 2019-02-25 22:10:56 INFO  SecurityManager:54 - Changing view acls to: jenkins
2019-02-25 22:10:56.332 - stdout> 2019-02-25 22:10:56 INFO  SecurityManager:54 - Changing modify acls to: jenkins
2019-02-25 22:10:56.332 - stdout> 2019-02-25 22:10:56 INFO  SecurityManager:54 - Changing view acls groups to: 
2019-02-25 22:10:56.333 - stdout> 2019-02-25 22:10:56 INFO  SecurityManager:54 - Changing modify acls groups to: 
2019-02-25 22:10:56.333 - stdout> 2019-02-25 22:10:56 INFO  SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
2019-02-25 22:10:56.627 - stdout> 2019-02-25 22:10:56 INFO  Utils:54 - Successfully started service 'sparkDriver' on port 37777.
2019-02-25 22:10:56.652 - stdout> 2019-02-25 22:10:56 INFO  SparkEnv:54 - Registering MapOutputTracker
2019-02-25 22:10:56.67 - stdout> 2019-02-25 22:10:56 INFO  SparkEnv:54 - Registering BlockManagerMaster
2019-02-25 22:10:56.673 - stdout> 2019-02-25 22:10:56 INFO  BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2019-02-25 22:10:56.674 - stdout> 2019-02-25 22:10:56 INFO  BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
2019-02-25 22:10:56.682 - stdout> 2019-02-25 22:10:56 INFO  DiskBlockManager:54 - Created local directory at /tmp/blockmgr-e8441f5f-4c53-40a9-abac-d85bb66abef5
2019-02-25 22:10:56.699 - stdout> 2019-02-25 22:10:56 INFO  MemoryStore:54 - MemoryStore started with capacity 366.3 MB
2019-02-25 22:10:56.714 - stdout> 2019-02-25 22:10:56 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
2019-02-25 22:10:56.791 - stdout> 2019-02-25 22:10:56 INFO  Executor:54 - Starting executor ID driver on host localhost
2019-02-25 22:10:56.858 - stdout> 2019-02-25 22:10:56 INFO  Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 44398.
2019-02-25 22:10:56.858 - stdout> 2019-02-25 22:10:56 INFO  NettyBlockTransferService:54 - Server created on amp-jenkins-worker-05.amp:44398
2019-02-25 22:10:56.86 - stdout> 2019-02-25 22:10:56 INFO  BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2019-02-25 22:10:56.884 - stdout> 2019-02-25 22:10:56 INFO  BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, amp-jenkins-worker-05.amp, 44398, None)
2019-02-25 22:10:56.887 - stdout> 2019-02-25 22:10:56 INFO  BlockManagerMasterEndpoint:54 - Registering block manager amp-jenkins-worker-05.amp:44398 with 366.3 MB RAM, BlockManagerId(driver, amp-jenkins-worker-05.amp, 44398, None)
2019-02-25 22:10:56.891 - stdout> 2019-02-25 22:10:56 INFO  BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, amp-jenkins-worker-05.amp, 44398, None)
2019-02-25 22:10:56.892 - stdout> 2019-02-25 22:10:56 INFO  BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, amp-jenkins-worker-05.amp, 44398, None)
2019-02-25 22:10:57.051 - stdout> 2019-02-25 22:10:57 INFO  log:192 - Logging initialized @2580ms
2019-02-25 22:10:57.203 - stdout> 2019-02-25 22:10:57 INFO  SharedState:54 - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/warehouse-f58ea99f-b5f3-40ee-942d-afe643b3a43c').
2019-02-25 22:10:57.204 - stdout> 2019-02-25 22:10:57 INFO  SharedState:54 - Warehouse path is '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/warehouse-f58ea99f-b5f3-40ee-942d-afe643b3a43c'.
2019-02-25 22:10:57.716 - stdout> 2019-02-25 22:10:57 INFO  StateStoreCoordinatorRef:54 - Registered StateStoreCoordinator endpoint
2019-02-25 22:10:59.768 - stdout> 2019-02-25 22:10:59 INFO  HiveUtils:54 - Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
2019-02-25 22:11:00.389 - stdout> 2019-02-25 22:11:00 INFO  HiveMetaStore:589 - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
2019-02-25 22:11:00.412 - stdout> 2019-02-25 22:11:00 INFO  ObjectStore:289 - ObjectStore, initialize called
2019-02-25 22:11:00.524 - stdout> 2019-02-25 22:11:00 INFO  Persistence:77 - Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
2019-02-25 22:11:00.525 - stdout> 2019-02-25 22:11:00 INFO  Persistence:77 - Property datanucleus.cache.level2 unknown - will be ignored
2019-02-25 22:11:06.342 - stdout> 2019-02-25 22:11:06 INFO  ObjectStore:370 - Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
2019-02-25 22:11:06.437 - stdout> 2019-02-25 22:11:06 ERROR MetaData:115 - Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo"
2019-02-25 22:11:06.443 - stdout> 2019-02-25 22:11:06 WARN  MetaStoreDirectSql:188 - Database initialization failed; direct SQL is disabled
2019-02-25 22:11:06.443 - stdout> javax.jdo.JDOException: Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo"
2019-02-25 22:11:06.443 - stdout> 	at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:600)
2019-02-25 22:11:06.443 - stdout> 	at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:230)
2019-02-25 22:11:06.443 - stdout> 	at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.ensureDbInit(MetaStoreDirectSql.java:183)
2019-02-25 22:11:06.443 - stdout> 	at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.<init>(MetaStoreDirectSql.java:137)
2019-02-25 22:11:06.443 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:295)
2019-02-25 22:11:06.443 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
2019-02-25 22:11:06.443 - stdout> 	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
2019-02-25 22:11:06.443 - stdout> 	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
2019-02-25 22:11:06.443 - stdout> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
2019-02-25 22:11:06.443 - stdout> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
2019-02-25 22:11:06.443 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
2019-02-25 22:11:06.443 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
2019-02-25 22:11:06.443 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
2019-02-25 22:11:06.443 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
2019-02-25 22:11:06.443 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
2019-02-25 22:11:06.443 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
2019-02-25 22:11:06.443 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
2019-02-25 22:11:06.443 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
2019-02-25 22:11:06.443 - stdout> 	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
2019-02-25 22:11:06.443 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2019-02-25 22:11:06.443 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2019-02-25 22:11:06.443 - stdout> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2019-02-25 22:11:06.443 - stdout> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
2019-02-25 22:11:06.443 - stdout> 	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
2019-02-25 22:11:06.443 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
2019-02-25 22:11:06.443 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
2019-02-25 22:11:06.443 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
2019-02-25 22:11:06.443 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
2019-02-25 22:11:06.443 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:183)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:117)
2019-02-25 22:11:06.444 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2019-02-25 22:11:06.444 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2019-02-25 22:11:06.444 - stdout> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2019-02-25 22:11:06.444 - stdout> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:272)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:384)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:215)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.org$apache$spark$sql$hive$HiveSessionStateBuilder$$externalCatalog(HiveSessionStateBuilder.scala:39)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog$lzycompute(SessionCatalog.scala:90)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog(SessionCatalog.scala:90)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.tableExists(SessionCatalog.scala:415)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.execution.command.CreateDataSourceTableAsSelectCommand.run(createDataSourceTables.scala:154)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:104)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:102)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.execution.command.DataWritingCommandExec.executeCollect(commands.scala:115)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:195)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:195)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.Dataset$$anonfun$53.apply(Dataset.scala:3365)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3364)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:195)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:80)
2019-02-25 22:11:06.444 - stdout> 	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
2019-02-25 22:11:06.444 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2019-02-25 22:11:06.444 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2019-02-25 22:11:06.444 - stdout> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2019-02-25 22:11:06.444 - stdout> 	at java.lang.reflect.Method.invoke(Method.java:498)
2019-02-25 22:11:06.444 - stdout> 	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
2019-02-25 22:11:06.444 - stdout> 	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
2019-02-25 22:11:06.444 - stdout> 	at py4j.Gateway.invoke(Gateway.java:282)
2019-02-25 22:11:06.444 - stdout> 	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
2019-02-25 22:11:06.444 - stdout> 	at py4j.commands.CallCommand.execute(CallCommand.java:79)
2019-02-25 22:11:06.444 - stdout> 	at py4j.GatewayConnection.run(GatewayConnection.java:238)
2019-02-25 22:11:06.444 - stdout> 	at java.lang.Thread.run(Thread.java:748)
2019-02-25 22:11:06.444 - stdout> NestedThrowablesStackTrace:
2019-02-25 22:11:06.444 - stdout> Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo"
2019-02-25 22:11:06.444 - stdout> org.datanucleus.exceptions.NucleusException: Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo"
2019-02-25 22:11:06.444 - stdout> 	at org.datanucleus.metadata.xml.MetaDataParser.parseMetaDataURL(MetaDataParser.java:145)
2019-02-25 22:11:06.444 - stdout> 	at org.datanucleus.api.jdo.metadata.JDOMetaDataManager.parseFile(JDOMetaDataManager.java:240)
2019-02-25 22:11:06.445 - stdout> 	at org.datanucleus.api.jdo.metadata.JDOMetaDataManager.loadXMLMetaDataForClass(JDOMetaDataManager.java:773)
2019-02-25 22:11:06.445 - stdout> 	at org.datanucleus.api.jdo.metadata.JDOMetaDataManager.getMetaDataForClassInternal(JDOMetaDataManager.java:383)
2019-02-25 22:11:06.445 - stdout> 	at org.datanucleus.metadata.MetaDataManager.getMetaDataForClass(MetaDataManager.java:1570)
2019-02-25 22:11:06.445 - stdout> 	at org.datanucleus.query.compiler.JavaQueryCompiler.getType(JavaQueryCompiler.java:960)
2019-02-25 22:11:06.445 - stdout> 	at org.datanucleus.query.compiler.JavaQueryCompiler.getType(JavaQueryCompiler.java:952)
2019-02-25 22:11:06.445 - stdout> 	at org.datanucleus.query.expression.PrimaryExpression.bind(PrimaryExpression.java:129)
2019-02-25 22:11:06.445 - stdout> 	at org.datanucleus.query.expression.DyadicExpression.bind(DyadicExpression.java:87)
2019-02-25 22:11:06.445 - stdout> 	at org.datanucleus.query.compiler.JavaQueryCompiler.compileFilter(JavaQueryCompiler.java:481)
2019-02-25 22:11:06.445 - stdout> 	at org.datanucleus.query.compiler.JDOQLCompiler.compile(JDOQLCompiler.java:113)
2019-02-25 22:11:06.445 - stdout> 	at org.datanucleus.store.query.AbstractJDOQLQuery.compileInternal(AbstractJDOQLQuery.java:367)
2019-02-25 22:11:06.445 - stdout> 	at org.datanucleus.store.rdbms.query.JDOQLQuery.compileInternal(JDOQLQuery.java:240)
2019-02-25 22:11:06.445 - stdout> 	at org.datanucleus.store.query.Query.executeQuery(Query.java:1744)
2019-02-25 22:11:06.445 - stdout> 	at org.datanucleus.store.query.Query.executeWithArray(Query.java:1672)
2019-02-25 22:11:06.445 - stdout> 	at org.datanucleus.store.query.Query.execute(Query.java:1654)
2019-02-25 22:11:06.445 - stdout> 	at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:221)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.ensureDbInit(MetaStoreDirectSql.java:183)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.<init>(MetaStoreDirectSql.java:137)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:295)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
2019-02-25 22:11:06.445 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2019-02-25 22:11:06.445 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2019-02-25 22:11:06.445 - stdout> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2019-02-25 22:11:06.445 - stdout> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:183)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:117)
2019-02-25 22:11:06.445 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2019-02-25 22:11:06.445 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2019-02-25 22:11:06.445 - stdout> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2019-02-25 22:11:06.445 - stdout> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:272)
2019-02-25 22:11:06.445 - stdout> 	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:384)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:215)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.org$apache$spark$sql$hive$HiveSessionStateBuilder$$externalCatalog(HiveSessionStateBuilder.scala:39)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog$lzycompute(SessionCatalog.scala:90)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog(SessionCatalog.scala:90)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.tableExists(SessionCatalog.scala:415)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.execution.command.CreateDataSourceTableAsSelectCommand.run(createDataSourceTables.scala:154)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:104)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:102)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.execution.command.DataWritingCommandExec.executeCollect(commands.scala:115)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:195)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:195)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.Dataset$$anonfun$53.apply(Dataset.scala:3365)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3364)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:195)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:80)
2019-02-25 22:11:06.446 - stdout> 	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
2019-02-25 22:11:06.446 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2019-02-25 22:11:06.446 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2019-02-25 22:11:06.446 - stdout> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2019-02-25 22:11:06.446 - stdout> 	at java.lang.reflect.Method.invoke(Method.java:498)
2019-02-25 22:11:06.446 - stdout> 	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
2019-02-25 22:11:06.446 - stdout> 	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
2019-02-25 22:11:06.446 - stdout> 	at py4j.Gateway.invoke(Gateway.java:282)
2019-02-25 22:11:06.446 - stdout> 	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
2019-02-25 22:11:06.446 - stdout> 	at py4j.commands.CallCommand.execute(CallCommand.java:79)
2019-02-25 22:11:06.446 - stdout> 	at py4j.GatewayConnection.run(GatewayConnection.java:238)
2019-02-25 22:11:06.446 - stdout> 	at java.lang.Thread.run(Thread.java:748)
2019-02-25 22:11:06.447 - stdout> 2019-02-25 22:11:06 INFO  ObjectStore:272 - Initialized ObjectStore
2019-02-25 22:11:06.517 - stdout> 2019-02-25 22:11:06 ERROR MetaData:115 - Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo"
2019-02-25 22:11:06.519 - stdout> 2019-02-25 22:11:06 WARN  HiveMetaStore:622 - Retrying creating default database after error: Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo"
2019-02-25 22:11:06.519 - stdout> javax.jdo.JDOException: Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo"
2019-02-25 22:11:06.519 - stdout> 	at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:600)
2019-02-25 22:11:06.519 - stdout> 	at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:230)
2019-02-25 22:11:06.519 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.getMSchemaVersion(ObjectStore.java:6721)
2019-02-25 22:11:06.519 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.getMetaStoreSchemaVersion(ObjectStore.java:6703)
2019-02-25 22:11:06.519 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.checkSchema(ObjectStore.java:6661)
2019-02-25 22:11:06.519 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.verifySchema(ObjectStore.java:6645)
2019-02-25 22:11:06.519 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2019-02-25 22:11:06.519 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2019-02-25 22:11:06.519 - stdout> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2019-02-25 22:11:06.519 - stdout> 	at java.lang.reflect.Method.invoke(Method.java:498)
2019-02-25 22:11:06.519 - stdout> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:114)
2019-02-25 22:11:06.519 - stdout> 	at com.sun.proxy.$Proxy19.verifySchema(Unknown Source)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:572)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
2019-02-25 22:11:06.52 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2019-02-25 22:11:06.52 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2019-02-25 22:11:06.52 - stdout> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2019-02-25 22:11:06.52 - stdout> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:183)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:117)
2019-02-25 22:11:06.52 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2019-02-25 22:11:06.52 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2019-02-25 22:11:06.52 - stdout> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2019-02-25 22:11:06.52 - stdout> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:272)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:384)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:215)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.org$apache$spark$sql$hive$HiveSessionStateBuilder$$externalCatalog(HiveSessionStateBuilder.scala:39)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog$lzycompute(SessionCatalog.scala:90)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog(SessionCatalog.scala:90)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.tableExists(SessionCatalog.scala:415)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.execution.command.CreateDataSourceTableAsSelectCommand.run(createDataSourceTables.scala:154)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:104)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:102)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.execution.command.DataWritingCommandExec.executeCollect(commands.scala:115)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:195)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:195)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.Dataset$$anonfun$53.apply(Dataset.scala:3365)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3364)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:195)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:80)
2019-02-25 22:11:06.52 - stdout> 	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
2019-02-25 22:11:06.52 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2019-02-25 22:11:06.52 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2019-02-25 22:11:06.52 - stdout> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2019-02-25 22:11:06.52 - stdout> 	at java.lang.reflect.Method.invoke(Method.java:498)
2019-02-25 22:11:06.52 - stdout> 	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
2019-02-25 22:11:06.52 - stdout> 	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
2019-02-25 22:11:06.52 - stdout> 	at py4j.Gateway.invoke(Gateway.java:282)
2019-02-25 22:11:06.52 - stdout> 	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
2019-02-25 22:11:06.52 - stdout> 	at py4j.commands.CallCommand.execute(CallCommand.java:79)
2019-02-25 22:11:06.52 - stdout> 	at py4j.GatewayConnection.run(GatewayConnection.java:238)
2019-02-25 22:11:06.52 - stdout> 	at java.lang.Thread.run(Thread.java:748)
2019-02-25 22:11:06.52 - stdout> NestedThrowablesStackTrace:
2019-02-25 22:11:06.52 - stdout> Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo"
2019-02-25 22:11:06.52 - stdout> org.datanucleus.exceptions.NucleusException: Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo"
2019-02-25 22:11:06.52 - stdout> 	at org.datanucleus.metadata.xml.MetaDataParser.parseMetaDataURL(MetaDataParser.java:145)
2019-02-25 22:11:06.52 - stdout> 	at org.datanucleus.api.jdo.metadata.JDOMetaDataManager.parseFile(JDOMetaDataManager.java:240)
2019-02-25 22:11:06.52 - stdout> 	at org.datanucleus.api.jdo.metadata.JDOMetaDataManager.loadXMLMetaDataForClass(JDOMetaDataManager.java:773)
2019-02-25 22:11:06.521 - stdout> 	at org.datanucleus.api.jdo.metadata.JDOMetaDataManager.getMetaDataForClassInternal(JDOMetaDataManager.java:383)
2019-02-25 22:11:06.521 - stdout> 	at org.datanucleus.metadata.MetaDataManager.getMetaDataForClass(MetaDataManager.java:1570)
2019-02-25 22:11:06.521 - stdout> 	at org.datanucleus.ExecutionContextImpl.hasPersistenceInformationForClass(ExecutionContextImpl.java:5768)
2019-02-25 22:11:06.521 - stdout> 	at org.datanucleus.store.rdbms.query.JDOQLQuery.compileInternal(JDOQLQuery.java:258)
2019-02-25 22:11:06.521 - stdout> 	at org.datanucleus.store.query.Query.executeQuery(Query.java:1744)
2019-02-25 22:11:06.521 - stdout> 	at org.datanucleus.store.query.Query.executeWithArray(Query.java:1672)
2019-02-25 22:11:06.521 - stdout> 	at org.datanucleus.store.query.Query.execute(Query.java:1654)
2019-02-25 22:11:06.521 - stdout> 	at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:221)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.getMSchemaVersion(ObjectStore.java:6721)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.getMetaStoreSchemaVersion(ObjectStore.java:6703)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.checkSchema(ObjectStore.java:6661)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.verifySchema(ObjectStore.java:6645)
2019-02-25 22:11:06.521 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2019-02-25 22:11:06.521 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2019-02-25 22:11:06.521 - stdout> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2019-02-25 22:11:06.521 - stdout> 	at java.lang.reflect.Method.invoke(Method.java:498)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:114)
2019-02-25 22:11:06.521 - stdout> 	at com.sun.proxy.$Proxy19.verifySchema(Unknown Source)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:572)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
2019-02-25 22:11:06.521 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2019-02-25 22:11:06.521 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2019-02-25 22:11:06.521 - stdout> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2019-02-25 22:11:06.521 - stdout> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:183)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:117)
2019-02-25 22:11:06.521 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2019-02-25 22:11:06.521 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2019-02-25 22:11:06.521 - stdout> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2019-02-25 22:11:06.521 - stdout> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:272)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:384)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:215)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.org$apache$spark$sql$hive$HiveSessionStateBuilder$$externalCatalog(HiveSessionStateBuilder.scala:39)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog$lzycompute(SessionCatalog.scala:90)
2019-02-25 22:11:06.521 - stdout> 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog(SessionCatalog.scala:90)
2019-02-25 22:11:06.522 - stdout> 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.tableExists(SessionCatalog.scala:415)
2019-02-25 22:11:06.522 - stdout> 	at org.apache.spark.sql.execution.command.CreateDataSourceTableAsSelectCommand.run(createDataSourceTables.scala:154)
2019-02-25 22:11:06.522 - stdout> 	at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:104)
2019-02-25 22:11:06.522 - stdout> 	at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:102)
2019-02-25 22:11:06.522 - stdout> 	at org.apache.spark.sql.execution.command.DataWritingCommandExec.executeCollect(commands.scala:115)
2019-02-25 22:11:06.522 - stdout> 	at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:195)
2019-02-25 22:11:06.522 - stdout> 	at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:195)
2019-02-25 22:11:06.522 - stdout> 	at org.apache.spark.sql.Dataset$$anonfun$53.apply(Dataset.scala:3365)
2019-02-25 22:11:06.522 - stdout> 	at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
2019-02-25 22:11:06.522 - stdout> 	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
2019-02-25 22:11:06.522 - stdout> 	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
2019-02-25 22:11:06.522 - stdout> 	at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3364)
2019-02-25 22:11:06.522 - stdout> 	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:195)
2019-02-25 22:11:06.522 - stdout> 	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:80)
2019-02-25 22:11:06.522 - stdout> 	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
2019-02-25 22:11:06.522 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2019-02-25 22:11:06.522 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2019-02-25 22:11:06.522 - stdout> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2019-02-25 22:11:06.522 - stdout> 	at java.lang.reflect.Method.invoke(Method.java:498)
2019-02-25 22:11:06.522 - stdout> 	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
2019-02-25 22:11:06.522 - stdout> 	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
2019-02-25 22:11:06.522 - stdout> 	at py4j.Gateway.invoke(Gateway.java:282)
2019-02-25 22:11:06.522 - stdout> 	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
2019-02-25 22:11:06.522 - stdout> 	at py4j.commands.CallCommand.execute(CallCommand.java:79)
2019-02-25 22:11:06.522 - stdout> 	at py4j.GatewayConnection.run(GatewayConnection.java:238)
2019-02-25 22:11:06.522 - stdout> 	at java.lang.Thread.run(Thread.java:748)
2019-02-25 22:11:06.522 - stdout> 2019-02-25 22:11:06 INFO  HiveMetaStore:589 - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
2019-02-25 22:11:06.523 - stdout> 2019-02-25 22:11:06 INFO  ObjectStore:289 - ObjectStore, initialize called
2019-02-25 22:11:06.529 - stdout> 2019-02-25 22:11:06 ERROR MetaData:115 - Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo"
2019-02-25 22:11:06.531 - stdout> 2019-02-25 22:11:06 WARN  MetaStoreDirectSql:188 - Database initialization failed; direct SQL is disabled
2019-02-25 22:11:06.531 - stdout> javax.jdo.JDOException: Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo"
2019-02-25 22:11:06.531 - stdout> 	at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:600)
2019-02-25 22:11:06.531 - stdout> 	at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:230)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.ensureDbInit(MetaStoreDirectSql.java:183)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.<init>(MetaStoreDirectSql.java:137)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:295)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
2019-02-25 22:11:06.531 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2019-02-25 22:11:06.531 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2019-02-25 22:11:06.531 - stdout> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2019-02-25 22:11:06.531 - stdout> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:183)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:117)
2019-02-25 22:11:06.531 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2019-02-25 22:11:06.531 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2019-02-25 22:11:06.531 - stdout> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2019-02-25 22:11:06.531 - stdout> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:272)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:384)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:215)
2019-02-25 22:11:06.531 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.org$apache$spark$sql$hive$HiveSessionStateBuilder$$externalCatalog(HiveSessionStateBuilder.scala:39)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog$lzycompute(SessionCatalog.scala:90)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog(SessionCatalog.scala:90)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.catalyst.catalog.SessionCatalog.tableExists(SessionCatalog.scala:415)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.execution.command.CreateDataSourceTableAsSelectCommand.run(createDataSourceTables.scala:154)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:104)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:102)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.execution.command.DataWritingCommandExec.executeCollect(commands.scala:115)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:195)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:195)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.Dataset$$anonfun$53.apply(Dataset.scala:3365)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3364)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:195)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:80)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
2019-02-25 22:11:06.532 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2019-02-25 22:11:06.532 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2019-02-25 22:11:06.532 - stdout> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2019-02-25 22:11:06.532 - stdout> 	at java.lang.reflect.Method.invoke(Method.java:498)
2019-02-25 22:11:06.532 - stdout> 	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
2019-02-25 22:11:06.532 - stdout> 	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
2019-02-25 22:11:06.532 - stdout> 	at py4j.Gateway.invoke(Gateway.java:282)
2019-02-25 22:11:06.532 - stdout> 	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
2019-02-25 22:11:06.532 - stdout> 	at py4j.commands.CallCommand.execute(CallCommand.java:79)
2019-02-25 22:11:06.532 - stdout> 	at py4j.GatewayConnection.run(GatewayConnection.java:238)
2019-02-25 22:11:06.532 - stdout> 	at java.lang.Thread.run(Thread.java:748)
2019-02-25 22:11:06.532 - stdout> NestedThrowablesStackTrace:
2019-02-25 22:11:06.532 - stdout> Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo"
2019-02-25 22:11:06.532 - stdout> org.datanucleus.exceptions.NucleusException: Error opening the Meta-Data file "jar:file:/tmp/test-spark/spark-2.4.0/jars/hive-metastore-1.2.1.spark2.jar!/package.jdo"
2019-02-25 22:11:06.532 - stdout> 	at org.datanucleus.metadata.xml.MetaDataParser.parseMetaDataURL(MetaDataParser.java:145)
2019-02-25 22:11:06.532 - stdout> 	at org.datanucleus.api.jdo.metadata.JDOMetaDataManager.parseFile(JDOMetaDataManager.java:240)
2019-02-25 22:11:06.532 - stdout> 	at org.datanucleus.api.jdo.metadata.JDOMetaDataManager.loadXMLMetaDataForClass(JDOMetaDataManager.java:773)
2019-02-25 22:11:06.532 - stdout> 	at org.datanucleus.api.jdo.metadata.JDOMetaDataManager.getMetaDataForClassInternal(JDOMetaDataManager.java:383)
2019-02-25 22:11:06.532 - stdout> 	at org.datanucleus.metadata.MetaDataManager.getMetaDataForClass(MetaDataManager.java:1570)
2019-02-25 22:11:06.532 - stdout> 	at org.datanucleus.query.compiler.JavaQueryCompiler.getType(JavaQueryCompiler.java:960)
2019-02-25 22:11:06.532 - stdout> 	at org.datanucleus.query.compiler.JavaQueryCompiler.getType(JavaQueryCompiler.java:952)
2019-02-25 22:11:06.532 - stdout> 	at org.datanucleus.query.expression.PrimaryExpression.bind(PrimaryExpression.java:129)
2019-02-25 22:11:06.532 - stdout> 	at org.datanucleus.query.expression.DyadicExpression.bind(DyadicExpression.java:87)
2019-02-25 22:11:06.532 - stdout> 	at org.datanucleus.query.compiler.JavaQueryCompiler.compileFilter(JavaQueryCompiler.java:481)
2019-02-25 22:11:06.532 - stdout> 	at org.datanucleus.query.compiler.JDOQLCompiler.compile(JDOQLCompiler.java:113)
2019-02-25 22:11:06.532 - stdout> 	at org.datanucleus.store.query.AbstractJDOQLQuery.compileInternal(AbstractJDOQLQuery.java:367)
2019-02-25 22:11:06.532 - stdout> 	at org.datanucleus.store.rdbms.query.JDOQLQuery.compileInternal(JDOQLQuery.java:240)
2019-02-25 22:11:06.532 - stdout> 	at org.datanucleus.store.query.Query.executeQuery(Query.java:1744)
2019-02-25 22:11:06.532 - stdout> 	at org.datanucleus.store.query.Query.executeWithArray(Query.java:1672)
2019-02-25 22:11:06.532 - stdout> 	at org.datanucleus.store.query.Query.execute(Query.java:1654)
2019-02-25 22:11:06.532 - stdout> 	at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:221)
2019-02-25 22:11:06.532 - stdout> 	at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.ensureDbInit(MetaStoreDirectSql.java:183)
2019-02-25 22:11:06.533 - stdout> 	at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.<init>(MetaStoreDirectSql.java:137)
2019-02-25 22:11:06.533 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:295)
2019-02-25 22:11:06.533 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
2019-02-25 22:11:06.533 - stdout> 	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
2019-02-25 22:11:06.533 - stdout> 	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
2019-02-25 22:11:06.533 - stdout> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
2019-02-25 22:11:06.533 - stdout> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
2019-02-25 22:11:06.533 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
2019-02-25 22:11:06.533 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
2019-02-25 22:11:06.533 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
2019-02-25 22:11:06.533 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
2019-02-25 22:11:06.533 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
2019-02-25 22:11:06.533 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.ge