org.scalatest.exceptions.TestFailedException: spark-submit returned with exit code 1. Command line: './bin/spark-submit' '--name' 'prepare testing tables' '--master' 'local[2]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--conf' 'spark.sql.warehouse.dir=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/warehouse-03bc9880-1483-4eaf-8bec-d05c991675f3' '--conf' 'spark.sql.test.version.index=2' '--driver-java-options' '-Dderby.system.home=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/warehouse-03bc9880-1483-4eaf-8bec-d05c991675f3' '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/test5580616031949812549.py' 2018-10-09 14:44:20.513 - stdout> 2018-10-09 14:44:20 WARN Utils:66 - Your hostname, amp-jenkins-staging-worker-01 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface eno1) 2018-10-09 14:44:20.513 - stdout> 2018-10-09 14:44:20 WARN Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address 2018-10-09 14:44:20.965 - stdout> 2018-10-09 14:44:20 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2018-10-09 14:44:21.789 - stdout> 2018-10-09 14:44:21 INFO SparkContext:54 - Running Spark version 2.3.2 2018-10-09 14:44:21.812 - stdout> 2018-10-09 14:44:21 INFO SparkContext:54 - Submitted application: prepare testing tables 2018-10-09 14:44:21.867 - stdout> 2018-10-09 14:44:21 INFO SecurityManager:54 - Changing view acls to: jenkins 2018-10-09 14:44:21.868 - stdout> 2018-10-09 14:44:21 INFO SecurityManager:54 - Changing modify acls to: jenkins 2018-10-09 14:44:21.868 - stdout> 2018-10-09 14:44:21 INFO SecurityManager:54 - Changing view acls groups to: 2018-10-09 14:44:21.868 - stdout> 2018-10-09 14:44:21 INFO SecurityManager:54 - Changing modify acls groups to: 2018-10-09 14:44:21.868 - stdout> 2018-10-09 14:44:21 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jenkins); groups with view permissions: Set(); users with modify permissions: Set(jenkins); groups with modify permissions: Set() 2018-10-09 14:44:22.167 - stdout> 2018-10-09 14:44:22 INFO Utils:54 - Successfully started service 'sparkDriver' on port 43960. 2018-10-09 14:44:22.191 - stdout> 2018-10-09 14:44:22 INFO SparkEnv:54 - Registering MapOutputTracker 2018-10-09 14:44:22.215 - stdout> 2018-10-09 14:44:22 INFO SparkEnv:54 - Registering BlockManagerMaster 2018-10-09 14:44:22.219 - stdout> 2018-10-09 14:44:22 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 2018-10-09 14:44:22.219 - stdout> 2018-10-09 14:44:22 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up 2018-10-09 14:44:22.231 - stdout> 2018-10-09 14:44:22 INFO DiskBlockManager:54 - Created local directory at /tmp/blockmgr-6a31b050-f5d0-462a-9dd5-ed27e7c3df4f 2018-10-09 14:44:22.254 - stdout> 2018-10-09 14:44:22 INFO MemoryStore:54 - MemoryStore started with capacity 366.3 MB 2018-10-09 14:44:22.272 - stdout> 2018-10-09 14:44:22 INFO SparkEnv:54 - Registering OutputCommitCoordinator 2018-10-09 14:44:22.577 - stdout> 2018-10-09 14:44:22 INFO SparkContext:54 - Added file file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/test5580616031949812549.py at file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/test5580616031949812549.py with timestamp 1539121462576 2018-10-09 14:44:22.58 - stdout> 2018-10-09 14:44:22 INFO Utils:54 - Copying /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/test5580616031949812549.py to /tmp/spark-62525601-adb6-4342-a0a6-fb2705c42266/userFiles-1c2688ab-8697-4947-ada5-c5d93f20a910/test5580616031949812549.py 2018-10-09 14:44:22.649 - stdout> 2018-10-09 14:44:22 INFO Executor:54 - Starting executor ID driver on host localhost 2018-10-09 14:44:22.666 - stdout> 2018-10-09 14:44:22 INFO Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 41574. 2018-10-09 14:44:22.667 - stdout> 2018-10-09 14:44:22 INFO NettyBlockTransferService:54 - Server created on 192.168.10.31:41574 2018-10-09 14:44:22.668 - stdout> 2018-10-09 14:44:22 INFO BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 2018-10-09 14:44:22.698 - stdout> 2018-10-09 14:44:22 INFO BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, 192.168.10.31, 41574, None) 2018-10-09 14:44:22.702 - stdout> 2018-10-09 14:44:22 INFO BlockManagerMasterEndpoint:54 - Registering block manager 192.168.10.31:41574 with 366.3 MB RAM, BlockManagerId(driver, 192.168.10.31, 41574, None) 2018-10-09 14:44:22.705 - stdout> 2018-10-09 14:44:22 INFO BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, 192.168.10.31, 41574, None) 2018-10-09 14:44:22.705 - stdout> 2018-10-09 14:44:22 INFO BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, 192.168.10.31, 41574, None) 2018-10-09 14:44:22.89 - stdout> 2018-10-09 14:44:22 INFO log:192 - Logging initialized @3133ms 2018-10-09 14:44:23.105 - stdout> 2018-10-09 14:44:23 INFO SharedState:54 - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/warehouse-03bc9880-1483-4eaf-8bec-d05c991675f3'). 2018-10-09 14:44:23.105 - stdout> 2018-10-09 14:44:23 INFO SharedState:54 - Warehouse path is '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/warehouse-03bc9880-1483-4eaf-8bec-d05c991675f3'. 2018-10-09 14:44:23.677 - stdout> 2018-10-09 14:44:23 INFO StateStoreCoordinatorRef:54 - Registered StateStoreCoordinator endpoint 2018-10-09 14:44:24.072 - stdout> 2018-10-09 14:44:24 INFO HiveUtils:54 - Initializing HiveMetastoreConnection version 1.2.1 using Spark classes. 2018-10-09 14:44:24.753 - stdout> 2018-10-09 14:44:24 INFO HiveMetaStore:589 - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 2018-10-09 14:44:24.779 - stdout> 2018-10-09 14:44:24 INFO ObjectStore:289 - ObjectStore, initialize called 2018-10-09 14:44:24.805 - stdout> 2018-10-09 14:44:24 ERROR General:115 - ResourceBundle org.datanucleus.api.jdo.Localisation for locale en_US was not found! 2018-10-09 14:44:24.82 - stdout> 2018-10-09 14:44:24 ERROR General:115 - ResourceBundle org.datanucleus.Localisation for locale en_US was not found! 2018-10-09 14:44:24.836 - stdout> 2018-10-09 14:44:24 WARN HiveMetaStore:622 - Retrying creating default database after error: Unexpected exception caught. 2018-10-09 14:44:24.836 - stdout> javax.jdo.JDOFatalInternalException: Unexpected exception caught. 2018-10-09 14:44:24.836 - stdout> at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1193) 2018-10-09 14:44:24.836 - stdout> at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808) 2018-10-09 14:44:24.836 - stdout> at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74) 2018-10-09 14:44:24.836 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2018-10-09 14:44:24.836 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2018-10-09 14:44:24.836 - stdout> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2018-10-09 14:44:24.836 - stdout> at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166) 2018-10-09 14:44:24.836 - stdout> at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503) 2018-10-09 14:44:24.836 - stdout> at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:180) 2018-10-09 14:44:24.836 - stdout> at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:114) 2018-10-09 14:44:24.836 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2018-10-09 14:44:24.836 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2018-10-09 14:44:24.836 - stdout> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2018-10-09 14:44:24.836 - stdout> at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 2018-10-09 14:44:24.836 - stdout> at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264) 2018-10-09 14:44:24.836 - stdout> at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:385) 2018-10-09 14:44:24.836 - stdout> at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:287) 2018-10-09 14:44:24.836 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66) 2018-10-09 14:44:24.836 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65) 2018-10-09 14:44:24.836 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195) 2018-10-09 14:44:24.836 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195) 2018-10-09 14:44:24.836 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195) 2018-10-09 14:44:24.836 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97) 2018-10-09 14:44:24.836 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194) 2018-10-09 14:44:24.836 - stdout> at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114) 2018-10-09 14:44:24.836 - stdout> at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102) 2018-10-09 14:44:24.836 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39) 2018-10-09 14:44:24.836 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54) 2018-10-09 14:44:24.836 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52) 2018-10-09 14:44:24.836 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69) 2018-10-09 14:44:24.836 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69) 2018-10-09 14:44:24.836 - stdout> at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293) 2018-10-09 14:44:24.836 - stdout> at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293) 2018-10-09 14:44:24.837 - stdout> at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79) 2018-10-09 14:44:24.837 - stdout> at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79) 2018-10-09 14:44:24.837 - stdout> at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57) 2018-10-09 14:44:24.837 - stdout> at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55) 2018-10-09 14:44:24.837 - stdout> at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47) 2018-10-09 14:44:24.837 - stdout> at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74) 2018-10-09 14:44:24.837 - stdout> at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642) 2018-10-09 14:44:24.837 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 2018-10-09 14:44:24.837 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 2018-10-09 14:44:24.837 - stdout> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2018-10-09 14:44:24.837 - stdout> at java.lang.reflect.Method.invoke(Method.java:498) 2018-10-09 14:44:24.837 - stdout> at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) 2018-10-09 14:44:24.837 - stdout> at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) 2018-10-09 14:44:24.837 - stdout> at py4j.Gateway.invoke(Gateway.java:282) 2018-10-09 14:44:24.837 - stdout> at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) 2018-10-09 14:44:24.837 - stdout> at py4j.commands.CallCommand.execute(CallCommand.java:79) 2018-10-09 14:44:24.837 - stdout> at py4j.GatewayConnection.run(GatewayConnection.java:238) 2018-10-09 14:44:24.837 - stdout> at java.lang.Thread.run(Thread.java:748) 2018-10-09 14:44:24.837 - stdout> NestedThrowablesStackTrace: 2018-10-09 14:44:24.837 - stdout> java.lang.reflect.InvocationTargetException 2018-10-09 14:44:24.837 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 2018-10-09 14:44:24.837 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 2018-10-09 14:44:24.837 - stdout> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2018-10-09 14:44:24.837 - stdout> at java.lang.reflect.Method.invoke(Method.java:498) 2018-10-09 14:44:24.837 - stdout> at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965) 2018-10-09 14:44:24.837 - stdout> at java.security.AccessController.doPrivileged(Native Method) 2018-10-09 14:44:24.837 - stdout> at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960) 2018-10-09 14:44:24.837 - stdout> at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166) 2018-10-09 14:44:24.837 - stdout> at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808) 2018-10-09 14:44:24.837 - stdout> at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74) 2018-10-09 14:44:24.837 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2018-10-09 14:44:24.837 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2018-10-09 14:44:24.837 - stdout> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2018-10-09 14:44:24.837 - stdout> at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166) 2018-10-09 14:44:24.837 - stdout> at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503) 2018-10-09 14:44:24.837 - stdout> at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:180) 2018-10-09 14:44:24.837 - stdout> at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:114) 2018-10-09 14:44:24.837 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2018-10-09 14:44:24.837 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2018-10-09 14:44:24.837 - stdout> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2018-10-09 14:44:24.837 - stdout> at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 2018-10-09 14:44:24.837 - stdout> at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264) 2018-10-09 14:44:24.837 - stdout> at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:385) 2018-10-09 14:44:24.837 - stdout> at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:287) 2018-10-09 14:44:24.837 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66) 2018-10-09 14:44:24.837 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65) 2018-10-09 14:44:24.837 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195) 2018-10-09 14:44:24.837 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195) 2018-10-09 14:44:24.837 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195) 2018-10-09 14:44:24.837 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97) 2018-10-09 14:44:24.837 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194) 2018-10-09 14:44:24.838 - stdout> at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114) 2018-10-09 14:44:24.838 - stdout> at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102) 2018-10-09 14:44:24.838 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39) 2018-10-09 14:44:24.838 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54) 2018-10-09 14:44:24.838 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52) 2018-10-09 14:44:24.838 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69) 2018-10-09 14:44:24.838 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69) 2018-10-09 14:44:24.838 - stdout> at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293) 2018-10-09 14:44:24.838 - stdout> at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293) 2018-10-09 14:44:24.838 - stdout> at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79) 2018-10-09 14:44:24.838 - stdout> at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79) 2018-10-09 14:44:24.838 - stdout> at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57) 2018-10-09 14:44:24.838 - stdout> at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55) 2018-10-09 14:44:24.838 - stdout> at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47) 2018-10-09 14:44:24.838 - stdout> at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74) 2018-10-09 14:44:24.838 - stdout> at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642) 2018-10-09 14:44:24.838 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 2018-10-09 14:44:24.838 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 2018-10-09 14:44:24.838 - stdout> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2018-10-09 14:44:24.838 - stdout> at java.lang.reflect.Method.invoke(Method.java:498) 2018-10-09 14:44:24.838 - stdout> at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) 2018-10-09 14:44:24.838 - stdout> at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) 2018-10-09 14:44:24.838 - stdout> at py4j.Gateway.invoke(Gateway.java:282) 2018-10-09 14:44:24.838 - stdout> at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) 2018-10-09 14:44:24.838 - stdout> at py4j.commands.CallCommand.execute(CallCommand.java:79) 2018-10-09 14:44:24.838 - stdout> at py4j.GatewayConnection.run(GatewayConnection.java:238) 2018-10-09 14:44:24.838 - stdout> at java.lang.Thread.run(Thread.java:748) 2018-10-09 14:44:24.838 - stdout> Caused by: java.lang.NullPointerException 2018-10-09 14:44:24.838 - stdout> at org.datanucleus.util.Localiser.getMessage(Localiser.java:359) 2018-10-09 14:44:24.838 - stdout> at org.datanucleus.util.Localiser.msg(Localiser.java:176) 2018-10-09 14:44:24.838 - stdout> at org.datanucleus.util.Localiser.msg(Localiser.java:259) 2018-10-09 14:44:24.838 - stdout> at org.datanucleus.plugin.NonManagedPluginRegistry.registerBundle(NonManagedPluginRegistry.java:435) 2018-10-09 14:44:24.838 - stdout> at org.datanucleus.plugin.NonManagedPluginRegistry.registerExtensions(NonManagedPluginRegistry.java:219) 2018-10-09 14:44:24.838 - stdout> at org.datanucleus.plugin.NonManagedPluginRegistry.registerExtensionPoints(NonManagedPluginRegistry.java:160) 2018-10-09 14:44:24.838 - stdout> at org.datanucleus.plugin.PluginManager.<init>(PluginManager.java:65) 2018-10-09 14:44:24.838 - stdout> at org.datanucleus.plugin.PluginManager.createPluginManager(PluginManager.java:427) 2018-10-09 14:44:24.838 - stdout> at org.datanucleus.NucleusContext.<init>(NucleusContext.java:266) 2018-10-09 14:44:24.838 - stdout> at org.datanucleus.NucleusContext.<init>(NucleusContext.java:247) 2018-10-09 14:44:24.838 - stdout> at org.datanucleus.NucleusContext.<init>(NucleusContext.java:225) 2018-10-09 14:44:24.838 - stdout> at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.<init>(JDOPersistenceManagerFactory.java:416) 2018-10-09 14:44:24.838 - stdout> at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:301) 2018-10-09 14:44:24.838 - stdout> at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202) 2018-10-09 14:44:24.838 - stdout> ... 84 more 2018-10-09 14:44:24.838 - stdout> 2018-10-09 14:44:24 INFO HiveMetaStore:589 - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 2018-10-09 14:44:24.838 - stdout> 2018-10-09 14:44:24 INFO ObjectStore:289 - ObjectStore, initialize called 2018-10-09 14:44:24.853 - stdout> 2018-10-09 14:44:24 WARN Hive:168 - Failed to access metastore. This class should not accessed in runtime. 2018-10-09 14:44:24.853 - stdout> org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient 2018-10-09 14:44:24.853 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1236) 2018-10-09 14:44:24.853 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174) 2018-10-09 14:44:24.853 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166) 2018-10-09 14:44:24.853 - stdout> at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:180) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:114) 2018-10-09 14:44:24.853 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2018-10-09 14:44:24.853 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2018-10-09 14:44:24.853 - stdout> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2018-10-09 14:44:24.853 - stdout> at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:385) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:287) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74) 2018-10-09 14:44:24.853 - stdout> at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642) 2018-10-09 14:44:24.853 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 2018-10-09 14:44:24.853 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 2018-10-09 14:44:24.853 - stdout> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2018-10-09 14:44:24.853 - stdout> at java.lang.reflect.Method.invoke(Method.java:498) 2018-10-09 14:44:24.853 - stdout> at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) 2018-10-09 14:44:24.853 - stdout> at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) 2018-10-09 14:44:24.853 - stdout> at py4j.Gateway.invoke(Gateway.java:282) 2018-10-09 14:44:24.853 - stdout> at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) 2018-10-09 14:44:24.853 - stdout> at py4j.commands.CallCommand.execute(CallCommand.java:79) 2018-10-09 14:44:24.853 - stdout> at py4j.GatewayConnection.run(GatewayConnection.java:238) 2018-10-09 14:44:24.853 - stdout> at java.lang.Thread.run(Thread.java:748) 2018-10-09 14:44:24.854 - stdout> Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523) 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86) 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132) 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005) 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024) 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234) 2018-10-09 14:44:24.854 - stdout> ... 46 more 2018-10-09 14:44:24.854 - stdout> Caused by: java.lang.reflect.InvocationTargetException 2018-10-09 14:44:24.854 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2018-10-09 14:44:24.854 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2018-10-09 14:44:24.854 - stdout> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2018-10-09 14:44:24.854 - stdout> at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521) 2018-10-09 14:44:24.854 - stdout> ... 52 more 2018-10-09 14:44:24.854 - stdout> Caused by: javax.jdo.JDOFatalInternalException: Unexpected exception caught. 2018-10-09 14:44:24.854 - stdout> NestedThrowables: 2018-10-09 14:44:24.854 - stdout> java.lang.reflect.InvocationTargetException 2018-10-09 14:44:24.854 - stdout> at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1193) 2018-10-09 14:44:24.854 - stdout> at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808) 2018-10-09 14:44:24.854 - stdout> at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701) 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365) 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394) 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291) 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258) 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76) 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136) 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57) 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66) 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593) 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571) 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624) 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461) 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66) 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72) 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762) 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199) 2018-10-09 14:44:24.854 - stdout> at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74) 2018-10-09 14:44:24.854 - stdout> ... 57 more 2018-10-09 14:44:24.854 - stdout> Caused by: java.lang.reflect.InvocationTargetException 2018-10-09 14:44:24.854 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 2018-10-09 14:44:24.854 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 2018-10-09 14:44:24.854 - stdout> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2018-10-09 14:44:24.854 - stdout> at java.lang.reflect.Method.invoke(Method.java:498) 2018-10-09 14:44:24.854 - stdout> at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965) 2018-10-09 14:44:24.854 - stdout> at java.security.AccessController.doPrivileged(Native Method) 2018-10-09 14:44:24.854 - stdout> at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960) 2018-10-09 14:44:24.854 - stdout> at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166) 2018-10-09 14:44:24.854 - stdout> ... 76 more 2018-10-09 14:44:24.854 - stdout> Caused by: java.lang.NullPointerException 2018-10-09 14:44:24.854 - stdout> at org.datanucleus.util.Localiser.getMessage(Localiser.java:359) 2018-10-09 14:44:24.854 - stdout> at org.datanucleus.util.Localiser.msg(Localiser.java:176) 2018-10-09 14:44:24.854 - stdout> at org.datanucleus.util.Localiser.msg(Localiser.java:259) 2018-10-09 14:44:24.854 - stdout> at org.datanucleus.plugin.NonManagedPluginRegistry.registerBundle(NonManagedPluginRegistry.java:435) 2018-10-09 14:44:24.854 - stdout> at org.datanucleus.plugin.NonManagedPluginRegistry.registerExtensions(NonManagedPluginRegistry.java:219) 2018-10-09 14:44:24.854 - stdout> at org.datanucleus.plugin.NonManagedPluginRegistry.registerExtensionPoints(NonManagedPluginRegistry.java:160) 2018-10-09 14:44:24.854 - stdout> at org.datanucleus.plugin.PluginManager.<init>(PluginManager.java:65) 2018-10-09 14:44:24.854 - stdout> at org.datanucleus.plugin.PluginManager.createPluginManager(PluginManager.java:427) 2018-10-09 14:44:24.854 - stdout> at org.datanucleus.NucleusContext.<init>(NucleusContext.java:266) 2018-10-09 14:44:24.854 - stdout> at org.datanucleus.NucleusContext.<init>(NucleusContext.java:247) 2018-10-09 14:44:24.854 - stdout> at org.datanucleus.NucleusContext.<init>(NucleusContext.java:225) 2018-10-09 14:44:24.854 - stdout> at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.<init>(JDOPersistenceManagerFactory.java:416) 2018-10-09 14:44:24.854 - stdout> at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:301) 2018-10-09 14:44:24.854 - stdout> at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202) 2018-10-09 14:44:24.854 - stdout> ... 84 more 2018-10-09 14:44:24.854 - stdout> 2018-10-09 14:44:24 INFO HiveMetaStore:589 - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 2018-10-09 14:44:24.855 - stdout> 2018-10-09 14:44:24 INFO ObjectStore:289 - ObjectStore, initialize called 2018-10-09 14:44:24.864 - stdout> 2018-10-09 14:44:24 WARN HiveMetaStore:622 - Retrying creating default database after error: Unexpected exception caught. 2018-10-09 14:44:24.864 - stdout> javax.jdo.JDOFatalInternalException: Unexpected exception caught. 2018-10-09 14:44:24.864 - stdout> at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1193) 2018-10-09 14:44:24.864 - stdout> at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808) 2018-10-09 14:44:24.864 - stdout> at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701) 2018-10-09 14:44:24.864 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365) 2018-10-09 14:44:24.864 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394) 2018-10-09 14:44:24.864 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291) 2018-10-09 14:44:24.864 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258) 2018-10-09 14:44:24.865 - stdout> at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76) 2018-10-09 14:44:24.865 - stdout> at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136) 2018-10-09 14:44:24.865 - stdout> at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57) 2018-10-09 14:44:24.865 - stdout> at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66) 2018-10-09 14:44:24.865 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593) 2018-10-09 14:44:24.865 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571) 2018-10-09 14:44:24.865 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620) 2018-10-09 14:44:24.865 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461) 2018-10-09 14:44:24.865 - stdout> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66) 2018-10-09 14:44:24.865 - stdout> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72) 2018-10-09 14:44:24.865 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762) 2018-10-09 14:44:24.865 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199) 2018-10-09 14:44:24.865 - stdout> at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74) 2018-10-09 14:44:24.865 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2018-10-09 14:44:24.865 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2018-10-09 14:44:24.865 - stdout> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2018-10-09 14:44:24.865 - stdout> at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 2018-10-09 14:44:24.865 - stdout> at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521) 2018-10-09 14:44:24.865 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86) 2018-10-09 14:44:24.865 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132) 2018-10-09 14:44:24.865 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) 2018-10-09 14:44:24.865 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005) 2018-10-09 14:44:24.865 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024) 2018-10-09 14:44:24.865 - stdout> at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:180) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:114) 2018-10-09 14:44:24.865 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2018-10-09 14:44:24.865 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2018-10-09 14:44:24.865 - stdout> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2018-10-09 14:44:24.865 - stdout> at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:385) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:287) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74) 2018-10-09 14:44:24.865 - stdout> at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642) 2018-10-09 14:44:24.865 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 2018-10-09 14:44:24.865 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 2018-10-09 14:44:24.865 - stdout> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2018-10-09 14:44:24.865 - stdout> at java.lang.reflect.Method.invoke(Method.java:498) 2018-10-09 14:44:24.865 - stdout> at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) 2018-10-09 14:44:24.865 - stdout> at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) 2018-10-09 14:44:24.865 - stdout> at py4j.Gateway.invoke(Gateway.java:282) 2018-10-09 14:44:24.865 - stdout> at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) 2018-10-09 14:44:24.866 - stdout> at py4j.commands.CallCommand.execute(CallCommand.java:79) 2018-10-09 14:44:24.866 - stdout> at py4j.GatewayConnection.run(GatewayConnection.java:238) 2018-10-09 14:44:24.866 - stdout> at java.lang.Thread.run(Thread.java:748) 2018-10-09 14:44:24.866 - stdout> NestedThrowablesStackTrace: 2018-10-09 14:44:24.866 - stdout> java.lang.reflect.InvocationTargetException 2018-10-09 14:44:24.866 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 2018-10-09 14:44:24.866 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 2018-10-09 14:44:24.866 - stdout> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2018-10-09 14:44:24.866 - stdout> at java.lang.reflect.Method.invoke(Method.java:498) 2018-10-09 14:44:24.866 - stdout> at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965) 2018-10-09 14:44:24.866 - stdout> at java.security.AccessController.doPrivileged(Native Method) 2018-10-09 14:44:24.866 - stdout> at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960) 2018-10-09 14:44:24.866 - stdout> at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166) 2018-10-09 14:44:24.866 - stdout> at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808) 2018-10-09 14:44:24.866 - stdout> at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701) 2018-10-09 14:44:24.866 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365) 2018-10-09 14:44:24.866 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394) 2018-10-09 14:44:24.866 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291) 2018-10-09 14:44:24.866 - stdout> at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258) 2018-10-09 14:44:24.866 - stdout> at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76) 2018-10-09 14:44:24.866 - stdout> at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136) 2018-10-09 14:44:24.866 - stdout> at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57) 2018-10-09 14:44:24.866 - stdout> at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66) 2018-10-09 14:44:24.866 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593) 2018-10-09 14:44:24.866 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571) 2018-10-09 14:44:24.866 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620) 2018-10-09 14:44:24.866 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461) 2018-10-09 14:44:24.866 - stdout> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66) 2018-10-09 14:44:24.866 - stdout> at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72) 2018-10-09 14:44:24.866 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762) 2018-10-09 14:44:24.866 - stdout> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199) 2018-10-09 14:44:24.866 - stdout> at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74) 2018-10-09 14:44:24.866 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2018-10-09 14:44:24.866 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2018-10-09 14:44:24.866 - stdout> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2018-10-09 14:44:24.866 - stdout> at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 2018-10-09 14:44:24.866 - stdout> at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521) 2018-10-09 14:44:24.866 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86) 2018-10-09 14:44:24.866 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132) 2018-10-09 14:44:24.866 - stdout> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) 2018-10-09 14:44:24.866 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005) 2018-10-09 14:44:24.866 - stdout> at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024) 2018-10-09 14:44:24.866 - stdout> at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503) 2018-10-09 14:44:24.866 - stdout> at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:180) 2018-10-09 14:44:24.866 - stdout> at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:114) 2018-10-09 14:44:24.866 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2018-10-09 14:44:24.866 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2018-10-09 14:44:24.866 - stdout> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2018-10-09 14:44:24.866 - stdout> at java.lang.reflect.Constructor.newInstance(Constructor.java:423) 2018-10-09 14:44:24.866 - stdout> at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264) 2018-10-09 14:44:24.866 - stdout> at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:385) 2018-10-09 14:44:24.866 - stdout> at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:287) 2018-10-09 14:44:24.866 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66) 2018-10-09 14:44:24.866 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65) 2018-10-09 14:44:24.866 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195) 2018-10-09 14:44:24.866 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195) 2018-10-09 14:44:24.866 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195) 2018-10-09 14:44:24.866 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97) 2018-10-09 14:44:24.866 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194) 2018-10-09 14:44:24.866 - stdout> at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114) 2018-10-09 14:44:24.866 - stdout> at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102) 2018-10-09 14:44:24.867 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39) 2018-10-09 14:44:24.867 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54) 2018-10-09 14:44:24.867 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52) 2018-10-09 14:44:24.867 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69) 2018-10-09 14:44:24.867 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69) 2018-10-09 14:44:24.867 - stdout> at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293) 2018-10-09 14:44:24.867 - stdout> at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293) 2018-10-09 14:44:24.867 - stdout> at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79) 2018-10-09 14:44:24.867 - stdout> at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79) 2018-10-09 14:44:24.867 - stdout> at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57) 2018-10-09 14:44:24.867 - stdout> at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55) 2018-10-09 14:44:24.867 - stdout> at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47) 2018-10-09 14:44:24.867 - stdout> at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74) 2018-10-09 14:44:24.867 - stdout> at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642) 2018-10-09 14:44:24.867 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 2018-10-09 14:44:24.867 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 2018-10-09 14:44:24.867 - stdout> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2018-10-09 14:44:24.867 - stdout> at java.lang.reflect.Method.invoke(Method.java:498) 2018-10-09 14:44:24.867 - stdout> at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) 2018-10-09 14:44:24.867 - stdout> at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) 2018-10-09 14:44:24.867 - stdout> at py4j.Gateway.invoke(Gateway.java:282) 2018-10-09 14:44:24.867 - stdout> at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) 2018-10-09 14:44:24.867 - stdout> at py4j.commands.CallCommand.execute(CallCommand.java:79) 2018-10-09 14:44:24.867 - stdout> at py4j.GatewayConnection.run(GatewayConnection.java:238) 2018-10-09 14:44:24.867 - stdout> at java.lang.Thread.run(Thread.java:748) 2018-10-09 14:44:24.867 - stdout> Caused by: java.lang.NullPointerException 2018-10-09 14:44:24.867 - stdout> at org.datanucleus.util.Localiser.getMessage(Localiser.java:359) 2018-10-09 14:44:24.867 - stdout> at org.datanucleus.util.Localiser.msg(Localiser.java:176) 2018-10-09 14:44:24.867 - stdout> at org.datanucleus.util.Localiser.msg(Localiser.java:259) 2018-10-09 14:44:24.867 - stdout> at org.datanucleus.plugin.NonManagedPluginRegistry.registerBundle(NonManagedPluginRegistry.java:435) 2018-10-09 14:44:24.867 - stdout> at org.datanucleus.plugin.NonManagedPluginRegistry.registerExtensions(NonManagedPluginRegistry.java:219) 2018-10-09 14:44:24.867 - stdout> at org.datanucleus.plugin.NonManagedPluginRegistry.registerExtensionPoints(NonManagedPluginRegistry.java:160) 2018-10-09 14:44:24.867 - stdout> at org.datanucleus.plugin.PluginManager.<init>(PluginManager.java:65) 2018-10-09 14:44:24.867 - stdout> at org.datanucleus.plugin.PluginManager.createPluginManager(PluginManager.java:427) 2018-10-09 14:44:24.867 - stdout> at org.datanucleus.NucleusContext.<init>(NucleusContext.java:266) 2018-10-09 14:44:24.867 - stdout> at org.datanucleus.NucleusContext.<init>(NucleusContext.java:247) 2018-10-09 14:44:24.867 - stdout> at org.datanucleus.NucleusContext.<init>(NucleusContext.java:225) 2018-10-09 14:44:24.867 - stdout> at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.<init>(JDOPersistenceManagerFactory.java:416) 2018-10-09 14:44:24.867 - stdout> at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:301) 2018-10-09 14:44:24.867 - stdout> at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202) 2018-10-09 14:44:24.867 - stdout> ... 81 more 2018-10-09 14:44:24.867 - stdout> 2018-10-09 14:44:24 INFO HiveMetaStore:589 - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 2018-10-09 14:44:24.867 - stdout> 2018-10-09 14:44:24 INFO ObjectStore:289 - ObjectStore, initialize called 2018-10-09 14:44:24.899 - stdout> Traceback (most recent call last): 2018-10-09 14:44:24.899 - stdout> File "/tmp/test-spark/spark-2.3.2/python/lib/pyspark.zip/pyspark/sql/utils.py", line 63, in deco 2018-10-09 14:44:24.9 - stdout> File "/tmp/test-spark/spark-2.3.2/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value 2018-10-09 14:44:24.902 - stdout> py4j.protocol.Py4JJavaError: An error occurred while calling o31.sql. 2018-10-09 14:44:24.902 - stdout> : org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient; 2018-10-09 14:44:24.903 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106) 2018-10-09 14:44:24.903 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194) 2018-10-09 14:44:24.903 - stdout> at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114) 2018-10-09 14:44:24.903 - stdout> at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102) 2018-10-09 14:44:24.903 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39) 2018-10-09 14:44:24.903 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54) 2018-10-09 14:44:24.903 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52) 2018-10-09 14:44:24.903 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69) 2018-10-09 14:44:24.903 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69) 2018-10-09 14:44:24.903 - stdout> at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293) 2018-10-09 14:44:24.903 - stdout> at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293) 2018-10-09 14:44:24.903 - stdout> at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79) 2018-10-09 14:44:24.903 - stdout> at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79) 2018-10-09 14:44:24.903 - stdout> at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57) 2018-10-09 14:44:24.903 - stdout> at org.apache.spark.

sbt.ForkMain$ForkError: org.scalatest.exceptions.TestFailedException: spark-submit returned with exit code 1.
Command line: './bin/spark-submit' '--name' 'prepare testing tables' '--master' 'local[2]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--conf' 'spark.sql.warehouse.dir=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/warehouse-03bc9880-1483-4eaf-8bec-d05c991675f3' '--conf' 'spark.sql.test.version.index=2' '--driver-java-options' '-Dderby.system.home=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/warehouse-03bc9880-1483-4eaf-8bec-d05c991675f3' '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/test5580616031949812549.py'

2018-10-09 14:44:20.513 - stdout> 2018-10-09 14:44:20 WARN  Utils:66 - Your hostname, amp-jenkins-staging-worker-01 resolves to a loopback address: 127.0.1.1; using 192.168.10.31 instead (on interface eno1)
2018-10-09 14:44:20.513 - stdout> 2018-10-09 14:44:20 WARN  Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
2018-10-09 14:44:20.965 - stdout> 2018-10-09 14:44:20 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2018-10-09 14:44:21.789 - stdout> 2018-10-09 14:44:21 INFO  SparkContext:54 - Running Spark version 2.3.2
2018-10-09 14:44:21.812 - stdout> 2018-10-09 14:44:21 INFO  SparkContext:54 - Submitted application: prepare testing tables
2018-10-09 14:44:21.867 - stdout> 2018-10-09 14:44:21 INFO  SecurityManager:54 - Changing view acls to: jenkins
2018-10-09 14:44:21.868 - stdout> 2018-10-09 14:44:21 INFO  SecurityManager:54 - Changing modify acls to: jenkins
2018-10-09 14:44:21.868 - stdout> 2018-10-09 14:44:21 INFO  SecurityManager:54 - Changing view acls groups to: 
2018-10-09 14:44:21.868 - stdout> 2018-10-09 14:44:21 INFO  SecurityManager:54 - Changing modify acls groups to: 
2018-10-09 14:44:21.868 - stdout> 2018-10-09 14:44:21 INFO  SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
2018-10-09 14:44:22.167 - stdout> 2018-10-09 14:44:22 INFO  Utils:54 - Successfully started service 'sparkDriver' on port 43960.
2018-10-09 14:44:22.191 - stdout> 2018-10-09 14:44:22 INFO  SparkEnv:54 - Registering MapOutputTracker
2018-10-09 14:44:22.215 - stdout> 2018-10-09 14:44:22 INFO  SparkEnv:54 - Registering BlockManagerMaster
2018-10-09 14:44:22.219 - stdout> 2018-10-09 14:44:22 INFO  BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2018-10-09 14:44:22.219 - stdout> 2018-10-09 14:44:22 INFO  BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
2018-10-09 14:44:22.231 - stdout> 2018-10-09 14:44:22 INFO  DiskBlockManager:54 - Created local directory at /tmp/blockmgr-6a31b050-f5d0-462a-9dd5-ed27e7c3df4f
2018-10-09 14:44:22.254 - stdout> 2018-10-09 14:44:22 INFO  MemoryStore:54 - MemoryStore started with capacity 366.3 MB
2018-10-09 14:44:22.272 - stdout> 2018-10-09 14:44:22 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
2018-10-09 14:44:22.577 - stdout> 2018-10-09 14:44:22 INFO  SparkContext:54 - Added file file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/test5580616031949812549.py at file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/test5580616031949812549.py with timestamp 1539121462576
2018-10-09 14:44:22.58 - stdout> 2018-10-09 14:44:22 INFO  Utils:54 - Copying /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/test5580616031949812549.py to /tmp/spark-62525601-adb6-4342-a0a6-fb2705c42266/userFiles-1c2688ab-8697-4947-ada5-c5d93f20a910/test5580616031949812549.py
2018-10-09 14:44:22.649 - stdout> 2018-10-09 14:44:22 INFO  Executor:54 - Starting executor ID driver on host localhost
2018-10-09 14:44:22.666 - stdout> 2018-10-09 14:44:22 INFO  Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 41574.
2018-10-09 14:44:22.667 - stdout> 2018-10-09 14:44:22 INFO  NettyBlockTransferService:54 - Server created on 192.168.10.31:41574
2018-10-09 14:44:22.668 - stdout> 2018-10-09 14:44:22 INFO  BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2018-10-09 14:44:22.698 - stdout> 2018-10-09 14:44:22 INFO  BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, 192.168.10.31, 41574, None)
2018-10-09 14:44:22.702 - stdout> 2018-10-09 14:44:22 INFO  BlockManagerMasterEndpoint:54 - Registering block manager 192.168.10.31:41574 with 366.3 MB RAM, BlockManagerId(driver, 192.168.10.31, 41574, None)
2018-10-09 14:44:22.705 - stdout> 2018-10-09 14:44:22 INFO  BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, 192.168.10.31, 41574, None)
2018-10-09 14:44:22.705 - stdout> 2018-10-09 14:44:22 INFO  BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, 192.168.10.31, 41574, None)
2018-10-09 14:44:22.89 - stdout> 2018-10-09 14:44:22 INFO  log:192 - Logging initialized @3133ms
2018-10-09 14:44:23.105 - stdout> 2018-10-09 14:44:23 INFO  SharedState:54 - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/warehouse-03bc9880-1483-4eaf-8bec-d05c991675f3').
2018-10-09 14:44:23.105 - stdout> 2018-10-09 14:44:23 INFO  SharedState:54 - Warehouse path is '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/warehouse-03bc9880-1483-4eaf-8bec-d05c991675f3'.
2018-10-09 14:44:23.677 - stdout> 2018-10-09 14:44:23 INFO  StateStoreCoordinatorRef:54 - Registered StateStoreCoordinator endpoint
2018-10-09 14:44:24.072 - stdout> 2018-10-09 14:44:24 INFO  HiveUtils:54 - Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
2018-10-09 14:44:24.753 - stdout> 2018-10-09 14:44:24 INFO  HiveMetaStore:589 - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
2018-10-09 14:44:24.779 - stdout> 2018-10-09 14:44:24 INFO  ObjectStore:289 - ObjectStore, initialize called
2018-10-09 14:44:24.805 - stdout> 2018-10-09 14:44:24 ERROR General:115 - ResourceBundle org.datanucleus.api.jdo.Localisation for locale en_US was not found!
2018-10-09 14:44:24.82 - stdout> 2018-10-09 14:44:24 ERROR General:115 - ResourceBundle org.datanucleus.Localisation for locale en_US was not found!
2018-10-09 14:44:24.836 - stdout> 2018-10-09 14:44:24 WARN  HiveMetaStore:622 - Retrying creating default database after error: Unexpected exception caught.
2018-10-09 14:44:24.836 - stdout> javax.jdo.JDOFatalInternalException: Unexpected exception caught.
2018-10-09 14:44:24.836 - stdout> 	at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1193)
2018-10-09 14:44:24.836 - stdout> 	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
2018-10-09 14:44:24.836 - stdout> 	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
2018-10-09 14:44:24.836 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2018-10-09 14:44:24.836 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2018-10-09 14:44:24.836 - stdout> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2018-10-09 14:44:24.836 - stdout> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:180)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:114)
2018-10-09 14:44:24.836 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2018-10-09 14:44:24.836 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2018-10-09 14:44:24.836 - stdout> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2018-10-09 14:44:24.836 - stdout> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:385)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:287)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
2018-10-09 14:44:24.836 - stdout> 	at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
2018-10-09 14:44:24.837 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2018-10-09 14:44:24.837 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2018-10-09 14:44:24.837 - stdout> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2018-10-09 14:44:24.837 - stdout> 	at java.lang.reflect.Method.invoke(Method.java:498)
2018-10-09 14:44:24.837 - stdout> 	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
2018-10-09 14:44:24.837 - stdout> 	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
2018-10-09 14:44:24.837 - stdout> 	at py4j.Gateway.invoke(Gateway.java:282)
2018-10-09 14:44:24.837 - stdout> 	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
2018-10-09 14:44:24.837 - stdout> 	at py4j.commands.CallCommand.execute(CallCommand.java:79)
2018-10-09 14:44:24.837 - stdout> 	at py4j.GatewayConnection.run(GatewayConnection.java:238)
2018-10-09 14:44:24.837 - stdout> 	at java.lang.Thread.run(Thread.java:748)
2018-10-09 14:44:24.837 - stdout> NestedThrowablesStackTrace:
2018-10-09 14:44:24.837 - stdout> java.lang.reflect.InvocationTargetException
2018-10-09 14:44:24.837 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2018-10-09 14:44:24.837 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2018-10-09 14:44:24.837 - stdout> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2018-10-09 14:44:24.837 - stdout> 	at java.lang.reflect.Method.invoke(Method.java:498)
2018-10-09 14:44:24.837 - stdout> 	at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
2018-10-09 14:44:24.837 - stdout> 	at java.security.AccessController.doPrivileged(Native Method)
2018-10-09 14:44:24.837 - stdout> 	at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
2018-10-09 14:44:24.837 - stdout> 	at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
2018-10-09 14:44:24.837 - stdout> 	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
2018-10-09 14:44:24.837 - stdout> 	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
2018-10-09 14:44:24.837 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2018-10-09 14:44:24.837 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2018-10-09 14:44:24.837 - stdout> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2018-10-09 14:44:24.837 - stdout> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:180)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:114)
2018-10-09 14:44:24.837 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2018-10-09 14:44:24.837 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2018-10-09 14:44:24.837 - stdout> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2018-10-09 14:44:24.837 - stdout> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:385)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:287)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
2018-10-09 14:44:24.837 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194)
2018-10-09 14:44:24.838 - stdout> 	at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
2018-10-09 14:44:24.838 - stdout> 	at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
2018-10-09 14:44:24.838 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
2018-10-09 14:44:24.838 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
2018-10-09 14:44:24.838 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
2018-10-09 14:44:24.838 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69)
2018-10-09 14:44:24.838 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69)
2018-10-09 14:44:24.838 - stdout> 	at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
2018-10-09 14:44:24.838 - stdout> 	at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
2018-10-09 14:44:24.838 - stdout> 	at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)
2018-10-09 14:44:24.838 - stdout> 	at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79)
2018-10-09 14:44:24.838 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
2018-10-09 14:44:24.838 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
2018-10-09 14:44:24.838 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
2018-10-09 14:44:24.838 - stdout> 	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74)
2018-10-09 14:44:24.838 - stdout> 	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
2018-10-09 14:44:24.838 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2018-10-09 14:44:24.838 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2018-10-09 14:44:24.838 - stdout> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2018-10-09 14:44:24.838 - stdout> 	at java.lang.reflect.Method.invoke(Method.java:498)
2018-10-09 14:44:24.838 - stdout> 	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
2018-10-09 14:44:24.838 - stdout> 	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
2018-10-09 14:44:24.838 - stdout> 	at py4j.Gateway.invoke(Gateway.java:282)
2018-10-09 14:44:24.838 - stdout> 	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
2018-10-09 14:44:24.838 - stdout> 	at py4j.commands.CallCommand.execute(CallCommand.java:79)
2018-10-09 14:44:24.838 - stdout> 	at py4j.GatewayConnection.run(GatewayConnection.java:238)
2018-10-09 14:44:24.838 - stdout> 	at java.lang.Thread.run(Thread.java:748)
2018-10-09 14:44:24.838 - stdout> Caused by: java.lang.NullPointerException
2018-10-09 14:44:24.838 - stdout> 	at org.datanucleus.util.Localiser.getMessage(Localiser.java:359)
2018-10-09 14:44:24.838 - stdout> 	at org.datanucleus.util.Localiser.msg(Localiser.java:176)
2018-10-09 14:44:24.838 - stdout> 	at org.datanucleus.util.Localiser.msg(Localiser.java:259)
2018-10-09 14:44:24.838 - stdout> 	at org.datanucleus.plugin.NonManagedPluginRegistry.registerBundle(NonManagedPluginRegistry.java:435)
2018-10-09 14:44:24.838 - stdout> 	at org.datanucleus.plugin.NonManagedPluginRegistry.registerExtensions(NonManagedPluginRegistry.java:219)
2018-10-09 14:44:24.838 - stdout> 	at org.datanucleus.plugin.NonManagedPluginRegistry.registerExtensionPoints(NonManagedPluginRegistry.java:160)
2018-10-09 14:44:24.838 - stdout> 	at org.datanucleus.plugin.PluginManager.<init>(PluginManager.java:65)
2018-10-09 14:44:24.838 - stdout> 	at org.datanucleus.plugin.PluginManager.createPluginManager(PluginManager.java:427)
2018-10-09 14:44:24.838 - stdout> 	at org.datanucleus.NucleusContext.<init>(NucleusContext.java:266)
2018-10-09 14:44:24.838 - stdout> 	at org.datanucleus.NucleusContext.<init>(NucleusContext.java:247)
2018-10-09 14:44:24.838 - stdout> 	at org.datanucleus.NucleusContext.<init>(NucleusContext.java:225)
2018-10-09 14:44:24.838 - stdout> 	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.<init>(JDOPersistenceManagerFactory.java:416)
2018-10-09 14:44:24.838 - stdout> 	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:301)
2018-10-09 14:44:24.838 - stdout> 	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
2018-10-09 14:44:24.838 - stdout> 	... 84 more
2018-10-09 14:44:24.838 - stdout> 2018-10-09 14:44:24 INFO  HiveMetaStore:589 - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
2018-10-09 14:44:24.838 - stdout> 2018-10-09 14:44:24 INFO  ObjectStore:289 - ObjectStore, initialize called
2018-10-09 14:44:24.853 - stdout> 2018-10-09 14:44:24 WARN  Hive:168 - Failed to access metastore. This class should not accessed in runtime.
2018-10-09 14:44:24.853 - stdout> org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
2018-10-09 14:44:24.853 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1236)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:180)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:114)
2018-10-09 14:44:24.853 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2018-10-09 14:44:24.853 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2018-10-09 14:44:24.853 - stdout> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2018-10-09 14:44:24.853 - stdout> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:385)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:287)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74)
2018-10-09 14:44:24.853 - stdout> 	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
2018-10-09 14:44:24.853 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2018-10-09 14:44:24.853 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2018-10-09 14:44:24.853 - stdout> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2018-10-09 14:44:24.853 - stdout> 	at java.lang.reflect.Method.invoke(Method.java:498)
2018-10-09 14:44:24.853 - stdout> 	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
2018-10-09 14:44:24.853 - stdout> 	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
2018-10-09 14:44:24.853 - stdout> 	at py4j.Gateway.invoke(Gateway.java:282)
2018-10-09 14:44:24.853 - stdout> 	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
2018-10-09 14:44:24.853 - stdout> 	at py4j.commands.CallCommand.execute(CallCommand.java:79)
2018-10-09 14:44:24.853 - stdout> 	at py4j.GatewayConnection.run(GatewayConnection.java:238)
2018-10-09 14:44:24.853 - stdout> 	at java.lang.Thread.run(Thread.java:748)
2018-10-09 14:44:24.854 - stdout> Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523)
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
2018-10-09 14:44:24.854 - stdout> 	... 46 more
2018-10-09 14:44:24.854 - stdout> Caused by: java.lang.reflect.InvocationTargetException
2018-10-09 14:44:24.854 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2018-10-09 14:44:24.854 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2018-10-09 14:44:24.854 - stdout> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2018-10-09 14:44:24.854 - stdout> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
2018-10-09 14:44:24.854 - stdout> 	... 52 more
2018-10-09 14:44:24.854 - stdout> Caused by: javax.jdo.JDOFatalInternalException: Unexpected exception caught.
2018-10-09 14:44:24.854 - stdout> NestedThrowables:
2018-10-09 14:44:24.854 - stdout> java.lang.reflect.InvocationTargetException
2018-10-09 14:44:24.854 - stdout> 	at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1193)
2018-10-09 14:44:24.854 - stdout> 	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
2018-10-09 14:44:24.854 - stdout> 	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:624)
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
2018-10-09 14:44:24.854 - stdout> 	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
2018-10-09 14:44:24.854 - stdout> 	... 57 more
2018-10-09 14:44:24.854 - stdout> Caused by: java.lang.reflect.InvocationTargetException
2018-10-09 14:44:24.854 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2018-10-09 14:44:24.854 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2018-10-09 14:44:24.854 - stdout> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2018-10-09 14:44:24.854 - stdout> 	at java.lang.reflect.Method.invoke(Method.java:498)
2018-10-09 14:44:24.854 - stdout> 	at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
2018-10-09 14:44:24.854 - stdout> 	at java.security.AccessController.doPrivileged(Native Method)
2018-10-09 14:44:24.854 - stdout> 	at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
2018-10-09 14:44:24.854 - stdout> 	at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
2018-10-09 14:44:24.854 - stdout> 	... 76 more
2018-10-09 14:44:24.854 - stdout> Caused by: java.lang.NullPointerException
2018-10-09 14:44:24.854 - stdout> 	at org.datanucleus.util.Localiser.getMessage(Localiser.java:359)
2018-10-09 14:44:24.854 - stdout> 	at org.datanucleus.util.Localiser.msg(Localiser.java:176)
2018-10-09 14:44:24.854 - stdout> 	at org.datanucleus.util.Localiser.msg(Localiser.java:259)
2018-10-09 14:44:24.854 - stdout> 	at org.datanucleus.plugin.NonManagedPluginRegistry.registerBundle(NonManagedPluginRegistry.java:435)
2018-10-09 14:44:24.854 - stdout> 	at org.datanucleus.plugin.NonManagedPluginRegistry.registerExtensions(NonManagedPluginRegistry.java:219)
2018-10-09 14:44:24.854 - stdout> 	at org.datanucleus.plugin.NonManagedPluginRegistry.registerExtensionPoints(NonManagedPluginRegistry.java:160)
2018-10-09 14:44:24.854 - stdout> 	at org.datanucleus.plugin.PluginManager.<init>(PluginManager.java:65)
2018-10-09 14:44:24.854 - stdout> 	at org.datanucleus.plugin.PluginManager.createPluginManager(PluginManager.java:427)
2018-10-09 14:44:24.854 - stdout> 	at org.datanucleus.NucleusContext.<init>(NucleusContext.java:266)
2018-10-09 14:44:24.854 - stdout> 	at org.datanucleus.NucleusContext.<init>(NucleusContext.java:247)
2018-10-09 14:44:24.854 - stdout> 	at org.datanucleus.NucleusContext.<init>(NucleusContext.java:225)
2018-10-09 14:44:24.854 - stdout> 	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.<init>(JDOPersistenceManagerFactory.java:416)
2018-10-09 14:44:24.854 - stdout> 	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:301)
2018-10-09 14:44:24.854 - stdout> 	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
2018-10-09 14:44:24.854 - stdout> 	... 84 more
2018-10-09 14:44:24.854 - stdout> 2018-10-09 14:44:24 INFO  HiveMetaStore:589 - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
2018-10-09 14:44:24.855 - stdout> 2018-10-09 14:44:24 INFO  ObjectStore:289 - ObjectStore, initialize called
2018-10-09 14:44:24.864 - stdout> 2018-10-09 14:44:24 WARN  HiveMetaStore:622 - Retrying creating default database after error: Unexpected exception caught.
2018-10-09 14:44:24.864 - stdout> javax.jdo.JDOFatalInternalException: Unexpected exception caught.
2018-10-09 14:44:24.864 - stdout> 	at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1193)
2018-10-09 14:44:24.864 - stdout> 	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
2018-10-09 14:44:24.864 - stdout> 	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
2018-10-09 14:44:24.864 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
2018-10-09 14:44:24.864 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
2018-10-09 14:44:24.864 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
2018-10-09 14:44:24.864 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
2018-10-09 14:44:24.865 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2018-10-09 14:44:24.865 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2018-10-09 14:44:24.865 - stdout> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2018-10-09 14:44:24.865 - stdout> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:180)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:114)
2018-10-09 14:44:24.865 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2018-10-09 14:44:24.865 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2018-10-09 14:44:24.865 - stdout> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2018-10-09 14:44:24.865 - stdout> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:385)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:287)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74)
2018-10-09 14:44:24.865 - stdout> 	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
2018-10-09 14:44:24.865 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2018-10-09 14:44:24.865 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2018-10-09 14:44:24.865 - stdout> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2018-10-09 14:44:24.865 - stdout> 	at java.lang.reflect.Method.invoke(Method.java:498)
2018-10-09 14:44:24.865 - stdout> 	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
2018-10-09 14:44:24.865 - stdout> 	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
2018-10-09 14:44:24.865 - stdout> 	at py4j.Gateway.invoke(Gateway.java:282)
2018-10-09 14:44:24.865 - stdout> 	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
2018-10-09 14:44:24.866 - stdout> 	at py4j.commands.CallCommand.execute(CallCommand.java:79)
2018-10-09 14:44:24.866 - stdout> 	at py4j.GatewayConnection.run(GatewayConnection.java:238)
2018-10-09 14:44:24.866 - stdout> 	at java.lang.Thread.run(Thread.java:748)
2018-10-09 14:44:24.866 - stdout> NestedThrowablesStackTrace:
2018-10-09 14:44:24.866 - stdout> java.lang.reflect.InvocationTargetException
2018-10-09 14:44:24.866 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2018-10-09 14:44:24.866 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2018-10-09 14:44:24.866 - stdout> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2018-10-09 14:44:24.866 - stdout> 	at java.lang.reflect.Method.invoke(Method.java:498)
2018-10-09 14:44:24.866 - stdout> 	at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
2018-10-09 14:44:24.866 - stdout> 	at java.security.AccessController.doPrivileged(Native Method)
2018-10-09 14:44:24.866 - stdout> 	at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
2018-10-09 14:44:24.866 - stdout> 	at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
2018-10-09 14:44:24.866 - stdout> 	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
2018-10-09 14:44:24.866 - stdout> 	at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:57)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:66)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:593)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:571)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:199)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
2018-10-09 14:44:24.866 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2018-10-09 14:44:24.866 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2018-10-09 14:44:24.866 - stdout> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2018-10-09 14:44:24.866 - stdout> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:180)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:114)
2018-10-09 14:44:24.866 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2018-10-09 14:44:24.866 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2018-10-09 14:44:24.866 - stdout> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2018-10-09 14:44:24.866 - stdout> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:385)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:287)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
2018-10-09 14:44:24.866 - stdout> 	at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
2018-10-09 14:44:24.867 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
2018-10-09 14:44:24.867 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
2018-10-09 14:44:24.867 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
2018-10-09 14:44:24.867 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69)
2018-10-09 14:44:24.867 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69)
2018-10-09 14:44:24.867 - stdout> 	at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
2018-10-09 14:44:24.867 - stdout> 	at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
2018-10-09 14:44:24.867 - stdout> 	at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)
2018-10-09 14:44:24.867 - stdout> 	at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79)
2018-10-09 14:44:24.867 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
2018-10-09 14:44:24.867 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
2018-10-09 14:44:24.867 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
2018-10-09 14:44:24.867 - stdout> 	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74)
2018-10-09 14:44:24.867 - stdout> 	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
2018-10-09 14:44:24.867 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2018-10-09 14:44:24.867 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2018-10-09 14:44:24.867 - stdout> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2018-10-09 14:44:24.867 - stdout> 	at java.lang.reflect.Method.invoke(Method.java:498)
2018-10-09 14:44:24.867 - stdout> 	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
2018-10-09 14:44:24.867 - stdout> 	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
2018-10-09 14:44:24.867 - stdout> 	at py4j.Gateway.invoke(Gateway.java:282)
2018-10-09 14:44:24.867 - stdout> 	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
2018-10-09 14:44:24.867 - stdout> 	at py4j.commands.CallCommand.execute(CallCommand.java:79)
2018-10-09 14:44:24.867 - stdout> 	at py4j.GatewayConnection.run(GatewayConnection.java:238)
2018-10-09 14:44:24.867 - stdout> 	at java.lang.Thread.run(Thread.java:748)
2018-10-09 14:44:24.867 - stdout> Caused by: java.lang.NullPointerException
2018-10-09 14:44:24.867 - stdout> 	at org.datanucleus.util.Localiser.getMessage(Localiser.java:359)
2018-10-09 14:44:24.867 - stdout> 	at org.datanucleus.util.Localiser.msg(Localiser.java:176)
2018-10-09 14:44:24.867 - stdout> 	at org.datanucleus.util.Localiser.msg(Localiser.java:259)
2018-10-09 14:44:24.867 - stdout> 	at org.datanucleus.plugin.NonManagedPluginRegistry.registerBundle(NonManagedPluginRegistry.java:435)
2018-10-09 14:44:24.867 - stdout> 	at org.datanucleus.plugin.NonManagedPluginRegistry.registerExtensions(NonManagedPluginRegistry.java:219)
2018-10-09 14:44:24.867 - stdout> 	at org.datanucleus.plugin.NonManagedPluginRegistry.registerExtensionPoints(NonManagedPluginRegistry.java:160)
2018-10-09 14:44:24.867 - stdout> 	at org.datanucleus.plugin.PluginManager.<init>(PluginManager.java:65)
2018-10-09 14:44:24.867 - stdout> 	at org.datanucleus.plugin.PluginManager.createPluginManager(PluginManager.java:427)
2018-10-09 14:44:24.867 - stdout> 	at org.datanucleus.NucleusContext.<init>(NucleusContext.java:266)
2018-10-09 14:44:24.867 - stdout> 	at org.datanucleus.NucleusContext.<init>(NucleusContext.java:247)
2018-10-09 14:44:24.867 - stdout> 	at org.datanucleus.NucleusContext.<init>(NucleusContext.java:225)
2018-10-09 14:44:24.867 - stdout> 	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.<init>(JDOPersistenceManagerFactory.java:416)
2018-10-09 14:44:24.867 - stdout> 	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:301)
2018-10-09 14:44:24.867 - stdout> 	at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
2018-10-09 14:44:24.867 - stdout> 	... 81 more
2018-10-09 14:44:24.867 - stdout> 2018-10-09 14:44:24 INFO  HiveMetaStore:589 - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
2018-10-09 14:44:24.867 - stdout> 2018-10-09 14:44:24 INFO  ObjectStore:289 - ObjectStore, initialize called
2018-10-09 14:44:24.899 - stdout> Traceback (most recent call last):
2018-10-09 14:44:24.899 - stdout>   File "/tmp/test-spark/spark-2.3.2/python/lib/pyspark.zip/pyspark/sql/utils.py", line 63, in deco
2018-10-09 14:44:24.9 - stdout>   File "/tmp/test-spark/spark-2.3.2/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value
2018-10-09 14:44:24.902 - stdout> py4j.protocol.Py4JJavaError: An error occurred while calling o31.sql.
2018-10-09 14:44:24.902 - stdout> : org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient;
2018-10-09 14:44:24.903 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106)
2018-10-09 14:44:24.903 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194)
2018-10-09 14:44:24.903 - stdout> 	at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
2018-10-09 14:44:24.903 - stdout> 	at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
2018-10-09 14:44:24.903 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
2018-10-09 14:44:24.903 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
2018-10-09 14:44:24.903 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
2018-10-09 14:44:24.903 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69)
2018-10-09 14:44:24.903 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69)
2018-10-09 14:44:24.903 - stdout> 	at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
2018-10-09 14:44:24.903 - stdout> 	at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
2018-10-09 14:44:24.903 - stdout> 	at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)
2018-10-09 14:44:24.903 - stdout> 	at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79)
2018-10-09 14:44:24.903 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
2018-10-09 14:44:24.903 - stdou