org.scalatest.exceptions.TestFailedException: spark-submit returned with exit code 1. Command line: './bin/spark-submit' '--name' 'prepare testing tables' '--master' 'local[2]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--conf' 'spark.sql.warehouse.dir=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/warehouse-87f4bf74-454a-4bac-89b5-5a0de5089d62' '--conf' 'spark.sql.test.version.index=2' '--driver-java-options' '-Dderby.system.home=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/warehouse-87f4bf74-454a-4bac-89b5-5a0de5089d62' '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/test6492818842578739735.py' 2018-09-26 02:23:24.166 - stdout> 2018-09-26 02:23:24 WARN Utils:66 - Your hostname, amp-jenkins-staging-worker-02 resolves to a loopback address: 127.0.1.1; using 192.168.10.32 instead (on interface eno1) 2018-09-26 02:23:24.167 - stdout> 2018-09-26 02:23:24 WARN Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address 2018-09-26 02:23:24.679 - stdout> 2018-09-26 02:23:24 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2018-09-26 02:23:25.421 - stdout> 2018-09-26 02:23:25 INFO SparkContext:54 - Running Spark version 2.3.1 2018-09-26 02:23:25.446 - stdout> 2018-09-26 02:23:25 INFO SparkContext:54 - Submitted application: prepare testing tables 2018-09-26 02:23:25.517 - stdout> 2018-09-26 02:23:25 INFO SecurityManager:54 - Changing view acls to: jenkins 2018-09-26 02:23:25.518 - stdout> 2018-09-26 02:23:25 INFO SecurityManager:54 - Changing modify acls to: jenkins 2018-09-26 02:23:25.518 - stdout> 2018-09-26 02:23:25 INFO SecurityManager:54 - Changing view acls groups to: 2018-09-26 02:23:25.518 - stdout> 2018-09-26 02:23:25 INFO SecurityManager:54 - Changing modify acls groups to: 2018-09-26 02:23:25.519 - stdout> 2018-09-26 02:23:25 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jenkins); groups with view permissions: Set(); users with modify permissions: Set(jenkins); groups with modify permissions: Set() 2018-09-26 02:23:25.878 - stdout> 2018-09-26 02:23:25 INFO Utils:54 - Successfully started service 'sparkDriver' on port 44562. 2018-09-26 02:23:25.912 - stdout> 2018-09-26 02:23:25 INFO SparkEnv:54 - Registering MapOutputTracker 2018-09-26 02:23:25.942 - stdout> 2018-09-26 02:23:25 INFO SparkEnv:54 - Registering BlockManagerMaster 2018-09-26 02:23:25.946 - stdout> 2018-09-26 02:23:25 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 2018-09-26 02:23:25.947 - stdout> 2018-09-26 02:23:25 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up 2018-09-26 02:23:25.962 - stdout> 2018-09-26 02:23:25 INFO DiskBlockManager:54 - Created local directory at /tmp/blockmgr-de8208a8-18ba-481f-ad88-878072a58327 2018-09-26 02:23:25.987 - stdout> 2018-09-26 02:23:25 INFO MemoryStore:54 - MemoryStore started with capacity 366.3 MB 2018-09-26 02:23:26.002 - stdout> 2018-09-26 02:23:26 INFO SparkEnv:54 - Registering OutputCommitCoordinator 2018-09-26 02:23:26.368 - stdout> 2018-09-26 02:23:26 INFO SparkContext:54 - Added file file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/test6492818842578739735.py at file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/test6492818842578739735.py with timestamp 1537953806367 2018-09-26 02:23:26.371 - stdout> 2018-09-26 02:23:26 INFO Utils:54 - Copying /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/test6492818842578739735.py to /tmp/spark-48cad304-208f-4743-aec3-88d403947b93/userFiles-1eb5cb0d-60a2-4b89-b8a5-1232c1753722/test6492818842578739735.py 2018-09-26 02:23:26.451 - stdout> 2018-09-26 02:23:26 INFO Executor:54 - Starting executor ID driver on host localhost 2018-09-26 02:23:26.488 - stdout> 2018-09-26 02:23:26 INFO Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 42789. 2018-09-26 02:23:26.489 - stdout> 2018-09-26 02:23:26 INFO NettyBlockTransferService:54 - Server created on 192.168.10.32:42789 2018-09-26 02:23:26.492 - stdout> 2018-09-26 02:23:26 INFO BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 2018-09-26 02:23:26.543 - stdout> 2018-09-26 02:23:26 INFO BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, 192.168.10.32, 42789, None) 2018-09-26 02:23:26.548 - stdout> 2018-09-26 02:23:26 INFO BlockManagerMasterEndpoint:54 - Registering block manager 192.168.10.32:42789 with 366.3 MB RAM, BlockManagerId(driver, 192.168.10.32, 42789, None) 2018-09-26 02:23:26.555 - stdout> 2018-09-26 02:23:26 INFO BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, 192.168.10.32, 42789, None) 2018-09-26 02:23:26.556 - stdout> 2018-09-26 02:23:26 INFO BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, 192.168.10.32, 42789, None) 2018-09-26 02:23:26.754 - stdout> 2018-09-26 02:23:26 INFO log:192 - Logging initialized @3488ms 2018-09-26 02:23:26.944 - stdout> 2018-09-26 02:23:26 INFO SharedState:54 - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/warehouse-87f4bf74-454a-4bac-89b5-5a0de5089d62'). 2018-09-26 02:23:26.944 - stdout> 2018-09-26 02:23:26 INFO SharedState:54 - Warehouse path is '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/warehouse-87f4bf74-454a-4bac-89b5-5a0de5089d62'. 2018-09-26 02:23:27.492 - stdout> 2018-09-26 02:23:27 INFO StateStoreCoordinatorRef:54 - Registered StateStoreCoordinator endpoint 2018-09-26 02:23:27.901 - stdout> 2018-09-26 02:23:27 INFO HiveUtils:54 - Initializing HiveMetastoreConnection version 1.2.1 using Spark classes. 2018-09-26 02:23:27.938 - stdout> Traceback (most recent call last): 2018-09-26 02:23:27.939 - stdout> File "/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/test6492818842578739735.py", line 8, in <module> 2018-09-26 02:23:27.939 - stdout> spark.sql("create table data_source_tbl_{} using json as select 1 i".format(version_index)) 2018-09-26 02:23:27.939 - stdout> File "/tmp/test-spark/spark-2.3.1/python/lib/pyspark.zip/pyspark/sql/session.py", line 710, in sql 2018-09-26 02:23:27.939 - stdout> File "/tmp/test-spark/spark-2.3.1/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__ 2018-09-26 02:23:27.939 - stdout> File "/tmp/test-spark/spark-2.3.1/python/lib/pyspark.zip/pyspark/sql/utils.py", line 63, in deco 2018-09-26 02:23:27.939 - stdout> File "/tmp/test-spark/spark-2.3.1/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value 2018-09-26 02:23:27.941 - stdout> py4j.protocol.Py4JJavaError: An error occurred while calling o27.sql. 2018-09-26 02:23:27.941 - stdout> : java.lang.NullPointerException 2018-09-26 02:23:27.941 - stdout> at org.apache.commons.io.IOUtils.copyLarge(IOUtils.java:1792) 2018-09-26 02:23:27.941 - stdout> at org.apache.commons.io.IOUtils.copyLarge(IOUtils.java:1769) 2018-09-26 02:23:27.941 - stdout> at org.apache.commons.io.IOUtils.copy(IOUtils.java:1744) 2018-09-26 02:23:27.941 - stdout> at org.apache.commons.io.IOUtils.toByteArray(IOUtils.java:462) 2018-09-26 02:23:27.941 - stdout> at org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.doLoadClass(IsolatedClientLoader.scala:216) 2018-09-26 02:23:27.941 - stdout> at org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.loadClass(IsolatedClientLoader.scala:210) 2018-09-26 02:23:27.941 - stdout> at java.lang.ClassLoader.loadClass(ClassLoader.java:411) 2018-09-26 02:23:27.941 - stdout> at java.lang.ClassLoader.loadClass(ClassLoader.java:357) 2018-09-26 02:23:27.941 - stdout> at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:262) 2018-09-26 02:23:27.941 - stdout> at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:385) 2018-09-26 02:23:27.941 - stdout> at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:287) 2018-09-26 02:23:27.941 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66) 2018-09-26 02:23:27.941 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65) 2018-09-26 02:23:27.941 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195) 2018-09-26 02:23:27.941 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195) 2018-09-26 02:23:27.941 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195) 2018-09-26 02:23:27.941 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97) 2018-09-26 02:23:27.941 - stdout> at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194) 2018-09-26 02:23:27.941 - stdout> at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114) 2018-09-26 02:23:27.941 - stdout> at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102) 2018-09-26 02:23:27.941 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39) 2018-09-26 02:23:27.941 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54) 2018-09-26 02:23:27.942 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52) 2018-09-26 02:23:27.942 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69) 2018-09-26 02:23:27.942 - stdout> at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69) 2018-09-26 02:23:27.942 - stdout> at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293) 2018-09-26 02:23:27.942 - stdout> at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293) 2018-09-26 02:23:27.942 - stdout> at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79) 2018-09-26 02:23:27.942 - stdout> at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79) 2018-09-26 02:23:27.942 - stdout> at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57) 2018-09-26 02:23:27.942 - stdout> at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55) 2018-09-26 02:23:27.942 - stdout> at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47) 2018-09-26 02:23:27.942 - stdout> at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74) 2018-09-26 02:23:27.942 - stdout> at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:641) 2018-09-26 02:23:27.942 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 2018-09-26 02:23:27.942 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 2018-09-26 02:23:27.942 - stdout> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2018-09-26 02:23:27.942 - stdout> at java.lang.reflect.Method.invoke(Method.java:498) 2018-09-26 02:23:27.942 - stdout> at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) 2018-09-26 02:23:27.942 - stdout> at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) 2018-09-26 02:23:27.942 - stdout> at py4j.Gateway.invoke(Gateway.java:282) 2018-09-26 02:23:27.942 - stdout> at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) 2018-09-26 02:23:27.942 - stdout> at py4j.commands.CallCommand.execute(CallCommand.java:79) 2018-09-26 02:23:27.942 - stdout> at py4j.GatewayConnection.run(GatewayConnection.java:238) 2018-09-26 02:23:27.942 - stdout> at java.lang.Thread.run(Thread.java:748) 2018-09-26 02:23:27.942 - stdout> 2018-09-26 02:23:27.991 - stdout> 2018-09-26 02:23:27 INFO SparkContext:54 - Invoking stop() from shutdown hook 2018-09-26 02:23:28.004 - stdout> 2018-09-26 02:23:28 INFO MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped! 2018-09-26 02:23:28.013 - stdout> 2018-09-26 02:23:28 INFO MemoryStore:54 - MemoryStore cleared 2018-09-26 02:23:28.014 - stdout> 2018-09-26 02:23:28 INFO BlockManager:54 - BlockManager stopped 2018-09-26 02:23:28.021 - stdout> 2018-09-26 02:23:28 INFO BlockManagerMaster:54 - BlockManagerMaster stopped 2018-09-26 02:23:28.025 - stdout> 2018-09-26 02:23:28 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped! 2018-09-26 02:23:28.028 - stdout> 2018-09-26 02:23:28 INFO SparkContext:54 - Successfully stopped SparkContext 2018-09-26 02:23:28.029 - stdout> 2018-09-26 02:23:28 INFO ShutdownHookManager:54 - Shutdown hook called 2018-09-26 02:23:28.029 - stdout> 2018-09-26 02:23:28 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-b47563f8-59f5-46e4-9f6a-776b8eddb379 2018-09-26 02:23:28.03 - stdout> 2018-09-26 02:23:28 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-48cad304-208f-4743-aec3-88d403947b93 2018-09-26 02:23:28.03 - stdout> 2018-09-26 02:23:28 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-48cad304-208f-4743-aec3-88d403947b93/pyspark-fe8f28d1-9a03-4495-b384-308149d06063

sbt.ForkMain$ForkError: org.scalatest.exceptions.TestFailedException: spark-submit returned with exit code 1.
Command line: './bin/spark-submit' '--name' 'prepare testing tables' '--master' 'local[2]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--conf' 'spark.sql.warehouse.dir=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/warehouse-87f4bf74-454a-4bac-89b5-5a0de5089d62' '--conf' 'spark.sql.test.version.index=2' '--driver-java-options' '-Dderby.system.home=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/warehouse-87f4bf74-454a-4bac-89b5-5a0de5089d62' '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/test6492818842578739735.py'

2018-09-26 02:23:24.166 - stdout> 2018-09-26 02:23:24 WARN  Utils:66 - Your hostname, amp-jenkins-staging-worker-02 resolves to a loopback address: 127.0.1.1; using 192.168.10.32 instead (on interface eno1)
2018-09-26 02:23:24.167 - stdout> 2018-09-26 02:23:24 WARN  Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
2018-09-26 02:23:24.679 - stdout> 2018-09-26 02:23:24 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2018-09-26 02:23:25.421 - stdout> 2018-09-26 02:23:25 INFO  SparkContext:54 - Running Spark version 2.3.1
2018-09-26 02:23:25.446 - stdout> 2018-09-26 02:23:25 INFO  SparkContext:54 - Submitted application: prepare testing tables
2018-09-26 02:23:25.517 - stdout> 2018-09-26 02:23:25 INFO  SecurityManager:54 - Changing view acls to: jenkins
2018-09-26 02:23:25.518 - stdout> 2018-09-26 02:23:25 INFO  SecurityManager:54 - Changing modify acls to: jenkins
2018-09-26 02:23:25.518 - stdout> 2018-09-26 02:23:25 INFO  SecurityManager:54 - Changing view acls groups to: 
2018-09-26 02:23:25.518 - stdout> 2018-09-26 02:23:25 INFO  SecurityManager:54 - Changing modify acls groups to: 
2018-09-26 02:23:25.519 - stdout> 2018-09-26 02:23:25 INFO  SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
2018-09-26 02:23:25.878 - stdout> 2018-09-26 02:23:25 INFO  Utils:54 - Successfully started service 'sparkDriver' on port 44562.
2018-09-26 02:23:25.912 - stdout> 2018-09-26 02:23:25 INFO  SparkEnv:54 - Registering MapOutputTracker
2018-09-26 02:23:25.942 - stdout> 2018-09-26 02:23:25 INFO  SparkEnv:54 - Registering BlockManagerMaster
2018-09-26 02:23:25.946 - stdout> 2018-09-26 02:23:25 INFO  BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2018-09-26 02:23:25.947 - stdout> 2018-09-26 02:23:25 INFO  BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
2018-09-26 02:23:25.962 - stdout> 2018-09-26 02:23:25 INFO  DiskBlockManager:54 - Created local directory at /tmp/blockmgr-de8208a8-18ba-481f-ad88-878072a58327
2018-09-26 02:23:25.987 - stdout> 2018-09-26 02:23:25 INFO  MemoryStore:54 - MemoryStore started with capacity 366.3 MB
2018-09-26 02:23:26.002 - stdout> 2018-09-26 02:23:26 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
2018-09-26 02:23:26.368 - stdout> 2018-09-26 02:23:26 INFO  SparkContext:54 - Added file file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/test6492818842578739735.py at file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/test6492818842578739735.py with timestamp 1537953806367
2018-09-26 02:23:26.371 - stdout> 2018-09-26 02:23:26 INFO  Utils:54 - Copying /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/test6492818842578739735.py to /tmp/spark-48cad304-208f-4743-aec3-88d403947b93/userFiles-1eb5cb0d-60a2-4b89-b8a5-1232c1753722/test6492818842578739735.py
2018-09-26 02:23:26.451 - stdout> 2018-09-26 02:23:26 INFO  Executor:54 - Starting executor ID driver on host localhost
2018-09-26 02:23:26.488 - stdout> 2018-09-26 02:23:26 INFO  Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 42789.
2018-09-26 02:23:26.489 - stdout> 2018-09-26 02:23:26 INFO  NettyBlockTransferService:54 - Server created on 192.168.10.32:42789
2018-09-26 02:23:26.492 - stdout> 2018-09-26 02:23:26 INFO  BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2018-09-26 02:23:26.543 - stdout> 2018-09-26 02:23:26 INFO  BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, 192.168.10.32, 42789, None)
2018-09-26 02:23:26.548 - stdout> 2018-09-26 02:23:26 INFO  BlockManagerMasterEndpoint:54 - Registering block manager 192.168.10.32:42789 with 366.3 MB RAM, BlockManagerId(driver, 192.168.10.32, 42789, None)
2018-09-26 02:23:26.555 - stdout> 2018-09-26 02:23:26 INFO  BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, 192.168.10.32, 42789, None)
2018-09-26 02:23:26.556 - stdout> 2018-09-26 02:23:26 INFO  BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, 192.168.10.32, 42789, None)
2018-09-26 02:23:26.754 - stdout> 2018-09-26 02:23:26 INFO  log:192 - Logging initialized @3488ms
2018-09-26 02:23:26.944 - stdout> 2018-09-26 02:23:26 INFO  SharedState:54 - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/warehouse-87f4bf74-454a-4bac-89b5-5a0de5089d62').
2018-09-26 02:23:26.944 - stdout> 2018-09-26 02:23:26 INFO  SharedState:54 - Warehouse path is '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/warehouse-87f4bf74-454a-4bac-89b5-5a0de5089d62'.
2018-09-26 02:23:27.492 - stdout> 2018-09-26 02:23:27 INFO  StateStoreCoordinatorRef:54 - Registered StateStoreCoordinator endpoint
2018-09-26 02:23:27.901 - stdout> 2018-09-26 02:23:27 INFO  HiveUtils:54 - Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
2018-09-26 02:23:27.938 - stdout> Traceback (most recent call last):
2018-09-26 02:23:27.939 - stdout>   File "/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6-ubuntu-test/target/tmp/test6492818842578739735.py", line 8, in <module>
2018-09-26 02:23:27.939 - stdout>     spark.sql("create table data_source_tbl_{} using json as select 1 i".format(version_index))
2018-09-26 02:23:27.939 - stdout>   File "/tmp/test-spark/spark-2.3.1/python/lib/pyspark.zip/pyspark/sql/session.py", line 710, in sql
2018-09-26 02:23:27.939 - stdout>   File "/tmp/test-spark/spark-2.3.1/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__
2018-09-26 02:23:27.939 - stdout>   File "/tmp/test-spark/spark-2.3.1/python/lib/pyspark.zip/pyspark/sql/utils.py", line 63, in deco
2018-09-26 02:23:27.939 - stdout>   File "/tmp/test-spark/spark-2.3.1/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value
2018-09-26 02:23:27.941 - stdout> py4j.protocol.Py4JJavaError: An error occurred while calling o27.sql.
2018-09-26 02:23:27.941 - stdout> : java.lang.NullPointerException
2018-09-26 02:23:27.941 - stdout> 	at org.apache.commons.io.IOUtils.copyLarge(IOUtils.java:1792)
2018-09-26 02:23:27.941 - stdout> 	at org.apache.commons.io.IOUtils.copyLarge(IOUtils.java:1769)
2018-09-26 02:23:27.941 - stdout> 	at org.apache.commons.io.IOUtils.copy(IOUtils.java:1744)
2018-09-26 02:23:27.941 - stdout> 	at org.apache.commons.io.IOUtils.toByteArray(IOUtils.java:462)
2018-09-26 02:23:27.941 - stdout> 	at org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.doLoadClass(IsolatedClientLoader.scala:216)
2018-09-26 02:23:27.941 - stdout> 	at org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1.loadClass(IsolatedClientLoader.scala:210)
2018-09-26 02:23:27.941 - stdout> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:411)
2018-09-26 02:23:27.941 - stdout> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
2018-09-26 02:23:27.941 - stdout> 	at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:262)
2018-09-26 02:23:27.941 - stdout> 	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:385)
2018-09-26 02:23:27.941 - stdout> 	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:287)
2018-09-26 02:23:27.941 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
2018-09-26 02:23:27.941 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
2018-09-26 02:23:27.941 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195)
2018-09-26 02:23:27.941 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
2018-09-26 02:23:27.941 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
2018-09-26 02:23:27.941 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
2018-09-26 02:23:27.941 - stdout> 	at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194)
2018-09-26 02:23:27.941 - stdout> 	at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
2018-09-26 02:23:27.941 - stdout> 	at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
2018-09-26 02:23:27.941 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
2018-09-26 02:23:27.941 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
2018-09-26 02:23:27.942 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
2018-09-26 02:23:27.942 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69)
2018-09-26 02:23:27.942 - stdout> 	at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69)
2018-09-26 02:23:27.942 - stdout> 	at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
2018-09-26 02:23:27.942 - stdout> 	at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
2018-09-26 02:23:27.942 - stdout> 	at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)
2018-09-26 02:23:27.942 - stdout> 	at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79)
2018-09-26 02:23:27.942 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
2018-09-26 02:23:27.942 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
2018-09-26 02:23:27.942 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
2018-09-26 02:23:27.942 - stdout> 	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74)
2018-09-26 02:23:27.942 - stdout> 	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:641)
2018-09-26 02:23:27.942 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2018-09-26 02:23:27.942 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2018-09-26 02:23:27.942 - stdout> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2018-09-26 02:23:27.942 - stdout> 	at java.lang.reflect.Method.invoke(Method.java:498)
2018-09-26 02:23:27.942 - stdout> 	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
2018-09-26 02:23:27.942 - stdout> 	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
2018-09-26 02:23:27.942 - stdout> 	at py4j.Gateway.invoke(Gateway.java:282)
2018-09-26 02:23:27.942 - stdout> 	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
2018-09-26 02:23:27.942 - stdout> 	at py4j.commands.CallCommand.execute(CallCommand.java:79)
2018-09-26 02:23:27.942 - stdout> 	at py4j.GatewayConnection.run(GatewayConnection.java:238)
2018-09-26 02:23:27.942 - stdout> 	at java.lang.Thread.run(Thread.java:748)
2018-09-26 02:23:27.942 - stdout> 
2018-09-26 02:23:27.991 - stdout> 2018-09-26 02:23:27 INFO  SparkContext:54 - Invoking stop() from shutdown hook
2018-09-26 02:23:28.004 - stdout> 2018-09-26 02:23:28 INFO  MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped!
2018-09-26 02:23:28.013 - stdout> 2018-09-26 02:23:28 INFO  MemoryStore:54 - MemoryStore cleared
2018-09-26 02:23:28.014 - stdout> 2018-09-26 02:23:28 INFO  BlockManager:54 - BlockManager stopped
2018-09-26 02:23:28.021 - stdout> 2018-09-26 02:23:28 INFO  BlockManagerMaster:54 - BlockManagerMaster stopped
2018-09-26 02:23:28.025 - stdout> 2018-09-26 02:23:28 INFO  OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped!
2018-09-26 02:23:28.028 - stdout> 2018-09-26 02:23:28 INFO  SparkContext:54 - Successfully stopped SparkContext
2018-09-26 02:23:28.029 - stdout> 2018-09-26 02:23:28 INFO  ShutdownHookManager:54 - Shutdown hook called
2018-09-26 02:23:28.029 - stdout> 2018-09-26 02:23:28 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-b47563f8-59f5-46e4-9f6a-776b8eddb379
2018-09-26 02:23:28.03 - stdout> 2018-09-26 02:23:28 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-48cad304-208f-4743-aec3-88d403947b93
2018-09-26 02:23:28.03 - stdout> 2018-09-26 02:23:28 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-48cad304-208f-4743-aec3-88d403947b93/pyspark-fe8f28d1-9a03-4495-b384-308149d06063
           
	at org.scalatest.Assertions$class.newAssertionFailedException(Assertions.scala:528)
	at org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1560)
	at org.scalatest.Assertions$class.fail(Assertions.scala:1089)
	at org.scalatest.FunSuite.fail(FunSuite.scala:1560)
	at org.apache.spark.sql.hive.SparkSubmitTestUtils$class.runSparkSubmit(SparkSubmitTestUtils.scala:94)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite.runSparkSubmit(HiveExternalCatalogVersionsSuite.scala:43)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite$$anonfun$beforeAll$1.apply(HiveExternalCatalogVersionsSuite.scala:187)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite$$anonfun$beforeAll$1.apply(HiveExternalCatalogVersionsSuite.scala:172)
	at scala.collection.immutable.List.foreach(List.scala:392)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite.beforeAll(HiveExternalCatalogVersionsSuite.scala:172)
	at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:212)
	at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
	at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:52)
	at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314)
	at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:480)
	at sbt.ForkMain$Run$2.call(ForkMain.java:296)
	at sbt.ForkMain$Run$2.call(ForkMain.java:286)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)