org.scalatest.exceptions.TestFailedException: spark-submit returned with exit code 1. Command line: './bin/spark-submit' '--name' 'prepare testing tables' '--master' 'local[2]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--conf' 'spark.sql.warehouse.dir=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/warehouse-631c240f-07df-497a-bfe4-d716ff61c957' '--conf' 'spark.sql.test.version.index=2' '--driver-java-options' '-Dderby.system.home=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/warehouse-631c240f-07df-497a-bfe4-d716ff61c957' '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/test2376953934368432345.py' 2018-11-21 04:39:01.004 - stdout> 2018-11-21 04:39:00 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2018-11-21 04:39:02.099 - stdout> 2018-11-21 04:39:02 INFO SparkContext:54 - Running Spark version 2.4.0 2018-11-21 04:39:02.126 - stdout> 2018-11-21 04:39:02 INFO SparkContext:54 - Submitted application: prepare testing tables 2018-11-21 04:39:02.197 - stdout> 2018-11-21 04:39:02 INFO SecurityManager:54 - Changing view acls to: jenkins 2018-11-21 04:39:02.198 - stdout> 2018-11-21 04:39:02 INFO SecurityManager:54 - Changing modify acls to: jenkins 2018-11-21 04:39:02.198 - stdout> 2018-11-21 04:39:02 INFO SecurityManager:54 - Changing view acls groups to: 2018-11-21 04:39:02.198 - stdout> 2018-11-21 04:39:02 INFO SecurityManager:54 - Changing modify acls groups to: 2018-11-21 04:39:02.199 - stdout> 2018-11-21 04:39:02 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jenkins); groups with view permissions: Set(); users with modify permissions: Set(jenkins); groups with modify permissions: Set() 2018-11-21 04:39:02.565 - stdout> 2018-11-21 04:39:02 INFO Utils:54 - Successfully started service 'sparkDriver' on port 45060. 2018-11-21 04:39:02.589 - stdout> 2018-11-21 04:39:02 INFO SparkEnv:54 - Registering MapOutputTracker 2018-11-21 04:39:02.608 - stdout> 2018-11-21 04:39:02 INFO SparkEnv:54 - Registering BlockManagerMaster 2018-11-21 04:39:02.611 - stdout> 2018-11-21 04:39:02 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 2018-11-21 04:39:02.611 - stdout> 2018-11-21 04:39:02 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up 2018-11-21 04:39:02.623 - stdout> 2018-11-21 04:39:02 INFO DiskBlockManager:54 - Created local directory at /tmp/blockmgr-bfff6a8d-b719-47b9-b499-8ae63bc2559b 2018-11-21 04:39:02.662 - stdout> 2018-11-21 04:39:02 INFO MemoryStore:54 - MemoryStore started with capacity 366.3 MB 2018-11-21 04:39:02.697 - stdout> 2018-11-21 04:39:02 INFO SparkEnv:54 - Registering OutputCommitCoordinator 2018-11-21 04:39:02.805 - stdout> 2018-11-21 04:39:02 INFO Executor:54 - Starting executor ID driver on host localhost 2018-11-21 04:39:02.884 - stdout> 2018-11-21 04:39:02 INFO Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 42937. 2018-11-21 04:39:02.885 - stdout> 2018-11-21 04:39:02 INFO NettyBlockTransferService:54 - Server created on amp-jenkins-worker-02.amp:42937 2018-11-21 04:39:02.887 - stdout> 2018-11-21 04:39:02 INFO BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 2018-11-21 04:39:02.924 - stdout> 2018-11-21 04:39:02 INFO BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, amp-jenkins-worker-02.amp, 42937, None) 2018-11-21 04:39:02.929 - stdout> 2018-11-21 04:39:02 INFO BlockManagerMasterEndpoint:54 - Registering block manager amp-jenkins-worker-02.amp:42937 with 366.3 MB RAM, BlockManagerId(driver, amp-jenkins-worker-02.amp, 42937, None) 2018-11-21 04:39:02.93 - stdout> 2018-11-21 04:39:02 INFO BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, amp-jenkins-worker-02.amp, 42937, None) 2018-11-21 04:39:02.931 - stdout> 2018-11-21 04:39:02 INFO BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, amp-jenkins-worker-02.amp, 42937, None) 2018-11-21 04:39:03.118 - stdout> 2018-11-21 04:39:03 INFO log:192 - Logging initialized @3724ms 2018-11-21 04:39:03.38 - stdout> 2018-11-21 04:39:03 INFO SharedState:54 - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/warehouse-631c240f-07df-497a-bfe4-d716ff61c957'). 2018-11-21 04:39:03.381 - stdout> 2018-11-21 04:39:03 INFO SharedState:54 - Warehouse path is '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/warehouse-631c240f-07df-497a-bfe4-d716ff61c957'. 2018-11-21 04:39:04.192 - stdout> 2018-11-21 04:39:04 INFO StateStoreCoordinatorRef:54 - Registered StateStoreCoordinator endpoint 2018-11-21 04:39:05.952 - stdout> Traceback (most recent call last): 2018-11-21 04:39:05.952 - stdout> File "/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/test2376953934368432345.py", line 8, in <module> 2018-11-21 04:39:05.953 - stdout> spark.sql("create table data_source_tbl_{} using json as select 1 i".format(version_index)) 2018-11-21 04:39:05.953 - stdout> File "/tmp/test-spark/spark-2.4.0/python/lib/pyspark.zip/pyspark/sql/session.py", line 767, in sql 2018-11-21 04:39:05.953 - stdout> File "/tmp/test-spark/spark-2.4.0/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__ 2018-11-21 04:39:05.953 - stdout> File "/tmp/test-spark/spark-2.4.0/python/lib/pyspark.zip/pyspark/sql/utils.py", line 63, in deco 2018-11-21 04:39:05.954 - stdout> File "/tmp/test-spark/spark-2.4.0/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value 2018-11-21 04:39:05.957 - stdout> py4j.protocol.Py4JJavaError: An error occurred while calling o27.sql. 2018-11-21 04:39:05.957 - stdout> : java.lang.ExceptionInInitializerError 2018-11-21 04:39:05.957 - stdout> at org.apache.spark.SparkContext.withScope(SparkContext.scala:699) 2018-11-21 04:39:05.957 - stdout> at org.apache.spark.SparkContext.parallelize(SparkContext.scala:716) 2018-11-21 04:39:05.957 - stdout> at org.apache.spark.sql.execution.SparkStrategies.singleRowRdd$lzycompute(SparkStrategies.scala:470) 2018-11-21 04:39:05.957 - stdout> at org.apache.spark.sql.execution.SparkStrategies.singleRowRdd(SparkStrategies.scala:470) 2018-11-21 04:39:05.957 - stdout> at org.apache.spark.sql.execution.SparkStrategies$BasicOperators$.apply(SparkStrategies.scala:627) 2018-11-21 04:39:05.957 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:63) 2018-11-21 04:39:05.957 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:63) 2018-11-21 04:39:05.957 - stdout> at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435) 2018-11-21 04:39:05.957 - stdout> at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441) 2018-11-21 04:39:05.957 - stdout> at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440) 2018-11-21 04:39:05.957 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93) 2018-11-21 04:39:05.957 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78) 2018-11-21 04:39:05.957 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75) 2018-11-21 04:39:05.957 - stdout> at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) 2018-11-21 04:39:05.957 - stdout> at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) 2018-11-21 04:39:05.957 - stdout> at scala.collection.Iterator$class.foreach(Iterator.scala:891) 2018-11-21 04:39:05.957 - stdout> at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) 2018-11-21 04:39:05.957 - stdout> at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157) 2018-11-21 04:39:05.957 - stdout> at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1334) 2018-11-21 04:39:05.957 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75) 2018-11-21 04:39:05.957 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67) 2018-11-21 04:39:05.957 - stdout> at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435) 2018-11-21 04:39:05.957 - stdout> at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441) 2018-11-21 04:39:05.957 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93) 2018-11-21 04:39:05.957 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78) 2018-11-21 04:39:05.957 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75) 2018-11-21 04:39:05.957 - stdout> at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) 2018-11-21 04:39:05.957 - stdout> at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) 2018-11-21 04:39:05.957 - stdout> at scala.collection.Iterator$class.foreach(Iterator.scala:891) 2018-11-21 04:39:05.957 - stdout> at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) 2018-11-21 04:39:05.957 - stdout> at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157) 2018-11-21 04:39:05.957 - stdout> at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1334) 2018-11-21 04:39:05.957 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75) 2018-11-21 04:39:05.957 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67) 2018-11-21 04:39:05.957 - stdout> at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435) 2018-11-21 04:39:05.957 - stdout> at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441) 2018-11-21 04:39:05.957 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93) 2018-11-21 04:39:05.957 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78) 2018-11-21 04:39:05.957 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75) 2018-11-21 04:39:05.957 - stdout> at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) 2018-11-21 04:39:05.957 - stdout> at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) 2018-11-21 04:39:05.957 - stdout> at scala.collection.Iterator$class.foreach(Iterator.scala:891) 2018-11-21 04:39:05.957 - stdout> at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) 2018-11-21 04:39:05.957 - stdout> at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157) 2018-11-21 04:39:05.957 - stdout> at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1334) 2018-11-21 04:39:05.957 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75) 2018-11-21 04:39:05.957 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67) 2018-11-21 04:39:05.957 - stdout> at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435) 2018-11-21 04:39:05.957 - stdout> at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441) 2018-11-21 04:39:05.957 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93) 2018-11-21 04:39:05.957 - stdout> at org.apache.spark.sql.execution.QueryExecution.sparkPlan$lzycompute(QueryExecution.scala:72) 2018-11-21 04:39:05.958 - stdout> at org.apache.spark.sql.execution.QueryExecution.sparkPlan(QueryExecution.scala:68) 2018-11-21 04:39:05.958 - stdout> at org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:77) 2018-11-21 04:39:05.958 - stdout> at org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:77) 2018-11-21 04:39:05.958 - stdout> at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3360) 2018-11-21 04:39:05.958 - stdout> at org.apache.spark.sql.Dataset.<init>(Dataset.scala:195) 2018-11-21 04:39:05.958 - stdout> at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:80) 2018-11-21 04:39:05.958 - stdout> at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642) 2018-11-21 04:39:05.958 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 2018-11-21 04:39:05.958 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 2018-11-21 04:39:05.958 - stdout> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2018-11-21 04:39:05.958 - stdout> at java.lang.reflect.Method.invoke(Method.java:497) 2018-11-21 04:39:05.958 - stdout> at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) 2018-11-21 04:39:05.958 - stdout> at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) 2018-11-21 04:39:05.958 - stdout> at py4j.Gateway.invoke(Gateway.java:282) 2018-11-21 04:39:05.958 - stdout> at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) 2018-11-21 04:39:05.958 - stdout> at py4j.commands.CallCommand.execute(CallCommand.java:79) 2018-11-21 04:39:05.958 - stdout> at py4j.GatewayConnection.run(GatewayConnection.java:238) 2018-11-21 04:39:05.958 - stdout> at java.lang.Thread.run(Thread.java:745) 2018-11-21 04:39:05.958 - stdout> Caused by: java.util.NoSuchElementException: key not found: groupId 2018-11-21 04:39:05.958 - stdout> at scala.collection.MapLike$class.default(MapLike.scala:228) 2018-11-21 04:39:05.958 - stdout> at scala.collection.AbstractMap.default(Map.scala:59) 2018-11-21 04:39:05.958 - stdout> at scala.collection.MapLike$class.apply(MapLike.scala:141) 2018-11-21 04:39:05.958 - stdout> at scala.collection.AbstractMap.apply(Map.scala:59) 2018-11-21 04:39:05.958 - stdout> at com.fasterxml.jackson.module.scala.JacksonModule$.version$lzycompute(JacksonModule.scala:27) 2018-11-21 04:39:05.958 - stdout> at com.fasterxml.jackson.module.scala.JacksonModule$.version(JacksonModule.scala:26) 2018-11-21 04:39:05.958 - stdout> at com.fasterxml.jackson.module.scala.JacksonModule$class.version(JacksonModule.scala:49) 2018-11-21 04:39:05.958 - stdout> at com.fasterxml.jackson.module.scala.DefaultScalaModule.version(DefaultScalaModule.scala:19) 2018-11-21 04:39:05.958 - stdout> at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:710) 2018-11-21 04:39:05.958 - stdout> at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82) 2018-11-21 04:39:05.958 - stdout> at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala) 2018-11-21 04:39:05.958 - stdout> ... 69 more 2018-11-21 04:39:05.958 - stdout> 2018-11-21 04:39:06.011 - stdout> 2018-11-21 04:39:06 INFO SparkContext:54 - Invoking stop() from shutdown hook 2018-11-21 04:39:06.02 - stdout> 2018-11-21 04:39:06 INFO MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped! 2018-11-21 04:39:06.028 - stdout> 2018-11-21 04:39:06 INFO MemoryStore:54 - MemoryStore cleared 2018-11-21 04:39:06.029 - stdout> 2018-11-21 04:39:06 INFO BlockManager:54 - BlockManager stopped 2018-11-21 04:39:06.034 - stdout> 2018-11-21 04:39:06 INFO BlockManagerMaster:54 - BlockManagerMaster stopped 2018-11-21 04:39:06.037 - stdout> 2018-11-21 04:39:06 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped! 2018-11-21 04:39:06.041 - stdout> 2018-11-21 04:39:06 INFO SparkContext:54 - Successfully stopped SparkContext 2018-11-21 04:39:06.042 - stdout> 2018-11-21 04:39:06 INFO ShutdownHookManager:54 - Shutdown hook called 2018-11-21 04:39:06.043 - stdout> 2018-11-21 04:39:06 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-007d965b-8707-477b-b943-15802f55a8b8 2018-11-21 04:39:06.045 - stdout> 2018-11-21 04:39:06 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-007d965b-8707-477b-b943-15802f55a8b8/pyspark-c9728e7c-d2fe-493d-8916-05978d52ff9d 2018-11-21 04:39:06.047 - stdout> 2018-11-21 04:39:06 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-63dffbb1-3b2c-465b-82c7-6aeeeb045769

sbt.ForkMain$ForkError: org.scalatest.exceptions.TestFailedException: spark-submit returned with exit code 1.
Command line: './bin/spark-submit' '--name' 'prepare testing tables' '--master' 'local[2]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--conf' 'spark.sql.warehouse.dir=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/warehouse-631c240f-07df-497a-bfe4-d716ff61c957' '--conf' 'spark.sql.test.version.index=2' '--driver-java-options' '-Dderby.system.home=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/warehouse-631c240f-07df-497a-bfe4-d716ff61c957' '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/test2376953934368432345.py'

2018-11-21 04:39:01.004 - stdout> 2018-11-21 04:39:00 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2018-11-21 04:39:02.099 - stdout> 2018-11-21 04:39:02 INFO  SparkContext:54 - Running Spark version 2.4.0
2018-11-21 04:39:02.126 - stdout> 2018-11-21 04:39:02 INFO  SparkContext:54 - Submitted application: prepare testing tables
2018-11-21 04:39:02.197 - stdout> 2018-11-21 04:39:02 INFO  SecurityManager:54 - Changing view acls to: jenkins
2018-11-21 04:39:02.198 - stdout> 2018-11-21 04:39:02 INFO  SecurityManager:54 - Changing modify acls to: jenkins
2018-11-21 04:39:02.198 - stdout> 2018-11-21 04:39:02 INFO  SecurityManager:54 - Changing view acls groups to: 
2018-11-21 04:39:02.198 - stdout> 2018-11-21 04:39:02 INFO  SecurityManager:54 - Changing modify acls groups to: 
2018-11-21 04:39:02.199 - stdout> 2018-11-21 04:39:02 INFO  SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
2018-11-21 04:39:02.565 - stdout> 2018-11-21 04:39:02 INFO  Utils:54 - Successfully started service 'sparkDriver' on port 45060.
2018-11-21 04:39:02.589 - stdout> 2018-11-21 04:39:02 INFO  SparkEnv:54 - Registering MapOutputTracker
2018-11-21 04:39:02.608 - stdout> 2018-11-21 04:39:02 INFO  SparkEnv:54 - Registering BlockManagerMaster
2018-11-21 04:39:02.611 - stdout> 2018-11-21 04:39:02 INFO  BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2018-11-21 04:39:02.611 - stdout> 2018-11-21 04:39:02 INFO  BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
2018-11-21 04:39:02.623 - stdout> 2018-11-21 04:39:02 INFO  DiskBlockManager:54 - Created local directory at /tmp/blockmgr-bfff6a8d-b719-47b9-b499-8ae63bc2559b
2018-11-21 04:39:02.662 - stdout> 2018-11-21 04:39:02 INFO  MemoryStore:54 - MemoryStore started with capacity 366.3 MB
2018-11-21 04:39:02.697 - stdout> 2018-11-21 04:39:02 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
2018-11-21 04:39:02.805 - stdout> 2018-11-21 04:39:02 INFO  Executor:54 - Starting executor ID driver on host localhost
2018-11-21 04:39:02.884 - stdout> 2018-11-21 04:39:02 INFO  Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 42937.
2018-11-21 04:39:02.885 - stdout> 2018-11-21 04:39:02 INFO  NettyBlockTransferService:54 - Server created on amp-jenkins-worker-02.amp:42937
2018-11-21 04:39:02.887 - stdout> 2018-11-21 04:39:02 INFO  BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2018-11-21 04:39:02.924 - stdout> 2018-11-21 04:39:02 INFO  BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, amp-jenkins-worker-02.amp, 42937, None)
2018-11-21 04:39:02.929 - stdout> 2018-11-21 04:39:02 INFO  BlockManagerMasterEndpoint:54 - Registering block manager amp-jenkins-worker-02.amp:42937 with 366.3 MB RAM, BlockManagerId(driver, amp-jenkins-worker-02.amp, 42937, None)
2018-11-21 04:39:02.93 - stdout> 2018-11-21 04:39:02 INFO  BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, amp-jenkins-worker-02.amp, 42937, None)
2018-11-21 04:39:02.931 - stdout> 2018-11-21 04:39:02 INFO  BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, amp-jenkins-worker-02.amp, 42937, None)
2018-11-21 04:39:03.118 - stdout> 2018-11-21 04:39:03 INFO  log:192 - Logging initialized @3724ms
2018-11-21 04:39:03.38 - stdout> 2018-11-21 04:39:03 INFO  SharedState:54 - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/warehouse-631c240f-07df-497a-bfe4-d716ff61c957').
2018-11-21 04:39:03.381 - stdout> 2018-11-21 04:39:03 INFO  SharedState:54 - Warehouse path is '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/warehouse-631c240f-07df-497a-bfe4-d716ff61c957'.
2018-11-21 04:39:04.192 - stdout> 2018-11-21 04:39:04 INFO  StateStoreCoordinatorRef:54 - Registered StateStoreCoordinator endpoint
2018-11-21 04:39:05.952 - stdout> Traceback (most recent call last):
2018-11-21 04:39:05.952 - stdout>   File "/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/test2376953934368432345.py", line 8, in <module>
2018-11-21 04:39:05.953 - stdout>     spark.sql("create table data_source_tbl_{} using json as select 1 i".format(version_index))
2018-11-21 04:39:05.953 - stdout>   File "/tmp/test-spark/spark-2.4.0/python/lib/pyspark.zip/pyspark/sql/session.py", line 767, in sql
2018-11-21 04:39:05.953 - stdout>   File "/tmp/test-spark/spark-2.4.0/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__
2018-11-21 04:39:05.953 - stdout>   File "/tmp/test-spark/spark-2.4.0/python/lib/pyspark.zip/pyspark/sql/utils.py", line 63, in deco
2018-11-21 04:39:05.954 - stdout>   File "/tmp/test-spark/spark-2.4.0/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value
2018-11-21 04:39:05.957 - stdout> py4j.protocol.Py4JJavaError: An error occurred while calling o27.sql.
2018-11-21 04:39:05.957 - stdout> : java.lang.ExceptionInInitializerError
2018-11-21 04:39:05.957 - stdout> 	at org.apache.spark.SparkContext.withScope(SparkContext.scala:699)
2018-11-21 04:39:05.957 - stdout> 	at org.apache.spark.SparkContext.parallelize(SparkContext.scala:716)
2018-11-21 04:39:05.957 - stdout> 	at org.apache.spark.sql.execution.SparkStrategies.singleRowRdd$lzycompute(SparkStrategies.scala:470)
2018-11-21 04:39:05.957 - stdout> 	at org.apache.spark.sql.execution.SparkStrategies.singleRowRdd(SparkStrategies.scala:470)
2018-11-21 04:39:05.957 - stdout> 	at org.apache.spark.sql.execution.SparkStrategies$BasicOperators$.apply(SparkStrategies.scala:627)
2018-11-21 04:39:05.957 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:63)
2018-11-21 04:39:05.957 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:63)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
2018-11-21 04:39:05.957 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
2018-11-21 04:39:05.957 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78)
2018-11-21 04:39:05.957 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.Iterator$class.foreach(Iterator.scala:891)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1334)
2018-11-21 04:39:05.957 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75)
2018-11-21 04:39:05.957 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
2018-11-21 04:39:05.957 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
2018-11-21 04:39:05.957 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78)
2018-11-21 04:39:05.957 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.Iterator$class.foreach(Iterator.scala:891)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1334)
2018-11-21 04:39:05.957 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75)
2018-11-21 04:39:05.957 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
2018-11-21 04:39:05.957 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
2018-11-21 04:39:05.957 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78)
2018-11-21 04:39:05.957 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.Iterator$class.foreach(Iterator.scala:891)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1334)
2018-11-21 04:39:05.957 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75)
2018-11-21 04:39:05.957 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
2018-11-21 04:39:05.957 - stdout> 	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
2018-11-21 04:39:05.957 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
2018-11-21 04:39:05.957 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.sparkPlan$lzycompute(QueryExecution.scala:72)
2018-11-21 04:39:05.958 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.sparkPlan(QueryExecution.scala:68)
2018-11-21 04:39:05.958 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:77)
2018-11-21 04:39:05.958 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:77)
2018-11-21 04:39:05.958 - stdout> 	at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3360)
2018-11-21 04:39:05.958 - stdout> 	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:195)
2018-11-21 04:39:05.958 - stdout> 	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:80)
2018-11-21 04:39:05.958 - stdout> 	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
2018-11-21 04:39:05.958 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2018-11-21 04:39:05.958 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2018-11-21 04:39:05.958 - stdout> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2018-11-21 04:39:05.958 - stdout> 	at java.lang.reflect.Method.invoke(Method.java:497)
2018-11-21 04:39:05.958 - stdout> 	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
2018-11-21 04:39:05.958 - stdout> 	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
2018-11-21 04:39:05.958 - stdout> 	at py4j.Gateway.invoke(Gateway.java:282)
2018-11-21 04:39:05.958 - stdout> 	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
2018-11-21 04:39:05.958 - stdout> 	at py4j.commands.CallCommand.execute(CallCommand.java:79)
2018-11-21 04:39:05.958 - stdout> 	at py4j.GatewayConnection.run(GatewayConnection.java:238)
2018-11-21 04:39:05.958 - stdout> 	at java.lang.Thread.run(Thread.java:745)
2018-11-21 04:39:05.958 - stdout> Caused by: java.util.NoSuchElementException: key not found: groupId
2018-11-21 04:39:05.958 - stdout> 	at scala.collection.MapLike$class.default(MapLike.scala:228)
2018-11-21 04:39:05.958 - stdout> 	at scala.collection.AbstractMap.default(Map.scala:59)
2018-11-21 04:39:05.958 - stdout> 	at scala.collection.MapLike$class.apply(MapLike.scala:141)
2018-11-21 04:39:05.958 - stdout> 	at scala.collection.AbstractMap.apply(Map.scala:59)
2018-11-21 04:39:05.958 - stdout> 	at com.fasterxml.jackson.module.scala.JacksonModule$.version$lzycompute(JacksonModule.scala:27)
2018-11-21 04:39:05.958 - stdout> 	at com.fasterxml.jackson.module.scala.JacksonModule$.version(JacksonModule.scala:26)
2018-11-21 04:39:05.958 - stdout> 	at com.fasterxml.jackson.module.scala.JacksonModule$class.version(JacksonModule.scala:49)
2018-11-21 04:39:05.958 - stdout> 	at com.fasterxml.jackson.module.scala.DefaultScalaModule.version(DefaultScalaModule.scala:19)
2018-11-21 04:39:05.958 - stdout> 	at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:710)
2018-11-21 04:39:05.958 - stdout> 	at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82)
2018-11-21 04:39:05.958 - stdout> 	at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala)
2018-11-21 04:39:05.958 - stdout> 	... 69 more
2018-11-21 04:39:05.958 - stdout> 
2018-11-21 04:39:06.011 - stdout> 2018-11-21 04:39:06 INFO  SparkContext:54 - Invoking stop() from shutdown hook
2018-11-21 04:39:06.02 - stdout> 2018-11-21 04:39:06 INFO  MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped!
2018-11-21 04:39:06.028 - stdout> 2018-11-21 04:39:06 INFO  MemoryStore:54 - MemoryStore cleared
2018-11-21 04:39:06.029 - stdout> 2018-11-21 04:39:06 INFO  BlockManager:54 - BlockManager stopped
2018-11-21 04:39:06.034 - stdout> 2018-11-21 04:39:06 INFO  BlockManagerMaster:54 - BlockManagerMaster stopped
2018-11-21 04:39:06.037 - stdout> 2018-11-21 04:39:06 INFO  OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped!
2018-11-21 04:39:06.041 - stdout> 2018-11-21 04:39:06 INFO  SparkContext:54 - Successfully stopped SparkContext
2018-11-21 04:39:06.042 - stdout> 2018-11-21 04:39:06 INFO  ShutdownHookManager:54 - Shutdown hook called
2018-11-21 04:39:06.043 - stdout> 2018-11-21 04:39:06 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-007d965b-8707-477b-b943-15802f55a8b8
2018-11-21 04:39:06.045 - stdout> 2018-11-21 04:39:06 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-007d965b-8707-477b-b943-15802f55a8b8/pyspark-c9728e7c-d2fe-493d-8916-05978d52ff9d
2018-11-21 04:39:06.047 - stdout> 2018-11-21 04:39:06 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-63dffbb1-3b2c-465b-82c7-6aeeeb045769
           
	at org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:528)
	at org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:527)
	at org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1560)
	at org.scalatest.Assertions.fail(Assertions.scala:1089)
	at org.scalatest.Assertions.fail$(Assertions.scala:1085)
	at org.scalatest.FunSuite.fail(FunSuite.scala:1560)
	at org.apache.spark.sql.hive.SparkSubmitTestUtils.runSparkSubmit(SparkSubmitTestUtils.scala:94)
	at org.apache.spark.sql.hive.SparkSubmitTestUtils.runSparkSubmit$(SparkSubmitTestUtils.scala:41)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite.runSparkSubmit(HiveExternalCatalogVersionsSuite.scala:43)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite.$anonfun$beforeAll$1(HiveExternalCatalogVersionsSuite.scala:187)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite.$anonfun$beforeAll$1$adapted(HiveExternalCatalogVersionsSuite.scala:172)
	at scala.collection.immutable.List.foreach(List.scala:388)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite.beforeAll(HiveExternalCatalogVersionsSuite.scala:172)
	at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:212)
	at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
	at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
	at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:52)
	at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314)
	at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:507)
	at sbt.ForkMain$Run$2.call(ForkMain.java:296)
	at sbt.ForkMain$Run$2.call(ForkMain.java:286)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)