org.scalatest.exceptions.TestFailedException: spark-submit returned with exit code 1. Command line: './bin/spark-submit' '--name' 'prepare testing tables' '--master' 'local[2]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--conf' 'spark.sql.hive.metastore.version=1.2.1' '--conf' 'spark.sql.hive.metastore.jars=maven' '--conf' 'spark.sql.warehouse.dir=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-882d9c38-a3e5-47b9-8738-ffd0863fd850' '--conf' 'spark.sql.test.version.index=1' '--driver-java-options' '-Dderby.system.home=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-882d9c38-a3e5-47b9-8738-ffd0863fd850' '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/test4418009254945157569.py' 2019-07-18 05:14:07.613 - stderr> 19/07/18 05:14:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2019-07-18 05:14:08.398 - stderr> Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 2019-07-18 05:14:08.401 - stderr> 19/07/18 05:14:08 INFO SparkContext: Running Spark version 2.4.3 2019-07-18 05:14:08.433 - stderr> 19/07/18 05:14:08 INFO SparkContext: Submitted application: prepare testing tables 2019-07-18 05:14:08.494 - stderr> 19/07/18 05:14:08 INFO SecurityManager: Changing view acls to: jenkins 2019-07-18 05:14:08.494 - stderr> 19/07/18 05:14:08 INFO SecurityManager: Changing modify acls to: jenkins 2019-07-18 05:14:08.494 - stderr> 19/07/18 05:14:08 INFO SecurityManager: Changing view acls groups to: 2019-07-18 05:14:08.494 - stderr> 19/07/18 05:14:08 INFO SecurityManager: Changing modify acls groups to: 2019-07-18 05:14:08.494 - stderr> 19/07/18 05:14:08 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jenkins); groups with view permissions: Set(); users with modify permissions: Set(jenkins); groups with modify permissions: Set() 2019-07-18 05:14:08.806 - stderr> 19/07/18 05:14:08 INFO Utils: Successfully started service 'sparkDriver' on port 44021. 2019-07-18 05:14:08.837 - stderr> 19/07/18 05:14:08 INFO SparkEnv: Registering MapOutputTracker 2019-07-18 05:14:08.855 - stderr> 19/07/18 05:14:08 INFO SparkEnv: Registering BlockManagerMaster 2019-07-18 05:14:08.857 - stderr> 19/07/18 05:14:08 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 2019-07-18 05:14:08.858 - stderr> 19/07/18 05:14:08 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 2019-07-18 05:14:08.867 - stderr> 19/07/18 05:14:08 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-3080cd65-f2e0-4635-a040-5b783bc88d7c 2019-07-18 05:14:08.896 - stderr> 19/07/18 05:14:08 INFO MemoryStore: MemoryStore started with capacity 366.3 MB 2019-07-18 05:14:08.911 - stderr> 19/07/18 05:14:08 INFO SparkEnv: Registering OutputCommitCoordinator 2019-07-18 05:14:09.008 - stderr> 19/07/18 05:14:09 INFO Executor: Starting executor ID driver on host localhost 2019-07-18 05:14:09.085 - stderr> 19/07/18 05:14:09 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 38942. 2019-07-18 05:14:09.085 - stderr> 19/07/18 05:14:09 INFO NettyBlockTransferService: Server created on amp-jenkins-worker-06.amp:38942 2019-07-18 05:14:09.087 - stderr> 19/07/18 05:14:09 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 2019-07-18 05:14:09.117 - stderr> 19/07/18 05:14:09 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, amp-jenkins-worker-06.amp, 38942, None) 2019-07-18 05:14:09.119 - stderr> 19/07/18 05:14:09 INFO BlockManagerMasterEndpoint: Registering block manager amp-jenkins-worker-06.amp:38942 with 366.3 MB RAM, BlockManagerId(driver, amp-jenkins-worker-06.amp, 38942, None) 2019-07-18 05:14:09.122 - stderr> 19/07/18 05:14:09 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, amp-jenkins-worker-06.amp, 38942, None) 2019-07-18 05:14:09.122 - stderr> 19/07/18 05:14:09 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, amp-jenkins-worker-06.amp, 38942, None) 2019-07-18 05:14:09.502 - stderr> 19/07/18 05:14:09 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-882d9c38-a3e5-47b9-8738-ffd0863fd850'). 2019-07-18 05:14:09.502 - stderr> 19/07/18 05:14:09 INFO SharedState: Warehouse path is '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-882d9c38-a3e5-47b9-8738-ffd0863fd850'. 2019-07-18 05:14:10.138 - stderr> 19/07/18 05:14:10 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint 2019-07-18 05:14:12.191 - stdout> Traceback (most recent call last): 2019-07-18 05:14:12.192 - stdout> File "/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/test4418009254945157569.py", line 8, in <module> 2019-07-18 05:14:12.192 - stdout> spark.sql("create table data_source_tbl_{} using json as select 1 i".format(version_index)) 2019-07-18 05:14:12.192 - stdout> File "/tmp/test-spark/spark-2.4.3/python/lib/pyspark.zip/pyspark/sql/session.py", line 767, in sql 2019-07-18 05:14:12.192 - stdout> File "/tmp/test-spark/spark-2.4.3/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__ 2019-07-18 05:14:12.192 - stdout> File "/tmp/test-spark/spark-2.4.3/python/lib/pyspark.zip/pyspark/sql/utils.py", line 63, in deco 2019-07-18 05:14:12.192 - stdout> File "/tmp/test-spark/spark-2.4.3/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value 2019-07-18 05:14:12.194 - stdout> py4j.protocol.Py4JJavaError: An error occurred while calling o29.sql. 2019-07-18 05:14:12.194 - stdout> : java.lang.ExceptionInInitializerError 2019-07-18 05:14:12.194 - stdout> at org.apache.spark.SparkContext.withScope(SparkContext.scala:699) 2019-07-18 05:14:12.194 - stdout> at org.apache.spark.SparkContext.parallelize(SparkContext.scala:716) 2019-07-18 05:14:12.194 - stdout> at org.apache.spark.sql.execution.SparkStrategies.singleRowRdd$lzycompute(SparkStrategies.scala:470) 2019-07-18 05:14:12.194 - stdout> at org.apache.spark.sql.execution.SparkStrategies.singleRowRdd(SparkStrategies.scala:470) 2019-07-18 05:14:12.194 - stdout> at org.apache.spark.sql.execution.SparkStrategies$BasicOperators$.apply(SparkStrategies.scala:627) 2019-07-18 05:14:12.194 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:63) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:63) 2019-07-18 05:14:12.195 - stdout> at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435) 2019-07-18 05:14:12.195 - stdout> at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441) 2019-07-18 05:14:12.195 - stdout> at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75) 2019-07-18 05:14:12.195 - stdout> at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) 2019-07-18 05:14:12.195 - stdout> at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) 2019-07-18 05:14:12.195 - stdout> at scala.collection.Iterator$class.foreach(Iterator.scala:891) 2019-07-18 05:14:12.195 - stdout> at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) 2019-07-18 05:14:12.195 - stdout> at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157) 2019-07-18 05:14:12.195 - stdout> at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1334) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67) 2019-07-18 05:14:12.195 - stdout> at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435) 2019-07-18 05:14:12.195 - stdout> at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75) 2019-07-18 05:14:12.195 - stdout> at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) 2019-07-18 05:14:12.195 - stdout> at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) 2019-07-18 05:14:12.195 - stdout> at scala.collection.Iterator$class.foreach(Iterator.scala:891) 2019-07-18 05:14:12.195 - stdout> at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) 2019-07-18 05:14:12.195 - stdout> at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157) 2019-07-18 05:14:12.195 - stdout> at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1334) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67) 2019-07-18 05:14:12.195 - stdout> at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435) 2019-07-18 05:14:12.195 - stdout> at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75) 2019-07-18 05:14:12.195 - stdout> at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) 2019-07-18 05:14:12.195 - stdout> at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) 2019-07-18 05:14:12.195 - stdout> at scala.collection.Iterator$class.foreach(Iterator.scala:891) 2019-07-18 05:14:12.195 - stdout> at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) 2019-07-18 05:14:12.195 - stdout> at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157) 2019-07-18 05:14:12.195 - stdout> at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1334) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67) 2019-07-18 05:14:12.195 - stdout> at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435) 2019-07-18 05:14:12.195 - stdout> at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.execution.QueryExecution.sparkPlan$lzycompute(QueryExecution.scala:72) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.execution.QueryExecution.sparkPlan(QueryExecution.scala:68) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:77) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:77) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3359) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.Dataset.<init>(Dataset.scala:194) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:79) 2019-07-18 05:14:12.195 - stdout> at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642) 2019-07-18 05:14:12.195 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 2019-07-18 05:14:12.195 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 2019-07-18 05:14:12.196 - stdout> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2019-07-18 05:14:12.196 - stdout> at java.lang.reflect.Method.invoke(Method.java:498) 2019-07-18 05:14:12.196 - stdout> at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) 2019-07-18 05:14:12.196 - stdout> at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) 2019-07-18 05:14:12.196 - stdout> at py4j.Gateway.invoke(Gateway.java:282) 2019-07-18 05:14:12.196 - stdout> at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) 2019-07-18 05:14:12.196 - stdout> at py4j.commands.CallCommand.execute(CallCommand.java:79) 2019-07-18 05:14:12.196 - stdout> at py4j.GatewayConnection.run(GatewayConnection.java:238) 2019-07-18 05:14:12.196 - stdout> at java.lang.Thread.run(Thread.java:748) 2019-07-18 05:14:12.196 - stdout> Caused by: java.util.NoSuchElementException: key not found: groupId 2019-07-18 05:14:12.196 - stdout> at scala.collection.MapLike$class.default(MapLike.scala:228) 2019-07-18 05:14:12.196 - stdout> at scala.collection.AbstractMap.default(Map.scala:59) 2019-07-18 05:14:12.196 - stdout> at scala.collection.MapLike$class.apply(MapLike.scala:141) 2019-07-18 05:14:12.196 - stdout> at scala.collection.AbstractMap.apply(Map.scala:59) 2019-07-18 05:14:12.196 - stdout> at com.fasterxml.jackson.module.scala.JacksonModule$.version$lzycompute(JacksonModule.scala:27) 2019-07-18 05:14:12.196 - stdout> at com.fasterxml.jackson.module.scala.JacksonModule$.version(JacksonModule.scala:26) 2019-07-18 05:14:12.196 - stdout> at com.fasterxml.jackson.module.scala.JacksonModule$class.version(JacksonModule.scala:49) 2019-07-18 05:14:12.196 - stdout> at com.fasterxml.jackson.module.scala.DefaultScalaModule.version(DefaultScalaModule.scala:19) 2019-07-18 05:14:12.196 - stdout> at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:710) 2019-07-18 05:14:12.196 - stdout> at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82) 2019-07-18 05:14:12.196 - stdout> at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala) 2019-07-18 05:14:12.196 - stdout> ... 69 more 2019-07-18 05:14:12.196 - stdout> 2019-07-18 05:14:12.243 - stderr> 19/07/18 05:14:12 INFO SparkContext: Invoking stop() from shutdown hook 2019-07-18 05:14:12.254 - stderr> 19/07/18 05:14:12 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 2019-07-18 05:14:12.263 - stderr> 19/07/18 05:14:12 INFO MemoryStore: MemoryStore cleared 2019-07-18 05:14:12.263 - stderr> 19/07/18 05:14:12 INFO BlockManager: BlockManager stopped 2019-07-18 05:14:12.269 - stderr> 19/07/18 05:14:12 INFO BlockManagerMaster: BlockManagerMaster stopped 2019-07-18 05:14:12.271 - stderr> 19/07/18 05:14:12 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 2019-07-18 05:14:12.276 - stderr> 19/07/18 05:14:12 INFO SparkContext: Successfully stopped SparkContext 2019-07-18 05:14:12.276 - stderr> 19/07/18 05:14:12 INFO ShutdownHookManager: Shutdown hook called 2019-07-18 05:14:12.277 - stderr> 19/07/18 05:14:12 INFO ShutdownHookManager: Deleting directory /tmp/spark-eea0ff7a-8cc1-418b-a261-94ea22f2c5c5 2019-07-18 05:14:12.28 - stderr> 19/07/18 05:14:12 INFO ShutdownHookManager: Deleting directory /tmp/spark-eea0ff7a-8cc1-418b-a261-94ea22f2c5c5/pyspark-660d54f6-554b-408c-a5c0-8213cf5a22c5 2019-07-18 05:14:12.283 - stderr> 19/07/18 05:14:12 INFO ShutdownHookManager: Deleting directory /tmp/spark-5d30178b-f76f-43a4-87e6-f06918a2355a

sbt.ForkMain$ForkError: org.scalatest.exceptions.TestFailedException: spark-submit returned with exit code 1.
Command line: './bin/spark-submit' '--name' 'prepare testing tables' '--master' 'local[2]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--conf' 'spark.sql.hive.metastore.version=1.2.1' '--conf' 'spark.sql.hive.metastore.jars=maven' '--conf' 'spark.sql.warehouse.dir=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-882d9c38-a3e5-47b9-8738-ffd0863fd850' '--conf' 'spark.sql.test.version.index=1' '--driver-java-options' '-Dderby.system.home=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-882d9c38-a3e5-47b9-8738-ffd0863fd850' '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/test4418009254945157569.py'

2019-07-18 05:14:07.613 - stderr> 19/07/18 05:14:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2019-07-18 05:14:08.398 - stderr> Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
2019-07-18 05:14:08.401 - stderr> 19/07/18 05:14:08 INFO SparkContext: Running Spark version 2.4.3
2019-07-18 05:14:08.433 - stderr> 19/07/18 05:14:08 INFO SparkContext: Submitted application: prepare testing tables
2019-07-18 05:14:08.494 - stderr> 19/07/18 05:14:08 INFO SecurityManager: Changing view acls to: jenkins
2019-07-18 05:14:08.494 - stderr> 19/07/18 05:14:08 INFO SecurityManager: Changing modify acls to: jenkins
2019-07-18 05:14:08.494 - stderr> 19/07/18 05:14:08 INFO SecurityManager: Changing view acls groups to: 
2019-07-18 05:14:08.494 - stderr> 19/07/18 05:14:08 INFO SecurityManager: Changing modify acls groups to: 
2019-07-18 05:14:08.494 - stderr> 19/07/18 05:14:08 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
2019-07-18 05:14:08.806 - stderr> 19/07/18 05:14:08 INFO Utils: Successfully started service 'sparkDriver' on port 44021.
2019-07-18 05:14:08.837 - stderr> 19/07/18 05:14:08 INFO SparkEnv: Registering MapOutputTracker
2019-07-18 05:14:08.855 - stderr> 19/07/18 05:14:08 INFO SparkEnv: Registering BlockManagerMaster
2019-07-18 05:14:08.857 - stderr> 19/07/18 05:14:08 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2019-07-18 05:14:08.858 - stderr> 19/07/18 05:14:08 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
2019-07-18 05:14:08.867 - stderr> 19/07/18 05:14:08 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-3080cd65-f2e0-4635-a040-5b783bc88d7c
2019-07-18 05:14:08.896 - stderr> 19/07/18 05:14:08 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
2019-07-18 05:14:08.911 - stderr> 19/07/18 05:14:08 INFO SparkEnv: Registering OutputCommitCoordinator
2019-07-18 05:14:09.008 - stderr> 19/07/18 05:14:09 INFO Executor: Starting executor ID driver on host localhost
2019-07-18 05:14:09.085 - stderr> 19/07/18 05:14:09 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 38942.
2019-07-18 05:14:09.085 - stderr> 19/07/18 05:14:09 INFO NettyBlockTransferService: Server created on amp-jenkins-worker-06.amp:38942
2019-07-18 05:14:09.087 - stderr> 19/07/18 05:14:09 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2019-07-18 05:14:09.117 - stderr> 19/07/18 05:14:09 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, amp-jenkins-worker-06.amp, 38942, None)
2019-07-18 05:14:09.119 - stderr> 19/07/18 05:14:09 INFO BlockManagerMasterEndpoint: Registering block manager amp-jenkins-worker-06.amp:38942 with 366.3 MB RAM, BlockManagerId(driver, amp-jenkins-worker-06.amp, 38942, None)
2019-07-18 05:14:09.122 - stderr> 19/07/18 05:14:09 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, amp-jenkins-worker-06.amp, 38942, None)
2019-07-18 05:14:09.122 - stderr> 19/07/18 05:14:09 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, amp-jenkins-worker-06.amp, 38942, None)
2019-07-18 05:14:09.502 - stderr> 19/07/18 05:14:09 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-882d9c38-a3e5-47b9-8738-ffd0863fd850').
2019-07-18 05:14:09.502 - stderr> 19/07/18 05:14:09 INFO SharedState: Warehouse path is '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-882d9c38-a3e5-47b9-8738-ffd0863fd850'.
2019-07-18 05:14:10.138 - stderr> 19/07/18 05:14:10 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
2019-07-18 05:14:12.191 - stdout> Traceback (most recent call last):
2019-07-18 05:14:12.192 - stdout>   File "/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/test4418009254945157569.py", line 8, in <module>
2019-07-18 05:14:12.192 - stdout>     spark.sql("create table data_source_tbl_{} using json as select 1 i".format(version_index))
2019-07-18 05:14:12.192 - stdout>   File "/tmp/test-spark/spark-2.4.3/python/lib/pyspark.zip/pyspark/sql/session.py", line 767, in sql
2019-07-18 05:14:12.192 - stdout>   File "/tmp/test-spark/spark-2.4.3/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__
2019-07-18 05:14:12.192 - stdout>   File "/tmp/test-spark/spark-2.4.3/python/lib/pyspark.zip/pyspark/sql/utils.py", line 63, in deco
2019-07-18 05:14:12.192 - stdout>   File "/tmp/test-spark/spark-2.4.3/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value
2019-07-18 05:14:12.194 - stdout> py4j.protocol.Py4JJavaError: An error occurred while calling o29.sql.
2019-07-18 05:14:12.194 - stdout> : java.lang.ExceptionInInitializerError
2019-07-18 05:14:12.194 - stdout> 	at org.apache.spark.SparkContext.withScope(SparkContext.scala:699)
2019-07-18 05:14:12.194 - stdout> 	at org.apache.spark.SparkContext.parallelize(SparkContext.scala:716)
2019-07-18 05:14:12.194 - stdout> 	at org.apache.spark.sql.execution.SparkStrategies.singleRowRdd$lzycompute(SparkStrategies.scala:470)
2019-07-18 05:14:12.194 - stdout> 	at org.apache.spark.sql.execution.SparkStrategies.singleRowRdd(SparkStrategies.scala:470)
2019-07-18 05:14:12.194 - stdout> 	at org.apache.spark.sql.execution.SparkStrategies$BasicOperators$.apply(SparkStrategies.scala:627)
2019-07-18 05:14:12.194 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:63)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:63)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.Iterator$class.foreach(Iterator.scala:891)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1334)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.Iterator$class.foreach(Iterator.scala:891)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1334)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.Iterator$class.foreach(Iterator.scala:891)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1334)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
2019-07-18 05:14:12.195 - stdout> 	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.sparkPlan$lzycompute(QueryExecution.scala:72)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.sparkPlan(QueryExecution.scala:68)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:77)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:77)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3359)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:194)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:79)
2019-07-18 05:14:12.195 - stdout> 	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
2019-07-18 05:14:12.195 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2019-07-18 05:14:12.195 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2019-07-18 05:14:12.196 - stdout> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2019-07-18 05:14:12.196 - stdout> 	at java.lang.reflect.Method.invoke(Method.java:498)
2019-07-18 05:14:12.196 - stdout> 	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
2019-07-18 05:14:12.196 - stdout> 	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
2019-07-18 05:14:12.196 - stdout> 	at py4j.Gateway.invoke(Gateway.java:282)
2019-07-18 05:14:12.196 - stdout> 	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
2019-07-18 05:14:12.196 - stdout> 	at py4j.commands.CallCommand.execute(CallCommand.java:79)
2019-07-18 05:14:12.196 - stdout> 	at py4j.GatewayConnection.run(GatewayConnection.java:238)
2019-07-18 05:14:12.196 - stdout> 	at java.lang.Thread.run(Thread.java:748)
2019-07-18 05:14:12.196 - stdout> Caused by: java.util.NoSuchElementException: key not found: groupId
2019-07-18 05:14:12.196 - stdout> 	at scala.collection.MapLike$class.default(MapLike.scala:228)
2019-07-18 05:14:12.196 - stdout> 	at scala.collection.AbstractMap.default(Map.scala:59)
2019-07-18 05:14:12.196 - stdout> 	at scala.collection.MapLike$class.apply(MapLike.scala:141)
2019-07-18 05:14:12.196 - stdout> 	at scala.collection.AbstractMap.apply(Map.scala:59)
2019-07-18 05:14:12.196 - stdout> 	at com.fasterxml.jackson.module.scala.JacksonModule$.version$lzycompute(JacksonModule.scala:27)
2019-07-18 05:14:12.196 - stdout> 	at com.fasterxml.jackson.module.scala.JacksonModule$.version(JacksonModule.scala:26)
2019-07-18 05:14:12.196 - stdout> 	at com.fasterxml.jackson.module.scala.JacksonModule$class.version(JacksonModule.scala:49)
2019-07-18 05:14:12.196 - stdout> 	at com.fasterxml.jackson.module.scala.DefaultScalaModule.version(DefaultScalaModule.scala:19)
2019-07-18 05:14:12.196 - stdout> 	at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:710)
2019-07-18 05:14:12.196 - stdout> 	at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82)
2019-07-18 05:14:12.196 - stdout> 	at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala)
2019-07-18 05:14:12.196 - stdout> 	... 69 more
2019-07-18 05:14:12.196 - stdout> 
2019-07-18 05:14:12.243 - stderr> 19/07/18 05:14:12 INFO SparkContext: Invoking stop() from shutdown hook
2019-07-18 05:14:12.254 - stderr> 19/07/18 05:14:12 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
2019-07-18 05:14:12.263 - stderr> 19/07/18 05:14:12 INFO MemoryStore: MemoryStore cleared
2019-07-18 05:14:12.263 - stderr> 19/07/18 05:14:12 INFO BlockManager: BlockManager stopped
2019-07-18 05:14:12.269 - stderr> 19/07/18 05:14:12 INFO BlockManagerMaster: BlockManagerMaster stopped
2019-07-18 05:14:12.271 - stderr> 19/07/18 05:14:12 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
2019-07-18 05:14:12.276 - stderr> 19/07/18 05:14:12 INFO SparkContext: Successfully stopped SparkContext
2019-07-18 05:14:12.276 - stderr> 19/07/18 05:14:12 INFO ShutdownHookManager: Shutdown hook called
2019-07-18 05:14:12.277 - stderr> 19/07/18 05:14:12 INFO ShutdownHookManager: Deleting directory /tmp/spark-eea0ff7a-8cc1-418b-a261-94ea22f2c5c5
2019-07-18 05:14:12.28 - stderr> 19/07/18 05:14:12 INFO ShutdownHookManager: Deleting directory /tmp/spark-eea0ff7a-8cc1-418b-a261-94ea22f2c5c5/pyspark-660d54f6-554b-408c-a5c0-8213cf5a22c5
2019-07-18 05:14:12.283 - stderr> 19/07/18 05:14:12 INFO ShutdownHookManager: Deleting directory /tmp/spark-5d30178b-f76f-43a4-87e6-f06918a2355a
           
	at org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:528)
	at org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:527)
	at org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1560)
	at org.scalatest.Assertions.fail(Assertions.scala:1089)
	at org.scalatest.Assertions.fail$(Assertions.scala:1085)
	at org.scalatest.FunSuite.fail(FunSuite.scala:1560)
	at org.apache.spark.sql.hive.SparkSubmitTestUtils.runSparkSubmit(SparkSubmitTestUtils.scala:94)
	at org.apache.spark.sql.hive.SparkSubmitTestUtils.runSparkSubmit$(SparkSubmitTestUtils.scala:41)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite.runSparkSubmit(HiveExternalCatalogVersionsSuite.scala:47)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite.$anonfun$beforeAll$1(HiveExternalCatalogVersionsSuite.scala:198)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite.$anonfun$beforeAll$1$adapted(HiveExternalCatalogVersionsSuite.scala:181)
	at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
	at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite.beforeAll(HiveExternalCatalogVersionsSuite.scala:181)
	at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:212)
	at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
	at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
	at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:56)
	at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314)
	at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:507)
	at sbt.ForkMain$Run$2.call(ForkMain.java:296)
	at sbt.ForkMain$Run$2.call(ForkMain.java:286)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)