org.scalatest.exceptions.TestFailedException: spark-submit returned with exit code 1. Command line: './bin/spark-submit' '--name' 'prepare testing tables' '--master' 'local[2]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--conf' 'spark.sql.warehouse.dir=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/warehouse-f4c5d725-dd70-4a76-9d89-789970d43cb9' '--conf' 'spark.sql.test.version.index=0' '--driver-java-options' '-Dderby.system.home=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/warehouse-f4c5d725-dd70-4a76-9d89-789970d43cb9' '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/test3654544805004184229.py' 2018-08-30 17:59:38.502 - stderr> SLF4J: Class path contains multiple SLF4J bindings. 2018-08-30 17:59:38.502 - stderr> SLF4J: Found binding in [jar:file:/tmp/test-spark/spark-2.1.3/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2018-08-30 17:59:38.502 - stderr> SLF4J: Found binding in [jar:file:/home/sparkivy/per-executor-caches/1/.ivy2/cache/org.slf4j/slf4j-log4j12/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2018-08-30 17:59:38.502 - stderr> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2018-08-30 17:59:38.502 - stderr> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 2018-08-30 17:59:38.804 - stdout> 17:59:38.804 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2018-08-30 17:59:41.401 - stdout> 17:59:41.401 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/home/sparkivy/per-executor-caches/1/.ivy2/cache/org.datanucleus/datanucleus-rdbms/jars/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/tmp/test-spark/spark-2.1.3/jars/datanucleus-rdbms-3.2.9.jar." 2018-08-30 17:59:41.426 - stdout> 17:59:41.426 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/tmp/test-spark/spark-2.1.3/jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/home/sparkivy/per-executor-caches/1/.ivy2/cache/org.datanucleus/datanucleus-core/jars/datanucleus-core-3.2.10.jar." 2018-08-30 17:59:41.431 - stdout> 17:59:41.431 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/tmp/test-spark/spark-2.1.3/jars/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/home/sparkivy/per-executor-caches/1/.ivy2/cache/org.datanucleus/datanucleus-api-jdo/jars/datanucleus-api-jdo-3.2.6.jar." 2018-08-30 17:59:57.985 - stdout> 17:59:57.985 WARN org.apache.hadoop.hive.metastore.ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0 2018-08-30 17:59:58.339 - stdout> 17:59:58.339 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database default, returning NoSuchObjectException 2018-08-30 18:00:00.262 - stdout> 18:00:00.262 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database global_temp, returning NoSuchObjectException 2018-08-30 18:00:02.144 - stdout> Traceback (most recent call last): 2018-08-30 18:00:02.145 - stdout> File "/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/test3654544805004184229.py", line 8, in <module> 2018-08-30 18:00:02.145 - stdout> spark.sql("create table data_source_tbl_{} using json as select 1 i".format(version_index)) 2018-08-30 18:00:02.145 - stdout> File "/tmp/test-spark/spark-2.1.3/python/lib/pyspark.zip/pyspark/sql/session.py", line 545, in sql 2018-08-30 18:00:02.145 - stdout> File "/tmp/test-spark/spark-2.1.3/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__ 2018-08-30 18:00:02.145 - stdout> File "/tmp/test-spark/spark-2.1.3/python/lib/pyspark.zip/pyspark/sql/utils.py", line 63, in deco 2018-08-30 18:00:02.145 - stdout> File "/tmp/test-spark/spark-2.1.3/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value 2018-08-30 18:00:02.146 - stdout> py4j.protocol.Py4JJavaError: An error occurred while calling o27.sql. 2018-08-30 18:00:02.146 - stdout> : java.lang.ExceptionInInitializerError 2018-08-30 18:00:02.146 - stdout> at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132) 2018-08-30 18:00:02.146 - stdout> at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113) 2018-08-30 18:00:02.146 - stdout> at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:92) 2018-08-30 18:00:02.146 - stdout> at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:92) 2018-08-30 18:00:02.147 - stdout> at org.apache.spark.sql.Dataset.<init>(Dataset.scala:185) 2018-08-30 18:00:02.147 - stdout> at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64) 2018-08-30 18:00:02.147 - stdout> at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:600) 2018-08-30 18:00:02.147 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 2018-08-30 18:00:02.147 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 2018-08-30 18:00:02.147 - stdout> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2018-08-30 18:00:02.147 - stdout> at java.lang.reflect.Method.invoke(Method.java:497) 2018-08-30 18:00:02.147 - stdout> at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) 2018-08-30 18:00:02.147 - stdout> at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) 2018-08-30 18:00:02.147 - stdout> at py4j.Gateway.invoke(Gateway.java:282) 2018-08-30 18:00:02.147 - stdout> at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) 2018-08-30 18:00:02.147 - stdout> at py4j.commands.CallCommand.execute(CallCommand.java:79) 2018-08-30 18:00:02.147 - stdout> at py4j.GatewayConnection.run(GatewayConnection.java:238) 2018-08-30 18:00:02.147 - stdout> at java.lang.Thread.run(Thread.java:745) 2018-08-30 18:00:02.147 - stdout> Caused by: java.util.NoSuchElementException: key not found: groupId 2018-08-30 18:00:02.147 - stdout> at scala.collection.MapLike$class.default(MapLike.scala:228) 2018-08-30 18:00:02.147 - stdout> at scala.collection.AbstractMap.default(Map.scala:59) 2018-08-30 18:00:02.147 - stdout> at scala.collection.MapLike$class.apply(MapLike.scala:141) 2018-08-30 18:00:02.147 - stdout> at scala.collection.AbstractMap.apply(Map.scala:59) 2018-08-30 18:00:02.147 - stdout> at com.fasterxml.jackson.module.scala.JacksonModule$.version$lzycompute(JacksonModule.scala:27) 2018-08-30 18:00:02.147 - stdout> at com.fasterxml.jackson.module.scala.JacksonModule$.version(JacksonModule.scala:26) 2018-08-30 18:00:02.147 - stdout> at com.fasterxml.jackson.module.scala.JacksonModule$class.version(JacksonModule.scala:49) 2018-08-30 18:00:02.147 - stdout> at com.fasterxml.jackson.module.scala.DefaultScalaModule.version(DefaultScalaModule.scala:19) 2018-08-30 18:00:02.147 - stdout> at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:710) 2018-08-30 18:00:02.147 - stdout> at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82) 2018-08-30 18:00:02.147 - stdout> at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala) 2018-08-30 18:00:02.147 - stdout> ... 18 more 2018-08-30 18:00:02.147 - stdout>

sbt.ForkMain$ForkError: org.scalatest.exceptions.TestFailedException: spark-submit returned with exit code 1.
Command line: './bin/spark-submit' '--name' 'prepare testing tables' '--master' 'local[2]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--conf' 'spark.sql.warehouse.dir=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/warehouse-f4c5d725-dd70-4a76-9d89-789970d43cb9' '--conf' 'spark.sql.test.version.index=0' '--driver-java-options' '-Dderby.system.home=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/warehouse-f4c5d725-dd70-4a76-9d89-789970d43cb9' '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/test3654544805004184229.py'

2018-08-30 17:59:38.502 - stderr> SLF4J: Class path contains multiple SLF4J bindings.
2018-08-30 17:59:38.502 - stderr> SLF4J: Found binding in [jar:file:/tmp/test-spark/spark-2.1.3/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2018-08-30 17:59:38.502 - stderr> SLF4J: Found binding in [jar:file:/home/sparkivy/per-executor-caches/1/.ivy2/cache/org.slf4j/slf4j-log4j12/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2018-08-30 17:59:38.502 - stderr> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2018-08-30 17:59:38.502 - stderr> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2018-08-30 17:59:38.804 - stdout> 17:59:38.804 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2018-08-30 17:59:41.401 - stdout> 17:59:41.401 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/home/sparkivy/per-executor-caches/1/.ivy2/cache/org.datanucleus/datanucleus-rdbms/jars/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/tmp/test-spark/spark-2.1.3/jars/datanucleus-rdbms-3.2.9.jar."
2018-08-30 17:59:41.426 - stdout> 17:59:41.426 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/tmp/test-spark/spark-2.1.3/jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/home/sparkivy/per-executor-caches/1/.ivy2/cache/org.datanucleus/datanucleus-core/jars/datanucleus-core-3.2.10.jar."
2018-08-30 17:59:41.431 - stdout> 17:59:41.431 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/tmp/test-spark/spark-2.1.3/jars/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/home/sparkivy/per-executor-caches/1/.ivy2/cache/org.datanucleus/datanucleus-api-jdo/jars/datanucleus-api-jdo-3.2.6.jar."
2018-08-30 17:59:57.985 - stdout> 17:59:57.985 WARN org.apache.hadoop.hive.metastore.ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
2018-08-30 17:59:58.339 - stdout> 17:59:58.339 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database default, returning NoSuchObjectException
2018-08-30 18:00:00.262 - stdout> 18:00:00.262 WARN org.apache.hadoop.hive.metastore.ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
2018-08-30 18:00:02.144 - stdout> Traceback (most recent call last):
2018-08-30 18:00:02.145 - stdout>   File "/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/test3654544805004184229.py", line 8, in <module>
2018-08-30 18:00:02.145 - stdout>     spark.sql("create table data_source_tbl_{} using json as select 1 i".format(version_index))
2018-08-30 18:00:02.145 - stdout>   File "/tmp/test-spark/spark-2.1.3/python/lib/pyspark.zip/pyspark/sql/session.py", line 545, in sql
2018-08-30 18:00:02.145 - stdout>   File "/tmp/test-spark/spark-2.1.3/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__
2018-08-30 18:00:02.145 - stdout>   File "/tmp/test-spark/spark-2.1.3/python/lib/pyspark.zip/pyspark/sql/utils.py", line 63, in deco
2018-08-30 18:00:02.145 - stdout>   File "/tmp/test-spark/spark-2.1.3/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value
2018-08-30 18:00:02.146 - stdout> py4j.protocol.Py4JJavaError: An error occurred while calling o27.sql.
2018-08-30 18:00:02.146 - stdout> : java.lang.ExceptionInInitializerError
2018-08-30 18:00:02.146 - stdout> 	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
2018-08-30 18:00:02.146 - stdout> 	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113)
2018-08-30 18:00:02.146 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:92)
2018-08-30 18:00:02.146 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:92)
2018-08-30 18:00:02.147 - stdout> 	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:185)
2018-08-30 18:00:02.147 - stdout> 	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64)
2018-08-30 18:00:02.147 - stdout> 	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:600)
2018-08-30 18:00:02.147 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2018-08-30 18:00:02.147 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2018-08-30 18:00:02.147 - stdout> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2018-08-30 18:00:02.147 - stdout> 	at java.lang.reflect.Method.invoke(Method.java:497)
2018-08-30 18:00:02.147 - stdout> 	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
2018-08-30 18:00:02.147 - stdout> 	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
2018-08-30 18:00:02.147 - stdout> 	at py4j.Gateway.invoke(Gateway.java:282)
2018-08-30 18:00:02.147 - stdout> 	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
2018-08-30 18:00:02.147 - stdout> 	at py4j.commands.CallCommand.execute(CallCommand.java:79)
2018-08-30 18:00:02.147 - stdout> 	at py4j.GatewayConnection.run(GatewayConnection.java:238)
2018-08-30 18:00:02.147 - stdout> 	at java.lang.Thread.run(Thread.java:745)
2018-08-30 18:00:02.147 - stdout> Caused by: java.util.NoSuchElementException: key not found: groupId
2018-08-30 18:00:02.147 - stdout> 	at scala.collection.MapLike$class.default(MapLike.scala:228)
2018-08-30 18:00:02.147 - stdout> 	at scala.collection.AbstractMap.default(Map.scala:59)
2018-08-30 18:00:02.147 - stdout> 	at scala.collection.MapLike$class.apply(MapLike.scala:141)
2018-08-30 18:00:02.147 - stdout> 	at scala.collection.AbstractMap.apply(Map.scala:59)
2018-08-30 18:00:02.147 - stdout> 	at com.fasterxml.jackson.module.scala.JacksonModule$.version$lzycompute(JacksonModule.scala:27)
2018-08-30 18:00:02.147 - stdout> 	at com.fasterxml.jackson.module.scala.JacksonModule$.version(JacksonModule.scala:26)
2018-08-30 18:00:02.147 - stdout> 	at com.fasterxml.jackson.module.scala.JacksonModule$class.version(JacksonModule.scala:49)
2018-08-30 18:00:02.147 - stdout> 	at com.fasterxml.jackson.module.scala.DefaultScalaModule.version(DefaultScalaModule.scala:19)
2018-08-30 18:00:02.147 - stdout> 	at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:710)
2018-08-30 18:00:02.147 - stdout> 	at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82)
2018-08-30 18:00:02.147 - stdout> 	at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala)
2018-08-30 18:00:02.147 - stdout> 	... 18 more
2018-08-30 18:00:02.147 - stdout> 
           
	at org.scalatest.Assertions$class.newAssertionFailedException(Assertions.scala:528)
	at org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1560)
	at org.scalatest.Assertions$class.fail(Assertions.scala:1089)
	at org.scalatest.FunSuite.fail(FunSuite.scala:1560)
	at org.apache.spark.sql.hive.SparkSubmitTestUtils$class.runSparkSubmit(SparkSubmitTestUtils.scala:84)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite.runSparkSubmit(HiveExternalCatalogVersionsSuite.scala:43)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite$$anonfun$beforeAll$1.apply(HiveExternalCatalogVersionsSuite.scala:184)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite$$anonfun$beforeAll$1.apply(HiveExternalCatalogVersionsSuite.scala:169)
	at scala.collection.immutable.List.foreach(List.scala:392)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite.beforeAll(HiveExternalCatalogVersionsSuite.scala:169)
	at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:212)
	at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
	at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:52)
	at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314)
	at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:480)
	at sbt.ForkMain$Run$2.call(ForkMain.java:296)
	at sbt.ForkMain$Run$2.call(ForkMain.java:286)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)