org.scalatest.exceptions.TestFailedException: spark-submit returned with exit code 1. Command line: './bin/spark-submit' '--name' 'prepare testing tables' '--master' 'local[2]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--conf' 'spark.sql.warehouse.dir=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-2ad7cd7e-d8cb-455b-95ef-ff2d9a604f3a' '--conf' 'spark.sql.test.version.index=0' '--driver-java-options' '-Dderby.system.home=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-2ad7cd7e-d8cb-455b-95ef-ff2d9a604f3a' '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/test5924024119323664615.py' 2019-04-19 12:29:13.999 - stdout> 2019-04-19 12:29:13 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2019-04-19 12:29:14.6 - stdout> 2019-04-19 12:29:14 INFO SparkContext:54 - Running Spark version 2.3.3 2019-04-19 12:29:14.619 - stdout> 2019-04-19 12:29:14 INFO SparkContext:54 - Submitted application: prepare testing tables 2019-04-19 12:29:14.672 - stdout> 2019-04-19 12:29:14 INFO SecurityManager:54 - Changing view acls to: jenkins 2019-04-19 12:29:14.673 - stdout> 2019-04-19 12:29:14 INFO SecurityManager:54 - Changing modify acls to: jenkins 2019-04-19 12:29:14.673 - stdout> 2019-04-19 12:29:14 INFO SecurityManager:54 - Changing view acls groups to: 2019-04-19 12:29:14.674 - stdout> 2019-04-19 12:29:14 INFO SecurityManager:54 - Changing modify acls groups to: 2019-04-19 12:29:14.674 - stdout> 2019-04-19 12:29:14 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jenkins); groups with view permissions: Set(); users with modify permissions: Set(jenkins); groups with modify permissions: Set() 2019-04-19 12:29:14.892 - stdout> 2019-04-19 12:29:14 INFO Utils:54 - Successfully started service 'sparkDriver' on port 36986. 2019-04-19 12:29:14.92 - stdout> 2019-04-19 12:29:14 INFO SparkEnv:54 - Registering MapOutputTracker 2019-04-19 12:29:14.943 - stdout> 2019-04-19 12:29:14 INFO SparkEnv:54 - Registering BlockManagerMaster 2019-04-19 12:29:14.946 - stdout> 2019-04-19 12:29:14 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 2019-04-19 12:29:14.947 - stdout> 2019-04-19 12:29:14 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up 2019-04-19 12:29:14.957 - stdout> 2019-04-19 12:29:14 INFO DiskBlockManager:54 - Created local directory at /tmp/blockmgr-83dc744d-9b43-47de-8e1a-17b020f6c5f5 2019-04-19 12:29:14.977 - stdout> 2019-04-19 12:29:14 INFO MemoryStore:54 - MemoryStore started with capacity 366.3 MB 2019-04-19 12:29:14.994 - stdout> 2019-04-19 12:29:14 INFO SparkEnv:54 - Registering OutputCommitCoordinator 2019-04-19 12:29:15.301 - stdout> 2019-04-19 12:29:15 INFO SparkContext:54 - Added file file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/test5924024119323664615.py at file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/test5924024119323664615.py with timestamp 1555702155300 2019-04-19 12:29:15.303 - stdout> 2019-04-19 12:29:15 INFO Utils:54 - Copying /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/test5924024119323664615.py to /tmp/spark-7b6a10b7-370e-4dd7-8a5d-591bcb610267/userFiles-4745b442-3fef-40ae-bb92-dfa244fb1fa9/test5924024119323664615.py 2019-04-19 12:29:15.369 - stdout> 2019-04-19 12:29:15 INFO Executor:54 - Starting executor ID driver on host localhost 2019-04-19 12:29:15.39 - stdout> 2019-04-19 12:29:15 INFO Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 46335. 2019-04-19 12:29:15.391 - stdout> 2019-04-19 12:29:15 INFO NettyBlockTransferService:54 - Server created on amp-jenkins-worker-04.amp:46335 2019-04-19 12:29:15.392 - stdout> 2019-04-19 12:29:15 INFO BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 2019-04-19 12:29:15.419 - stdout> 2019-04-19 12:29:15 INFO BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, amp-jenkins-worker-04.amp, 46335, None) 2019-04-19 12:29:15.424 - stdout> 2019-04-19 12:29:15 INFO BlockManagerMasterEndpoint:54 - Registering block manager amp-jenkins-worker-04.amp:46335 with 366.3 MB RAM, BlockManagerId(driver, amp-jenkins-worker-04.amp, 46335, None) 2019-04-19 12:29:15.428 - stdout> 2019-04-19 12:29:15 INFO BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, amp-jenkins-worker-04.amp, 46335, None) 2019-04-19 12:29:15.428 - stdout> 2019-04-19 12:29:15 INFO BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, amp-jenkins-worker-04.amp, 46335, None) 2019-04-19 12:29:15.619 - stdout> 2019-04-19 12:29:15 INFO log:192 - Logging initialized @2749ms 2019-04-19 12:29:15.771 - stdout> 2019-04-19 12:29:15 INFO SharedState:54 - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-2ad7cd7e-d8cb-455b-95ef-ff2d9a604f3a'). 2019-04-19 12:29:15.772 - stdout> 2019-04-19 12:29:15 INFO SharedState:54 - Warehouse path is '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-2ad7cd7e-d8cb-455b-95ef-ff2d9a604f3a'. 2019-04-19 12:29:16.239 - stdout> 2019-04-19 12:29:16 INFO StateStoreCoordinatorRef:54 - Registered StateStoreCoordinator endpoint 2019-04-19 12:29:16.583 - stdout> 2019-04-19 12:29:16 INFO HiveUtils:54 - Initializing HiveMetastoreConnection version 1.2.1 using Spark classes. 2019-04-19 12:29:17.221 - stdout> 2019-04-19 12:29:17 INFO HiveMetaStore:589 - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 2019-04-19 12:29:17.248 - stdout> 2019-04-19 12:29:17 INFO ObjectStore:289 - ObjectStore, initialize called 2019-04-19 12:29:17.389 - stdout> 2019-04-19 12:29:17 INFO Persistence:77 - Property hive.metastore.integral.jdo.pushdown unknown - will be ignored 2019-04-19 12:29:17.389 - stdout> 2019-04-19 12:29:17 INFO Persistence:77 - Property datanucleus.cache.level2 unknown - will be ignored 2019-04-19 12:29:27.836 - stdout> 2019-04-19 12:29:27 INFO ObjectStore:370 - Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order" 2019-04-19 12:29:29.499 - stdout> 2019-04-19 12:29:29 INFO Datastore:77 - The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table. 2019-04-19 12:29:29.499 - stdout> 2019-04-19 12:29:29 INFO Datastore:77 - The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table. 2019-04-19 12:29:39.812 - stdout> 2019-04-19 12:29:39 INFO Datastore:77 - The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table. 2019-04-19 12:29:39.812 - stdout> 2019-04-19 12:29:39 INFO Datastore:77 - The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table. 2019-04-19 12:29:42.985 - stdout> 2019-04-19 12:29:42 INFO MetaStoreDirectSql:139 - Using direct SQL, underlying DB is DERBY 2019-04-19 12:29:42.988 - stdout> 2019-04-19 12:29:42 INFO ObjectStore:272 - Initialized ObjectStore 2019-04-19 12:29:43.365 - stdout> 2019-04-19 12:29:43 WARN ObjectStore:6666 - Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0 2019-04-19 12:29:43.692 - stdout> 2019-04-19 12:29:43 WARN ObjectStore:568 - Failed to get database default, returning NoSuchObjectException 2019-04-19 12:29:44.217 - stdout> 2019-04-19 12:29:44 INFO HiveMetaStore:663 - Added admin role in metastore 2019-04-19 12:29:44.233 - stdout> 2019-04-19 12:29:44 INFO HiveMetaStore:672 - Added public role in metastore 2019-04-19 12:29:44.816 - stdout> 2019-04-19 12:29:44 INFO HiveMetaStore:712 - No user is added in admin role, since config is empty 2019-04-19 12:29:44.947 - stdout> 2019-04-19 12:29:44 INFO HiveMetaStore:746 - 0: get_all_databases 2019-04-19 12:29:44.948 - stdout> 2019-04-19 12:29:44 INFO audit:371 - ugi=jenkins ip=unknown-ip-addr cmd=get_all_databases 2019-04-19 12:29:44.963 - stdout> 2019-04-19 12:29:44 INFO HiveMetaStore:746 - 0: get_functions: db=default pat=* 2019-04-19 12:29:44.963 - stdout> 2019-04-19 12:29:44 INFO audit:371 - ugi=jenkins ip=unknown-ip-addr cmd=get_functions: db=default pat=* 2019-04-19 12:29:44.965 - stdout> 2019-04-19 12:29:44 INFO Datastore:77 - The class "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as "embedded-only" so does not have its own datastore table. 2019-04-19 12:29:46.26 - stdout> 2019-04-19 12:29:46 INFO SessionState:641 - Created local directory: /tmp/4bc68a3b-73bd-4a2a-8ab5-344458265efc_resources 2019-04-19 12:29:46.264 - stdout> 2019-04-19 12:29:46 INFO SessionState:641 - Created HDFS directory: /tmp/hive/jenkins/4bc68a3b-73bd-4a2a-8ab5-344458265efc 2019-04-19 12:29:46.267 - stdout> 2019-04-19 12:29:46 INFO SessionState:641 - Created local directory: /tmp/jenkins/4bc68a3b-73bd-4a2a-8ab5-344458265efc 2019-04-19 12:29:46.27 - stdout> 2019-04-19 12:29:46 INFO SessionState:641 - Created HDFS directory: /tmp/hive/jenkins/4bc68a3b-73bd-4a2a-8ab5-344458265efc/_tmp_space.db 2019-04-19 12:29:46.273 - stdout> 2019-04-19 12:29:46 INFO HiveClientImpl:54 - Warehouse location for Hive client (version 1.2.2) is /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-2ad7cd7e-d8cb-455b-95ef-ff2d9a604f3a 2019-04-19 12:29:46.285 - stdout> 2019-04-19 12:29:46 INFO HiveMetaStore:746 - 0: get_database: default 2019-04-19 12:29:46.285 - stdout> 2019-04-19 12:29:46 INFO audit:371 - ugi=jenkins ip=unknown-ip-addr cmd=get_database: default 2019-04-19 12:29:46.304 - stdout> 2019-04-19 12:29:46 INFO HiveMetaStore:746 - 0: get_database: global_temp 2019-04-19 12:29:46.304 - stdout> 2019-04-19 12:29:46 INFO audit:371 - ugi=jenkins ip=unknown-ip-addr cmd=get_database: global_temp 2019-04-19 12:29:46.306 - stdout> 2019-04-19 12:29:46 WARN ObjectStore:568 - Failed to get database global_temp, returning NoSuchObjectException 2019-04-19 12:29:47.69 - stdout> Traceback (most recent call last): 2019-04-19 12:29:47.69 - stdout> File "/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/test5924024119323664615.py", line 8, in <module> 2019-04-19 12:29:47.69 - stdout> spark.sql("create table data_source_tbl_{} using json as select 1 i".format(version_index)) 2019-04-19 12:29:47.69 - stdout> File "/tmp/test-spark/spark-2.3.3/python/lib/pyspark.zip/pyspark/sql/session.py", line 710, in sql 2019-04-19 12:29:47.69 - stdout> File "/tmp/test-spark/spark-2.3.3/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__ 2019-04-19 12:29:47.691 - stdout> File "/tmp/test-spark/spark-2.3.3/python/lib/pyspark.zip/pyspark/sql/utils.py", line 63, in deco 2019-04-19 12:29:47.691 - stdout> File "/tmp/test-spark/spark-2.3.3/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value 2019-04-19 12:29:47.692 - stdout> py4j.protocol.Py4JJavaError: An error occurred while calling o27.sql. 2019-04-19 12:29:47.692 - stdout> : java.lang.ExceptionInInitializerError 2019-04-19 12:29:47.692 - stdout> at org.apache.spark.SparkContext.withScope(SparkContext.scala:693) 2019-04-19 12:29:47.693 - stdout> at org.apache.spark.SparkContext.parallelize(SparkContext.scala:710) 2019-04-19 12:29:47.693 - stdout> at org.apache.spark.sql.execution.SparkStrategies.singleRowRdd$lzycompute(SparkStrategies.scala:399) 2019-04-19 12:29:47.693 - stdout> at org.apache.spark.sql.execution.SparkStrategies.singleRowRdd(SparkStrategies.scala:399) 2019-04-19 12:29:47.693 - stdout> at org.apache.spark.sql.execution.SparkStrategies$BasicOperators$.apply(SparkStrategies.scala:536) 2019-04-19 12:29:47.693 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:63) 2019-04-19 12:29:47.693 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:63) 2019-04-19 12:29:47.693 - stdout> at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434) 2019-04-19 12:29:47.693 - stdout> at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440) 2019-04-19 12:29:47.693 - stdout> at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:439) 2019-04-19 12:29:47.693 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93) 2019-04-19 12:29:47.693 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78) 2019-04-19 12:29:47.693 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75) 2019-04-19 12:29:47.693 - stdout> at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) 2019-04-19 12:29:47.693 - stdout> at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) 2019-04-19 12:29:47.693 - stdout> at scala.collection.Iterator$class.foreach(Iterator.scala:893) 2019-04-19 12:29:47.693 - stdout> at scala.collection.AbstractIterator.foreach(Iterator.scala:1336) 2019-04-19 12:29:47.693 - stdout> at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157) 2019-04-19 12:29:47.693 - stdout> at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1336) 2019-04-19 12:29:47.694 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75) 2019-04-19 12:29:47.694 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67) 2019-04-19 12:29:47.694 - stdout> at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434) 2019-04-19 12:29:47.694 - stdout> at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440) 2019-04-19 12:29:47.694 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93) 2019-04-19 12:29:47.694 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78) 2019-04-19 12:29:47.694 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75) 2019-04-19 12:29:47.694 - stdout> at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) 2019-04-19 12:29:47.694 - stdout> at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) 2019-04-19 12:29:47.694 - stdout> at scala.collection.Iterator$class.foreach(Iterator.scala:893) 2019-04-19 12:29:47.694 - stdout> at scala.collection.AbstractIterator.foreach(Iterator.scala:1336) 2019-04-19 12:29:47.694 - stdout> at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157) 2019-04-19 12:29:47.694 - stdout> at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1336) 2019-04-19 12:29:47.694 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75) 2019-04-19 12:29:47.694 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67) 2019-04-19 12:29:47.694 - stdout> at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434) 2019-04-19 12:29:47.694 - stdout> at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440) 2019-04-19 12:29:47.694 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93) 2019-04-19 12:29:47.694 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78) 2019-04-19 12:29:47.694 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75) 2019-04-19 12:29:47.694 - stdout> at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) 2019-04-19 12:29:47.695 - stdout> at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157) 2019-04-19 12:29:47.695 - stdout> at scala.collection.Iterator$class.foreach(Iterator.scala:893) 2019-04-19 12:29:47.695 - stdout> at scala.collection.AbstractIterator.foreach(Iterator.scala:1336) 2019-04-19 12:29:47.695 - stdout> at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157) 2019-04-19 12:29:47.695 - stdout> at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1336) 2019-04-19 12:29:47.695 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75) 2019-04-19 12:29:47.695 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67) 2019-04-19 12:29:47.695 - stdout> at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434) 2019-04-19 12:29:47.695 - stdout> at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440) 2019-04-19 12:29:47.695 - stdout> at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93) 2019-04-19 12:29:47.695 - stdout> at org.apache.spark.sql.execution.QueryExecution.sparkPlan$lzycompute(QueryExecution.scala:72) 2019-04-19 12:29:47.695 - stdout> at org.apache.spark.sql.execution.QueryExecution.sparkPlan(QueryExecution.scala:68) 2019-04-19 12:29:47.695 - stdout> at org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:77) 2019-04-19 12:29:47.695 - stdout> at org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:77) 2019-04-19 12:29:47.695 - stdout> at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3254) 2019-04-19 12:29:47.695 - stdout> at org.apache.spark.sql.Dataset.<init>(Dataset.scala:190) 2019-04-19 12:29:47.695 - stdout> at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:75) 2019-04-19 12:29:47.696 - stdout> at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642) 2019-04-19 12:29:47.696 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 2019-04-19 12:29:47.696 - stdout> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 2019-04-19 12:29:47.696 - stdout> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 2019-04-19 12:29:47.696 - stdout> at java.lang.reflect.Method.invoke(Method.java:498) 2019-04-19 12:29:47.696 - stdout> at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) 2019-04-19 12:29:47.696 - stdout> at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) 2019-04-19 12:29:47.696 - stdout> at py4j.Gateway.invoke(Gateway.java:282) 2019-04-19 12:29:47.696 - stdout> at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) 2019-04-19 12:29:47.696 - stdout> at py4j.commands.CallCommand.execute(CallCommand.java:79) 2019-04-19 12:29:47.696 - stdout> at py4j.GatewayConnection.run(GatewayConnection.java:238) 2019-04-19 12:29:47.696 - stdout> at java.lang.Thread.run(Thread.java:748) 2019-04-19 12:29:47.696 - stdout> Caused by: java.util.NoSuchElementException: key not found: groupId 2019-04-19 12:29:47.696 - stdout> at scala.collection.MapLike$class.default(MapLike.scala:228) 2019-04-19 12:29:47.696 - stdout> at scala.collection.AbstractMap.default(Map.scala:59) 2019-04-19 12:29:47.696 - stdout> at scala.collection.MapLike$class.apply(MapLike.scala:141) 2019-04-19 12:29:47.696 - stdout> at scala.collection.AbstractMap.apply(Map.scala:59) 2019-04-19 12:29:47.696 - stdout> at com.fasterxml.jackson.module.scala.JacksonModule$.version$lzycompute(JacksonModule.scala:27) 2019-04-19 12:29:47.696 - stdout> at com.fasterxml.jackson.module.scala.JacksonModule$.version(JacksonModule.scala:26) 2019-04-19 12:29:47.696 - stdout> at com.fasterxml.jackson.module.scala.JacksonModule$class.version(JacksonModule.scala:49) 2019-04-19 12:29:47.696 - stdout> at com.fasterxml.jackson.module.scala.DefaultScalaModule.version(DefaultScalaModule.scala:19) 2019-04-19 12:29:47.696 - stdout> at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:710) 2019-04-19 12:29:47.696 - stdout> at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82) 2019-04-19 12:29:47.696 - stdout> at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala) 2019-04-19 12:29:47.696 - stdout> ... 69 more 2019-04-19 12:29:47.697 - stdout> 2019-04-19 12:29:47.739 - stdout> 2019-04-19 12:29:47 INFO SparkContext:54 - Invoking stop() from shutdown hook 2019-04-19 12:29:47.753 - stdout> 2019-04-19 12:29:47 INFO MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped! 2019-04-19 12:29:47.761 - stdout> 2019-04-19 12:29:47 INFO MemoryStore:54 - MemoryStore cleared 2019-04-19 12:29:47.761 - stdout> 2019-04-19 12:29:47 INFO BlockManager:54 - BlockManager stopped 2019-04-19 12:29:47.762 - stdout> 2019-04-19 12:29:47 INFO BlockManagerMaster:54 - BlockManagerMaster stopped 2019-04-19 12:29:47.768 - stdout> 2019-04-19 12:29:47 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped! 2019-04-19 12:29:47.77 - stdout> 2019-04-19 12:29:47 INFO SparkContext:54 - Successfully stopped SparkContext 2019-04-19 12:29:47.77 - stdout> 2019-04-19 12:29:47 INFO ShutdownHookManager:54 - Shutdown hook called 2019-04-19 12:29:47.771 - stdout> 2019-04-19 12:29:47 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-7b6a10b7-370e-4dd7-8a5d-591bcb610267 2019-04-19 12:29:47.772 - stdout> 2019-04-19 12:29:47 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-7b6a10b7-370e-4dd7-8a5d-591bcb610267/pyspark-bc4ed90c-a6ab-4109-aee9-51d218f23042 2019-04-19 12:29:47.772 - stdout> 2019-04-19 12:29:47 INFO ShutdownHookManager:54 - Deleting directory /tmp/spark-9503b113-97b7-4215-b53f-cae37b0a6646

sbt.ForkMain$ForkError: org.scalatest.exceptions.TestFailedException: spark-submit returned with exit code 1.
Command line: './bin/spark-submit' '--name' 'prepare testing tables' '--master' 'local[2]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--conf' 'spark.sql.warehouse.dir=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-2ad7cd7e-d8cb-455b-95ef-ff2d9a604f3a' '--conf' 'spark.sql.test.version.index=0' '--driver-java-options' '-Dderby.system.home=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-2ad7cd7e-d8cb-455b-95ef-ff2d9a604f3a' '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/test5924024119323664615.py'

2019-04-19 12:29:13.999 - stdout> 2019-04-19 12:29:13 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2019-04-19 12:29:14.6 - stdout> 2019-04-19 12:29:14 INFO  SparkContext:54 - Running Spark version 2.3.3
2019-04-19 12:29:14.619 - stdout> 2019-04-19 12:29:14 INFO  SparkContext:54 - Submitted application: prepare testing tables
2019-04-19 12:29:14.672 - stdout> 2019-04-19 12:29:14 INFO  SecurityManager:54 - Changing view acls to: jenkins
2019-04-19 12:29:14.673 - stdout> 2019-04-19 12:29:14 INFO  SecurityManager:54 - Changing modify acls to: jenkins
2019-04-19 12:29:14.673 - stdout> 2019-04-19 12:29:14 INFO  SecurityManager:54 - Changing view acls groups to: 
2019-04-19 12:29:14.674 - stdout> 2019-04-19 12:29:14 INFO  SecurityManager:54 - Changing modify acls groups to: 
2019-04-19 12:29:14.674 - stdout> 2019-04-19 12:29:14 INFO  SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
2019-04-19 12:29:14.892 - stdout> 2019-04-19 12:29:14 INFO  Utils:54 - Successfully started service 'sparkDriver' on port 36986.
2019-04-19 12:29:14.92 - stdout> 2019-04-19 12:29:14 INFO  SparkEnv:54 - Registering MapOutputTracker
2019-04-19 12:29:14.943 - stdout> 2019-04-19 12:29:14 INFO  SparkEnv:54 - Registering BlockManagerMaster
2019-04-19 12:29:14.946 - stdout> 2019-04-19 12:29:14 INFO  BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2019-04-19 12:29:14.947 - stdout> 2019-04-19 12:29:14 INFO  BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
2019-04-19 12:29:14.957 - stdout> 2019-04-19 12:29:14 INFO  DiskBlockManager:54 - Created local directory at /tmp/blockmgr-83dc744d-9b43-47de-8e1a-17b020f6c5f5
2019-04-19 12:29:14.977 - stdout> 2019-04-19 12:29:14 INFO  MemoryStore:54 - MemoryStore started with capacity 366.3 MB
2019-04-19 12:29:14.994 - stdout> 2019-04-19 12:29:14 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
2019-04-19 12:29:15.301 - stdout> 2019-04-19 12:29:15 INFO  SparkContext:54 - Added file file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/test5924024119323664615.py at file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/test5924024119323664615.py with timestamp 1555702155300
2019-04-19 12:29:15.303 - stdout> 2019-04-19 12:29:15 INFO  Utils:54 - Copying /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/test5924024119323664615.py to /tmp/spark-7b6a10b7-370e-4dd7-8a5d-591bcb610267/userFiles-4745b442-3fef-40ae-bb92-dfa244fb1fa9/test5924024119323664615.py
2019-04-19 12:29:15.369 - stdout> 2019-04-19 12:29:15 INFO  Executor:54 - Starting executor ID driver on host localhost
2019-04-19 12:29:15.39 - stdout> 2019-04-19 12:29:15 INFO  Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 46335.
2019-04-19 12:29:15.391 - stdout> 2019-04-19 12:29:15 INFO  NettyBlockTransferService:54 - Server created on amp-jenkins-worker-04.amp:46335
2019-04-19 12:29:15.392 - stdout> 2019-04-19 12:29:15 INFO  BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2019-04-19 12:29:15.419 - stdout> 2019-04-19 12:29:15 INFO  BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, amp-jenkins-worker-04.amp, 46335, None)
2019-04-19 12:29:15.424 - stdout> 2019-04-19 12:29:15 INFO  BlockManagerMasterEndpoint:54 - Registering block manager amp-jenkins-worker-04.amp:46335 with 366.3 MB RAM, BlockManagerId(driver, amp-jenkins-worker-04.amp, 46335, None)
2019-04-19 12:29:15.428 - stdout> 2019-04-19 12:29:15 INFO  BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, amp-jenkins-worker-04.amp, 46335, None)
2019-04-19 12:29:15.428 - stdout> 2019-04-19 12:29:15 INFO  BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, amp-jenkins-worker-04.amp, 46335, None)
2019-04-19 12:29:15.619 - stdout> 2019-04-19 12:29:15 INFO  log:192 - Logging initialized @2749ms
2019-04-19 12:29:15.771 - stdout> 2019-04-19 12:29:15 INFO  SharedState:54 - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-2ad7cd7e-d8cb-455b-95ef-ff2d9a604f3a').
2019-04-19 12:29:15.772 - stdout> 2019-04-19 12:29:15 INFO  SharedState:54 - Warehouse path is '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-2ad7cd7e-d8cb-455b-95ef-ff2d9a604f3a'.
2019-04-19 12:29:16.239 - stdout> 2019-04-19 12:29:16 INFO  StateStoreCoordinatorRef:54 - Registered StateStoreCoordinator endpoint
2019-04-19 12:29:16.583 - stdout> 2019-04-19 12:29:16 INFO  HiveUtils:54 - Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
2019-04-19 12:29:17.221 - stdout> 2019-04-19 12:29:17 INFO  HiveMetaStore:589 - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
2019-04-19 12:29:17.248 - stdout> 2019-04-19 12:29:17 INFO  ObjectStore:289 - ObjectStore, initialize called
2019-04-19 12:29:17.389 - stdout> 2019-04-19 12:29:17 INFO  Persistence:77 - Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
2019-04-19 12:29:17.389 - stdout> 2019-04-19 12:29:17 INFO  Persistence:77 - Property datanucleus.cache.level2 unknown - will be ignored
2019-04-19 12:29:27.836 - stdout> 2019-04-19 12:29:27 INFO  ObjectStore:370 - Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
2019-04-19 12:29:29.499 - stdout> 2019-04-19 12:29:29 INFO  Datastore:77 - The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
2019-04-19 12:29:29.499 - stdout> 2019-04-19 12:29:29 INFO  Datastore:77 - The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
2019-04-19 12:29:39.812 - stdout> 2019-04-19 12:29:39 INFO  Datastore:77 - The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
2019-04-19 12:29:39.812 - stdout> 2019-04-19 12:29:39 INFO  Datastore:77 - The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
2019-04-19 12:29:42.985 - stdout> 2019-04-19 12:29:42 INFO  MetaStoreDirectSql:139 - Using direct SQL, underlying DB is DERBY
2019-04-19 12:29:42.988 - stdout> 2019-04-19 12:29:42 INFO  ObjectStore:272 - Initialized ObjectStore
2019-04-19 12:29:43.365 - stdout> 2019-04-19 12:29:43 WARN  ObjectStore:6666 - Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
2019-04-19 12:29:43.692 - stdout> 2019-04-19 12:29:43 WARN  ObjectStore:568 - Failed to get database default, returning NoSuchObjectException
2019-04-19 12:29:44.217 - stdout> 2019-04-19 12:29:44 INFO  HiveMetaStore:663 - Added admin role in metastore
2019-04-19 12:29:44.233 - stdout> 2019-04-19 12:29:44 INFO  HiveMetaStore:672 - Added public role in metastore
2019-04-19 12:29:44.816 - stdout> 2019-04-19 12:29:44 INFO  HiveMetaStore:712 - No user is added in admin role, since config is empty
2019-04-19 12:29:44.947 - stdout> 2019-04-19 12:29:44 INFO  HiveMetaStore:746 - 0: get_all_databases
2019-04-19 12:29:44.948 - stdout> 2019-04-19 12:29:44 INFO  audit:371 - ugi=jenkins	ip=unknown-ip-addr	cmd=get_all_databases	
2019-04-19 12:29:44.963 - stdout> 2019-04-19 12:29:44 INFO  HiveMetaStore:746 - 0: get_functions: db=default pat=*
2019-04-19 12:29:44.963 - stdout> 2019-04-19 12:29:44 INFO  audit:371 - ugi=jenkins	ip=unknown-ip-addr	cmd=get_functions: db=default pat=*	
2019-04-19 12:29:44.965 - stdout> 2019-04-19 12:29:44 INFO  Datastore:77 - The class "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as "embedded-only" so does not have its own datastore table.
2019-04-19 12:29:46.26 - stdout> 2019-04-19 12:29:46 INFO  SessionState:641 - Created local directory: /tmp/4bc68a3b-73bd-4a2a-8ab5-344458265efc_resources
2019-04-19 12:29:46.264 - stdout> 2019-04-19 12:29:46 INFO  SessionState:641 - Created HDFS directory: /tmp/hive/jenkins/4bc68a3b-73bd-4a2a-8ab5-344458265efc
2019-04-19 12:29:46.267 - stdout> 2019-04-19 12:29:46 INFO  SessionState:641 - Created local directory: /tmp/jenkins/4bc68a3b-73bd-4a2a-8ab5-344458265efc
2019-04-19 12:29:46.27 - stdout> 2019-04-19 12:29:46 INFO  SessionState:641 - Created HDFS directory: /tmp/hive/jenkins/4bc68a3b-73bd-4a2a-8ab5-344458265efc/_tmp_space.db
2019-04-19 12:29:46.273 - stdout> 2019-04-19 12:29:46 INFO  HiveClientImpl:54 - Warehouse location for Hive client (version 1.2.2) is /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/warehouse-2ad7cd7e-d8cb-455b-95ef-ff2d9a604f3a
2019-04-19 12:29:46.285 - stdout> 2019-04-19 12:29:46 INFO  HiveMetaStore:746 - 0: get_database: default
2019-04-19 12:29:46.285 - stdout> 2019-04-19 12:29:46 INFO  audit:371 - ugi=jenkins	ip=unknown-ip-addr	cmd=get_database: default	
2019-04-19 12:29:46.304 - stdout> 2019-04-19 12:29:46 INFO  HiveMetaStore:746 - 0: get_database: global_temp
2019-04-19 12:29:46.304 - stdout> 2019-04-19 12:29:46 INFO  audit:371 - ugi=jenkins	ip=unknown-ip-addr	cmd=get_database: global_temp	
2019-04-19 12:29:46.306 - stdout> 2019-04-19 12:29:46 WARN  ObjectStore:568 - Failed to get database global_temp, returning NoSuchObjectException
2019-04-19 12:29:47.69 - stdout> Traceback (most recent call last):
2019-04-19 12:29:47.69 - stdout>   File "/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/sql/hive/target/tmp/org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite/test5924024119323664615.py", line 8, in <module>
2019-04-19 12:29:47.69 - stdout>     spark.sql("create table data_source_tbl_{} using json as select 1 i".format(version_index))
2019-04-19 12:29:47.69 - stdout>   File "/tmp/test-spark/spark-2.3.3/python/lib/pyspark.zip/pyspark/sql/session.py", line 710, in sql
2019-04-19 12:29:47.69 - stdout>   File "/tmp/test-spark/spark-2.3.3/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__
2019-04-19 12:29:47.691 - stdout>   File "/tmp/test-spark/spark-2.3.3/python/lib/pyspark.zip/pyspark/sql/utils.py", line 63, in deco
2019-04-19 12:29:47.691 - stdout>   File "/tmp/test-spark/spark-2.3.3/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value
2019-04-19 12:29:47.692 - stdout> py4j.protocol.Py4JJavaError: An error occurred while calling o27.sql.
2019-04-19 12:29:47.692 - stdout> : java.lang.ExceptionInInitializerError
2019-04-19 12:29:47.692 - stdout> 	at org.apache.spark.SparkContext.withScope(SparkContext.scala:693)
2019-04-19 12:29:47.693 - stdout> 	at org.apache.spark.SparkContext.parallelize(SparkContext.scala:710)
2019-04-19 12:29:47.693 - stdout> 	at org.apache.spark.sql.execution.SparkStrategies.singleRowRdd$lzycompute(SparkStrategies.scala:399)
2019-04-19 12:29:47.693 - stdout> 	at org.apache.spark.sql.execution.SparkStrategies.singleRowRdd(SparkStrategies.scala:399)
2019-04-19 12:29:47.693 - stdout> 	at org.apache.spark.sql.execution.SparkStrategies$BasicOperators$.apply(SparkStrategies.scala:536)
2019-04-19 12:29:47.693 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:63)
2019-04-19 12:29:47.693 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:63)
2019-04-19 12:29:47.693 - stdout> 	at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
2019-04-19 12:29:47.693 - stdout> 	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
2019-04-19 12:29:47.693 - stdout> 	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:439)
2019-04-19 12:29:47.693 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
2019-04-19 12:29:47.693 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78)
2019-04-19 12:29:47.693 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75)
2019-04-19 12:29:47.693 - stdout> 	at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)
2019-04-19 12:29:47.693 - stdout> 	at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)
2019-04-19 12:29:47.693 - stdout> 	at scala.collection.Iterator$class.foreach(Iterator.scala:893)
2019-04-19 12:29:47.693 - stdout> 	at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
2019-04-19 12:29:47.693 - stdout> 	at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157)
2019-04-19 12:29:47.693 - stdout> 	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1336)
2019-04-19 12:29:47.694 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75)
2019-04-19 12:29:47.694 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67)
2019-04-19 12:29:47.694 - stdout> 	at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
2019-04-19 12:29:47.694 - stdout> 	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
2019-04-19 12:29:47.694 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
2019-04-19 12:29:47.694 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78)
2019-04-19 12:29:47.694 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75)
2019-04-19 12:29:47.694 - stdout> 	at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)
2019-04-19 12:29:47.694 - stdout> 	at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)
2019-04-19 12:29:47.694 - stdout> 	at scala.collection.Iterator$class.foreach(Iterator.scala:893)
2019-04-19 12:29:47.694 - stdout> 	at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
2019-04-19 12:29:47.694 - stdout> 	at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157)
2019-04-19 12:29:47.694 - stdout> 	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1336)
2019-04-19 12:29:47.694 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75)
2019-04-19 12:29:47.694 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67)
2019-04-19 12:29:47.694 - stdout> 	at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
2019-04-19 12:29:47.694 - stdout> 	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
2019-04-19 12:29:47.694 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
2019-04-19 12:29:47.694 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:78)
2019-04-19 12:29:47.694 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2$$anonfun$apply$2.apply(QueryPlanner.scala:75)
2019-04-19 12:29:47.694 - stdout> 	at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)
2019-04-19 12:29:47.695 - stdout> 	at scala.collection.TraversableOnce$$anonfun$foldLeft$1.apply(TraversableOnce.scala:157)
2019-04-19 12:29:47.695 - stdout> 	at scala.collection.Iterator$class.foreach(Iterator.scala:893)
2019-04-19 12:29:47.695 - stdout> 	at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
2019-04-19 12:29:47.695 - stdout> 	at scala.collection.TraversableOnce$class.foldLeft(TraversableOnce.scala:157)
2019-04-19 12:29:47.695 - stdout> 	at scala.collection.AbstractIterator.foldLeft(Iterator.scala:1336)
2019-04-19 12:29:47.695 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:75)
2019-04-19 12:29:47.695 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$2.apply(QueryPlanner.scala:67)
2019-04-19 12:29:47.695 - stdout> 	at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
2019-04-19 12:29:47.695 - stdout> 	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
2019-04-19 12:29:47.695 - stdout> 	at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:93)
2019-04-19 12:29:47.695 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.sparkPlan$lzycompute(QueryExecution.scala:72)
2019-04-19 12:29:47.695 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.sparkPlan(QueryExecution.scala:68)
2019-04-19 12:29:47.695 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:77)
2019-04-19 12:29:47.695 - stdout> 	at org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:77)
2019-04-19 12:29:47.695 - stdout> 	at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3254)
2019-04-19 12:29:47.695 - stdout> 	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:190)
2019-04-19 12:29:47.695 - stdout> 	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:75)
2019-04-19 12:29:47.696 - stdout> 	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
2019-04-19 12:29:47.696 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
2019-04-19 12:29:47.696 - stdout> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
2019-04-19 12:29:47.696 - stdout> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2019-04-19 12:29:47.696 - stdout> 	at java.lang.reflect.Method.invoke(Method.java:498)
2019-04-19 12:29:47.696 - stdout> 	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
2019-04-19 12:29:47.696 - stdout> 	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
2019-04-19 12:29:47.696 - stdout> 	at py4j.Gateway.invoke(Gateway.java:282)
2019-04-19 12:29:47.696 - stdout> 	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
2019-04-19 12:29:47.696 - stdout> 	at py4j.commands.CallCommand.execute(CallCommand.java:79)
2019-04-19 12:29:47.696 - stdout> 	at py4j.GatewayConnection.run(GatewayConnection.java:238)
2019-04-19 12:29:47.696 - stdout> 	at java.lang.Thread.run(Thread.java:748)
2019-04-19 12:29:47.696 - stdout> Caused by: java.util.NoSuchElementException: key not found: groupId
2019-04-19 12:29:47.696 - stdout> 	at scala.collection.MapLike$class.default(MapLike.scala:228)
2019-04-19 12:29:47.696 - stdout> 	at scala.collection.AbstractMap.default(Map.scala:59)
2019-04-19 12:29:47.696 - stdout> 	at scala.collection.MapLike$class.apply(MapLike.scala:141)
2019-04-19 12:29:47.696 - stdout> 	at scala.collection.AbstractMap.apply(Map.scala:59)
2019-04-19 12:29:47.696 - stdout> 	at com.fasterxml.jackson.module.scala.JacksonModule$.version$lzycompute(JacksonModule.scala:27)
2019-04-19 12:29:47.696 - stdout> 	at com.fasterxml.jackson.module.scala.JacksonModule$.version(JacksonModule.scala:26)
2019-04-19 12:29:47.696 - stdout> 	at com.fasterxml.jackson.module.scala.JacksonModule$class.version(JacksonModule.scala:49)
2019-04-19 12:29:47.696 - stdout> 	at com.fasterxml.jackson.module.scala.DefaultScalaModule.version(DefaultScalaModule.scala:19)
2019-04-19 12:29:47.696 - stdout> 	at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:710)
2019-04-19 12:29:47.696 - stdout> 	at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82)
2019-04-19 12:29:47.696 - stdout> 	at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala)
2019-04-19 12:29:47.696 - stdout> 	... 69 more
2019-04-19 12:29:47.697 - stdout> 
2019-04-19 12:29:47.739 - stdout> 2019-04-19 12:29:47 INFO  SparkContext:54 - Invoking stop() from shutdown hook
2019-04-19 12:29:47.753 - stdout> 2019-04-19 12:29:47 INFO  MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped!
2019-04-19 12:29:47.761 - stdout> 2019-04-19 12:29:47 INFO  MemoryStore:54 - MemoryStore cleared
2019-04-19 12:29:47.761 - stdout> 2019-04-19 12:29:47 INFO  BlockManager:54 - BlockManager stopped
2019-04-19 12:29:47.762 - stdout> 2019-04-19 12:29:47 INFO  BlockManagerMaster:54 - BlockManagerMaster stopped
2019-04-19 12:29:47.768 - stdout> 2019-04-19 12:29:47 INFO  OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped!
2019-04-19 12:29:47.77 - stdout> 2019-04-19 12:29:47 INFO  SparkContext:54 - Successfully stopped SparkContext
2019-04-19 12:29:47.77 - stdout> 2019-04-19 12:29:47 INFO  ShutdownHookManager:54 - Shutdown hook called
2019-04-19 12:29:47.771 - stdout> 2019-04-19 12:29:47 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-7b6a10b7-370e-4dd7-8a5d-591bcb610267
2019-04-19 12:29:47.772 - stdout> 2019-04-19 12:29:47 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-7b6a10b7-370e-4dd7-8a5d-591bcb610267/pyspark-bc4ed90c-a6ab-4109-aee9-51d218f23042
2019-04-19 12:29:47.772 - stdout> 2019-04-19 12:29:47 INFO  ShutdownHookManager:54 - Deleting directory /tmp/spark-9503b113-97b7-4215-b53f-cae37b0a6646
           
	at org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:528)
	at org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:527)
	at org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1560)
	at org.scalatest.Assertions.fail(Assertions.scala:1089)
	at org.scalatest.Assertions.fail$(Assertions.scala:1085)
	at org.scalatest.FunSuite.fail(FunSuite.scala:1560)
	at org.apache.spark.sql.hive.SparkSubmitTestUtils.runSparkSubmit(SparkSubmitTestUtils.scala:94)
	at org.apache.spark.sql.hive.SparkSubmitTestUtils.runSparkSubmit$(SparkSubmitTestUtils.scala:41)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite.runSparkSubmit(HiveExternalCatalogVersionsSuite.scala:43)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite.$anonfun$beforeAll$1(HiveExternalCatalogVersionsSuite.scala:187)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite.$anonfun$beforeAll$1$adapted(HiveExternalCatalogVersionsSuite.scala:172)
	at scala.collection.immutable.List.foreach(List.scala:392)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite.beforeAll(HiveExternalCatalogVersionsSuite.scala:172)
	at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:212)
	at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
	at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
	at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:54)
	at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314)
	at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:507)
	at sbt.ForkMain$Run$2.call(ForkMain.java:296)
	at sbt.ForkMain$Run$2.call(ForkMain.java:286)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)