org.scalatest.exceptions.TestFailedException: spark-submit returned with exit code 1. Command line: './bin/spark-submit' '--name' 'prepare testing tables' '--master' 'local[2]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--conf' 'spark.sql.warehouse.dir=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/warehouse-ad1af88b-8861-4716-8a40-f74c8009c384' '--conf' 'spark.sql.test.version.index=2' '--driver-java-options' '-Dderby.system.home=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/warehouse-ad1af88b-8861-4716-8a40-f74c8009c384' '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/test4902275085138136343.py' 2018-08-10 09:18:05.197 - stderr> SLF4J: Class path contains multiple SLF4J bindings. 2018-08-10 09:18:05.197 - stderr> SLF4J: Found binding in [jar:file:/tmp/test-spark/spark-2.3.1/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2018-08-10 09:18:05.197 - stderr> SLF4J: Found binding in [jar:file:/home/sparkivy/per-executor-caches/4/.ivy2/cache/org.slf4j/slf4j-log4j12/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2018-08-10 09:18:05.198 - stderr> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2018-08-10 09:18:05.198 - stderr> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 2018-08-10 09:18:05.822 - stdout> 2018-08-10 09:18:05 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2018-08-10 09:18:06.664 - stdout> 2018-08-10 09:18:06 INFO SparkContext:54 - Running Spark version 2.3.1 2018-08-10 09:18:07.134 - stdout> 2018-08-10 09:18:07 INFO SparkContext:54 - Submitted application: prepare testing tables 2018-08-10 09:18:07.2 - stdout> 2018-08-10 09:18:07 INFO SecurityManager:54 - Changing view acls to: jenkins 2018-08-10 09:18:07.2 - stdout> 2018-08-10 09:18:07 INFO SecurityManager:54 - Changing modify acls to: jenkins 2018-08-10 09:18:07.201 - stdout> 2018-08-10 09:18:07 INFO SecurityManager:54 - Changing view acls groups to: 2018-08-10 09:18:07.201 - stdout> 2018-08-10 09:18:07 INFO SecurityManager:54 - Changing modify acls groups to: 2018-08-10 09:18:07.202 - stdout> 2018-08-10 09:18:07 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jenkins); groups with view permissions: Set(); users with modify permissions: Set(jenkins); groups with modify permissions: Set() 2018-08-10 09:18:08.764 - stdout> 2018-08-10 09:18:08 INFO Utils:54 - Successfully started service 'sparkDriver' on port 44194. 2018-08-10 09:18:08.789 - stdout> 2018-08-10 09:18:08 INFO SparkEnv:54 - Registering MapOutputTracker 2018-08-10 09:18:08.808 - stdout> 2018-08-10 09:18:08 INFO SparkEnv:54 - Registering BlockManagerMaster 2018-08-10 09:18:08.81 - stdout> 2018-08-10 09:18:08 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 2018-08-10 09:18:08.811 - stdout> 2018-08-10 09:18:08 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up 2018-08-10 09:18:08.819 - stdout> 2018-08-10 09:18:08 INFO DiskBlockManager:54 - Created local directory at /tmp/blockmgr-4cf94193-ad4a-45a4-8505-5b27e36988a3 2018-08-10 09:18:08.835 - stdout> 2018-08-10 09:18:08 INFO MemoryStore:54 - MemoryStore started with capacity 366.3 MB 2018-08-10 09:18:08.85 - stdout> 2018-08-10 09:18:08 INFO SparkEnv:54 - Registering OutputCommitCoordinator 2018-08-10 09:18:09.255 - stdout> 2018-08-10 09:18:09 INFO SparkContext:54 - Added file file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/test4902275085138136343.py at file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/test4902275085138136343.py with timestamp 1533917889254 2018-08-10 09:18:09.256 - stdout> 2018-08-10 09:18:09 INFO Utils:54 - Copying /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/test4902275085138136343.py to /tmp/spark-6fbfb887-6931-4fd5-89b3-371542707aac/userFiles-0b11a1b4-c163-4c60-8410-0d5bd8a1e2ec/test4902275085138136343.py 2018-08-10 09:18:09.26 - stdout> Traceback (most recent call last): 2018-08-10 09:18:09.26 - stdout> File "/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/test4902275085138136343.py", line 5, in <module> 2018-08-10 09:18:09.26 - stdout> spark = SparkSession.builder.enableHiveSupport().getOrCreate() 2018-08-10 09:18:09.26 - stdout> File "/tmp/test-spark/spark-2.3.1/python/lib/pyspark.zip/pyspark/sql/session.py", line 173, in getOrCreate 2018-08-10 09:18:09.261 - stdout> File "/tmp/test-spark/spark-2.3.1/python/lib/pyspark.zip/pyspark/context.py", line 343, in getOrCreate 2018-08-10 09:18:09.261 - stdout> File "/tmp/test-spark/spark-2.3.1/python/lib/pyspark.zip/pyspark/context.py", line 118, in __init__ 2018-08-10 09:18:09.261 - stdout> File "/tmp/test-spark/spark-2.3.1/python/lib/pyspark.zip/pyspark/context.py", line 180, in _do_init 2018-08-10 09:18:09.261 - stdout> File "/tmp/test-spark/spark-2.3.1/python/lib/pyspark.zip/pyspark/context.py", line 282, in _initialize_context 2018-08-10 09:18:09.261 - stdout> File "/tmp/test-spark/spark-2.3.1/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1525, in __call__ 2018-08-10 09:18:09.261 - stdout> File "/tmp/test-spark/spark-2.3.1/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value 2018-08-10 09:18:09.263 - stdout> py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext. 2018-08-10 09:18:09.264 - stdout> : java.lang.ExceptionInInitializerError 2018-08-10 09:18:09.264 - stdout> at java.nio.file.FileSystems.getDefault(FileSystems.java:176) 2018-08-10 09:18:09.264 - stdout> at java.io.File.toPath(File.java:2234) 2018-08-10 09:18:09.264 - stdout> at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$copyRecursive(Utils.scala:635) 2018-08-10 09:18:09.264 - stdout> at org.apache.spark.util.Utils$.copyFile(Utils.scala:606) 2018-08-10 09:18:09.264 - stdout> at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:691) 2018-08-10 09:18:09.264 - stdout> at org.apache.spark.util.Utils$.fetchFile(Utils.scala:488) 2018-08-10 09:18:09.264 - stdout> at org.apache.spark.SparkContext.addFile(SparkContext.scala:1553) 2018-08-10 09:18:09.264 - stdout> at org.apache.spark.SparkContext.addFile(SparkContext.scala:1499) 2018-08-10 09:18:09.264 - stdout> at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:461) 2018-08-10 09:18:09.264 - stdout> at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:461) 2018-08-10 09:18:09.264 - stdout> at scala.collection.immutable.List.foreach(List.scala:381) 2018-08-10 09:18:09.264 - stdout> at org.apache.spark.SparkContext.<init>(SparkContext.scala:461) 2018-08-10 09:18:09.264 - stdout> at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58) 2018-08-10 09:18:09.264 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2018-08-10 09:18:09.264 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2018-08-10 09:18:09.264 - stdout> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2018-08-10 09:18:09.264 - stdout> at java.lang.reflect.Constructor.newInstance(Constructor.java:422) 2018-08-10 09:18:09.264 - stdout> at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247) 2018-08-10 09:18:09.264 - stdout> at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) 2018-08-10 09:18:09.264 - stdout> at py4j.Gateway.invoke(Gateway.java:238) 2018-08-10 09:18:09.264 - stdout> at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80) 2018-08-10 09:18:09.264 - stdout> at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69) 2018-08-10 09:18:09.264 - stdout> at py4j.GatewayConnection.run(GatewayConnection.java:238) 2018-08-10 09:18:09.264 - stdout> at java.lang.Thread.run(Thread.java:745) 2018-08-10 09:18:09.264 - stdout> Caused by: java.security.PrivilegedActionException: sun.nio.fs.UnixException: No such file or directory 2018-08-10 09:18:09.264 - stdout> at java.security.AccessController.doPrivileged(Native Method) 2018-08-10 09:18:09.264 - stdout> at java.nio.file.FileSystems$DefaultFileSystemHolder.defaultFileSystem(FileSystems.java:96) 2018-08-10 09:18:09.264 - stdout> at java.nio.file.FileSystems$DefaultFileSystemHolder.<clinit>(FileSystems.java:90) 2018-08-10 09:18:09.264 - stdout> ... 24 more 2018-08-10 09:18:09.264 - stdout> Caused by: sun.nio.fs.UnixException: No such file or directory 2018-08-10 09:18:09.264 - stdout> at sun.nio.fs.UnixNativeDispatcher.getcwd(Native Method) 2018-08-10 09:18:09.264 - stdout> at sun.nio.fs.UnixFileSystem.<init>(UnixFileSystem.java:67) 2018-08-10 09:18:09.264 - stdout> at sun.nio.fs.LinuxFileSystem.<init>(LinuxFileSystem.java:39) 2018-08-10 09:18:09.264 - stdout> at sun.nio.fs.LinuxFileSystemProvider.newFileSystem(LinuxFileSystemProvider.java:46) 2018-08-10 09:18:09.264 - stdout> at sun.nio.fs.LinuxFileSystemProvider.newFileSystem(LinuxFileSystemProvider.java:39) 2018-08-10 09:18:09.264 - stdout> at sun.nio.fs.UnixFileSystemProvider.<init>(UnixFileSystemProvider.java:56) 2018-08-10 09:18:09.264 - stdout> at sun.nio.fs.LinuxFileSystemProvider.<init>(LinuxFileSystemProvider.java:41) 2018-08-10 09:18:09.264 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 2018-08-10 09:18:09.264 - stdout> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) 2018-08-10 09:18:09.264 - stdout> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 2018-08-10 09:18:09.264 - stdout> at java.lang.reflect.Constructor.newInstance(Constructor.java:422) 2018-08-10 09:18:09.264 - stdout> at java.lang.Class.newInstance(Class.java:442) 2018-08-10 09:18:09.264 - stdout> at sun.nio.fs.DefaultFileSystemProvider.createProvider(DefaultFileSystemProvider.java:48) 2018-08-10 09:18:09.264 - stdout> at sun.nio.fs.DefaultFileSystemProvider.create(DefaultFileSystemProvider.java:63) 2018-08-10 09:18:09.264 - stdout> at java.nio.file.FileSystems$DefaultFileSystemHolder.getDefaultProvider(FileSystems.java:108) 2018-08-10 09:18:09.264 - stdout> at java.nio.file.FileSystems$DefaultFileSystemHolder.access$000(FileSystems.java:89) 2018-08-10 09:18:09.264 - stdout> at java.nio.file.FileSystems$DefaultFileSystemHolder$1.run(FileSystems.java:98) 2018-08-10 09:18:09.264 - stdout> at java.nio.file.FileSystems$DefaultFileSystemHolder$1.run(FileSystems.java:96) 2018-08-10 09:18:09.264 - stdout> ... 27 more 2018-08-10 09:18:09.264 - stdout> 2018-08-10 09:18:09.304 - stdout> 2018-08-10 09:18:09 INFO DiskBlockManager:54 - Shutdown hook called 2018-08-10 09:18:09.311 - stdout> 2018-08-10 09:18:09 ERROR Utils:91 - Uncaught exception in thread Thread-1 2018-08-10 09:18:09.311 - stdout> java.lang.NoClassDefFoundError: Could not initialize class java.nio.file.FileSystems$DefaultFileSystemHolder 2018-08-10 09:18:09.311 - stdout> at java.nio.file.FileSystems.getDefault(FileSystems.java:176) 2018-08-10 09:18:09.311 - stdout> at java.nio.file.Paths.get(Paths.java:138) 2018-08-10 09:18:09.311 - stdout> at org.apache.spark.util.Utils$.isSymlink(Utils.scala:1084) 2018-08-10 09:18:09.311 - stdout> at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1052) 2018-08-10 09:18:09.312 - stdout> at org.apache.spark.storage.DiskBlockManager$$anonfun$org$apache$spark$storage$DiskBlockManager$$doStop$1.apply(DiskBlockManager.scala:178) 2018-08-10 09:18:09.312 - stdout> at org.apache.spark.storage.DiskBlockManager$$anonfun$org$apache$spark$storage$DiskBlockManager$$doStop$1.apply(DiskBlockManager.scala:174) 2018-08-10 09:18:09.312 - stdout> at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) 2018-08-10 09:18:09.312 - stdout> at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186) 2018-08-10 09:18:09.312 - stdout> at org.apache.spark.storage.DiskBlockManager.org$apache$spark$storage$DiskBlockManager$$doStop(DiskBlockManager.scala:174) 2018-08-10 09:18:09.312 - stdout> at org.apache.spark.storage.DiskBlockManager$$anonfun$addShutdownHook$1.apply$mcV$sp(DiskBlockManager.scala:156) 2018-08-10 09:18:09.312 - stdout> at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216) 2018-08-10 09:18:09.312 - stdout> at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188) 2018-08-10 09:18:09.312 - stdout> at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188) 2018-08-10 09:18:09.312 - stdout> at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188) 2018-08-10 09:18:09.312 - stdout> at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1991) 2018-08-10 09:18:09.312 - stdout> at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188) 2018-08-10 09:18:09.312 - stdout> at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188) 2018-08-10 09:18:09.312 - stdout> at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188) 2018-08-10 09:18:09.312 - stdout> at scala.util.Try$.apply(Try.scala:192) 2018-08-10 09:18:09.312 - stdout> at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188) 2018-08-10 09:18:09.312 - stdout> at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178) 2018-08-10 09:18:09.312 - stdout> at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54) 2018-08-10 09:18:09.312 - stdout> 2018-08-10 09:18:09 WARN ShutdownHookManager:56 - ShutdownHook '$anon$2' failed, java.lang.NoClassDefFoundError: Could not initialize class java.nio.file.FileSystems$DefaultFileSystemHolder 2018-08-10 09:18:09.312 - stdout> java.lang.NoClassDefFoundError: Could not initialize class java.nio.file.FileSystems$DefaultFileSystemHolder 2018-08-10 09:18:09.312 - stdout> at java.nio.file.FileSystems.getDefault(FileSystems.java:176) 2018-08-10 09:18:09.312 - stdout> at java.nio.file.Paths.get(Paths.java:138) 2018-08-10 09:18:09.312 - stdout> at org.apache.spark.util.Utils$.isSymlink(Utils.scala:1084) 2018-08-10 09:18:09.312 - stdout> at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1052) 2018-08-10 09:18:09.312 - stdout> at org.apache.spark.storage.DiskBlockManager$$anonfun$org$apache$spark$storage$DiskBlockManager$$doStop$1.apply(DiskBlockManager.scala:178) 2018-08-10 09:18:09.312 - stdout> at org.apache.spark.storage.DiskBlockManager$$anonfun$org$apache$spark$storage$DiskBlockManager$$doStop$1.apply(DiskBlockManager.scala:174) 2018-08-10 09:18:09.312 - stdout> at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) 2018-08-10 09:18:09.312 - stdout> at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186) 2018-08-10 09:18:09.313 - stdout> at org.apache.spark.storage.DiskBlockManager.org$apache$spark$storage$DiskBlockManager$$doStop(DiskBlockManager.scala:174) 2018-08-10 09:18:09.313 - stdout> at org.apache.spark.storage.DiskBlockManager$$anonfun$addShutdownHook$1.apply$mcV$sp(DiskBlockManager.scala:156) 2018-08-10 09:18:09.313 - stdout> at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216) 2018-08-10 09:18:09.313 - stdout> at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188) 2018-08-10 09:18:09.313 - stdout> at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188) 2018-08-10 09:18:09.313 - stdout> at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188) 2018-08-10 09:18:09.313 - stdout> at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1991) 2018-08-10 09:18:09.313 - stdout> at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188) 2018-08-10 09:18:09.313 - stdout> at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188) 2018-08-10 09:18:09.313 - stdout> at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188) 2018-08-10 09:18:09.313 - stdout> at scala.util.Try$.apply(Try.scala:192) 2018-08-10 09:18:09.313 - stdout> at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188) 2018-08-10 09:18:09.313 - stdout> at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178) 2018-08-10 09:18:09.313 - stdout> at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)

sbt.ForkMain$ForkError: org.scalatest.exceptions.TestFailedException: spark-submit returned with exit code 1.
Command line: './bin/spark-submit' '--name' 'prepare testing tables' '--master' 'local[2]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--conf' 'spark.sql.warehouse.dir=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/warehouse-ad1af88b-8861-4716-8a40-f74c8009c384' '--conf' 'spark.sql.test.version.index=2' '--driver-java-options' '-Dderby.system.home=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/warehouse-ad1af88b-8861-4716-8a40-f74c8009c384' '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/test4902275085138136343.py'

2018-08-10 09:18:05.197 - stderr> SLF4J: Class path contains multiple SLF4J bindings.
2018-08-10 09:18:05.197 - stderr> SLF4J: Found binding in [jar:file:/tmp/test-spark/spark-2.3.1/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2018-08-10 09:18:05.197 - stderr> SLF4J: Found binding in [jar:file:/home/sparkivy/per-executor-caches/4/.ivy2/cache/org.slf4j/slf4j-log4j12/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2018-08-10 09:18:05.198 - stderr> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2018-08-10 09:18:05.198 - stderr> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2018-08-10 09:18:05.822 - stdout> 2018-08-10 09:18:05 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2018-08-10 09:18:06.664 - stdout> 2018-08-10 09:18:06 INFO  SparkContext:54 - Running Spark version 2.3.1
2018-08-10 09:18:07.134 - stdout> 2018-08-10 09:18:07 INFO  SparkContext:54 - Submitted application: prepare testing tables
2018-08-10 09:18:07.2 - stdout> 2018-08-10 09:18:07 INFO  SecurityManager:54 - Changing view acls to: jenkins
2018-08-10 09:18:07.2 - stdout> 2018-08-10 09:18:07 INFO  SecurityManager:54 - Changing modify acls to: jenkins
2018-08-10 09:18:07.201 - stdout> 2018-08-10 09:18:07 INFO  SecurityManager:54 - Changing view acls groups to: 
2018-08-10 09:18:07.201 - stdout> 2018-08-10 09:18:07 INFO  SecurityManager:54 - Changing modify acls groups to: 
2018-08-10 09:18:07.202 - stdout> 2018-08-10 09:18:07 INFO  SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(jenkins); groups with view permissions: Set(); users  with modify permissions: Set(jenkins); groups with modify permissions: Set()
2018-08-10 09:18:08.764 - stdout> 2018-08-10 09:18:08 INFO  Utils:54 - Successfully started service 'sparkDriver' on port 44194.
2018-08-10 09:18:08.789 - stdout> 2018-08-10 09:18:08 INFO  SparkEnv:54 - Registering MapOutputTracker
2018-08-10 09:18:08.808 - stdout> 2018-08-10 09:18:08 INFO  SparkEnv:54 - Registering BlockManagerMaster
2018-08-10 09:18:08.81 - stdout> 2018-08-10 09:18:08 INFO  BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2018-08-10 09:18:08.811 - stdout> 2018-08-10 09:18:08 INFO  BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up
2018-08-10 09:18:08.819 - stdout> 2018-08-10 09:18:08 INFO  DiskBlockManager:54 - Created local directory at /tmp/blockmgr-4cf94193-ad4a-45a4-8505-5b27e36988a3
2018-08-10 09:18:08.835 - stdout> 2018-08-10 09:18:08 INFO  MemoryStore:54 - MemoryStore started with capacity 366.3 MB
2018-08-10 09:18:08.85 - stdout> 2018-08-10 09:18:08 INFO  SparkEnv:54 - Registering OutputCommitCoordinator
2018-08-10 09:18:09.255 - stdout> 2018-08-10 09:18:09 INFO  SparkContext:54 - Added file file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/test4902275085138136343.py at file:/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/test4902275085138136343.py with timestamp 1533917889254
2018-08-10 09:18:09.256 - stdout> 2018-08-10 09:18:09 INFO  Utils:54 - Copying /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/test4902275085138136343.py to /tmp/spark-6fbfb887-6931-4fd5-89b3-371542707aac/userFiles-0b11a1b4-c163-4c60-8410-0d5bd8a1e2ec/test4902275085138136343.py
2018-08-10 09:18:09.26 - stdout> Traceback (most recent call last):
2018-08-10 09:18:09.26 - stdout>   File "/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7/target/tmp/test4902275085138136343.py", line 5, in <module>
2018-08-10 09:18:09.26 - stdout>     spark = SparkSession.builder.enableHiveSupport().getOrCreate()
2018-08-10 09:18:09.26 - stdout>   File "/tmp/test-spark/spark-2.3.1/python/lib/pyspark.zip/pyspark/sql/session.py", line 173, in getOrCreate
2018-08-10 09:18:09.261 - stdout>   File "/tmp/test-spark/spark-2.3.1/python/lib/pyspark.zip/pyspark/context.py", line 343, in getOrCreate
2018-08-10 09:18:09.261 - stdout>   File "/tmp/test-spark/spark-2.3.1/python/lib/pyspark.zip/pyspark/context.py", line 118, in __init__
2018-08-10 09:18:09.261 - stdout>   File "/tmp/test-spark/spark-2.3.1/python/lib/pyspark.zip/pyspark/context.py", line 180, in _do_init
2018-08-10 09:18:09.261 - stdout>   File "/tmp/test-spark/spark-2.3.1/python/lib/pyspark.zip/pyspark/context.py", line 282, in _initialize_context
2018-08-10 09:18:09.261 - stdout>   File "/tmp/test-spark/spark-2.3.1/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1525, in __call__
2018-08-10 09:18:09.261 - stdout>   File "/tmp/test-spark/spark-2.3.1/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value
2018-08-10 09:18:09.263 - stdout> py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
2018-08-10 09:18:09.264 - stdout> : java.lang.ExceptionInInitializerError
2018-08-10 09:18:09.264 - stdout> 	at java.nio.file.FileSystems.getDefault(FileSystems.java:176)
2018-08-10 09:18:09.264 - stdout> 	at java.io.File.toPath(File.java:2234)
2018-08-10 09:18:09.264 - stdout> 	at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$copyRecursive(Utils.scala:635)
2018-08-10 09:18:09.264 - stdout> 	at org.apache.spark.util.Utils$.copyFile(Utils.scala:606)
2018-08-10 09:18:09.264 - stdout> 	at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:691)
2018-08-10 09:18:09.264 - stdout> 	at org.apache.spark.util.Utils$.fetchFile(Utils.scala:488)
2018-08-10 09:18:09.264 - stdout> 	at org.apache.spark.SparkContext.addFile(SparkContext.scala:1553)
2018-08-10 09:18:09.264 - stdout> 	at org.apache.spark.SparkContext.addFile(SparkContext.scala:1499)
2018-08-10 09:18:09.264 - stdout> 	at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:461)
2018-08-10 09:18:09.264 - stdout> 	at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:461)
2018-08-10 09:18:09.264 - stdout> 	at scala.collection.immutable.List.foreach(List.scala:381)
2018-08-10 09:18:09.264 - stdout> 	at org.apache.spark.SparkContext.<init>(SparkContext.scala:461)
2018-08-10 09:18:09.264 - stdout> 	at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
2018-08-10 09:18:09.264 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2018-08-10 09:18:09.264 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2018-08-10 09:18:09.264 - stdout> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2018-08-10 09:18:09.264 - stdout> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
2018-08-10 09:18:09.264 - stdout> 	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
2018-08-10 09:18:09.264 - stdout> 	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
2018-08-10 09:18:09.264 - stdout> 	at py4j.Gateway.invoke(Gateway.java:238)
2018-08-10 09:18:09.264 - stdout> 	at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
2018-08-10 09:18:09.264 - stdout> 	at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
2018-08-10 09:18:09.264 - stdout> 	at py4j.GatewayConnection.run(GatewayConnection.java:238)
2018-08-10 09:18:09.264 - stdout> 	at java.lang.Thread.run(Thread.java:745)
2018-08-10 09:18:09.264 - stdout> Caused by: java.security.PrivilegedActionException: sun.nio.fs.UnixException: No such file or directory
2018-08-10 09:18:09.264 - stdout> 	at java.security.AccessController.doPrivileged(Native Method)
2018-08-10 09:18:09.264 - stdout> 	at java.nio.file.FileSystems$DefaultFileSystemHolder.defaultFileSystem(FileSystems.java:96)
2018-08-10 09:18:09.264 - stdout> 	at java.nio.file.FileSystems$DefaultFileSystemHolder.<clinit>(FileSystems.java:90)
2018-08-10 09:18:09.264 - stdout> 	... 24 more
2018-08-10 09:18:09.264 - stdout> Caused by: sun.nio.fs.UnixException: No such file or directory
2018-08-10 09:18:09.264 - stdout> 	at sun.nio.fs.UnixNativeDispatcher.getcwd(Native Method)
2018-08-10 09:18:09.264 - stdout> 	at sun.nio.fs.UnixFileSystem.<init>(UnixFileSystem.java:67)
2018-08-10 09:18:09.264 - stdout> 	at sun.nio.fs.LinuxFileSystem.<init>(LinuxFileSystem.java:39)
2018-08-10 09:18:09.264 - stdout> 	at sun.nio.fs.LinuxFileSystemProvider.newFileSystem(LinuxFileSystemProvider.java:46)
2018-08-10 09:18:09.264 - stdout> 	at sun.nio.fs.LinuxFileSystemProvider.newFileSystem(LinuxFileSystemProvider.java:39)
2018-08-10 09:18:09.264 - stdout> 	at sun.nio.fs.UnixFileSystemProvider.<init>(UnixFileSystemProvider.java:56)
2018-08-10 09:18:09.264 - stdout> 	at sun.nio.fs.LinuxFileSystemProvider.<init>(LinuxFileSystemProvider.java:41)
2018-08-10 09:18:09.264 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2018-08-10 09:18:09.264 - stdout> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2018-08-10 09:18:09.264 - stdout> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2018-08-10 09:18:09.264 - stdout> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
2018-08-10 09:18:09.264 - stdout> 	at java.lang.Class.newInstance(Class.java:442)
2018-08-10 09:18:09.264 - stdout> 	at sun.nio.fs.DefaultFileSystemProvider.createProvider(DefaultFileSystemProvider.java:48)
2018-08-10 09:18:09.264 - stdout> 	at sun.nio.fs.DefaultFileSystemProvider.create(DefaultFileSystemProvider.java:63)
2018-08-10 09:18:09.264 - stdout> 	at java.nio.file.FileSystems$DefaultFileSystemHolder.getDefaultProvider(FileSystems.java:108)
2018-08-10 09:18:09.264 - stdout> 	at java.nio.file.FileSystems$DefaultFileSystemHolder.access$000(FileSystems.java:89)
2018-08-10 09:18:09.264 - stdout> 	at java.nio.file.FileSystems$DefaultFileSystemHolder$1.run(FileSystems.java:98)
2018-08-10 09:18:09.264 - stdout> 	at java.nio.file.FileSystems$DefaultFileSystemHolder$1.run(FileSystems.java:96)
2018-08-10 09:18:09.264 - stdout> 	... 27 more
2018-08-10 09:18:09.264 - stdout> 
2018-08-10 09:18:09.304 - stdout> 2018-08-10 09:18:09 INFO  DiskBlockManager:54 - Shutdown hook called
2018-08-10 09:18:09.311 - stdout> 2018-08-10 09:18:09 ERROR Utils:91 - Uncaught exception in thread Thread-1
2018-08-10 09:18:09.311 - stdout> java.lang.NoClassDefFoundError: Could not initialize class java.nio.file.FileSystems$DefaultFileSystemHolder
2018-08-10 09:18:09.311 - stdout> 	at java.nio.file.FileSystems.getDefault(FileSystems.java:176)
2018-08-10 09:18:09.311 - stdout> 	at java.nio.file.Paths.get(Paths.java:138)
2018-08-10 09:18:09.311 - stdout> 	at org.apache.spark.util.Utils$.isSymlink(Utils.scala:1084)
2018-08-10 09:18:09.311 - stdout> 	at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1052)
2018-08-10 09:18:09.312 - stdout> 	at org.apache.spark.storage.DiskBlockManager$$anonfun$org$apache$spark$storage$DiskBlockManager$$doStop$1.apply(DiskBlockManager.scala:178)
2018-08-10 09:18:09.312 - stdout> 	at org.apache.spark.storage.DiskBlockManager$$anonfun$org$apache$spark$storage$DiskBlockManager$$doStop$1.apply(DiskBlockManager.scala:174)
2018-08-10 09:18:09.312 - stdout> 	at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
2018-08-10 09:18:09.312 - stdout> 	at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
2018-08-10 09:18:09.312 - stdout> 	at org.apache.spark.storage.DiskBlockManager.org$apache$spark$storage$DiskBlockManager$$doStop(DiskBlockManager.scala:174)
2018-08-10 09:18:09.312 - stdout> 	at org.apache.spark.storage.DiskBlockManager$$anonfun$addShutdownHook$1.apply$mcV$sp(DiskBlockManager.scala:156)
2018-08-10 09:18:09.312 - stdout> 	at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216)
2018-08-10 09:18:09.312 - stdout> 	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188)
2018-08-10 09:18:09.312 - stdout> 	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
2018-08-10 09:18:09.312 - stdout> 	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
2018-08-10 09:18:09.312 - stdout> 	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1991)
2018-08-10 09:18:09.312 - stdout> 	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188)
2018-08-10 09:18:09.312 - stdout> 	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
2018-08-10 09:18:09.312 - stdout> 	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
2018-08-10 09:18:09.312 - stdout> 	at scala.util.Try$.apply(Try.scala:192)
2018-08-10 09:18:09.312 - stdout> 	at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
2018-08-10 09:18:09.312 - stdout> 	at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
2018-08-10 09:18:09.312 - stdout> 	at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
2018-08-10 09:18:09.312 - stdout> 2018-08-10 09:18:09 WARN  ShutdownHookManager:56 - ShutdownHook '$anon$2' failed, java.lang.NoClassDefFoundError: Could not initialize class java.nio.file.FileSystems$DefaultFileSystemHolder
2018-08-10 09:18:09.312 - stdout> java.lang.NoClassDefFoundError: Could not initialize class java.nio.file.FileSystems$DefaultFileSystemHolder
2018-08-10 09:18:09.312 - stdout> 	at java.nio.file.FileSystems.getDefault(FileSystems.java:176)
2018-08-10 09:18:09.312 - stdout> 	at java.nio.file.Paths.get(Paths.java:138)
2018-08-10 09:18:09.312 - stdout> 	at org.apache.spark.util.Utils$.isSymlink(Utils.scala:1084)
2018-08-10 09:18:09.312 - stdout> 	at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1052)
2018-08-10 09:18:09.312 - stdout> 	at org.apache.spark.storage.DiskBlockManager$$anonfun$org$apache$spark$storage$DiskBlockManager$$doStop$1.apply(DiskBlockManager.scala:178)
2018-08-10 09:18:09.312 - stdout> 	at org.apache.spark.storage.DiskBlockManager$$anonfun$org$apache$spark$storage$DiskBlockManager$$doStop$1.apply(DiskBlockManager.scala:174)
2018-08-10 09:18:09.312 - stdout> 	at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
2018-08-10 09:18:09.312 - stdout> 	at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
2018-08-10 09:18:09.313 - stdout> 	at org.apache.spark.storage.DiskBlockManager.org$apache$spark$storage$DiskBlockManager$$doStop(DiskBlockManager.scala:174)
2018-08-10 09:18:09.313 - stdout> 	at org.apache.spark.storage.DiskBlockManager$$anonfun$addShutdownHook$1.apply$mcV$sp(DiskBlockManager.scala:156)
2018-08-10 09:18:09.313 - stdout> 	at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216)
2018-08-10 09:18:09.313 - stdout> 	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188)
2018-08-10 09:18:09.313 - stdout> 	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
2018-08-10 09:18:09.313 - stdout> 	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
2018-08-10 09:18:09.313 - stdout> 	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1991)
2018-08-10 09:18:09.313 - stdout> 	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188)
2018-08-10 09:18:09.313 - stdout> 	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
2018-08-10 09:18:09.313 - stdout> 	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
2018-08-10 09:18:09.313 - stdout> 	at scala.util.Try$.apply(Try.scala:192)
2018-08-10 09:18:09.313 - stdout> 	at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
2018-08-10 09:18:09.313 - stdout> 	at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
2018-08-10 09:18:09.313 - stdout> 	at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
           
	at org.scalatest.Assertions$class.newAssertionFailedException(Assertions.scala:528)
	at org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1560)
	at org.scalatest.Assertions$class.fail(Assertions.scala:1089)
	at org.scalatest.FunSuite.fail(FunSuite.scala:1560)
	at org.apache.spark.sql.hive.SparkSubmitTestUtils$class.runSparkSubmit(SparkSubmitTestUtils.scala:84)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite.runSparkSubmit(HiveExternalCatalogVersionsSuite.scala:43)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite$$anonfun$beforeAll$1.apply(HiveExternalCatalogVersionsSuite.scala:184)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite$$anonfun$beforeAll$1.apply(HiveExternalCatalogVersionsSuite.scala:169)
	at scala.collection.immutable.List.foreach(List.scala:392)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite.beforeAll(HiveExternalCatalogVersionsSuite.scala:169)
	at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:212)
	at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
	at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:52)
	at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314)
	at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:480)
	at sbt.ForkMain$Run$2.call(ForkMain.java:296)
	at sbt.ForkMain$Run$2.call(ForkMain.java:286)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)