org.scalatest.exceptions.TestFailedException: spark-submit returned with exit code 1. Command line: './bin/spark-submit' '--name' 'prepare testing tables' '--master' 'local[2]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--conf' 'spark.sql.warehouse.dir=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/target/tmp/warehouse-ea7f75df-6203-406f-bf29-5453283436e2' '--conf' 'spark.sql.test.version.index=3' '--driver-java-options' '-Dderby.system.home=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/target/tmp/warehouse-ea7f75df-6203-406f-bf29-5453283436e2' '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/target/tmp/test5767309165081074921.py' 2018-04-24 04:43:05.478 - stderr> SLF4J: Class path contains multiple SLF4J bindings. 2018-04-24 04:43:05.478 - stderr> SLF4J: Found binding in [jar:file:/tmp/test-spark/spark-2.2.1/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2018-04-24 04:43:05.478 - stderr> SLF4J: Found binding in [jar:file:/home/sparkivy/per-executor-caches/1/.ivy2/cache/org.slf4j/slf4j-log4j12/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class] 2018-04-24 04:43:05.478 - stderr> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. 2018-04-24 04:43:05.483 - stderr> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 2018-04-24 04:43:05.797 - stdout> Traceback (most recent call last): 2018-04-24 04:43:05.797 - stdout> File "/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/target/tmp/test5767309165081074921.py", line 2, in <module> 2018-04-24 04:43:05.797 - stdout> from pyspark.sql import SparkSession 2018-04-24 04:43:05.797 - stdout> ImportError: No module named 'pyspark'

sbt.ForkMain$ForkError: org.scalatest.exceptions.TestFailedException: spark-submit returned with exit code 1.
Command line: './bin/spark-submit' '--name' 'prepare testing tables' '--master' 'local[2]' '--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--conf' 'spark.sql.warehouse.dir=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/target/tmp/warehouse-ea7f75df-6203-406f-bf29-5453283436e2' '--conf' 'spark.sql.test.version.index=3' '--driver-java-options' '-Dderby.system.home=/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/target/tmp/warehouse-ea7f75df-6203-406f-bf29-5453283436e2' '/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/target/tmp/test5767309165081074921.py'

2018-04-24 04:43:05.478 - stderr> SLF4J: Class path contains multiple SLF4J bindings.
2018-04-24 04:43:05.478 - stderr> SLF4J: Found binding in [jar:file:/tmp/test-spark/spark-2.2.1/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2018-04-24 04:43:05.478 - stderr> SLF4J: Found binding in [jar:file:/home/sparkivy/per-executor-caches/1/.ivy2/cache/org.slf4j/slf4j-log4j12/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
2018-04-24 04:43:05.478 - stderr> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2018-04-24 04:43:05.483 - stderr> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
2018-04-24 04:43:05.797 - stdout> Traceback (most recent call last):
2018-04-24 04:43:05.797 - stdout>   File "/home/jenkins/workspace/spark-master-test-sbt-hadoop-2.6/target/tmp/test5767309165081074921.py", line 2, in <module>
2018-04-24 04:43:05.797 - stdout>     from pyspark.sql import SparkSession
2018-04-24 04:43:05.797 - stdout> ImportError: No module named 'pyspark'
           
	at org.scalatest.Assertions$class.newAssertionFailedException(Assertions.scala:528)
	at org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1560)
	at org.scalatest.Assertions$class.fail(Assertions.scala:1089)
	at org.scalatest.FunSuite.fail(FunSuite.scala:1560)
	at org.apache.spark.sql.hive.SparkSubmitTestUtils$class.runSparkSubmit(SparkSubmitTestUtils.scala:84)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite.runSparkSubmit(HiveExternalCatalogVersionsSuite.scala:43)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite$$anonfun$beforeAll$1.apply(HiveExternalCatalogVersionsSuite.scala:176)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite$$anonfun$beforeAll$1.apply(HiveExternalCatalogVersionsSuite.scala:161)
	at scala.collection.immutable.List.foreach(List.scala:381)
	at org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite.beforeAll(HiveExternalCatalogVersionsSuite.scala:161)
	at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:212)
	at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
	at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:52)
	at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314)
	at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:480)
	at sbt.ForkMain$Run$2.call(ForkMain.java:296)
	at sbt.ForkMain$Run$2.call(ForkMain.java:286)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)