Details for Spark-Master-SBT » hadoop2.2,spark-test build #4145

View on Jenkins

Duration
140 minutes
Start time
2015-12-01 20:27:01 ()
Commit
34e7093c1131162b3aa05b65a19a633a0b5b633e
Executor
amp-jenkins-worker-08
Status
FAILURE

Failed tests

org.apache.spark.streaming.CheckpointSuite: basic rdd checkpoints + dstream graph checkpoint recovery 14 ms
org.apache.spark.streaming.CheckpointSuite: recovery of conf through checkpoints 1 ms
org.apache.spark.streaming.CheckpointSuite: [host|port] from checkpoint 1 ms
org.apache.spark.streaming.CheckpointSuite: recovery with map and reduceByKey operations 1 ms
org.apache.spark.streaming.CheckpointSuite: recovery with invertible reduceByKeyAndWindow operation 2 ms
org.apache.spark.streaming.CheckpointSuite: recovery with saveAsHadoopFiles operation 1 ms
org.apache.spark.streaming.CheckpointSuite: recovery with saveAsNewAPIHadoopFiles operation 2 ms
org.apache.spark.streaming.CheckpointSuite: recovery with saveAsHadoopFile inside transform operation 1 ms
org.apache.spark.streaming.CheckpointSuite: recovery with updateStateByKey operation 2 ms
org.apache.spark.streaming.CheckpointSuite: recovery maintains rate controller 1 ms
org.apache.spark.streaming.CheckpointSuite: recovery with file input stream 2 ms
org.apache.spark.streaming.DStreamClosureSuite: (It is not a test) 1 ms
org.apache.spark.streaming.JavaAPISuite: testStreamingContextTransform 1 ms
org.apache.spark.streaming.JavaAPISuite: testFlatMapValues 0 ms
org.apache.spark.streaming.JavaAPISuite: testReduceByWindowWithInverse 1 ms
org.apache.spark.streaming.JavaAPISuite: testMapPartitions 1 ms
org.apache.spark.streaming.JavaAPISuite: testPairFilter 1 ms
org.apache.spark.streaming.JavaAPISuite: testRepartitionFewerPartitions 0 ms
org.apache.spark.streaming.JavaAPISuite: testCombineByKey 0 ms
org.apache.spark.streaming.JavaAPISuite: testContextGetOrCreate 0 ms
org.apache.spark.streaming.JavaAPISuite: testWindowWithSlideDuration 1 ms
org.apache.spark.streaming.JavaAPISuite: testQueueStream 1 ms
org.apache.spark.streaming.JavaAPISuite: testCountByValue 1 ms
org.apache.spark.streaming.JavaAPISuite: testMap 0 ms
org.apache.spark.streaming.JavaAPISuite: testPairToNormalRDDTransform 0 ms
org.apache.spark.streaming.JavaAPISuite: testPairReduceByKey 0 ms
org.apache.spark.streaming.JavaAPISuite: testCount 2 ms
org.apache.spark.streaming.JavaAPISuite: testCheckpointMasterRecovery 1 ms
org.apache.spark.streaming.JavaAPISuite: testPairMap 1 ms
org.apache.spark.streaming.JavaAPISuite: testUnion 1 ms
org.apache.spark.streaming.JavaAPISuite: testFlatMap 0 ms
org.apache.spark.streaming.JavaAPISuite: testReduceByKeyAndWindowWithInverse 0 ms
org.apache.spark.streaming.JavaAPISuite: testGlom 0 ms
org.apache.spark.streaming.JavaAPISuite: testJoin 0 ms
org.apache.spark.streaming.JavaAPISuite: testPairFlatMap 1 ms
org.apache.spark.streaming.JavaAPISuite: testPairToPairFlatMapWithChangingTypes 1 ms
org.apache.spark.streaming.JavaAPISuite: testPairMapPartitions 1 ms
org.apache.spark.streaming.JavaAPISuite: testRepartitionMorePartitions 1 ms
org.apache.spark.streaming.JavaAPISuite: testReduceByWindowWithoutInverse 0 ms
org.apache.spark.streaming.JavaAPISuite: testLeftOuterJoin 0 ms
org.apache.spark.streaming.JavaAPISuite: testVariousTransform 0 ms
org.apache.spark.streaming.JavaAPISuite: testTransformWith 0 ms
org.apache.spark.streaming.JavaAPISuite: testVariousTransformWith 1 ms
org.apache.spark.streaming.JavaAPISuite: testTextFileStream 1 ms
org.apache.spark.streaming.JavaAPISuite: testPairGroupByKey 0 ms
org.apache.spark.streaming.JavaAPISuite: testCoGroup 0 ms
org.apache.spark.streaming.JavaAPISuite: testInitialization 0 ms
org.apache.spark.streaming.JavaAPISuite: testSocketString 0 ms
org.apache.spark.streaming.JavaAPISuite: testGroupByKeyAndWindow 1 ms
org.apache.spark.streaming.JavaAPISuite: testReduceByKeyAndWindow 1 ms
org.apache.spark.streaming.JavaAPISuite: testForeachRDD 1 ms
org.apache.spark.streaming.JavaAPISuite: testFileStream 0 ms
org.apache.spark.streaming.JavaAPISuite: testPairTransform 0 ms
org.apache.spark.streaming.JavaAPISuite: testFilter 0 ms
org.apache.spark.streaming.JavaAPISuite: testPairMap2 1 ms
org.apache.spark.streaming.JavaAPISuite: testMapValues 1 ms
org.apache.spark.streaming.JavaAPISuite: testReduce 1 ms
org.apache.spark.streaming.JavaAPISuite: testUpdateStateByKey 0 ms
org.apache.spark.streaming.JavaAPISuite: testTransform 0 ms
org.apache.spark.streaming.JavaAPISuite: testWindow 0 ms
org.apache.spark.streaming.JavaAPISuite: testCountByValueAndWindow 0 ms
org.apache.spark.streaming.JavaAPISuite: testRawSocketStream 1 ms
org.apache.spark.streaming.JavaAPISuite: testSocketTextStream 0 ms
org.apache.spark.streaming.JavaAPISuite: testUpdateStateByKeyWithInitial 0 ms
org.apache.spark.streaming.JavaAPISuite: testContextState 1 ms
org.apache.spark.streaming.JavaReceiverAPISuite: testReceiver 5 ms
org.apache.spark.streaming.JavaTrackStateByKeySuite: testBasicFunction 1 ms
org.apache.spark.streaming.ReceiverInputDStreamSuite: Without WAL enabled: createBlockRDD creates empty BlockRDD when no block info 1 ms
org.apache.spark.streaming.ReceiverInputDStreamSuite: Without WAL enabled: createBlockRDD creates correct BlockRDD with block info 0 ms
org.apache.spark.streaming.ReceiverInputDStreamSuite: Without WAL enabled: createBlockRDD filters non-existent blocks before creating BlockRDD 1 ms
org.apache.spark.streaming.ReceiverInputDStreamSuite: With WAL enabled: createBlockRDD creates empty WALBackedBlockRDD when no block info 1 ms
org.apache.spark.streaming.ReceiverInputDStreamSuite: With WAL enabled: createBlockRDD creates correct WALBackedBlockRDD with all block info having WAL info 0 ms
org.apache.spark.streaming.ReceiverInputDStreamSuite: With WAL enabled: createBlockRDD creates BlockRDD when some block info dont have WAL info 0 ms
org.apache.spark.streaming.StreamingContextSuite: from checkpoint 94 ms
org.apache.spark.streaming.StreamingContextSuite: checkPoint from conf 165 ms
org.apache.spark.streaming.StreamingContextSuite: start and stop state check 2 ms
org.apache.spark.streaming.StreamingContextSuite: start with non-seriazable DStream checkpoints 1 ms
org.apache.spark.streaming.StreamingContextSuite: start failure should stop internal components 0 ms
org.apache.spark.streaming.StreamingContextSuite: start should set job group and description of streaming jobs correctly 1 ms
org.apache.spark.streaming.StreamingContextSuite: start multiple times 0 ms
org.apache.spark.streaming.StreamingContextSuite: stop multiple times 1 ms
org.apache.spark.streaming.StreamingContextSuite: stop before start 1 ms
org.apache.spark.streaming.StreamingContextSuite: start after stop 1 ms
org.apache.spark.streaming.StreamingContextSuite: stop only streaming context 1 ms
org.apache.spark.streaming.StreamingContextSuite: stop(stopSparkContext=true) after stop(stopSparkContext=false) 1 ms
org.apache.spark.streaming.StreamingContextSuite: stop gracefully 1 ms
org.apache.spark.streaming.StreamingContextSuite: stop gracefully even if a receiver misses StopReceiver 1 ms
org.apache.spark.streaming.StreamingContextSuite: stop slow receiver gracefully 10 ms
org.apache.spark.streaming.StreamingContextSuite: registering and de-registering of streamingSource 0 ms
org.apache.spark.streaming.StreamingContextSuite: awaitTermination 1 ms
org.apache.spark.streaming.StreamingContextSuite: awaitTermination after stop 1 ms
org.apache.spark.streaming.StreamingContextSuite: awaitTermination with error in task 1 ms
org.apache.spark.streaming.StreamingContextSuite: awaitTermination with error in job generation 1 ms
org.apache.spark.streaming.StreamingContextSuite: awaitTerminationOrTimeout 1 ms
org.apache.spark.streaming.StreamingContextSuite: getOrCreate 3 ms
org.apache.spark.streaming.StreamingContextSuite: getActive and getActiveOrCreate 1 ms
org.apache.spark.streaming.StreamingContextSuite: getActiveOrCreate with checkpoint 11 ms
org.apache.spark.streaming.StreamingContextSuite: multiple streaming contexts 0 ms
org.apache.spark.streaming.StreamingContextSuite: DStream and generated RDD creation sites 2 ms
org.apache.spark.streaming.StreamingContextSuite: throw exception on using active or stopped context 0 ms
org.apache.spark.streaming.StreamingContextSuite: queueStream doesn't support checkpointing 2 ms
org.apache.spark.streaming.StreamingContextSuite: Creating an InputDStream but not using it should not crash 1 ms
org.apache.spark.streaming.WindowOperationsSuite: window - basic window 1 ms
org.apache.spark.streaming.WindowOperationsSuite: window - tumbling window 0 ms
org.apache.spark.streaming.WindowOperationsSuite: window - larger window 2 ms
org.apache.spark.streaming.WindowOperationsSuite: window - non-overlapping window 0 ms
org.apache.spark.streaming.WindowOperationsSuite: window - persistence level 1 ms
org.apache.spark.streaming.WindowOperationsSuite: reduceByKeyAndWindow - basic reduction 1 ms
org.apache.spark.streaming.WindowOperationsSuite: reduceByKeyAndWindow - key already in window and new value added into window 1 ms
org.apache.spark.streaming.WindowOperationsSuite: reduceByKeyAndWindow - new key added into window 0 ms
org.apache.spark.streaming.WindowOperationsSuite: reduceByKeyAndWindow - key removed from window 0 ms
org.apache.spark.streaming.WindowOperationsSuite: reduceByKeyAndWindow - larger slide time 1 ms
org.apache.spark.streaming.WindowOperationsSuite: reduceByKeyAndWindow - big test 0 ms
org.apache.spark.streaming.WindowOperationsSuite: reduceByKeyAndWindow with inverse function - basic reduction 1 ms
org.apache.spark.streaming.WindowOperationsSuite: reduceByKeyAndWindow with inverse function - key already in window and new value added into window 0 ms
org.apache.spark.streaming.WindowOperationsSuite: reduceByKeyAndWindow with inverse function - new key added into window 1 ms
org.apache.spark.streaming.WindowOperationsSuite: reduceByKeyAndWindow with inverse function - key removed from window 0 ms
org.apache.spark.streaming.WindowOperationsSuite: reduceByKeyAndWindow with inverse function - larger slide time 0 ms
org.apache.spark.streaming.WindowOperationsSuite: reduceByKeyAndWindow with inverse function - big test 1 ms
org.apache.spark.streaming.WindowOperationsSuite: reduceByKeyAndWindow with inverse and filter functions - big test 2 ms
org.apache.spark.streaming.WindowOperationsSuite: groupByKeyAndWindow 1 ms
org.apache.spark.streaming.WindowOperationsSuite: countByWindow 1 ms
org.apache.spark.streaming.WindowOperationsSuite: countByValueAndWindow 1 ms
org.apache.spark.streaming.rdd.TrackStateRDDSuite: (It is not a test) 1 ms
org.apache.spark.streaming.rdd.WriteAheadLogBackedBlockRDDSuite: (It is not a test) 1 ms
org.apache.spark.streaming.ui.StreamingJobProgressListenerSuite: onBatchSubmitted, onBatchStarted, onBatchCompleted, onReceiverStarted, onReceiverError, onReceiverStopped 1 ms
org.apache.spark.streaming.ui.StreamingJobProgressListenerSuite: Remove the old completed batches when exceeding the limit 0 ms
org.apache.spark.streaming.ui.StreamingJobProgressListenerSuite: out-of-order onJobStart and onBatchXXX 0 ms
org.apache.spark.streaming.ui.StreamingJobProgressListenerSuite: detect memory leak 1 ms

Test time report

Right click on the visualization to go back up a level. Click on a node to expand it. Hover over a node to see the combined duration of tests under that node.