Details for spark-master-test-sbt-hadoop-3.2-ubuntu-testing build #1

View on Jenkins

Duration
111 minutes
Start time
2019-07-08 19:36:58 ()
Commit
e11a55827e7475aab77e8a4ea0baed7c14059908
Executor
research-jenkins-worker-09
Status
FAILURE

Failed tests

org.apache.spark.sql.execution.streaming.state.StateStoreSuite: get, put, remove, commit, and all data iterator 57 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: removing while iterating 72 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: abort 42 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: getStore with invalid versions 44 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: two concurrent StateStores - one for read-only and one for read-write 45 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: retaining only two latest versions when MAX_BATCHES_TO_RETAIN_IN_MEMORY set to 2 34 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: failure after committing with MAX_BATCHES_TO_RETAIN_IN_MEMORY set to 1 45 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: no cache data with MAX_BATCHES_TO_RETAIN_IN_MEMORY set to 0 50 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: snapshotting 39 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: cleaning 32 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: SPARK-19677: Committing a delta file atop an existing one should not fail on HDFS 40 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: corrupted file handling 37 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: reports memory usage 42 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: reports memory usage on current version 50 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: get 4 ms

Test time report

Right click on the visualization to go back up a level. Click on a node to expand it. Hover over a node to see the combined duration of tests under that node.