Details for spark-master-test-sbt-hadoop-3.2-ubuntu-testing build #6

View on Jenkins

Duration
184 minutes
Start time
2019-07-25 17:03:46 ()
Commit
89fd2b5efc2e22ee2aa1be5228448f53eff404c8
Executor
research-jenkins-worker-09
Status
FAILURE

Failed tests

org.apache.spark.sql.execution.streaming.state.StateStoreSuite: get, put, remove, commit, and all data iterator 40 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: removing while iterating 33 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: abort 23 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: getStore with invalid versions 21 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: two concurrent StateStores - one for read-only and one for read-write 20 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: retaining only two latest versions when MAX_BATCHES_TO_RETAIN_IN_MEMORY set to 2 23 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: failure after committing with MAX_BATCHES_TO_RETAIN_IN_MEMORY set to 1 22 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: no cache data with MAX_BATCHES_TO_RETAIN_IN_MEMORY set to 0 23 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: snapshotting 22 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: cleaning 21 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: SPARK-19677: Committing a delta file atop an existing one should not fail on HDFS 36 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: corrupted file handling 24 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: reports memory usage 23 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: reports memory usage on current version 25 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: get 5 ms

Test time report

Right click on the visualization to go back up a level. Click on a node to expand it. Hover over a node to see the combined duration of tests under that node.