Details for spark-master-test-sbt-hadoop-3.2-ubuntu-testing build #7

View on Jenkins

Duration
170 minutes
Start time
2019-07-25 21:07:28 ()
Commit
dbd0a2aa3705eaf642ca7945bbf1908479a8e951
Executor
research-jenkins-worker-08
Status
FAILURE

Failed tests

org.apache.spark.sql.execution.streaming.state.StateStoreSuite: get, put, remove, commit, and all data iterator 30 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: removing while iterating 22 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: abort 16 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: getStore with invalid versions 17 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: two concurrent StateStores - one for read-only and one for read-write 20 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: retaining only two latest versions when MAX_BATCHES_TO_RETAIN_IN_MEMORY set to 2 16 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: failure after committing with MAX_BATCHES_TO_RETAIN_IN_MEMORY set to 1 16 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: no cache data with MAX_BATCHES_TO_RETAIN_IN_MEMORY set to 0 15 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: snapshotting 16 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: cleaning 16 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: SPARK-19677: Committing a delta file atop an existing one should not fail on HDFS 26 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: corrupted file handling 17 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: reports memory usage 17 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: reports memory usage on current version 17 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: get 4 ms

Test time report

Right click on the visualization to go back up a level. Click on a node to expand it. Hover over a node to see the combined duration of tests under that node.