Details for spark-master-test-sbt-hadoop-3.2-ubuntu-testing build #9

View on Jenkins

Duration
136 minutes
Start time
2019-08-26 19:50:35 ()
Commit
84d4f945969e199a5d3fb658864e494b88d15f3c
Executor
research-jenkins-worker-07
Status
FAILURE

Failed tests

org.apache.spark.sql.execution.streaming.state.StateStoreSuite: get, put, remove, commit, and all data iterator 30 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: removing while iterating 23 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: abort 14 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: getStore with invalid versions 15 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: two concurrent StateStores - one for read-only and one for read-write 13 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: retaining only two latest versions when MAX_BATCHES_TO_RETAIN_IN_MEMORY set to 2 14 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: failure after committing with MAX_BATCHES_TO_RETAIN_IN_MEMORY set to 1 14 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: no cache data with MAX_BATCHES_TO_RETAIN_IN_MEMORY set to 0 14 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: snapshotting 17 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: cleaning 14 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: SPARK-19677: Committing a delta file atop an existing one should not fail on HDFS 25 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: corrupted file handling 17 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: reports memory usage 14 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: reports memory usage on current version 18 ms
org.apache.spark.sql.execution.streaming.state.StateStoreSuite: get 4 ms

Test time report

Right click on the visualization to go back up a level. Click on a node to expand it. Hover over a node to see the combined duration of tests under that node.