Query memory [id = 498238ae-c24e-4620-91fb-3f6c0d76914a, runId = 363d7010-3bba-474d-976e-94044318d3a0] terminated with exception: null
org.apache.spark.sql.streaming.StreamingQueryException: null
=== Streaming Query ===
Identifier: memory [id = 498238ae-c24e-4620-91fb-3f6c0d76914a, runId = 363d7010-3bba-474d-976e-94044318d3a0]
Current Committed Offsets: {KafkaSource[SubscribePattern[failOnDataLoss.*]]: {"failOnDataLoss-0":{"0":12}}}
Current Available Offsets: {}
Current State: RECONFIGURING
Thread State: RUNNABLE
Logical Plan:
SerializeFromObject [input[0, int, false] AS value#2789]
+- MapElements org.apache.spark.sql.kafka010.KafkaSourceStressForDontFailOnDataLossSuite$$Lambda$5059/0x0000000801956840@55da9ccf, class scala.Tuple2, [StructField(_1,StringType,true), StructField(_2,StringType,true)], obj#2788: int
+- DeserializeToObject newInstance(class scala.Tuple2), obj#2787: scala.Tuple2
+- Project [cast(key#2763 as string) AS key#2777, cast(value#2764 as string) AS value#2778]
+- StreamingDataSourceV2Relation [key#2763, value#2764, topic#2765, partition#2766, offset#2767L, timestamp#2768, timestampType#2769], org.apache.spark.sql.kafka010.KafkaSourceProvider$KafkaScan@6ea97ef8, KafkaSource[SubscribePattern[failOnDataLoss.*]]
at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:351)
at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:242)
Cause: java.lang.InterruptedException
at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer.tryAcquireSharedNanos(AbstractQueuedSynchronizer.java:1367)
at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:248)
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:258)
at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:263)
at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:220)
at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:92)
at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:76)
at org.apache.spark.sql.execution.streaming.continuous.ContinuousExecution.runContinuous(ContinuousExecution.scala:271)
at org.apache.spark.sql.execution.streaming.continuous.ContinuousExecution.runActivatedStream(ContinuousExecution.scala:109)
at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:330)
at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:242)