Query memory [id = f91ec5e5-748c-4c4e-99c6-22c1e97b4274, runId = a108e709-07e1-46b6-bcfd-143159b9fae9] terminated with exception: null


      org.apache.spark.sql.streaming.StreamingQueryException: null
=== Streaming Query ===
Identifier: memory [id = f91ec5e5-748c-4c4e-99c6-22c1e97b4274, runId = a108e709-07e1-46b6-bcfd-143159b9fae9]
Current Committed Offsets: {}
Current Available Offsets: {}

Current State: RECONFIGURING
Thread State: RUNNABLE

Logical Plan:
SerializeFromObject [input[0, int, false] AS value#2765]
+- MapElements org.apache.spark.sql.kafka010.KafkaSourceStressForDontFailOnDataLossSuite$$Lambda$5030/0x000000080194a040@74d53976, class scala.Tuple2, [StructField(_1,StringType,true), StructField(_2,StringType,true)], obj#2764: int
   +- DeserializeToObject newInstance(class scala.Tuple2), obj#2763: scala.Tuple2
      +- Project [cast(key#2739 as string) AS key#2753, cast(value#2740 as string) AS value#2754]
         +- StreamingDataSourceV2Relation [key#2739, value#2740, topic#2741, partition#2742, offset#2743L, timestamp#2744, timestampType#2745], org.apache.spark.sql.kafka010.KafkaSourceProvider$KafkaScan@26b8aa7c, KafkaSource[SubscribePattern[failOnDataLoss.*]]

      at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:351)
      at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:242)
      Cause: java.lang.InterruptedException
      at java.base/java.util.concurrent.locks.AbstractQueuedSynchronizer.tryAcquireSharedNanos(AbstractQueuedSynchronizer.java:1367)
      at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:248)
      at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:258)
      at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:263)
      at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:220)
      at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
      at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:92)
      at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:76)
      at org.apache.spark.sql.execution.streaming.continuous.ContinuousExecution.runContinuous(ContinuousExecution.scala:271)
      at org.apache.spark.sql.execution.streaming.continuous.ContinuousExecution.runActivatedStream(ContinuousExecution.scala:109)
      at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:330)
      at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:242)