Query [id = ccba92a4-7385-4f0a-b702-42e12dda7905, runId = 0a762f78-15eb-4fe5-8e02-edce08615dce] terminated with exception: Timeout of 3000ms expired before the position for partition failOnDataLoss-14-0 could be determined


      org.apache.spark.sql.streaming.StreamingQueryException: Timeout of 3000ms expired before the position for partition failOnDataLoss-14-0 could be determined
=== Streaming Query ===
Identifier: [id = ccba92a4-7385-4f0a-b702-42e12dda7905, runId = 0a762f78-15eb-4fe5-8e02-edce08615dce]
Current Committed Offsets: {KafkaV2[SubscribePattern[failOnDataLoss.*]]: {"failOnDataLoss-1":{"0":10},"failOnDataLoss-0":{"0":27}}}
Current Available Offsets: {KafkaV2[SubscribePattern[failOnDataLoss.*]]: {"failOnDataLoss-1":{"0":10},"failOnDataLoss-0":{"0":27}}}

Current State: ACTIVE
Thread State: RUNNABLE

Logical Plan:
SerializeFromObject [input[0, int, false] AS value#18506]
+- MapElements org.apache.spark.sql.kafka010.KafkaSourceStressForDontFailOnDataLossSuite$$Lambda$4984/842103351@46bb6d60, class scala.Tuple2, [StructField(_1,StringType,true), StructField(_2,StringType,true)], obj#18505: int
   +- DeserializeToObject newInstance(class scala.Tuple2), obj#18504: scala.Tuple2
      +- Project [cast(key#18480 as string) AS key#18494, cast(value#18481 as string) AS value#18495]
         +- StreamingDataSourceV2Relation [key#18480, value#18481, topic#18482, partition#18483, offset#18484L, timestamp#18485, timestampType#18486], class org.apache.spark.sql.kafka010.KafkaSourceProvider$KafkaScan, KafkaV2[SubscribePattern[failOnDataLoss.*]]

      at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:307)
      at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:198)
      Cause: org.apache.kafka.common.errors.TimeoutException: Timeout of 3000ms expired before the position for partition failOnDataLoss-14-0 could be determined
      at