Kafka-Spark streaming stuck in infinite with no progress

I am creating a streaming application using Kafka-Spark and I keep getting zillions of updates and QueryProgressEvent, it says “Progress” but nothing is happening for real. No rows are getting detected to be processed or anything helpful is happening, please help

    val outputQuery = outputDF.writeStream
      .foreachBatch(writeToCassandra _)
      .outputMode("update")
      .option("checkpointLocation", "chk-point-dir")
      .trigger(Trigger.ProcessingTime("1 minute"))
      .start()
    logger.info("Waiting for Query")
    outputQuery.awaitTermination()
  }

  def writeToCassandra(outputDF: DataFrame, batchID: Long): Unit = {
    outputDF.write
      .format("org.apache.spark.sql.cassandra")
      .option("keyspace", "spark_db")
      .option("table", "users")
      .mode("append")
      .save()
    outputDF.show()
  }

Hopefully someone will have some insight, but I should warn: this isn’t a great place to ask this question. This forum is focused on the Scala language, and that’s almost certainly not your problem – this is probably an issue with Spark. (Or Cassandra, or your Kafka configuration, or the Kafka-Spark library that you’re using.)