Apache spark 3.3.1 run time excception

I am getting a runtime exception “Exception in User Class: org.apache.spark.SparkException : Job aborted due to stage failure: Task 0 in stage 34.0 failed 4 times, most recent failure: Lost task 0.3 in stage 34.0 (TID 46) (10.67.233.121 executor 1): java.lang.RuntimeException: Duplicate map key was found, please check the input data. If you want to remove the duplicated keys, you can set spark.sql.mapKeyDedupPolicy to LAST_WIN so that the key inserted at last takes precedence.”

I have verified there are no duplicate records with the key.I have set the spark.sql.mapKeyDedupPolicy as suggested above to LAST_WIN. Still getting the same error.

Please see the code snippet below

sourceDF.rdd.foreachPartition(partition => {

  partition.grouped(UPDATE_BATCH_SIZE).foreach(batch => {
    batch.foreach(row => {
      logger.info(s"row : $row")
    })
  })
})

tried updating the policy as shown below.

val sparkSession: SparkSession = {
val spark: SparkContext = SparkContext.getOrCreate()
val conf: SparkConf = spark.getConf
conf.set(“spark.sql.mapKeyDedupPolicy”, “LAST_WIN”)

SparkSession.builder().config(conf).getOrCreate()

}