val abc = Map(
“connection.uri” → mongoConnectionUri,
“database” → mongoDbname,
“collection” → mongoCollection)
df.write.format(“mongodb”).options(abc).mode(“append”).save()
**Error while writing Dataframe to mongoDB. (I able to print df, abc.).I have installed latest mongo_spark_connector_10_0_2.jar. Getting error like-
NoClassDefFoundError: com/mongodb/WriteConcern
at com.mongodb.spark.sql.connector.config.WriteConfig.createWriteConcern(WriteConfig.java:212)
at com.mongodb.spark.sql.connector.config.WriteConfig.(WriteConfig.java:166)
at com.mongodb.spark.sql.connector.config.MongoConfig.writeConfig(MongoConfig.java:80)
at com.mongodb.spark.sql.connector.config.MongoConfig.toWriteConfig(MongoConfig.java:226)…
Keep in mind, this isn’t a great forum for asking Spark questions – this is mainly for the Scala language, not the Spark platform. So you may be able to find better advice elsewhere.
That said, NoClassDefFoundError
s tend to be due to dependency evictions. Odds are that you have two different things depending on different (and incompatible) versions of the mongodb library. You probably need to track down the incompatibility and make them consistent.