Skip to content
Advertisement

Tag: pyspark

How to resolve (java.lang.ClassNotFoundException: com.mongodb.spark.sql.DefaultSource.DefaultSource) in pyspark i’m using pycharm

With Pycharm I’m getting this error: java.lang.ClassNotFoundException: com.mongodb.spark.sql.DefaultSource.DefaultSource How can I resolve this issue? I tried: I also tried setting the classpath of the jars also .bash_profile: I had many jars in my_jars but still didn’t get it to work. I keep getting the same error. Answer Provide comma separated jarfiles instead of directory path in spark.jars Alternatively you can

Advertisement