Skip to content Skip to sidebar Skip to footer

Add Jar To Pyspark When Using Notebook

I'm trying the mongodb hadoop integration with spark but can't figure out how to make the jars accessible to an IPython notebook. Here what I'm trying to do: # set up parameters fo

Solution 1:

Very similar, please let me know if this helps: https://issues.apache.org/jira/browse/SPARK-5185


Post a Comment for "Add Jar To Pyspark When Using Notebook"