Hi Stephen, it should be enough to include > --jars /path/to/file.jar
in the command line call to either pyspark or spark-submit, as in > spark-submit --master local --jars /path/to/file.jar myfile.py and you can check the bottom of the Web UI’s “Environment" tab to make sure the jar gets on your classpath. Let me know if you still see errors related to this. — Jeremy ------------------------- jeremyfreeman.net @thefreemanlab On Dec 29, 2014, at 7:55 PM, Stephen Boesch <java...@gmail.com> wrote: > What is the recommended way to do this? We have some native database > client libraries for which we are adding pyspark bindings. > > The pyspark invokes spark-submit. Do we add our libraries to > the SPARK_SUBMIT_LIBRARY_PATH ? > > This issue relates back to an error we have been seeing "Py4jError: Trying > to call a package" - the suspicion being that the third party libraries may > not be available on the jvm side.