Re: How to add kafka streaming jars when initialising the sparkcontext in python

2016-02-15 Thread Jorge Machado
Hi David, Just package with maven and deploy everthing into one jar. You don´t need to do it like this… Use Maven for example. And check if your cluster already has this libraries loaded. If you are using CDH for example you can just import the classes because they already are in the path

How to add kafka streaming jars when initialising the sparkcontext in python

2016-02-15 Thread David Kennedy
I have no problems when submitting the task using spark-submit. The --jars option with the list of jars required is successful and I see in the output the jars being added: 16/02/10 11:14:24 INFO spark.SparkContext: Added JAR file:/usr/lib/spark/extras/lib/spark-streaming-kafka.jar at

add kafka streaming jars when initialising the sparkcontext in python

2016-02-10 Thread David Kennedy
I have no problems when submitting the task using spark-submit. The --jars option with the list of jars required is successful and I see in the output the jars being added: 16/02/10 11:14:24 INFO spark.SparkContext: Added JAR file:/usr/lib/spark/extras/lib/spark-streaming-kafka.jar at