Re: Standalone Cluster: ClassNotFound org.apache.kafka.common.serialization.ByteArrayDeserializer

2017-12-28 Thread Shixiong(Ryan) Zhu
The cluster mode doesn't upload jars to the driver node. This is a known issue: https://issues.apache.org/jira/browse/SPARK-4160 On Wed, Dec 27, 2017 at 1:27 AM, Geoff Von Allmen wrote: > I’ve tried it both ways. > > Uber jar gives me gives me the following: > >-

Re: Standalone Cluster: ClassNotFound org.apache.kafka.common.serialization.ByteArrayDeserializer

2017-12-27 Thread Geoff Von Allmen
I’ve tried it both ways. Uber jar gives me gives me the following: - Caused by: java.lang.ClassNotFoundException: Failed to find data source: kafka. Please find packages at http://spark.apache.org/third-party-projects.html If I only do minimal packaging and add

Re: Standalone Cluster: ClassNotFound org.apache.kafka.common.serialization.ByteArrayDeserializer

2017-12-27 Thread Eyal Zituny
Hi, it seems that you're missing the kafka-clients jar (and probably some other dependencies as well) how did you packaged you application jar? does it includes all the required dependencies (as an uber jar)? if it's not an uber jar you need to pass via the driver-class-path and the

Standalone Cluster: ClassNotFound org.apache.kafka.common.serialization.ByteArrayDeserializer

2017-12-26 Thread Geoff Von Allmen
I am trying to deploy a standalone cluster but running into ClassNotFound errors. I have tried a whole myriad of different approaches varying from packaging all dependencies into a single JAR and using the --packages and --driver-class-path options. I’ve got a master node started, a slave node