Github user tdas commented on the pull request:
https://github.com/apache/spark/pull/3798#issuecomment-69695353
Can you confirm the following.
1. In your SBT/maven app used for testing, you are using your development
Spark version to compile? That is, the dev version is locally publish and
you are compiling your app against spark version 1.3.0-SNAPSHOT?
2. Do you have spark-streaming dependency as "provided" scope or the
default "compile" scope? And then are you creating uber jar of the app?
3. Are you submitting the app through spark-submit to the same development
Spark version to compile?
On Mon, Jan 12, 2015 at 2:13 PM, Cody Koeninger <[email protected]>
wrote:
> Yeah, this is on a local development version, after assembly / publish
> local.
>
> Here's a gist of the exception and the diff that causes it (using
> KafkaRDDPartition instead of a tuple)
>
> https://gist.github.com/koeninger/561a61482cd1b5b3600c
>
> â
> Reply to this email directly or view it on GitHub
> <https://github.com/apache/spark/pull/3798#issuecomment-69656800>.
>
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]