[Streaming] Akka-based receiver with messages defined in uploaded jar

2014-08-05 Thread Anton Brazhnyk
Greetings, I modified ActorWordCount example a little and it uses simple case class as the message for Streaming instead of the primitive string. I also modified launch code to not use run-example script, but set spark master in the code and attach the jar (setJars(...)) with all the classes

RE: [Streaming] Akka-based receiver with messages defined in uploaded jar

2014-08-29 Thread Anton Brazhnyk
Just checked it with 1.0.2 Still same exception. From: Anton Brazhnyk [mailto:anton.brazh...@genesys.com] Sent: Wednesday, August 27, 2014 6:46 PM To: Tathagata Das Cc: user@spark.apache.org Subject: RE: [Streaming] Akka-based receiver with messages defined in uploaded jar Sorry for the delay

RE: [Streaming] Akka-based receiver with messages defined in uploaded jar

2014-09-02 Thread Anton Brazhnyk
at either Spark or application code? From: Tathagata Das [mailto:tathagata.das1...@gmail.com] Sent: Friday, August 29, 2014 7:21 PM To: Anton Brazhnyk Cc: user@spark.apache.org Subject: Re: [Streaming] Akka-based receiver with messages defined in uploaded jar Can you try adding the JAR

RE: how to setup steady state stream partitions

2014-09-10 Thread Anton Brazhnyk
Just a guess. updateStateByKey has overloaded variants with partitioner as parameter. Can it help? -Original Message- From: qihong [mailto:qc...@pivotal.io] Sent: Tuesday, September 09, 2014 9:13 PM To: u...@spark.incubator.apache.org Subject: Re: how to setup steady state stream

SPARK-2243 Support multiple SparkContexts in the same JVM

2014-12-17 Thread Anton Brazhnyk
Greetings, First comment on the issue says that reason for non-supporting of multiple contexts is There are numerous assumptions in the code base that uses a shared cache or thread local variables or some global identifiers which prevent us from using multiple SparkContext's. May it be worked

RE: spark-core in a servlet

2015-02-18 Thread Anton Brazhnyk
Check for the dependencies. Looks like you have a conflict around servlet-api jars. Maven's dependency-tree, some exclusions and some luck :) could help. From: Ralph Bergmann | the4thFloor.eu [ra...@the4thfloor.eu] Sent: Tuesday, February 17, 2015 4:14 PM

Spark's Guava pieces cause exceptions in non-trivial deployments

2015-05-14 Thread Anton Brazhnyk
Greetings, I have a relatively complex application with Spark, Jetty and Guava (16) not fitting together. Exception happens when some components try to use mix of Guava classes (including Spark's pieces) that are loaded by different classloaders: java.lang.LinkageError: loader constraint

RE: Spark's Guava pieces cause exceptions in non-trivial deployments

2015-05-14 Thread Anton Brazhnyk
The problem is with 1.3.1 It has Function class (mentioned in exception) in spark-network-common_2.10-1.3.1.jar. Our current resolution is actually backport to 1.2.2, which is working fine. From: Marcelo Vanzin [mailto:van...@cloudera.com] Sent: Thursday, May 14, 2015 6:27 PM To: Anton Brazhnyk

RE: Spark's Guava pieces cause exceptions in non-trivial deployments

2015-05-15 Thread Anton Brazhnyk
. That’s why I proposed to put them into separate maven artifact where they could be just excluded in the build of the app that depends on Spark. From: Marcelo Vanzin [mailto:van...@cloudera.com] Sent: Friday, May 15, 2015 11:55 AM To: Anton Brazhnyk Cc: user@spark.apache.org Subject: Re: Spark's