Sorry for the long silence. We are able to
1. Pass parameters from Vaadin (Java Framework) to spark-jobserver using
HttpURLConnection POST method.
2. Receive filtered (based on passed parameters) RDD results from
spark-jobserver using HttpURLConnection GET method.
3. Finally, showing the results o
Thanks Vasu. Let me get back to you once I am done with trials.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-pass-parameters-to-a-spark-jobserver-Scala-class-tp21671p21732.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
ow to modify above code for passing parameters (as we do in
*curl -d ...*) during job run?
Hope I make sense.
Sasi
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-pass-parameters-to-a-spark-jobserver-Scala-class-tp21671p21717.html
Sen
Thank you Abhishek. The code works.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-define-SparkContext-with-Cassandra-connection-for-spark-jobserver-tp21119p21184.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Dear All,
For our requirement, we need to define a SparkContext with SparkConf which
has Cassandra connection details. And this SparkContext need to be shared
for subsequent runJobs and through out the application.
So, How to define SparkContext with Cassandra connection for
spark-jobserver?
Thank you Abhishek. That works.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Removing-JARs-from-spark-jobserver-tp21081p21084.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
--
How to remove submitted JARs from spark-jobserver?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Removing-JARs-from-spark-jobserver-tp21081.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---
We are able to resolve *SparkException: Job aborted due to stage failure: All
masters are unresponsive! Giving up* as well. Spark-jobserver working fine
now and need to experiment more.
Thank you guys.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Set-EXT
Boris,
Yes, as you mentioned, we are creating a new SparkContext for our Job. The
reason being, to define Apache Cassandra connection using SparkConf. We
hope, this also should work.
For uploading JAR, we followed
(1) Package JAR using *sbt package* command
(2) Use *curl --data-binary
@target/s
Thank you Pankaj. We are able to create the Uber JAR (very good to bind all
dependency JARs together) and run it on spark-jobserver. One step better
than what we are.
However, now facing *SparkException: Job aborted due to stage failure: All
masters are unresponsive! Giving up*. We may need to rai
Boris,
Thank you for your suggestion. I used following code and still facing the
same issue -
val conf = new SparkConf(true).set("spark.cassandra.connection.host",
"127.0.0.1")
.setAppName("jobserver test demo")
.set
d about EXTRA_JAR environment variable.
How do we set EXTRA_JAR environment variable for spark-jobserver SBT when
running in Windows Server 2012?
Does EXTRA_JAR environment variable set will help to resolve above class not
found exception?
Your suggestions would be appreciated.
Sasi
--
View this mes
Thanks Abhishek. We are good know with an answer to try.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Need-help-for-Spark-JobServer-setup-on-Maven-for-Java-programming-tp20849p20906.html
Sent from the Apache Spark User List mailing list archive at Nabble.
Thanks Abhishek. We understand your point and will try using REST URL.
However one concern, we had around 1 lakh rows in our Cassandra table
presently. Will REST URL result can withstand the response size?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Ne
The reason being, we had Vaadin (Java Framework) application which displays
data from Spark RDD, which in turn gets data from Cassandra. As we know, we
need to use Maven for building Spark API in Java.
We tested the spark-jobserver using SBT and able to run it. However, for our
requirement, we nee
Does my question make sense or required some elaboration?
Sasi
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Need-help-for-Spark-JobServer-setup-on-Maven-for-Java-programming-tp20849p20896.html
Sent from the Apache Spark User List mailing list archive at
(for Apache Spark - Java programming).
We will be glad for your suggestion.
Sasi
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Need-help-for-Spark-JobServer-setup-on-Maven-for-Java-programming-tp20849.html
Sent from the Apache Spark User List mailing
Add my message.
On Tue, Oct 28, 2014 at 3:22 PM, Sasi [via Apache Spark User List] <
ml-node+s1001560n17471...@n3.nabble.com> wrote:
> Thank you Akhil. You are correct it's about overlapped "thrift" libraries.
> We have taken reference from
> http://mail-archives.
wing order -
a) cassandra-driver-core-2.1.0.jar
b) cassandra-thrift-2.1.0.jar
c) libthrift-0.9.0.jar
d) spark-cassandra-connector_2.10-1.1.0-alpha3.jar
It resolved our issue.
Sasi
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-c
e) cassandra-clientutil-2.1.0.jar
f) cassandra-driver-core-2.1.0.jar
Are we missed any JAR file? or Is it the right way to connect Spark with
Cassandra? Any guidance would be appreciated.
Sasi
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/NoSuchMethodError-ca
20 matches
Mail list logo