Regarding Spark Cassandra Metrics

2022-01-31 Thread Yogesh Kumar Garg
Hi all, I am developing a spark application where I am loading the data into Cassandra and I am using the Spark Cassandra connector for the same. I have created a FAT jar with all the dependencies and submitted that using spark-submit. I am able to load the data successfully to cassandra, but I

Re: spark cassandra questiom

2020-11-23 Thread Sonal Goyal
Yes, it should be good to use Spark for this use case in my opinion. You can look into using the Cassandra Spark connector for persisting your updated data into Cassandra. Cheers, Sonal Nube Technologies Join me at Data Con LA Oct 23 | Big Data Conference Europe. Nov 24 |

spark cassandra questiom

2020-11-10 Thread adfel70
I an very very new to both spark and spark structured streaming. I have to write an application that receives a very very large csv files in hdfs folder. the app must take the file and on each row it must read from Cassandra data base some rows (not many rows will be returned for each row in csv).

Re: spark-cassandra-connector_2.1 caused java.lang.NoClassDefFoundError under Spark 2.4.2?

2019-05-09 Thread Russell Spitzer
gt; On Mon, May 6, 2019, 7:59 PM Richard Xin >> wrote: >> >> >> org.apache.spark >> spark-core_2.12 >> 2.4.0 >> compile >> >> >> org.apache.spark >> spark-sql_2.12 >> 2.4.0 >> >> >>

Re: spark-cassandra-connector_2.1 caused java.lang.NoClassDefFoundError under Spark 2.4.2?

2019-05-06 Thread Russell Spitzer
t;> >> Spark is shown at 2.12, the connector only has a 2.11 release >> >> >> >> On Mon, May 6, 2019, 7:59 PM Richard Xin >> wrote: >> >> >> org.apache.spark >> spark-core_2.12 >> 2.4.0 >> compile >> >&

Re: spark-cassandra-connector_2.1 caused java.lang.NoClassDefFoundError under Spark 2.4.2?

2019-05-06 Thread Russell Spitzer
e.spark > spark-core_2.12 > 2.4.0 > compile > > > org.apache.spark > spark-sql_2.12 > 2.4.0 > > > com.datastax.spark > spark-cassandra-connector_2.11 > 2.4.1 > > > > > I run spark-submit I got following

Re: spark-cassandra-connector_2.1 caused java.lang.NoClassDefFoundError under Spark 2.4.2?

2019-05-06 Thread Richard Xin
org.apache.spark spark-sql_2.12 2.4.0 com.datastax.spark spark-cassandra-connector_2.11 2.4.1 I run spark-submit I got following exceptions on Spark 2.4.2, it works fine when running  spark-submit under Spark 2.4.0 with exact the same command-line call, any idea how do i

Re: spark-cassandra-connector_2.1 caused java.lang.NoClassDefFoundError under Spark 2.4.2?

2019-05-06 Thread Russell Spitzer
spark-sql_2.12 > 2.4.0 > > > com.datastax.spark > spark-cassandra-connector_2.11 > 2.4.1 > > > > > I run spark-submit I got following exceptions on Spark 2.4.2, it works > fine when running spark-submit under Spark 2.4.0 with exact the same > command-line call,

spark-cassandra-connector_2.1 caused java.lang.NoClassDefFoundError under Spark 2.4.2?

2019-05-06 Thread Richard Xin
org.apache.spark spark-core_2.12 2.4.0 compile org.apache.spark spark-sql_2.12 2.4.0 com.datastax.spark spark-cassandra-connector_2.11 2.4.1 I run spark-submit I got following exceptions on Spark 2.4.2, it works fine when running  spark-submit under

read parallel processing spark-cassandra

2018-02-13 Thread sujeet jog
Folks, I have a time series table with each record being 350 columns. the primary key is ((date, bucket), objectid, timestamp) objective is to read 1 day worth of data, which comes to around 12k partitions, each partition has around 25MB of data, I see only 1 task active during the read operati

Re: Testing Spark-Cassandra

2018-01-17 Thread Guillermo Ortiz
link > to install it, and then follow this > <https://github.com/koeninger/spark-cassandra-example> project, but you > will have to adapt the necessary libraries to use spark 2.0.x version. > > Good luck, i would like to see any blog post using this combination. > >

Re: Testing Spark-Cassandra

2018-01-17 Thread Alonso Isidoro Roman
Yes, you can use docker to build your own cassandra ring. Depending your SO, instructions may change, so, please, follow this <https://yurisubach.com/2016/03/24/cassandra-docker-test-cluster/> link to install it, and then follow this <https://github.com/koeninger/spark-cassandra-example

Testing Spark-Cassandra

2018-01-17 Thread Guillermo Ortiz
Hello, I'm using spark 2.0 and Cassandra. Is there any util to make unit test easily or which one would be the best way to do it? library? Cassandra with docker?

Lucene Index with Spark Cassandra

2017-12-17 Thread Junaid Nasir
Hi everyone, I am trying to run lucene with spark but sparkSQL returns zero results, where when same query is run using cqlsh it returns correct rows. same issue as https://github.com/Stratio/cassandra-lucene-index/issues/79 I can see in spark logs that lucene is working but as mentioned in the l

Re: Apache Spark or Spark-Cassandra-Connector doesnt look like it is reading multiple partitions in parallel.

2016-11-26 Thread kant kodali
d it using spark.read.json . > > Cheers, > Anastasios > > > > On Sat, Nov 26, 2016 at 9:34 AM, kant kodali wrote: > >> up vote >> down votefavorite >> <http://stackoverflow.com/questions/40797231/apache-spark-or-spark-cassandra-connector-doesnt-look-like-

Re: Apache Spark or Spark-Cassandra-Connector doesnt look like it is reading multiple partitions in parallel.

2016-11-26 Thread Anastasios Zouzias
guess you might want to first store rdd as a text file on HDFS and then read it using spark.read.json . Cheers, Anastasios On Sat, Nov 26, 2016 at 9:34 AM, kant kodali wrote: > up vote > down votefavorite > <http://stackoverflow.com/questions/40797231/apache-spark-or-spark-cassand

Apache Spark or Spark-Cassandra-Connector doesnt look like it is reading multiple partitions in parallel.

2016-11-26 Thread kant kodali
up vote down votefavorite <http://stackoverflow.com/questions/40797231/apache-spark-or-spark-cassandra-connector-doesnt-look-like-it-is-reading-multipl?noredirect=1#> Apache Spark or Spark-Cassandra-Connector doesnt look like it is reading multiple partitions in parallel. Here is my code

Re: Python - Spark Cassandra Connector on DC/OS

2016-11-01 Thread Andrew Holway
Sorry: Spark 2.0.0 On Tue, Nov 1, 2016 at 10:04 AM, Andrew Holway < andrew.hol...@otternetworks.de> wrote: > Hello, > > I've been getting pretty serious with DC/OS which I guess could be > described as a somewhat polished distribution of Mesos. I'm not sure how > relevant DC/OS is to this problem

Python - Spark Cassandra Connector on DC/OS

2016-11-01 Thread Andrew Holway
Hello, I've been getting pretty serious with DC/OS which I guess could be described as a somewhat polished distribution of Mesos. I'm not sure how relevant DC/OS is to this problem. I am using this pyspark program to test the cassandra connection: http://bit.ly/2eWAfxm (github) I can that the df

Re: unresolved dependency: datastax#spark-cassandra-connector;2.0.0-s_2.11-M3-20-g75719df: not found

2016-09-21 Thread Kevin Mellott
3 You can verify the available versions by searching Maven at http://search.maven.org. Thanks, Kevin On Wed, Sep 21, 2016 at 3:38 AM, muhammet pakyürek wrote: > while i run the spark-shell as below > > spark-shell --jars '/home/ktuser/spark-cassandra- > connector/target/scala

unresolved dependency: datastax#spark-cassandra-connector;2.0.0-s_2.11-M3-20-g75719df: not found

2016-09-21 Thread muhammet pakyürek
while i run the spark-shell as below spark-shell --jars '/home/ktuser/spark-cassandra-connector/target/scala-2.11/root_2.11-2.0.0-M3-20-g75719df.jar' --packages datastax:spark-cassandra-connector:2.0.0-s_2.11-M3-20-g75719df --conf spark.cassandra.connection.host=localhost i get

cassandra 3.7 is compatible with datastax Spark Cassandra Connector 2.0?

2016-09-19 Thread muhammet pakyürek

Re: clear steps for installation of spark, cassandra and cassandra connector to run on spyder 2.3.7 using python 3.5 and anaconda 2.4 ipython 4.0

2016-09-06 Thread ayan guha
ammet pakyürek" wrote: > > > could u send me documents and links to satisfy all above requirements of > installation > of spark, cassandra and cassandra connector to run on spyder 2.3.7 using > python 3.5 and anaconda 2.4 ipython 4.0 > > > -- > >

clear steps for installation of spark, cassandra and cassandra connector to run on spyder 2.3.7 using python 3.5 and anaconda 2.4 ipython 4.0

2016-09-06 Thread muhammet pakyürek
could u send me documents and links to satisfy all above requirements of installation of spark, cassandra and cassandra connector to run on spyder 2.3.7 using python 3.5 and anaconda 2.4 ipython 4.0

Re: spark cassandra issue

2016-09-04 Thread Selvam Raman
ep 4, 2016 at 10:04 PM, Russell Spitzer wrote: > This would also be a better question for the SCC user list :) > https://groups.google.com/a/lists.datastax.com/forum/#! > forum/spark-connector-user > > On Sun, Sep 4, 2016 at 9:31 AM Russell Spitzer > wrote: > >> https://g

Re: spark cassandra issue

2016-09-04 Thread Russell Spitzer
This would also be a better question for the SCC user list :) https://groups.google.com/a/lists.datastax.com/forum/#!forum/spark-connector-user On Sun, Sep 4, 2016 at 9:31 AM Russell Spitzer wrote: > > https://github.com/datastax/spark-cassandra-connector/blob/v1.3.1/doc/14_data_frames.m

Re: spark cassandra issue

2016-09-04 Thread Russell Spitzer
https://github.com/datastax/spark-cassandra-connector/blob/v1.3.1/doc/14_data_frames.md In Spark 1.3 it was illegal to use "table" as a key in Spark SQL so in that version of Spark the connector needed to use the option "c_table" val df = sqlContext

Re: spark cassandra issue

2016-09-04 Thread Mich Talebzadeh
("keyspace name", "table name") > > > On Sun, Sep 4, 2016 at 8:48 PM, Mich Talebzadeh > wrote: > >> Hi Selvan. >> >> I don't deal with Cassandra but have you tried other options as described >> here >> >> https://git

Re: spark cassandra issue

2016-09-04 Thread Selvam Raman
wrote: > Hi Selvan. > > I don't deal with Cassandra but have you tried other options as described > here > > https://github.com/datastax/spark-cassandra-connector/ > blob/master/doc/2_loading.md > > To get a Spark RDD that represents a Cassandra table, call the > cassa

Re: spark cassandra issue

2016-09-04 Thread Mich Talebzadeh
Hi Selvan. I don't deal with Cassandra but have you tried other options as described here https://github.com/datastax/spark-cassandra-connector/blob/master/doc/2_loading.md To get a Spark RDD that represents a Cassandra table, call the cassandraTable method on the SparkContext object. i

Re: spark cassandra issue

2016-09-04 Thread Selvam Raman
its very urgent. please help me guys. On Sun, Sep 4, 2016 at 8:05 PM, Selvam Raman wrote: > Please help me to solve the issue. > > spark-shell --packages com.datastax.spark:spark-cassandra-connector_2.10:1.3.0 > --conf spark.cassandra.connection.host=** > > val df = sqlContext.read. > |

spark cassandra issue

2016-09-04 Thread Selvam Raman
Please help me to solve the issue. spark-shell --packages com.datastax.spark:spark-cassandra-connector_2.10:1.3.0 --conf spark.cassandra.connection.host=** val df = sqlContext.read. | format("org.apache.spark.sql.cassandra"). | options(Map( "table" -> "", "keyspace" -> "***")).

Spark-Cassandra connector

2016-06-21 Thread Joaquin Alzola
Hi List I am trying to install the Spark-Cassandra connector through maven or sbt but neither works. Both of them try to connect to the Internet (which I do not have connection) to download certain files. Is there a way to install the files manually? I downloaded from the maven repository

Re: pyspark spark-cassandra-connector java.io.IOException: Failed to open native connection to Cassandra at {192.168.1.126}:9042

2016-03-09 Thread Andy Davidson
quot;9043") sqlContext.setConf("host","localhost") sqlContext.setConf("port","9043”) Thanks Andy From: Saurabh Bajaj Date: Tuesday, March 8, 2016 at 9:13 PM To: Andrew Davidson Cc: Ted Yu , "user @spark" Subject: Re: pyspark spark-cassand

Re: pyspark spark-cassandra-connector java.io.IOException: Failed to open native connection to Cassandra at {192.168.1.126}:9042

2016-03-08 Thread Ted Yu
; On Tue, Mar 8, 2016 at 6:25 PM, Andy Davidson < > a...@santacruzintegration.com> wrote: > >> Hi Ted >> >> I believe by default cassandra listens on 9042 >> >> From: Ted Yu >> Date: Tuesday, March 8, 2016 at 6:11 PM >> To: Andrew Davidson >> C

Re: pyspark spark-cassandra-connector java.io.IOException: Failed to open native connection to Cassandra at {192.168.1.126}:9042

2016-03-08 Thread Saurabh Bajaj
Hi Andy, I believe you need to set the host and port settings separately spark.cassandra.connection.host spark.cassandra.connection.port https://github.com/datastax/spark-cassandra-connector/blob/master/doc/reference.md#cassandra-connection-parameters Looking at the logs, it seems your port

Re: pyspark spark-cassandra-connector java.io.IOException: Failed to open native connection to Cassandra at {192.168.1.126}:9042

2016-03-08 Thread Andy Davidson
Hi Ted I believe by default cassandra listens on 9042 From: Ted Yu Date: Tuesday, March 8, 2016 at 6:11 PM To: Andrew Davidson Cc: "user @spark" Subject: Re: pyspark spark-cassandra-connector java.io.IOException: Failed to open native connection to Cassandra at {192.168.1

Re: pyspark spark-cassandra-connector java.io.IOException: Failed to open native connection to Cassandra at {192.168.1.126}:9042

2016-03-08 Thread Ted Yu
Have you contacted spark-cassandra-connector related mailing list ? I wonder where the port 9042 came from. Cheers On Tue, Mar 8, 2016 at 6:02 PM, Andy Davidson wrote: > > I am using spark-1.6.0-bin-hadoop2.6. I am trying to write a python > notebook that reads a data frame from

pyspark spark-cassandra-connector java.io.IOException: Failed to open native connection to Cassandra at {192.168.1.126}:9042

2016-03-08 Thread Andy Davidson
I am using spark-1.6.0-bin-hadoop2.6. I am trying to write a python notebook that reads a data frame from Cassandra. I connect to cassadra using an ssh tunnel running on port 9043. CQLSH works how ever I can not figure out how to configure my notebook. I have tried various hacks any idea what I a

Re: metrics not reported by spark-cassandra-connector

2016-02-23 Thread Sa Xiao
Hi Yin, Thanks for your reply. I didn't realize there is a specific mailing list for spark-Cassandra-connector. I will ask there. Thanks! -Sa On Tuesday, February 23, 2016, Yin Yang wrote: > Hi, Sa: > Have you asked on spark-cassandra-connector mailing list ? > > Seems you

Re: metrics not reported by spark-cassandra-connector

2016-02-23 Thread Yin Yang
Hi, Sa: Have you asked on spark-cassandra-connector mailing list ? Seems you would get better response there. Cheers

metrics not reported by spark-cassandra-connector

2016-02-23 Thread Sa Xiao
Hi there, I am trying to enable the metrics collection by spark-cassandra-connector, following the instruction here: https://github.com/datastax/spark-cassandra-connector/blob/master/doc/11_metrics.md However, I was not able to see any metrics reported. I'm using spark-cassandra-connector

RE: spark-cassandra-connector BulkOutputWriter

2016-02-09 Thread Mohammed Guller
Alex – I suggest posting this question on the Spark Cassandra Connector mailing list. The SCC developers are pretty responsive. Mohammed Author: Big Data Analytics with Spark<http://www.amazon.com/Big-Data-Analytics-Spark-Practitioners/dp/1484209656/> From: Alexandr Dzhagriev [mail

spark-cassandra-connector BulkOutputWriter

2016-02-09 Thread Alexandr Dzhagriev
Hello all, I looked through the cassandra spark integration ( https://github.com/datastax/spark-cassandra-connector) and couldn't find any usages of the BulkOutputWriter ( http://www.datastax.com/dev/blog/bulk-loading) - an awesome tool for creating local sstables, which could be later upl

Spark Cassandra Atomic Inserts

2016-02-04 Thread Flaherty, Frank
Cassandra provides "BEGIN BATCH" and "APPLY BATCH" to perform atomic execution of multiple statements as below: BEGIN BATCH INSERT INTO "user_status_updates" ("username", "id", "body") VALUES( 'dave', 16e2f240-2afa-11e4-8069-5f98e903bf02, 'dave update 4' ); INSERT INTO "hom

RE: spark-cassandra

2016-02-03 Thread Mohammed Guller
Another thing to check is what version of the Spark-Cassandra-Connector the Spark Job server passing to the workers. It looks like when you use Spark-submit, you are sending the correct SCC jar, but the Spark Job server may be using a different one. Mohammed Author: Big Data Analytics with

Re: spark-cassandra

2016-02-03 Thread Gerard Maas
; mrajaf...@gmail.com> wrote: > Hi, > > I am using Spark Jobserver to submit the jobs. I am using spark-cassandra > connector to connect to Cassandra. I am getting below exception through > spak jobserver. > > If I submit the job through *Spark-Submit *command it is working fine

spark-cassandra

2016-02-03 Thread Madabhattula Rajesh Kumar
Hi, I am using Spark Jobserver to submit the jobs. I am using spark-cassandra connector to connect to Cassandra. I am getting below exception through spak jobserver. If I submit the job through *Spark-Submit *command it is working fine,. Please let me know how to solve this issue Exception in

Re: Spark Cassandra clusters

2016-01-24 Thread vivek.meghanathan
Thanks mohammed and Ted. I will try out the options and let you all know the progress. Also had posted in spark Cassandra connector community, got similar response. Regards Vivek On Sat, Jan 23, 2016 at 11:37 am, Mohammed Guller mailto:moham...@glassbeam.com>> wrote: Vivek, By d

RE: Spark Cassandra clusters

2016-01-22 Thread Mohammed Guller
Spark<http://www.amazon.com/Big-Data-Analytics-Spark-Practitioners/dp/1484209656/> From: Ted Yu [mailto:yuzhih...@gmail.com] Sent: Friday, January 22, 2016 6:37 PM To: vivek.meghanat...@wipro.com Cc: user Subject: Re: Spark Cassandra clusters I am not Cassandra developer :-) Can you us

Re: Spark Cassandra clusters

2016-01-22 Thread Ted Yu
k.meghanat...@wipro.com [mailto:vivek.meghanat...@wipro.com] > *Sent:* Friday, January 22, 2016 5:38 PM > *To:* user@spark.apache.org > *Subject:* Spark Cassandra clusters > > > > Hi All, > What is the right spark Cassandra cluster setup - having Cassandra cluster > and spa

RE: Spark Cassandra clusters

2016-01-22 Thread Mohammed Guller
] Sent: Friday, January 22, 2016 5:38 PM To: user@spark.apache.org Subject: Spark Cassandra clusters Hi All, What is the right spark Cassandra cluster setup - having Cassandra cluster and spark cluster in different nodes or they should be on same nodes. We are having them in different nod

Re: Spark Cassandra clusters

2016-01-22 Thread Ted Yu
ards >> Vivek >> On Sat, Jan 23, 2016 at 7:13 am, Ted Yu wrote: >> >> Can you give us a bit more information ? >> >> How much memory does each node have ? >> What's the current heap allocation for Cassandra process and executor ? >> Spark / Cass

Re: Spark Cassandra clusters

2016-01-22 Thread vivek.meghanathan
nodes. Regards Vivek On Sat, Jan 23, 2016 at 7:13 am, Ted Yu mailto:yuzhih...@gmail.com>> wrote: Can you give us a bit more information ? How much memory does each node have ? What's the current heap allocation for Cassandra process and executor ? Spark / Cassandra release you are usin

Re: Spark Cassandra clusters

2016-01-22 Thread Ted Yu
does each node have ? > What's the current heap allocation for Cassandra process and executor ? > Spark / Cassandra release you are using > > Thanks > > On Fri, Jan 22, 2016 at 5:37 PM, wrote: > >> Hi All, >> What is the right spark Cassandra cluster setup -

Re: Spark Cassandra clusters

2016-01-22 Thread vivek.meghanathan
ave ? What's the current heap allocation for Cassandra process and executor ? Spark / Cassandra release you are using Thanks On Fri, Jan 22, 2016 at 5:37 PM, mailto:vivek.meghanat...@wipro.com>> wrote: Hi All, What is the right spark Cassandra cluster setup - having Cassandra cluster and

Re: Spark Cassandra clusters

2016-01-22 Thread vivek.meghanathan
Thanks. We are using spark - Cassandra connector aligned for spark 1.3. Regards Vivek On Sat, Jan 23, 2016 at 7:27 am, Durgesh Verma mailto:dv21...@gmail.com>> wrote: This may be useful, you can try connectors. https://academy.datastax.com/demos/getting-started-apache-spark-and-cas

Re: Spark Cassandra clusters

2016-01-22 Thread vivek.meghanathan
n 23, 2016 at 7:13 am, Ted Yu mailto:yuzhih...@gmail.com>> wrote: Can you give us a bit more information ? How much memory does each node have ? What's the current heap allocation for Cassandra process and executor ? Spark / Cassandra release you are using Thanks On Fri, Jan 22,

Re: Spark Cassandra clusters

2016-01-22 Thread Durgesh Verma
gt; What is the right spark Cassandra cluster setup - having Cassandra cluster > and spark cluster in different nodes or they should be on same nodes. > We are having them in different nodes and performance test shows very bad > result for the spark streaming jobs. > Please let us

Re: Spark Cassandra clusters

2016-01-22 Thread Ted Yu
Can you give us a bit more information ? How much memory does each node have ? What's the current heap allocation for Cassandra process and executor ? Spark / Cassandra release you are using Thanks On Fri, Jan 22, 2016 at 5:37 PM, wrote: > Hi All, > What is the right spark Cassan

Spark Cassandra clusters

2016-01-22 Thread vivek.meghanathan
Hi All, What is the right spark Cassandra cluster setup - having Cassandra cluster and spark cluster in different nodes or they should be on same nodes. We are having them in different nodes and performance test shows very bad result for the spark streaming jobs. Please let us know. Regards

Re: Spark Cassandra Java Connector: records missing despite consistency=ALL

2016-01-21 Thread Dennis Birkholz
tency. I would really appreciate if someone could give me a hint how to fix this problem, thanks! Greets, Dennis P.s.: some information about our setup: Cassandra 2.1.12 in a two Node configuration with replication factor=2 Spark 1.5.1 Cassandra Java Driver 2.2.0-rc

Re: Spark Cassandra Java Connector: records missing despite consistency=ALL

2016-01-19 Thread Femi Anthony
, > Dennis > > P.s.: > some information about our setup: > Cassandra 2.1.12 in a two Node configuration with replication factor=2 > Spark 1.5.1 > Cassandra Java Driver 2.2.0-rc3 > Spark Cassandra Java Connector 2.10-1.5.0-M2 > > ---

Spark Cassandra Java Connector: records missing despite consistency=ALL

2016-01-13 Thread Dennis Birkholz
ld really appreciate if someone could give me a hint how to fix this problem, thanks! Greets, Dennis P.s.: some information about our setup: Cassandra 2.1.12 in a two Node configuration with replication factor=2 Spark 1.5.1 Cassandra Java Driver 2.2.0-rc3 Spark Cassandra Java Con

RE: Spark 1.5.2 compatible spark-cassandra-connector

2015-12-29 Thread vivek.meghanathan
Thank you mwy and Sun for your response. Yes basic things are working for me using this connector(guava issue was encountered earlier but with proper exclusion of old version we have resolved it). The current issue is strange one �C we have a kafka-spark-cassandra streaming job in spark. The

Re: Spark 1.5.2 compatible spark-cassandra-connector

2015-12-29 Thread mwy
2.10-1.5.0-M3 & spark 1.5.2 work for me. The jar is built by sbt-assembly. Just for reference. 发件人: "fightf...@163.com" 日期: Wednesday, December 30, 2015 at 10:22 至: "vivek.meghanat...@wipro.com" , user 主题: Re: Spark 1.5.2 compatible spark-cassandra-connector Hi,

Re: Spark 1.5.2 compatible spark-cassandra-connector

2015-12-29 Thread fightf...@163.com
Hi, Vivek M I had ever tried 1.5.x spark-cassandra connector and indeed encounter some classpath issues, mainly for the guaua dependency. I believe that can be solved by some maven config, but have not tried that yet. Best, Sun. fightf...@163.com From: vivek.meghanat...@wipro.com Date

Spark 1.5.2 compatible spark-cassandra-connector

2015-12-29 Thread vivek.meghanathan
All, What is the compatible spark-cassandra-connector for spark 1.5.2? I can only find the latest connector version spark-cassandra-connector_2.10-1.5.0-M3 which has dependency with 1.5.1 spark. Can we use the same for 1.5.2? Any classpath issues needs to be handled or any jars needs to be

Re: error in spark cassandra connector

2015-12-24 Thread Ted Yu
Mind providing a bit more detail ? Release of Spark version of Cassandra connector How job was submitted complete stack trace Thanks On Thu, Dec 24, 2015 at 2:06 AM, Vijay Kandiboyina wrote: > java.lang.NoClassDefFoundError: > com/datastax/spark/connector/rdd/CassandraTableScanRDD > >

error in spark cassandra connector

2015-12-24 Thread Vijay Kandiboyina
java.lang.NoClassDefFoundError: com/datastax/spark/connector/rdd/CassandraTableScanRDD

Spark- Cassandra Connector Error

2015-11-25 Thread ahlusar
ackoverflow.com/questions/33896937/spark-connector-error-warn-nettyutil-found-nettys-native-epoll-transport-but Thank you for your help and for your support. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Cassandra-Connector-Error-tp25483.html Sent

Spark Cassandra Filtering

2015-09-16 Thread Ashish Soni
Hi , How can i pass an dynamic value inside below function to filter instead of hardcoded if have an existing RDD and i would like to use data in that for filter so instead of doing .where("name=?","Anna") i want to do .where("name=?",someobject.value) Please help JavaRDD rdd3 = javaFunctions(sc

Re: Is there any way to connect cassandra without spark-cassandra connector?

2015-08-27 Thread Hafiz Mujadid
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Is-there-any-way-to-connect-cassandra-without-spark-cassandra-connector-tp24472p24482.html Sent from the Apache Spark User List mailing list archive at

Re: Spark-Cassandra-connector

2015-08-21 Thread Ted Yu
gt; 2. If the above answer is YES, is there a way to create a connectionPool > for > each executor, so that multiple task can dump data to cassandra in > parallel? > > Regards, > Samya > > > > -- > View this message in context: > http://apache-spark-user-list.100

Spark-Cassandra-connector

2015-08-20 Thread Samya
list.1001560.n3.nabble.com/Spark-Cassandra-connector-tp24378.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: us

RE: Evaluating spark + Cassandra for our use cases

2015-08-18 Thread Benjamin Ross
echnology? Ben From: Jörn Franke [jornfra...@gmail.com] Sent: Tuesday, August 18, 2015 4:14 PM To: Benjamin Ross; user@spark.apache.org Cc: Ron Gonzalez Subject: Re: Evaluating spark + Cassandra for our use cases Hi, First you need to make your SLA clear. It does n

Re: Evaluating spark + Cassandra for our use cases

2015-08-18 Thread Jörn Franke
to use DataStax enterprise (which is > unappealing from a cost standpoint) because it’s the only thing that > provides the hive spark thrift server to Cassandra. > > > > The two top contenders for our solution are Spark+Cassandra and Druid. > > > > Neither of these

Evaluating spark + Cassandra for our use cases

2015-08-18 Thread Benjamin Ross
enterprise (which is unappealing from a cost standpoint) because it's the only thing that provides the hive spark thrift server to Cassandra. The two top contenders for our solution are Spark+Cassandra and Druid. Neither of these solutions work perfectly out of the box: - Druid w

Re: Spark Cassandra Connector issue

2015-08-11 Thread satish chandra j
Regards, Satish Chandra On Tue, Aug 11, 2015 at 9:23 AM, satish chandra j wrote: > HI All, > I have tried Commands as mentioned below but still it is errors > > dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld > --jars /home/missingmerch/ > postgresql

Re: Spark Cassandra Connector issue

2015-08-10 Thread satish chandra j
HI All, I have tried Commands as mentioned below but still it is errors dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld --jars /home/missingmerch/ postgresql-9.4-1201.jdbc41.jar,/home/missingmerch/dse.jar,/home/missingmerch/spark- cassandra-connector-java_2.10-1.1.1.jar

Re: Spark Cassandra Connector issue

2015-08-10 Thread Dean Wampler
l-9.4-1201.jdbc41.jar,/home/missingmerch/dse.jar,/home/missingmerch/spark- cassandra-connector-java_2.10-1.1.1.jar /home/missingmerch/etl-0.0. 1-SNAPSHOT.jar I also removed the extra "//". Or put "file:" in front of them so they are proper URLs. Note the "snapshot" jar

Re: Spark Cassandra Connector issue

2015-08-10 Thread satish chandra j
Hi, Thanks for quick input, now I am getting class not found error *Command:* dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld --jars ///home/missingmerch/postgresql-9.4-1201.jdbc41.jar ///home/missingmerch/dse.jar ///home/missingmerch/spark-cassandra-connector-java_2.10

Re: Spark Cassandra Connector issue

2015-08-10 Thread Dean Wampler
Add the other Cassandra dependencies (dse.jar, spark-cassandra-connect-java_2.10) to your --jars argument on the command line. Dean Wampler, Ph.D. Author: Programming Scala, 2nd Edition <http://shop.oreilly.com/product/0636920033073.do> (O'Reilly) Typesafe <http://typesafe.com> @

Spark Cassandra Connector issue

2015-08-10 Thread satish chandra j
HI All, Please help me to fix Spark Cassandra Connector issue, find the details below *Command:* dse spark-submit --master spark://10.246.43.15:7077 --class HelloWorld --jars ///home/missingmerch/postgresql-9.4-1201.jdbc41.jar ///home/missingmerch/etl-0.0.1-SNAPSHOT.jar *Error:* WARN 2015

Spark-Cassandra connector DataFrame

2015-07-28 Thread simon wang
Hi, I would like to get the recommendations to use Spark-Cassandra connector DataFrame feature. I was trying to save a Dataframe containing 8 Million rows to Cassandra through the Spark-Cassandra connector. Based on the Spark log, this single action took about 60 minutes to complete. I think it

Spark - Cassandra (timestamp question)

2015-07-26 Thread Ivan Babic
Hi, I am using Spark to load data form Cassandra. One of the fields in C* table is timestamp. When queried in C* it looks like this "2015-06-01 02:56:07-0700" After loading data in to Spark DataFrame (using sqlContext) and printing it from there, I lose the last field (4-digit time zone) and than

Spark Cassandra

2015-05-28 Thread lucas
Key/Value ? I hope it is clear, Thank you very much for you help. Best Regards, -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Cassandra-tp23065.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --

Re: Spark Cassandra connector number of Tasks

2015-05-10 Thread vijaypawnarkar
Looking for help with this. Thank you! -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Cassandra-connector-number-of-Tasks-tp22820p22839.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Spark Cassandra connector number of Tasks

2015-05-08 Thread vijaypawnarkar
I am using the Spark Cassandra connector to work with a table with 3 million records. Using .where() API to work with only a certain rows in this table. Where clause filters the data to 1 rows. CassandraJavaUtil.javaFunctions(sparkContext) .cassandraTable(KEY_SPACE, MY_TABLE

Re: Spark Cassandra Connector

2015-04-19 Thread Ted Yu
1.2.0-rc3 can be found here: http://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector_2.10 Can you use maven to build your project ? Cheers > On Apr 18, 2015, at 9:02 PM, DStrip wrote: > > Hello, > > I am facing some difficulties on installing the

Spark Cassandra Connector

2015-04-18 Thread DStrip
pendencies libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.2.0-rc3" in order to create the previous jar version of the connector and not the default one (i.e. spark-cassandra-connector-assembly-1.3.0-SNAPSHOT.jar ) I am really new on workin

Spark+Cassandra and how to visualize results

2015-02-26 Thread Jan Algermissen
Hi, I am planning to process an event stream in the following way: - write the raw stream through spark streaming to cassandra for later analytics use cases - ‘fork of’ the stream and do some stream analysis and make that information available to build dashboards. Since I am having ElasticSear

Hive, Spark, Cassandra, Tableau, BI, etc.

2015-02-17 Thread Ashic Mahtab
Hi,I've seen a few articles where they CqlStorageHandler to create hive tables referencing Cassandra data using the thriftserver. Is there a secret to getting this to work? I've basically got Spark built with Hive, and a Cassandra cluster. Is there a way to get the hive server to talk to Cassand

Python connector for spark-cassandra

2015-01-20 Thread Nishant Sinha
Hello everyone, Is there a python connector for Spark and Cassandra as there is one for Java. I found a Java connector by DataStax on github: https://github.com/datastax/spark-cassandra-connector I am looking for something similar in Java. Thanks

RE: Adding Spark Cassandra dependency breaks Spark Streaming?

2014-12-06 Thread Ashic Mahtab
Hi,Just checked cassandra connector 1.1.0-beta1 runs fine. The issue seems to be 1.1.0 for spark streaming and 1.1.0 cassandra connector (final). Regards,Ashic. Date: Sat, 6 Dec 2014 13:52:20 -0500 Subject: Re: Adding Spark Cassandra dependency breaks Spark Streaming? From: jayunit100.apa

RE: Adding Spark Cassandra dependency breaks Spark Streaming?

2014-12-06 Thread Ashic Mahtab
ading my clusters to 1.1.1 anyway, but I am intrigued...I'm fairly new to sbt, scala and the jvm in general. Any idea how having spark streaming 1.1.0 and spark cassandra connector 1.1.0 together would cause classes in spark streaming to go missing? Here's the full sbt file if any

RE: Adding Spark Cassandra dependency breaks Spark Streaming?

2014-12-05 Thread Ashic Mahtab
Getting this on the home machine as well. Not referencing the spark cassandra connector in libraryDependencies compiles. I've recently updated IntelliJ to 14. Could that be causing an issue? From: as...@live.com To: yuzhih...@gmail.com CC: user@spark.apache.org Subject: RE: Adding

RE: Adding Spark Cassandra dependency breaks Spark Streaming?

2014-12-05 Thread Ashic Mahtab
";> 4.0.0 untitled100 untiled100 1.0-SNAPSHOT org.apache.spark spark-core_2.10 1.1.0 org.apache.spark spark-streaming_2.10 1.1.0 com.data

Re: Adding Spark Cassandra dependency breaks Spark Streaming?

2014-12-05 Thread Ted Yu
Can you try with maven ? diff --git a/streaming/pom.xml b/streaming/pom.xml index b8b8f2e..6cc8102 100644 --- a/streaming/pom.xml +++ b/streaming/pom.xml @@ -68,6 +68,11 @@ junit-interface test + + com.datastax.spark + spark-cassandra-connector_2.10 + 1.1.0

Adding Spark Cassandra dependency breaks Spark Streaming?

2014-12-05 Thread Ashic Mahtab
file: import sbt.Keys._import sbt._ name := "untitled99" version := "1.0" scalaVersion := "2.10.4" val spark = "org.apache.spark" %% "spark-core" % "1.1.0"val sparkStreaming = "org.apache.spark" %% "spark-streaming" % &

  1   2   >