Re: Count(*) is not working

2017-02-16 Thread Selvam Raman
I am not getting count as result. Where i keep on getting n number of
results below.

Read 100 live rows and 1423 tombstone cells for query SELECT * FROM
keysace.table WHERE token(id) > token(test:ODP0144-0883E-022R-002/047-052)
LIMIT 100 (see tombstone_warn_threshold)

On Thu, Feb 16, 2017 at 12:37 PM, Jan Kesten <j...@dafuer.de> wrote:

> Hi,
>
> do you got a result finally?
>
> Those messages are simply warnings telling you that c* had to read many
> tombstones while processing your query - rows that are deleted but not
> garbage collected/compacted. This warning gives you some explanation why
> things might be much slower than expected because per 100 rows that count
> c* had to read about 15 times rows that were deleted already.
>
> Apart from that, count(*) is almost always slow - and there is a default
> limit of 10.000 rows in a result.
>
> Do you really need the actual live count? To get a idea you can always
> look at nodetool cfstats (but those numbers also contain deleted rows).
>
>
> Am 16.02.2017 um 13:18 schrieb Selvam Raman:
>
> Hi,
>
> I want to know the total records count in table.
>
> I fired the below query:
>select count(*) from tablename;
>
> and i have got the below output
>
> Read 100 live rows and 1423 tombstone cells for query SELECT * FROM
> keysace.table WHERE token(id) > token(test:ODP0144-0883E-022R-002/047-052)
> LIMIT 100 (see tombstone_warn_threshold)
>
> Read 100 live rows and 1435 tombstone cells for query SELECT * FROM
> keysace.table WHERE token(id) > token(test:2565-AMK-2) LIMIT 100 (see
> tombstone_warn_threshold)
>
> Read 96 live rows and 1385 tombstone cells for query SELECT * FROM
> keysace.table WHERE token(id) > token(test:-2220-UV033/04) LIMIT 100 (see
> tombstone_warn_threshold).
>
>
>
>
> Can you please help me to get the total count of the table.
>
> --
> Selvam Raman
> "லஞ்சம் தவிர்த்து நெஞ்சம் நிமிர்த்து"
>
>


-- 
Selvam Raman
"லஞ்சம் தவிர்த்து நெஞ்சம் நிமிர்த்து"


Re: Count(*) is not working

2017-02-16 Thread Selvam Raman
I am using cassandra 3.9.

Primary Key:
id text;

On Thu, Feb 16, 2017 at 12:25 PM, Cogumelos Maravilha <
cogumelosmaravi...@sapo.pt> wrote:

> C* version please and partition key.
>
> On 02/16/2017 12:18 PM, Selvam Raman wrote:
>
> Hi,
>
> I want to know the total records count in table.
>
> I fired the below query:
>select count(*) from tablename;
>
> and i have got the below output
>
> Read 100 live rows and 1423 tombstone cells for query SELECT * FROM
> keysace.table WHERE token(id) > token(test:ODP0144-0883E-022R-002/047-052)
> LIMIT 100 (see tombstone_warn_threshold)
>
> Read 100 live rows and 1435 tombstone cells for query SELECT * FROM
> keysace.table WHERE token(id) > token(test:2565-AMK-2) LIMIT 100 (see
> tombstone_warn_threshold)
>
> Read 96 live rows and 1385 tombstone cells for query SELECT * FROM
> keysace.table WHERE token(id) > token(test:-2220-UV033/04) LIMIT 100 (see
> tombstone_warn_threshold).
>
>
>
>
> Can you please help me to get the total count of the table.
>
> --
> Selvam Raman
> "லஞ்சம் தவிர்த்து நெஞ்சம் நிமிர்த்து"
>
>
>


-- 
Selvam Raman
"லஞ்சம் தவிர்த்து நெஞ்சம் நிமிர்த்து"


Count(*) is not working

2017-02-16 Thread Selvam Raman
Hi,

I want to know the total records count in table.

I fired the below query:
   select count(*) from tablename;

and i have got the below output

Read 100 live rows and 1423 tombstone cells for query SELECT * FROM
keysace.table WHERE token(id) > token(test:ODP0144-0883E-022R-002/047-052)
LIMIT 100 (see tombstone_warn_threshold)

Read 100 live rows and 1435 tombstone cells for query SELECT * FROM
keysace.table WHERE token(id) > token(test:2565-AMK-2) LIMIT 100 (see
tombstone_warn_threshold)

Read 96 live rows and 1385 tombstone cells for query SELECT * FROM
keysace.table WHERE token(id) > token(test:-2220-UV033/04) LIMIT 100 (see
tombstone_warn_threshold).




Can you please help me to get the total count of the table.

-- 
Selvam Raman
"லஞ்சம் தவிர்த்து நெஞ்சம் நிமிர்த்து"


Cassandra: maximum size of collection list type

2016-12-01 Thread Selvam Raman
Hi,

What is the maximum size which can be stored into collection list(in a row
) in cassandra.

-- 
Selvam Raman
"லஞ்சம் தவிர்த்து நெஞ்சம் நிமிர்த்து"


Re: Cassandra: maximum size of collection list type

2016-12-01 Thread Selvam Raman
I am getting the below error when i am having huge list( greater than
3lakh).

"Cassandra timeout during write query at consistency LOCAL_ONE (1 replica
were required but only 0 acknowledged the write)"

2016-12-01 16:20 GMT+00:00 Selvam Raman <sel...@gmail.com>:

> Hi,
>
> What is the maximum size which can be stored into collection list(in a row
> ) in cassandra.
>
> --
> Selvam Raman
> "லஞ்சம் தவிர்த்து நெஞ்சம் நிமிர்த்து"
>



-- 
Selvam Raman
"லஞ்சம் தவிர்த்து நெஞ்சம் நிமிர்த்து"


Java Collections.emptyList inserted as null object in cassandra

2016-11-29 Thread Selvam Raman
Filed Type in cassandra : List

I am trying to insert  Collections.emptyList() from spark to cassandra
list field. In cassandra it stores as null object.

How can i avoid null values here.

-- 
Selvam Raman
"லஞ்சம் தவிர்த்து நெஞ்சம் நிமிர்த்து"


Cassandra timestamp to spark Date field

2016-09-05 Thread Selvam Raman
Hi All,

As per datastax report Cassandra to spark type
timestamp Long, java.util.Date, java.sql.Date, org.joda.time.DateTime

Please help me with your input.

I have a Cassandra table with 30 fields. Out of it 3 are timestamp.

I read cassandratable using sc.cassandraTable
 
[com.datastax.spark.connector.rdd.CassandraTableScanRDD[com.datastax.spark.connector.CassandraRow]
= CassandraTableScanRDD[9] at RDD at CassandraRDD.scala:15]

then I have converted to row of rdd

*val* exis_repair_fact = sqlContext.createDataFrame(rddrepfact.map(r =>
org.apache.spark.sql.Row.fromSeq(r.columnValues)),schema)

in schema fields I have mentioned timestamp as

*StructField*("shipped_datetime", *DateType*),


when I try to show the result, it throws java.util.Date can not convert to
java.sql.Date.


so how can I solve the issue.


First I have converted cassandrascanrdd to



-- 
Selvam Raman
"லஞ்சம் தவிர்த்து நெஞ்சம் நிமிர்த்து"


Reading cassandra table using Spark

2016-09-04 Thread Selvam Raman
Hi,

i am trying to read cassandra table as Dataframe but got the below issue

spark-shell --packages com.datastax.spark:spark-cassandra-connector_2.10:1.3.0
--conf spark.cassandra.connection.host=**

val df = sqlContext.read.
 | format("org.apache.spark.sql.cassandra").
 | options(Map( "table" -> "", "keyspace" -> "***")).
 | load()
java.util.NoSuchElementException: key not found: c_table
at scala.collection.MapLike$class.default(MapLike.scala:228)
at org.apache.spark.sql.execution.datasources.
CaseInsensitiveMap.default(ddl.scala:151)
at scala.collection.MapLike$class.apply(MapLike.scala:141)
at org.apache.spark.sql.execution.datasources.
CaseInsensitiveMap.apply(ddl.scala:151)
at org.apache.spark.sql.cassandra.DefaultSource$.TableRefAndOptions(
DefaultSource.scala:120)
at org.apache.spark.sql.cassandra.DefaultSource.
createRelation(DefaultSource.scala:56)
at org.apache.spark.sql.execution.datasources.
ResolvedDataSource$.apply(ResolvedDataSource.scala:125)


When i am using

SC.CassandraTable(tablename,keyspace) it went fine. but when i crate action
it throws plenty of error

example:
 com.datastax.driver.core.exceptions.InvalidQueryException: unconfigured
columnfamily size_estimates

-- 
Selvam Raman
"லஞ்சம் தவிர்த்து நெஞ்சம் நிமிர்த்து"