Re: CQL data type compatibility between ascii and text

2018-08-10 Thread thiranjith
to change the column type from ascii to text. I have had a mix experience with conversion between data types on different versions of Cassandra. For example, given the following table definition: CREATE TABLE changelog (     sequence int,     description ascii,     createdby ascii,     executedon

Re: CQL data type compatibility between ascii and text

2018-08-10 Thread Y K
日(金) 17:10 thiranjith : > Hi, > > According to documentation at > https://docs.datastax.com/en/cql/3.3/cql/cql_reference/cql_data_types_c.html#cql_data_types_c__cql_data_type_compatibility > we > should not be able to change the column type from ascii to text. > > I have had a mix

CQL data type compatibility between ascii and text

2018-08-10 Thread thiranjith
Hi, According to documentation at  https://docs.datastax.com/en/cql/3.3/cql/cql_reference/cql_data_types_c.html#cql_data_types_c__cql_data_type_compatibility  we should not be able to change the column type from ascii to text. I have had a mix experience with conversion between data types

Re: [EXTERNAL] full text search on some text columns

2018-08-01 Thread Hannu Kröger
supporting Cassandra 3.11.2 (the version i >>>> currently use) >>>> >>>> Sent using Zoho Mail >>>> >>>> >>>> >>>> Forwarded message >>>> From : Andrzej Śliwiński >>>>

Re: [EXTERNAL] full text search on some text columns

2018-08-01 Thread Octavian Rinciog
s://www.zoho.com/mail/> >> >> >> Forwarded message >> From : Andrzej Śliwiński >> To : >> Date : Wed, 01 Aug 2018 08:16:06 +0430 >> Subject : Re: [EXTERNAL] full text search on some text columns >> Forwarded message

Re: [EXTERNAL] full text search on some text columns

2018-08-01 Thread Hannu Kröger
; last 8 Months and not supporting Cassandra 3.11.2 (the version i currently >> use) >> >> Sent using Zoho Mail >> >> >> >> Forwarded message >> From : Andrzej Śliwiński >> To : >> Date : Wed, 01 Aug 2018 08:

Re: Re: [EXTERNAL] full text search on some text columns

2018-07-31 Thread Ben Slater
> Date : Wed, 01 Aug 2018 08:16:06 +0430 > Subject : Re: [EXTERNAL] full text search on some text columns > Forwarded message > > Maybe this plugin could do the job: > https://github.com/Stratio/cassandra-lucene-index > > On Tue, 31 Jul 2018 at 2

Fwd: Re: [EXTERNAL] full text search on some text columns

2018-07-31 Thread onmstester onmstester
Subject : Re: [EXTERNAL] full text search on some text columns Forwarded message Maybe this plugin could do the job: https://github.com/Stratio/cassandra-lucene-index On Tue, 31 Jul 2018 at 22:37, onmstester onmstester wrote:

Re: [EXTERNAL] full text search on some text columns

2018-07-31 Thread Andrzej Śliwiński
er onmstester > *Sent:* Tuesday, July 31, 2018 10:46 AM > *To:* user > *Subject:* [EXTERNAL] full text search on some text columns > > > > I need to do a full text search (like) on one of my clustering keys and > one of partition keys (it use text as data type). The input rate

RE: [EXTERNAL] full text search on some text columns

2018-07-31 Thread onmstester onmstester
  From: onmstester onmstester Sent: Tuesday, July 31, 2018 10:46 AM To: user Subject: [EXTERNAL] full text search on some text columns   I need to do a full text search (like) on one of my clustering keys and one of partition keys (it use text as data type). The input rate is high so only

Re: full text search on some text columns

2018-07-31 Thread onmstester onmstester
Thanks Jordan, There would be millions of rows per day, is SASI capable of standing such a rate? Sent using Zoho Mail On Tue, 31 Jul 2018 19:47:55 +0430 Jordan West wrote On Tue, Jul 31, 2018 at 7:45 AM, onmstester onmstester wrote: I need to do a full text search (like) on one

Re: full text search on some text columns

2018-07-31 Thread DuyHai Doan
zoho.com> wrote: > >> I need to do a full text search (like) on one of my clustering keys and >> one of partition keys (it use text as data type). >> > > For simple LIKE queries on existing columns you could give SASI ( > https://docs.datastax.com/en/dse/5.1/cql/cql/cql

Re: full text search on some text columns

2018-07-31 Thread Jordan West
On Tue, Jul 31, 2018 at 7:45 AM, onmstester onmstester wrote: > I need to do a full text search (like) on one of my clustering keys and > one of partition keys (it use text as data type). > For simple LIKE queries on existing columns you could give SASI ( https://docs.datastax.com/e

RE: [EXTERNAL] full text search on some text columns

2018-07-31 Thread Durity, Sean R
That sounds like a problem tailor-made for the DataStax Search (embedded SOLR) solution. I think that would be the fastest path to success. Sean Durity From: onmstester onmstester Sent: Tuesday, July 31, 2018 10:46 AM To: user Subject: [EXTERNAL] full text search on some text columns I need

full text search on some text columns

2018-07-31 Thread onmstester onmstester
I need to do a full text search (like) on one of my clustering keys and one of partition keys (it use text as data type). The input rate is high so only Cassandra could handle it, Is there any open source version project which help using cassandra+ solr or cassandra + elastic? Any

Re: Text or....

2018-04-04 Thread Jon Haddad
...@gmail.com > <mailto:doanduy...@gmail.com>> wrote: > >> Compressing client-side is better because it will save: >> >> 1) a lot of bandwidth on the network >> 2) a lot of Cassandra CPU because no decompression server-side >> 3) a lot of Cassandra HEA

Re: Text or....

2018-04-04 Thread Jeff Jirsa
oan <doanduy...@gmail.com> wrote: > > Compressing client-side is better because it will save: > > 1) a lot of bandwidth on the network > 2) a lot of Cassandra CPU because no decompression server-side > 3) a lot of Cassandra HEAP because the compressed blob should be relatively

Re: Text or....

2018-04-04 Thread DuyHai Doan
Compressing client-side is better because it will save: 1) a lot of bandwidth on the network 2) a lot of Cassandra CPU because no decompression server-side 3) a lot of Cassandra HEAP because the compressed blob should be relatively small (text data compress very well) compared to the raw size

Re: Text or....

2018-04-04 Thread Jeronimo de A. Barros
ain application is writing ~55,000 characters for a single row. Most > of these characters are entered to one column with "text" data type. > > This looks insanely large for one row. > Would you suggest to change the data type from "text" to BLOB or any other > option that might fit this scenario? > > Thanks! >

Re: Text or....

2018-04-04 Thread Nicolas Guyomar
t; On Wed, Apr 4, 2018 at 3:28 PM, DuyHai Doan <doanduy...@gmail.com> wrote: > >> Compress it and stores it as a blob. >> Unless you ever need to index it but I guess even with SASI indexing a so >> huge text block is not a good idea >> >> On Wed, Apr 4, 2018 at

Re: Text or....

2018-04-04 Thread shalom sagges
, DuyHai Doan <doanduy...@gmail.com> wrote: > Compress it and stores it as a blob. > Unless you ever need to index it but I guess even with SASI indexing a so > huge text block is not a good idea > > On Wed, Apr 4, 2018 at 2:25 PM, shalom sagges <shalomsag...@gmail.c

Re: Text or....

2018-04-04 Thread DuyHai Doan
Compress it and stores it as a blob. Unless you ever need to index it but I guess even with SASI indexing a so huge text block is not a good idea On Wed, Apr 4, 2018 at 2:25 PM, shalom sagges <shalomsag...@gmail.com> wrote: > Hi All, > > A certain application is writing ~55

Text or....

2018-04-04 Thread shalom sagges
Hi All, A certain application is writing ~55,000 characters for a single row. Most of these characters are entered to one column with "text" data type. This looks insanely large for one row. Would you suggest to change the data type from "text" to BLOB or any other o

Re: How to Parse raw CQL text?

2018-02-26 Thread Jon Haddad
On Mon, Feb 5, 2018 at 2:27 PM Kant Kodali <k...@peernova.com >>> <mailto:k...@peernova.com>> wrote: >>> I just did some trial and error. Looks like this would work >>> >>> public class Test { >>> >>> >>> >>>

Re: How to Parse raw CQL text?

2018-02-26 Thread Hannu Kröger
ost/2018/2018-02-25-accessing-private-variables-in-jvm/ >> >> On Mon, Feb 5, 2018 at 2:27 PM Kant Kodali <k...@peernova.com> wrote: >> I just did some trial and error. Looks like this would work >> >> public class Test { >> >> >> >>

Re: How to Parse raw CQL text?

2018-02-26 Thread Kant Kodali
k...@peernova.com> wrote: > > I just did some trial and error. Looks like this would work > > *public class *Test { > > *public static void *main(String[] args) *throws *Exception { > > String stmt = *"create table if not exists test_keyspace.my_table > (

Re: How to Parse raw CQL text?

2018-02-26 Thread Ariel Weisberg
public static void *main(String[] args) *throws *Exception { >>>> String stmt = *"create table if not exists >> test_keyspace.my_table (field1 text, field2 int, field3 >> set, field4 map<ascii, text>, primary key (field1) >> );&q

Re: How to Parse raw CQL text?

2018-02-25 Thread Jonathan Haddad
"create table if not exists test_keyspace.my_table > (field1 text, field2 int, field3 set, field4 map<ascii, text>, primary > key (field1) );"; > ANTLRStringStream stringStream = new ANTLRStringStream(stmt); > CqlLexer cqlLexer = new CqlLexer(stringStream); >

Re: How to Parse raw CQL text?

2018-02-05 Thread Kant Kodali
I just did some trial and error. Looks like this would work public class Test { public static void main(String[] args) throws Exception { String stmt = "create table if not exists test_keyspace.my_table (field1 text, field2 int, field3 set, field4 map<ascii, text>,

Re: How to Parse raw CQL text?

2018-02-05 Thread Kant Kodali
g.apache.cassandra.cql3.statements.ParsedStatement; > > public class Test { > > public static void main(String[] args) throws Exception { > String stmt = "create table if not exists test_keyspace.my_table > (field1 text, field2 int, field3

Re: How to Parse raw CQL text?

2018-02-05 Thread Rahul Singh
pache.cassandra.cql3.CqlLexer; > import org.apache.cassandra.cql3.CqlParser; > import org.apache.cassandra.cql3.statements.CreateTableStatement; > import org.apache.cassandra.cql3.statements.ParsedStatement; > > public class Test { > >public static void main(String[] args) throws Exception { >

How to Parse raw CQL text?

2018-02-05 Thread Kant Kodali
pace.my_table (field1 text, field2 int, field3 set, field4 map<ascii, text>, primary key (field1) );"; ANTLRStringStream stringStream = new ANTLRStringStream(stmt); CqlLexer cqlLexer = new CqlLexer(stringStream); CommonTokenStream token = new CommonTokenStream(cql

Re: Golang + Cassandra + Text Search

2017-10-24 Thread Justin Cameron
https://github.com/Stratio/cassandra-lucene-index is another option - it plugs a full Lucene engine into Cassandra's custom secondary index interface. If you only need text prefix/postfix/substring matching or basic tokenization there is SASI. On Wed, 25 Oct 2017 at 03:50 Who Dadddy <qwert

Re: Golang + Cassandra + Text Search

2017-10-24 Thread Who Dadddy
one here who works with Go has specific > recommendations for as simple framework to add text search on top of > cassandra? > > (Apologies if this is off topic—I am not quite sure what forum in the > cassandra community would be best for this type of question) > > Thanks, > Riley

Re: Golang + Cassandra + Text Search

2017-10-24 Thread Jon Haddad
When someone talks about full text search, I usually assume there’s more required than keyword search, ie simple tokenization and a little stemming. * Term Vectors, common used for a “more like this feature” * Ranking of search results * Facets * More complex tokenization like trigrams So

Re: Golang + Cassandra + Text Search

2017-10-24 Thread DuyHai Doan
There is already a full text search index in Cassandra called SASI On Tue, Oct 24, 2017 at 6:50 AM, Ridley Submission < ridley.submission2...@gmail.com> wrote: > Hi, > > Quick question, I am wondering if anyone here who works with Go has > specific recommendations for as simpl

Golang + Cassandra + Text Search

2017-10-23 Thread Ridley Submission
Hi, Quick question, I am wondering if anyone here who works with Go has specific recommendations for as simple framework to add text search on top of cassandra? (Apologies if this is off topic—I am not quite sure what forum in the cassandra community would be best for this type of question

Re: Cassandra blob vs base64 text

2017-02-20 Thread Benjamin Roth
You could save space when storing your data (base64-)decoded as blobs. 2017-02-20 13:38 GMT+01:00 Oskar Kjellin <oskar.kjel...@gmail.com>: > We currently have some cases where we store base64 as a text field instead > of a blob (running version 2.0.17). > I would like to mov

Cassandra blob vs base64 text

2017-02-20 Thread Oskar Kjellin
We currently have some cases where we store base64 as a text field instead of a blob (running version 2.0.17). I would like to move these to blob but wondering what benefits and optimizations there are? The possible ones I can think of is (but there's probably more): * blob is stored as off heap

回复:UDA can't use int or text as state_type

2016-06-27 Thread lowping
problem solved !!! INITCOND {} should be INITCOND 0 原始邮件 发件人:lowpinglowp...@163.com 收件人:useru...@cassandra.apache.org 发送时间:2016年6月27日(周一) 16:03 主题:UDA can't use int or text as state_type Hi, all I got a problem today when I create a UDA like this. hope you guys help me solve

UDA can't use int or text as state_type

2016-06-27 Thread lowping
Hi, all I got a problem today when I create a UDA like this. hope you guys help me solve this CREATE OR REPLACE FUNCTION sum_fun(state int, type text) // if state type is SET or MAP , this is work CALLED ON NULL INPUT RETURNS int LANGUAGE java AS 'return Integer.parseInt(type)+state

Store JSON as text or UTF-8 encoded blobs?

2015-08-23 Thread Kevin Burton
Hey. I’m considering migrating my DB from using multiple columns to just 2 columns, with the second one being a JSON object. Is there going to be any real difference between TEXT or UTF-8 encoded BLOB? I guess it would probably be easier to get tools like spark to parse the object as JSON

text partition key Bloom filters fp is 1 always, why?

2015-05-13 Thread Anishek Agarwal
Hello, I have a text partition key for one of the CF. The cfstats on that table seems to show that the bloom filter false positive ratio is always 1. Also the bloom filter is using very less space. Do bloom filters not work well with text partition keys ? I can assume this as it can no way

efficiently generate complete database dump in text format

2014-10-09 Thread Gaurav Bhatnagar
Hi, We have a Cassandra database column family containing 320 millions rows and each row contains about 15 columns. We want to take monthly dump of this single column family contained in this database in text format. We are planning to take following approach to implement this functionality 1

Re: efficiently generate complete database dump in text format

2014-10-09 Thread Paulo Ricardo Motta Gomes
Bhatnagar gbhatna...@gmail.com wrote: Hi, We have a Cassandra database column family containing 320 millions rows and each row contains about 15 columns. We want to take monthly dump of this single column family contained in this database in text format. We are planning to take following

Re: efficiently generate complete database dump in text format

2014-10-09 Thread Daniel Chia
column family containing 320 millions rows and each row contains about 15 columns. We want to take monthly dump of this single column family contained in this database in text format. We are planning to take following approach to implement this functionality 1. Take a snapshot of Cassandra

Re: is lack of full text search hurting cassandra and datastax?

2014-10-03 Thread DuyHai Doan
There are some options around for full text search integration with C*. Google for Stratio deep and Stargate. Both are open source Le 3 oct. 2014 06:31, Kevin Burton bur...@spinn3r.com a écrit : So right now I have plenty of quality and robust full text search systems I can use. Solr cloud

Re: is lack of full text search hurting cassandra and datastax?

2014-10-03 Thread Andres de la Peña
You can use also Stratio Cassandra https://github.com/Stratio/stratio-cassandra, which is an open source fork of Cassandra with Lucene based full text search capabilities. -- Andrés de la Peña http://www.stratio.com/ Avenida de Europa, 26. Ática 5. 3ª Planta 28224 Pozuelo de Alarcón, Madrid

Re: is lack of full text search hurting cassandra and datastax?

2014-10-03 Thread Jack Krupansky
And meanwhile, DataStax will continue to invest in and promote and support full text search of your Cassandra data with our tight integration of Solr in DataStax Enterprise. BTW, there is in fact very strong interest in DataStax Enterprise, and not just as “support” for raw Cassandra, so I’m

is lack of full text search hurting cassandra and datastax?

2014-10-02 Thread Kevin Burton
So right now I have plenty of quality and robust full text search systems I can use. Solr cloud, elastic search. They all also have very robust UIs on top of them… kibana, banana, etc. and my alternative for cassandra is… paying for a proprietary database. Which might be fine for some parties

Re: Adding large text blob causes read timeout...

2014-06-24 Thread Kevin Burton
oh.. the difference between the the ONE field and the remaining 29 is massive. It's like 200ms for just the 29 columns.. adding the extra one cause it to timeout .. 5000ms... On Mon, Jun 23, 2014 at 10:30 PM, DuyHai Doan doanduy...@gmail.com wrote: Don't forget that when you do the Select

Re: Adding large text blob causes read timeout...

2014-06-24 Thread DuyHai Doan
Yes but adding the extra one ends up by * 1000. The limit in CQL3 specifies the number of logical rows, not the number of physical columns in the storage engine Le 24 juin 2014 08:30, Kevin Burton bur...@spinn3r.com a écrit : oh.. the difference between the the ONE field and the remaining 29 is

Re: Can I call getBytes on a text column to get the raw (already encoded UTF8)

2014-06-24 Thread Olivier Michallat
of content, and encoding/decoding performance has really bitten us in the future. So I try to avoid transparent encoding/decoding if I can avoid it. So right now, I have a huge blob of text that's a 'text' column. Logically it *should* be text, because that's what it is... Can I just keep

Re: Can I call getBytes on a text column to get the raw (already encoded UTF8)

2014-06-24 Thread Robert Stupp
encoding/decoding if I can avoid it. So right now, I have a huge blob of text that's a 'text' column. Logically it *should* be text, because that's what it is... Can I just keep it as text so our normal tools work on it, but get it as raw UTF8 if I call getBytes? This way I can call

Re: Adding large text blob causes read timeout...

2014-06-24 Thread Jonathan Haddad
Can you do you query in the cli after setting tracing on? On Mon, Jun 23, 2014 at 11:32 PM, DuyHai Doan doanduy...@gmail.com wrote: Yes but adding the extra one ends up by * 1000. The limit in CQL3 specifies the number of logical rows, not the number of physical columns in the storage engine

Re: Can I call getBytes on a text column to get the raw (already encoded UTF8)

2014-06-24 Thread Kevin Burton
really bitten us in the future. So I try to avoid transparent encoding/decoding if I can avoid it. So right now, I have a huge blob of text that's a 'text' column. Logically it *should* be text, because that's what it is... Can I just keep it as text so our normal tools work on it, but get

Can I call getBytes on a text column to get the raw (already encoded UTF8)

2014-06-23 Thread Kevin Burton
blob of text that's a 'text' column. Logically it *should* be text, because that's what it is... Can I just keep it as text so our normal tools work on it, but get it as raw UTF8 if I call getBytes? This way I can call getBytes and then send it right over the wire as pre-encoded UTF8 data

Adding large text blob causes read timeout...

2014-06-23 Thread Kevin Burton
I have a table with a schema mostly of small fields. About 30 of them. The primary key is: primary key( bucket, sequence ) … I have 100 buckets and the idea is that sequence is ever increasing. This way I can read from bucket zero, and everything after sequence N and get all the writes

Re: Can I call getBytes on a text column to get the raw (already encoded UTF8)

2014-06-23 Thread DuyHai Doan
to push LOTS of content, and encoding/decoding performance has really bitten us in the future. So I try to avoid transparent encoding/decoding if I can avoid it. So right now, I have a huge blob of text that's a 'text' column. Logically it *should* be text, because that's what it is... Can I just

Re: Adding large text blob causes read timeout...

2014-06-23 Thread DuyHai Doan
Don't forget that when you do the Select with limit set to 1000, Cassandra is actually fetching 1000 * 29 physical columns (29 fields per logical row). Adding one extra big html column may be too much and cause timeout. Try to: 1. Select only the big html only 2. Or reduce the limit

com.datastax.driver.core.exceptions.InvalidTypeException: Invalid type for value 1 of CQL type text, expecting class java.lang.String but class [Ljava.lang.Object; provided

2013-12-07 Thread Techy Teck
I am trying to insert into Cassandra database using Datastax Java driver. But everytime I am getting below exception at `prBatchInsert.bind` line- com.datastax.driver.core.exceptions.InvalidTypeException: Invalid type for value 1 of CQL type text, expecting class java.lang.String but class

Re: com.datastax.driver.core.exceptions.InvalidTypeException: Invalid type for value 1 of CQL type text, expecting class java.lang.String but class [Ljava.lang.Object; provided

2013-12-07 Thread Keith Wright
Java driver. But everytime I am getting below exception at `prBatchInsert.bind` line- com.datastax.driver.core.exceptions.InvalidTypeException: Invalid type for value 1 of CQL type text, expecting class java.lang.String but class [Ljava.lang.Object; provided Below is my method which accepts

Re: com.datastax.driver.core.exceptions.InvalidTypeException: Invalid type for value 1 of CQL type text, expecting class java.lang.String but class [Ljava.lang.Object; provided

2013-12-07 Thread Dave Brosius
at `prBatchInsert.bind` line- com.datastax.driver.core.exceptions.InvalidTypeException: Invalid type for value 1 of CQL type text, expecting class java.lang.String but class [Ljava.lang.Object; provided Below is my method which accepts `userId` as the input and `attributes` as the `Map` which contains `key

RE: cassandra hadoop reducer writing to CQL3 - primary key - must it be text type?

2013-10-10 Thread John Lumby
From: johnlu...@hotmail.com To: user@cassandra.apache.org Subject: RE: cassandra hadoop reducer writing to CQL3 - primary key - must it be text type? Date: Wed, 9 Oct 2013 18:33:13 -0400 reduce method : public void reduce(LongWritable

RE: cassandra hadoop reducer writing to CQL3 - primary key - must it be text type?

2013-10-09 Thread John Lumby
the one difference was datatype of the primary key of the output colfamily: WordCount has text I had bigint I changed mine to text : CREATE TABLE archive_recordids ( recordid text , count_num bigint, PRIMARY KEY (recordid)) and set the primary key *twice* in the reducer :    keys.put(recordid

RE: cassandra hadoop reducer writing to CQL3 - primary key - must it be text type?

2013-10-09 Thread John Lumby
From: johnlu...@hotmail.com To: user@cassandra.apache.org Subject: RE: cassandra hadoop reducer writing to CQL3 - primary key - must it be text type? Date: Wed, 9 Oct 2013 09:40:06 -0400 software versions : apache-cassandra-2.0.1hadoop-2.1.0

cassandra hadoop reducer writing to CQL3 - primary key - must it be text type?

2013-10-08 Thread John Lumby
I have been expermimenting with using hadoop for a map/reduce operation on cassandra, outputting to the CqlOutputFormat.class. I based my first program fairly closely on the famous WordCount example in examples/hadoop_cql3_word_count except  ---  I set my output colfamily to have a bigint

Keystore password in yaml is in plain text

2013-10-04 Thread Shahryar Sedghi
Hi I there a way to obfuscate the keystore/truststore password? Thanks Shahryar --

Problem with sstableloader from text data

2013-10-02 Thread Paolo Crosato
Hi, following the article at http://www.datastax.com/dev/blog/bulk-loading , I developed a custom builder app to serialize a text file with rows in json format to a sstable. I managed to get the tool running and building the tables, however when I try to load them I get this error

Re: Text searches and free form queries

2012-10-09 Thread Oleg Dulin
It works pretty fast. Cool. Just keep an eye out for how big the lucene token row gets. Cheers Indeed, it may get out of hand, but for now we are ok -- for the foreseable future I would say. Should it get larger, I can split it up into rows -- i.e. all tokens that start with a, all

Re: Text searches and free form queries

2012-10-08 Thread aaron morton
It works pretty fast. Cool. Just keep an eye out for how big the lucene token row gets. Cheers - Aaron Morton Freelance Developer @aaronmorton http://www.thelastpickle.com On 7/10/2012, at 2:57 AM, Oleg Dulin oleg.du...@gmail.com wrote: So, what I ended up doing is this

Re: Text searches and free form queries

2012-10-06 Thread Oleg Dulin
So, what I ended up doing is this -- As I write my records into the main CF, I tokenize some fields that I want to search on using Lucene and write an index into a separate CF, such that my columns are a composite of: luceneToken:record key I can then search my records by doing a slice for

Re: Text searches and free form queries

2012-09-04 Thread aaron morton
AFAIk if you want to keep it inside cassandra then DSE, roll your own from scratch or start with https://github.com/tjake/Solandra . Outside of Cassandra I've heard of people using Elastic Search or Solr which I *think* is now faster at updating the index. Hope that helps.

Text searches and free form queries

2012-09-03 Thread Oleg Dulin
Dear Distinguished Colleagues: I need to add full-text search and somewhat free form queries to my application. Our data is made up of items that are stored in a single column family, and we have a bunch of secondary indices for look ups. An item has header fields and data fields

Re: Text searches and free form queries

2012-09-03 Thread Andrey V. Panov
Some one did search on Lucene, but for very fresh data they build search index in memory so data become available for search without delays. On 3 September 2012 22:25, Oleg Dulin oleg.du...@gmail.com wrote: Dear Distinguished Colleagues:

Online text search with Hadoop/Brisk

2011-05-11 Thread Ben Scholl
the result set first, but they also need to support text search on one of the fields. I was thinking of simulating the SQL LIKE statement, by running each query as a MapReduce job so that the text search gets distributed between nodes. I know the recommended approach is to keep a seperate full-text

Re: Online text search with Hadoop/Brisk

2011-05-11 Thread Edward Capriolo
return within about 20 seconds. The queries will use indexes to narrow down the result set first, but they also need to support text search on one of the fields. I was thinking of simulating the SQL LIKE statement, by running each query as a MapReduce job so that the text search gets distributed

Re: What would be a good strategy for Storing the large text contents like blog posts in Cassandra.

2011-03-08 Thread Jean-Christophe Sirot
On 03/07/2011 10:08 PM, Aaron Morton wrote: You can fill your boots. So long as your boots have a capacity of 2 billion. Background ... http://wiki.apache.org/cassandra/LargeDataSetConsiderations http://wiki.apache.org/cassandra/CassandraLimitations

Re: What would be a good strategy for Storing the large text contents like blog posts in Cassandra.

2011-03-07 Thread Jean-Christophe Sirot
Hello, On 03/06/2011 06:35 PM, Aditya Narayan wrote: Next, I also need to store the blogComments which I am planning to store all, in another single row. 1 comment per column. Thus the entire information about the a single comment like commentBody, commentor would be serialized(using google

Re: What would be a good strategy for Storing the large text contents like blog posts in Cassandra.

2011-03-07 Thread Aaron Morton
You can fill your boots. So long as your boots have a capacity of 2 billion. Background ... http://wiki.apache.org/cassandra/LargeDataSetConsiderations http://wiki.apache.org/cassandra/CassandraLimitations

What would be a good strategy for Storing the large text contents like blog posts in Cassandra.

2011-03-06 Thread Aditya Narayan
What would be a good strategy to store large text content/(blog posts of around 1500-3000 characters) in cassandra? I need to store these blog posts along with their metadata like bloggerId, blogTags. I am looking forward to store this data in a single row giving each attribute a single column

Re: What would be a good strategy for Storing the large text contents like blog posts in Cassandra.

2011-03-06 Thread Aaron Morton
text content/(blog posts of around 1500-3000 characters) in cassandra? I need to store these blog posts along with their metadata like bloggerId, blogTags. I am looking forward to store this data in a single row giving each attribute a single column. So one blog per row. Is using a single

Re: What would be a good strategy for Storing the large text contents like blog posts in Cassandra.

2011-03-06 Thread Aditya Narayan
, Aditya Narayan ady...@gmail.com wrote: What would be a good strategy to store large text content/(blog posts of around 1500-3000 characters)  in cassandra? I need to store these blog posts along with their metadata like bloggerId, blogTags. I am looking forward to store this data in a single row

RE: How can I implement text based searching for the data/entities/items stored in Cassandra ?

2011-02-12 Thread Vivek Mishra
You can use: http://code.google.com/p/kundera/ to search text. it provides a way to search by any key over Cassandra. I guess, nothing inbuilt is in place for this. Vivek From: rajkumar@gmail.com [rajkumar@gmail.com] on behalf of Aklin_81 [asdk

RE: How can I implement text based searching for the data/entities/items stored in Cassandra ?

2011-02-12 Thread Vivek Mishra
Addtionally you can use cassandra indexes for specific search. From: Vivek Mishra [vivek.mis...@impetus.co.in] Sent: 12 February 2011 17:38 To: user@cassandra.apache.org Subject: RE: How can I implement text based searching for the data/entities/items

Re: How can I implement text based searching for the data/entities/items stored in Cassandra ?

2011-02-12 Thread Shaun Cutts
There is/are lucandra/solandra: https://github.com/tjake/Lucandra -- Shaun On Feb 12, 2011, at 6:57 AM, Aklin_81 wrote: I would like to text search for some of Entities/items stored in the database through an AJAX powered application...Such that the user starts typing and he can get