Re: Cassandra 1.2.2 cluster + raspberry

2013-04-15 Thread murat migdisoglu
Hi Aaron,

Thank you for your support. It was my mistake indeed. The second node was
still configured to have the internode comm to be compressed.

After I fixed it, I'm able to start my cluster.

Cheers



On Thu, Apr 11, 2013 at 12:40 PM, aaron morton aa...@thelastpickle.comwrote:

 I've already tried to set internode_compression: none in my yaml files.

 What version are you on?

 If you've set internode_compression to none and restarted? Can you double
 check.
 The code stack shows cassandra deciding that the connection should be
 compressed.

 Cheers

 -
 Aaron Morton
 Freelance Cassandra Consultant
 New Zealand

 @aaronmorton
 http://www.thelastpickle.com

 On 10/04/2013, at 12:54 PM, murat migdisoglu murat.migdiso...@gmail.com
 wrote:

 Hi,

 I'm trying to set up a cassandra cluster for some experiments on my
 raspberry pies but I'm still having trouble to join my nodes to the cluster.

 I started with two nodes (192.168.2.3 and 192.168.2.7) and when I start
 the cassandra, I see the following exception on the node 192.168.2.7
 ERROR [WRITE-/192.168.2.3] 2013-04-10 02:10:24,524 CassandraDaemon.java
 (line 132) Exception in thread Thread[WRITE-/192.168.2.3,5,main]
 java.lang.NoClassDefFoundError: Could not initialize class
 org.xerial.snappy.Snappy
 at
 org.xerial.snappy.SnappyOutputStream.init(SnappyOutputStream.java:79)
 at
 org.xerial.snappy.SnappyOutputStream.init(SnappyOutputStream.java:66)
 at
 org.apache.cassandra.net.OutboundTcpConnection.connect(OutboundTcpConnection.java:322)
 at
 org.apache.cassandra.net.OutboundTcpConnection.run(OutboundTcpConnection.java:143)

 I suspect that the lack of native snappy libraries are causing this
 exception furing the internode communication.
 I did not try to compile the native Snappy for ARM yet but I wonder if it
 is not possible to use cassandra without snappy.

 I've already tried to set internode_compression: none in my yaml files.

 nodetool outputs:

 nodetool -h pi1 ring

 Datacenter: dc1
 ==
 Replicas: 1

 Address RackStatus State   Load
 OwnsToken

 192.168.2.7 RAC1Up Normal  92.35 KB
 100.00% 0

 nodetool -h pi2 ring

 Datacenter: dc1
 ==
 Replicas: 1

 Address RackStatus State   Load
 OwnsToken

 192.168.2.3 RAC1Up Normal  92.42 KB
 100.00% 85070591730234615865843651857942052864



 Kind Regards







-- 
Find a job you enjoy, and you'll never work a day in your life.
Confucius


cassandra-hadoop mapper

2012-05-31 Thread murat migdisoglu
Hi,

I'm working on some use cases to understand how cassandra-hadoop
integration works.

I have a very basic scenario: I have a column family that keeps the session
id and some bson data that contains the username in two separate columns. I
want to go through all rows and dump the row to a file when the username is
matching to a certain criteria. And I don't need any Reducer or Combiner
for now.

After I've written the following very simple hadoop job, I see from the
logs that my mapper function is called per each row.  Is that normal? If
that is the case, doing such a search operation in a big dataset would take
hours if not days...Besides that, I see many small output files being
created on HDFS.

I guess i need a better understanding on how splitting the job into tasks
works exactly..


@Override
public void map(ByteBuffer key, SortedMapByteBuffer, IColumn columns,
Context context)
throws IOException, InterruptedException
{
String rowkey = ByteBufferUtil.string(key);
String ip = context.getConfiguration().
get(IP);
IColumn column = columns.get(sourceColumn);
if (column == null)
return;
ByteBuffer byteBuffer = column.value();
ByteBuffer bb2 = byteBuffer.duplicate();

DataConvertor convertor= fromBson(byteBuffer,
DataConvertor.class);
String username= convertor.getUsername();
BytesWritable value = new BytesWritable();
if (username != null  username.equals(cip)) {
byte[] arr = convertToByteArray(bb2);
value.set(new BytesWritable(arr));
Text tkey = new Text(rowkey);
context.write( tkey, value);
} else {
log.info(ip not match [ + ip + ]);
}
}

Thanks in advance
Kind Regards


-- 
Find a job you enjoy, and you'll never work a day in your life.
Confucius