or you can use combination of kafka <http://kafka.apache.org/> + phoenix<http://phoenix.incubator.apache.org/>
On Wed, May 7, 2014 at 8:55 PM, Azuryy Yu <[email protected]> wrote: > Hi Alex, > > you can try Apache Flume. > > > On Wed, May 7, 2014 at 10:48 AM, Alex Lee <[email protected]> wrote: > >> Sensors' may send tcpip data to server. Each sensor may send tcpip data >> like a stream to the server, the quatity of the sensors and the data rate >> of the data is high. >> >> Firstly, how the data from tcpip can be put into hadoop. It need to do >> some process and store in hbase. Does it need through save to data files >> and put into hadoop or can be done in some direct ways from tcpip. Is there >> any software module can take care of this. Searched that Ganglia Nagios and >> Flume may do it. But when looking into details, ganglia and nagios are >> more for monitoring hadoop cluster itself. Flume is for log files. >> >> Secondly, if the total network traffic from sensors are over the limit of >> one lan port, how to share the loads, is there any component in hadoop to >> make this done automatically. >> >> Any suggestions, thanks. >> > >
