I use a data historian (sometimes called time series database) for collecting 
and persisting large (billions) of rows of  measurement data.  The data being 
collected is off of a manufacturing equipment and represents sensors such as 
temperature, pressure….  I've been wondering if I should be researching some 
type of BigData replacement.  The historian simply stores key=value types data, 
primarily made up of timestamp=value.  At 3:00, temperature was 40, at 3:01, it 
was 40…. Lots of repetitive data, but historians are good at compression, but 
cost $$.  I have to believe that commodity hardware is a lot less than year 
over year software maintenance.  Has anyone used any of the Apache Hadoop 
products in this scenario?

Thanks

Reply via email to