Dear Wiki user, You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.
The "PoweredBy" page has been changed by SteveLoughran: https://wiki.apache.org/hadoop/PoweredBy?action=diff&rev1=438&rev2=439 Comment: add a title with term "Apache Hadoop"; use "commercial support" as linktext for distributions and commercial support + = Powered by Apache Hadoop = + - This page documents an alphabetical list of institutions that are using Hadoop for educational or production uses. Companies that offer services on or based around Hadoop are listed in [[Distributions and Commercial Support]]. Please include details about your cluster hardware and size. Entries without this may be mistaken for spam references and deleted.'' '' + This page documents an alphabetical list of institutions that are using Apache Hadoop for educational or production uses. Companies that offer services on or based around Hadoop are listed in [[Distributions and Commercial Support|Commercial Support]]. Please include details about your cluster hardware and size. Entries without this may be mistaken for spam references and deleted.'' '' To add entries you need write permission to the wiki, which you can get by subscribing to the common-...@hadoop.apache.org mailing list and asking for permissions on the wiki account username you've registered yourself as. If you are using Apache Hadoop in production you ought to consider getting involved in the development process anyway, by filing bugs, testing beta releases, reviewing the code and turning your notes into shared documentation. Your participation in this process will ensure your needs get met. @@ -70, +72 @@ * ''[[http://atxcursions.com/|ATXcursions]] '' * ''Two applications that are side products/projects of a local tour company: 1. Sentiment analysis of review websites and social media data. Targeting the tourism industry. 2. Marketing tool that analyzes the most valuable/useful reviewers from sites like Tripadvisor and Yelp as well as social media. Lets marketers and business owners find community members most relevant to their businesses. '' - * ''Using Apache Hadoop, HDFS, Hive, and HBase.'' + * ''Using Apache Hadoop, HDFS, Hive, and HBase.'' * ''3 node cluster, 4 cores, 4GB RAM.'' @@ -88, +90 @@ * ''35 Node Cluster '' * ''We have been running our cluster with no downtime for over 2 ½ years and have successfully handled over 75 Million files on a 64 GB Namenode with 50 TB cluster storage. '' * ''We are heavy MapReduce and Apache HBase users and use Apache Hadoop with Apache HBase for semi-supervised Machine Learning, AI R&D, Image Processing & Analysis, and Apache Lucene index sharding using katta. '' - + * ''[[http://www.beebler.com|Beebler]] '' * ''14 node cluster (each node has: 2 dual core CPUs, 2TB storage, 8GB RAM) '' * ''We use Apache Hadoop for matching dating profiles '' @@ -421, +423 @@ * ''[[http://www.legolas-media.com|Legolas Media]] '' * ''[[http://www.linkedin.com|LinkedIn]] '' - * ''We have multiple grids divided up based upon purpose. + * ''We have multiple grids divided up based upon purpose. * ''Hardware: '' * ''~800 Westmere-based HP SL 170x, with 2x4 cores, 24GB RAM, 6x2TB SATA '' * ''~1900 Westmere-based SuperMicro X8DTT-H, with 2x6 cores, 24GB RAM, 6x2TB SATA ''