Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change 
notification.

The following page has been changed by AndriYunanto:
http://wiki.apache.org/hadoop/PoweredBy

------------------------------------------------------------------------------
   * [http://www.weblab.infosci.cornell.edu/ Cornell University Web Lab]
    * Generating web graphs on 100 nodes (dual 2.4GHz Xeon Processor, 2 GB RAM, 
72GB Hard Drive)
  
+  * [http://www.deepdyve.com Deepdyve]
+   * Elastic cluster with 5-80 nodes 
+   * We use hadoop to create our indexes of deep web content and to provide a 
high availability and high bandwidth storage service for index shards for our 
search cluster.
+ 
   * [http://search.detik.com Detikcom] - Indonesia's largest news portal
    * We use hadoop, pig and hbase to analyze search log, generate Most View 
News, generate top wordcloud, and analyze all of our logs
    * Currently We use 9 nodes
  
-  * [http://www.deepdyve.com Deepdyve]
-   * Elastic cluster with 5-80 nodes 
-   * We use hadoop to create our indexes of deep web content and to provide a 
high availability and high bandwidth storage service for index shards for our 
search cluster.
- 
-  
- * [http://www.enormo.com/ Enormo]
+  * [http://www.enormo.com/ Enormo]
    * 4 nodes cluster (32 cores, 1TB).
    * We use Hadoop to filter and index our listings, removing exact duplicates 
and grouping similar ones.
    * We plan to use Pig very shortly to produce statistics.

Reply via email to