Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change 
notification.

The "PoweredBy" page has been changed by ZhengShao.
http://wiki.apache.org/hadoop/PoweredBy?action=diff&rev1=177&rev2=178

--------------------------------------------------

  
   * [[http://www.facebook.com/|Facebook]]
    * We use Hadoop to store copies of internal log and dimension data sources 
and use it as a source for reporting/analytics and machine learning.
-   * Currently have a 600 machine cluster with 4800 cores and about 7 PB raw 
storage.  Each (commodity) node has 8 cores and 12 TB of storage.
+   * Currently we have 2 major clusters:
+    * A 1100-machine cluster with 8800 cores and about 12 PB raw storage.
+    * A 300-machine cluster with 2400 cores and about 3 PB raw storage.
+    * Each (commodity) node has 8 cores and 12 TB of storage.
    * We are heavy users of both streaming as well as the Java apis. We have 
built a higher level data warehousing framework using these features called 
Hive (see the http://hadoop.apache.org/hive/).  We have also developed a FUSE 
implementation over hdfs.
  
   * [[http://www.foxaudiencenetwork.com|FOX Audience Network]]

Reply via email to