[Hadoop Wiki] Update of "PoweredBy" by DavidTing

2017-12-07 Thread Apache Wiki
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change 
notification.

The "PoweredBy" page has been changed by DavidTing:
https://wiki.apache.org/hadoop/PoweredBy?action=diff=443=444

* ''Data mining ''
* ''Machine learning ''
  
+  * ''[[https://fquotes.com/|FQuotes]] ''
+   * ''We use Hadoop for analyzing quotes, quote authors and quote topics. ''
+ 
   * ''[[http://freestylers.jp/|Freestylers]] - Image retrieval engine ''
* ''We, the Japanese company Freestylers, use Hadoop to build the image 
processing environment for image-based product recommendation system mainly on 
Amazon EC2, from April 2009. ''
* ''Our Apache Hadoop environment produces the original database for fast 
access from our web application. ''

-
To unsubscribe, e-mail: common-commits-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-commits-h...@hadoop.apache.org



[Hadoop Wiki] Update of "PoweredBy" by DavidTing

2017-01-24 Thread Apache Wiki
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change 
notification.

The "PoweredBy" page has been changed by DavidTing:
https://wiki.apache.org/hadoop/PoweredBy?action=diff=439=440

 * ''Each (commodity) node has 8 cores and 12 TB of storage. ''
 * ''We are heavy users of both streaming as well as the Java APIs. We have 
built a higher level data warehousing framework using these features called 
Hive (see the http://hadoop.apache.org/hive/). We have also developed a FUSE 
implementation over HDFS. ''
  
-  * ''[[http://www.follownews.com/|FollowNews]] ''
+  * ''[[https://www.follownews.com/|FollowNews]] ''
* ''We use Hadoop for storing logs, news analysis, tag analysis. ''
  
   * ''[[http://www.foxaudiencenetwork.com|FOX Audience Network]] ''
@@ -437, +437 @@

 * ''Apache Hive, Apache Avro, Apache Kafka, and other bits and pieces... ''
* ''We use these things for discovering People You May Know and 
[[http://www.linkedin.com/careerexplorer/dashboard|other]] 
[[http://inmaps.linkedinlabs.com/|fun]] 
[[http://www.linkedin.com/skills/|facts]]. ''
  
+  * ''[[https://www.livebet.com|LiveBet]] ''
+   * ''We use Hadoop for storing logs, odds analysis, markets analysis. ''
+ 
   * ''[[http://www.lookery.com|Lookery]] ''
* ''We use Hadoop to process clickstream and demographic data in order to 
create web analytic reports. ''
* ''Our cluster runs across Amazon's EC2 infrastructure and makes use of 
the streaming module to use Python for most operations. ''

-
To unsubscribe, e-mail: common-commits-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-commits-h...@hadoop.apache.org



[Hadoop Wiki] Update of PoweredBy by DavidTing

2015-08-06 Thread Apache Wiki
Dear Wiki user,

You have subscribed to a wiki page or wiki category on Hadoop Wiki for change 
notification.

The PoweredBy page has been changed by DavidTing:
https://wiki.apache.org/hadoop/PoweredBy?action=diffrev1=436rev2=437

 * ''Each (commodity) node has 8 cores and 12 TB of storage. ''
 * ''We are heavy users of both streaming as well as the Java APIs. We have 
built a higher level data warehousing framework using these features called 
Hive (see the http://hadoop.apache.org/hive/). We have also developed a FUSE 
implementation over HDFS. ''
  
-  * ''[[https://fnews.com/|fnews]] ''
+  * ''[[http://www.follownews.com/|FollowNews]] ''
* ''We use Hadoop for storing logs, news analysis, tag analysis. ''
  
   * ''[[http://www.foxaudiencenetwork.com|FOX Audience Network]] ''


[Hadoop Wiki] Update of PoweredBy by DavidTing

2015-01-27 Thread Apache Wiki
Dear Wiki user,

You have subscribed to a wiki page or wiki category on Hadoop Wiki for change 
notification.

The PoweredBy page has been changed by DavidTing:
https://wiki.apache.org/hadoop/PoweredBy?action=diffrev1=430rev2=431

 * ''A 300-machine cluster with 2400 cores and about 3 PB raw storage. ''
 * ''Each (commodity) node has 8 cores and 12 TB of storage. ''
 * ''We are heavy users of both streaming as well as the Java APIs. We have 
built a higher level data warehousing framework using these features called 
Hive (see the http://hadoop.apache.org/hive/). We have also developed a FUSE 
implementation over HDFS. ''
+ 
+  * ''[[https://fnews.com/|fnews]] ''
+   * ''We use Hadoop for storing logs, news analysis, tag analysis. ''
  
   * ''[[http://www.foxaudiencenetwork.com|FOX Audience Network]] ''
* ''40 machine cluster (8 cores/machine, 2TB/machine storage) ''