Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Pig Wiki" for change 
notification.

The "PoweredBy" page has been changed by DmitriyRyaboy.
http://wiki.apache.org/pig/PoweredBy

--------------------------------------------------

New page:
Applications and organizations using Pig include (alphabetically):

 * [[http://www.cooliris.com/|Cooliris]] - Cooliris transforms your browser 
into a lightning fast, cinematic way to browse photos and videos, both online 
and on your hard drive.
  * We have a 15-node Hadoop cluster where each machine has 8 cores, 8 GB ram, 
and 3-4 TB of storage.
  * We use Hadoop for all of our analytics, and we use Pig to allow PMs and 
non-engineers the freedom to query the data in an ad-hoc manner.<<BR>>
 * [[http://www.dropfire.com/|DropFire]]
  * We generate Pig Latin scripts that describe structural and semantic 
conversions between data contexts
  * We use Hadoop to execute these scripts for production-level deployments
  * Eliminates the need for explicit data and schema mappings during database 
integration
 * [[http://www.linkedin.com/|LinkedIn]]
  * 3x30 Nehalem-based node grids, with 2x4 cores, 16GB RAM, 8x1TB storage 
using ZFS in a JBOD configuration.
  * We use Hadoop and Pig for discovering People You May Know and other fun 
facts.
 * [[http://www.ning.com/|Ning]]
  * We use Hadoop to store and process our log file
  * We rely on Apache Pig for reporting, analytics, Cascading for machine 
learning, and on a proprietary [[/hadoop/JavaScript|JavaScript]] API for ad-hoc 
queries
  * We use commodity hardware, with 8 cores and 16 GB of RAM per machine



 * [[http://twitter.com|Twitter]]<<BR>>
  * We use Pig extensively to process usage logs, mine tweet data, and more.
  * We have maintain [[http://github.com/kevinweil/elephant-bird|Elephant 
Bird]], a set of libraries for working with Pig, LZO compression, protocol 
buffers, and more.
  * More details can be seen in this presentation: 
http://www.slideshare.net/kevinweil/nosql-at-twitter-nosql-eu-2010<<BR>>
 * [[http://www.yahoo.com/|Yahoo!]]
  * More than 100,000 CPUs in >25,000 computers running Hadoop
  * Our biggest cluster: 4000 nodes (2*4cpu boxes w 4*1TB disk & 16GB RAM)
  * Used to support research for Ad Systems and Web Search
  * Also used to do scaling tests to support development of Hadoop on larger 
clusters
  * [[http://developer.yahoo.com/blogs/hadoop|Our Blog]] - Learn more about how 
we use Hadoop.
  * >40% of Hadoop Jobs within Yahoo are Pig jobs.

Reply via email to