For a starting point, here is what I currently do at the bash level to install Pig on the master node. This needs to be converted to a proper recipe. It assumes pig is run by user hadoop:

sudo echo "export PIG_HOME=/usr/lib/pig">>/home/hadoop/.profile
sudo echo 'export PATH=${PIG_HOME}/bin:$PATH'>>/home/hadoop/.profile
sudo echo "export PIG_HOME=/usr/lib/pig" >> /usr/lib/hadoop-0.20/conf/hadoop-env.sh
CDH_REV_CAPTURE=cdh3u2
PIG_DIR=pig-0.8.1-${CDH_REV_CAPTURE}
PIG_TGZ=http://archive.cloudera.com/cdh/3/${PIG_DIR}.tar.gz
(cd /usr/lib/;  wget -q -O - ${PIG_TGZ} | gunzip --stdout | sudo tar xf -)
sudo ln -s /usr/lib/$PIG_DIR /usr/lib/pig
sudo chown -R root:root /usr/lib/$PIG_DIR
sudo chmod -R 555 /usr/lib/$PIG_DIR


On 20111028 6:21 , Andrei Savu wrote:

Stephen,

Hive and Pig are not yet available as Whirr services but you should be able to easily deploy them on a Hadoop cluster started by Whirr.

I encourage you to contribute back if you add them :)

Cheers,
Andrei

On Oct 28, 2011 4:01 PM, "Stephen Boesch" <[email protected] <mailto:[email protected]>> wrote:


    anyone have these?  thx!


Reply via email to