Check
http://pig.apache.org/docs/r0.8.1/piglatin_ref1.html#Storing+Intermediate+Data

Daniel

On Fri, Jul 8, 2011 at 12:04 PM, William Oberman
<[email protected]>wrote:

> Sorry, to be more verbose, CDH3 actually respects permissions inside of
> HDFS, and creates special users called "hdfs" and "mapred" to keep things
> safe.  I'm guessing by default when I did the non-package install, I didn't
> enable permissions and/or installed everything as the same user so I didn't
> matter.
>
> So, that makes getting the permissions right for /tmp more important, but I
> didn't think the hadoop crowd would care since it's pig that causes the
> write to that location.  But a newbie pig user might need the FYI....
>
> On Fri, Jul 8, 2011 at 3:01 PM, William Oberman <[email protected]
> >wrote:
>
> > I thought pig is the one trying to write to /tmp inside of hadoop?
> >
> > will
> >
> >
> > On Fri, Jul 8, 2011 at 3:00 PM, Dmitriy Ryaboy <[email protected]>
> wrote:
> >
> >> Seems like a question you should ask Cloudera?
> >>
> >> On Fri, Jul 8, 2011 at 11:57 AM, William Oberman
> >> <[email protected]> wrote:
> >> > I tried out hadoop/pig in my test environment using tar.gz's.  Before
> I
> >> roll
> >> > out to production, I thought I'd try the cdh3 pacakges, as that might
> be
> >> > easier to maintain (since I'm not a sysadmin).  Following cloudera's
> >> install
> >> > guide worked like a charm, but I couldn't get pig to run until I did
> >> this:
> >> >
> >> > sudo -u hdfs hadoop fs -mkdir /tmp
> >> > sudo -u hdfs hadoop fs -chmod 777 /tmp
> >> >
> >> > Maybe I missed the hows & whys of that setting in the install
> >> guide/forums,
> >> > but just wanted to give a heads up to anyone else that gets "ERROR
> 6002:
> >> > Unable to obtain a temporary path." and is puzzled at why...
> >> >
> >> > will
> >>
> >
>

Reply via email to