I just want to capture stdout and stderr stream into files for each job. I did some work with Hive in the past, and Hive allows you to get the stdout and stderr streams for each job. I thought that's what the ExecJob interface provides, but I guess the concrete implementation is not there for now.
I'll look into writing a log4j appender, thanks Rohini. On Fri, Mar 22, 2013 at 1:41 PM, Rohini Palaniswamy <[email protected]> wrote: > Not sure what you are exactly trying to capture, but one workaround I can > think of is writing your own log4j appender and capturing the log > information. > > -Rohini > > > On Thu, Mar 21, 2013 at 10:13 AM, Cheolsoo Park <[email protected]>wrote: > >> Hi Jeff, >> >> You're right that those methods in HJob.java throw a >> UnsupportedOperationException >> now. I think they are simply not implemented yet. Probably, we should. >> >> Thanks, >> Cheolsoo >> >> >> On Wed, Mar 20, 2013 at 2:00 PM, Jeff Yuan <[email protected]> wrote: >> >> > Is there an interface to get the standard out and standard error >> > streams for a pig execution? I'm using the Java interface and directly >> > calling PigServer.executeBatch() for example and getting back >> > List<ExecJob>. The ExecJob interface has some interface for getSTDOut >> > and getSTDError, but any calls to these result in >> > UnsupportedOperationException. >> > >> > Thanks, >> > Jeff >> > >>
