I don't think so, unfortunately.  I couldn't find them.  I remember someone
mentioning that the slides would be posted somewhere, though.
Alex

On Thu, Oct 30, 2008 at 1:16 PM, Scott Whitecross <[EMAIL PROTECTED]> wrote:

> Is the presentation online as well?  (Hard to see some of the slides in the
> video)
>
>
> On Oct 30, 2008, at 1:34 PM, Alex Loddengaard wrote:
>
>  Arun gave a great talk about debugging and tuning at the Rapleaf event.
>> Take a look:
>> <http://www.vimeo.com/2085477>
>>
>> Alex
>>
>> On Thu, Oct 30, 2008 at 6:20 AM, Malcolm Matalka <
>> [EMAIL PROTECTED]> wrote:
>>
>>  I'm not sure of the correct way, but when I need to log a job I have it
>>> print out with some unique identifier and then just do:
>>>
>>> for i in list of each box; do ssh $i 'grep -R PREFIX path/to/logs'; done
>>>
>>>> results
>>>>
>>>
>>> It works well in a pinch
>>>
>>> -----Original Message-----
>>> From: Scott Whitecross [mailto:[EMAIL PROTECTED]
>>> Sent: Wednesday, October 29, 2008 22:14
>>> To: core-user@hadoop.apache.org
>>> Subject: Debugging / Logging in Hadoop?
>>>
>>> I'm curious to what the best method for debugging and logging in
>>> Hadoop?  I put together a small cluster today and a simple application
>>> to process log files.  While it worked well, I had trouble trying to
>>> get logging information out.  Is there any way to attach a debugger,
>>> or get log4j to write a log file?  I tried setting up a Logger in the
>>> class I used for the map/reduce, but I had no luck.
>>>
>>> Thanks.
>>>
>>>
>>>
>>>
>

Reply via email to