Instead of Ted's approach, it's also useful to use surefire plugin
when you debug tests.

mvn test -Dmaven.surefire.debug -Dtest=TestClassName

This commands accept debugger's attach on 5005 port by default, so you
can attach via eclipse's debugger. Then the test runs and you can use
debugger. I think the source code is needed to be compiled in your
local environment instead of just downloading it from hadoop's
release.

Thanks,
Tsuyoshi

On Sun, Nov 3, 2013 at 11:22 PM, Karim Awara <karim.aw...@kaust.edu.sa> wrote:
> Hi guys,
>
> Can I just install the HDFS project and debug it? (assuming I am running a
> <put> command through the command line). If so, which project should I
> download (hadoop project that has hdfs)?
>
> --
> Best Regards,
> Karim Ahmed Awara
>
>
> On Sun, Oct 6, 2013 at 5:40 PM, Ted Yu <yuzhih...@gmail.com> wrote:
>>
>> Karim:
>> If you want to debug unit tests, using Eclipse is a viable approach.
>> Here is what I did the past week debugging certain part of hadoop
>> (JobSubmitter in particular) through an HBase unit test.
>>
>> Run 'mvn install -DskipTests' to install hadoop locally
>> Open the class you want to debug and place breakpoint at proper location
>> Open unit test which depends on the class above and select Debug As ->
>> JUnit Test
>> When breakpoint hits, associate the sources.jar file in local maven repo
>> with the class. In my case, the sources jar file is located under
>> ~/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/2.1.2-SNAPSHOT
>>
>> You should be able to step through hadoop code as usual at this point.
>>
>> Cheers
>>
>>
>> On Sun, Oct 6, 2013 at 6:14 AM, John Lilley <john.lil...@redpoint.net>
>> wrote:
>>>
>>> Karim,
>>>
>>>
>>>
>>> I am not an experienced Hadoop programmer, but what I found was that
>>> building and debugging Hadoop under Eclipse was very difficult, and I was
>>> never to make it work correctly.  I suggest using the well documented
>>> command-line Maven build, installing Hadoop from that build, and running it
>>> normally.  Once you have that working, run your namemode or datanode daemon
>>> so as to wait for a remote debugger attach before starting.  You should also
>>> get comfortable with log4j, the logging framework used by Hadoop, as those
>>> log files are often your best friend when trying to debug a collection of
>>> services.
>>>
>>>
>>>
>>> john
>>>
>>>
>>>
>>> From: Karim Awara [mailto:karim.aw...@kaust.edu.sa]
>>> Sent: Sunday, October 06, 2013 5:41 AM
>>> To: user
>>> Subject: Hadoop 2.x with Eclipse
>>>
>>>
>>>
>>> Hi,
>>>
>>> I followed the instructions on how to import hadoop files to Eclipse (I
>>> am using hadoop 2.1 beta).
>>>
>>> Currently on my machine, I have hadoop 2.1 installed.. and its source
>>> code is imported on Eclipse. What I can't grasp is   how to proceed from
>>> there?
>>>
>>> I want to modify HDFS code (blockplacement strategy).. Now building hdfs
>>> project via generates errors to me (unresolved types in hadoop common).  and
>>> if i built successfully, how to test my modified code?
>>>
>>>
>>> --
>>> Best Regards,
>>> Karim Ahmed Awara
>>>
>>>
>>>
>>> ________________________________
>>>
>>> This message and its contents, including attachments are intended solely
>>> for the original recipient. If you are not the intended recipient or have
>>> received this message in error, please notify me immediately and delete this
>>> message from your computer system. Any unauthorized use or distribution is
>>> prohibited. Please consider the environment before printing this email.
>>
>>
>
>
> ________________________________
> This message and its contents, including attachments are intended solely for
> the original recipient. If you are not the intended recipient or have
> received this message in error, please notify me immediately and delete this
> message from your computer system. Any unauthorized use or distribution is
> prohibited. Please consider the environment before printing this email.



-- 
- Tsuyoshi

Reply via email to