[ 
https://issues.apache.org/jira/browse/HDFS-756?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eli Collins updated HDFS-756:
-----------------------------

    Attachment: hdfs-756.patch

Patch attached. The project split means the libhdfs test needs to access both 
the hdfs and common respos. The test lives in and is run out of the hdfs repo, 
however it runs an instance of hdfs, and therefore needs access to the common 
repo's bin directory. test-libhdfs.sh runs the hdfs instance out of the common 
repo (build/test/libhdfs and sub-directories get created there) since 
hadoop-daemon.sh makes doing otherwise a pain. 

*This means the test now requires setting HADOOP_CORE_HOME.* Once HDFS-621 is 
checked in it would be nice to convert this test to run a MiniDFS cluster and 
no longer depend on a common repo. It doesn't seem like running one-daemon per 
process using the traditional startup scripts adds much additional coverage. 
Reasonable?

To run the test:
{{export HADOOP_CORE_HOME=<common repo dir>}}
{{ant -Dcompile.c\+\+=true -Dlibhdfs=true test-c\+\+-libhdfs}}

You may need to ant clean your common directory to remove old hdfs jar files in 
the root directory, build/ivy or the lib dirs.

> libhdfs unit tests do not run 
> ------------------------------
>
>                 Key: HDFS-756
>                 URL: https://issues.apache.org/jira/browse/HDFS-756
>             Project: Hadoop HDFS
>          Issue Type: Bug
>          Components: contrib/libhdfs
>            Reporter: dhruba borthakur
>            Assignee: Eli Collins
>            Priority: Critical
>             Fix For: 0.22.0
>
>         Attachments: hdfs-756.patch
>
>
> The libhdfs unit tests (ant test-c++-libhdfs -Dislibhdfs=1) do not run yet 
> because the scripts are in the common subproject,

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to