Build failed in Jenkins: Hadoop-Common-0.23-Build #316

2012-07-19 Thread Apache Jenkins Server
See https://builds.apache.org/job/Hadoop-Common-0.23-Build/316/changes

Changes:

[daryn] svn merge -c 1353800 FIXES: HDFS-3516. Check content-type in 
WebHdfsFileSystem.

[szetszwo] svn merge -c 1362976 from trunk for HDFS-3577. In 
DatanodeWebHdfsMethods, use MessageBodyWriter instead of StreamingOutput, 
otherwise, it will fail to transfer large files.

--
[...truncated 19064 lines...]
Tests run: 49, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.94 sec
Running org.apache.hadoop.fs.viewfs.TestFcMainOperationsLocalFs
Tests run: 54, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.245 sec
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemDelegation
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.481 sec
Running org.apache.hadoop.fs.viewfs.TestViewFsWithAuthorityLocalFs
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.166 sec
Running org.apache.hadoop.fs.viewfs.TestViewFsLocalFs
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.201 sec
Running org.apache.hadoop.fs.TestGlobPattern
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.154 sec
Running org.apache.hadoop.fs.TestS3_LocalFileContextURI
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.18 sec
Running org.apache.hadoop.fs.TestLocalFSFileContextCreateMkdir
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.759 sec
Running org.apache.hadoop.fs.TestHarFileSystem
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.35 sec
Running org.apache.hadoop.fs.TestFileSystemCaching
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.781 sec
Running org.apache.hadoop.fs.TestLocalFsFCStatistics
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.575 sec
Running org.apache.hadoop.fs.TestHardLink
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.304 sec
Running org.apache.hadoop.fs.TestCommandFormat
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.166 sec
Running org.apache.hadoop.fs.TestLocal_S3FileContextURI
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.19 sec
Running org.apache.hadoop.fs.TestLocalFileSystem
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.854 sec
Running org.apache.hadoop.fs.TestFcLocalFsPermission
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.661 sec
Running org.apache.hadoop.fs.TestListFiles
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.598 sec
Running org.apache.hadoop.fs.TestPath
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.87 sec
Running org.apache.hadoop.fs.kfs.TestKosmosFileSystem
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.619 sec
Running org.apache.hadoop.fs.TestGlobExpander
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.111 sec
Running org.apache.hadoop.fs.TestFilterFileSystem
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.65 sec
Running org.apache.hadoop.fs.TestFcLocalFsUtil
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.598 sec
Running org.apache.hadoop.fs.TestGetFileBlockLocations
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.819 sec
Running org.apache.hadoop.fs.s3.TestInMemoryS3FileSystemContract
Tests run: 29, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.854 sec
Running org.apache.hadoop.fs.s3.TestINode
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.116 sec
Running org.apache.hadoop.fs.s3.TestS3Credentials
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.205 sec
Running org.apache.hadoop.fs.s3.TestS3FileSystem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.236 sec
Running org.apache.hadoop.fs.TestDU
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.279 sec
Running org.apache.hadoop.record.TestBuffer
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.123 sec
Running org.apache.hadoop.record.TestRecordVersioning
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.205 sec
Running org.apache.hadoop.record.TestRecordIO
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.218 sec
Running org.apache.hadoop.metrics2.source.TestJvmMetrics
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.428 sec
Running org.apache.hadoop.metrics2.util.TestSampleStat
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.133 sec
Running org.apache.hadoop.metrics2.util.TestMetricsCache
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.241 sec
Running org.apache.hadoop.metrics2.lib.TestInterns
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.29 sec
Running org.apache.hadoop.metrics2.lib.TestMetricsAnnotations
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.534 sec
Running org.apache.hadoop.metrics2.lib.TestMutableMetrics
Tests run: 2, Failures: 0, 

Powered By Hadoop Wiki Page Permissions

2012-07-19 Thread Joey Krabacher
Could someone please grant me write permissions to this page?
http://wiki.apache.org/hadoop/PoweredBy

Just following instructions from the hadoop wiki/
To add entries you need write permission to the wiki, which you can
get by subscribing to the common-dev@hadoop.apache.org mailing list
and asking for the wiki account you have just created to get this
permission.

Thanks,
/* Joey */


[jira] [Reopened] (HADOOP-8577) The RPC must have failed proxyUser (auth:SIMPLE) via realus...@hadoop.apache.org (auth:SIMPLE)

2012-07-19 Thread Daryn Sharp (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-8577?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daryn Sharp reopened HADOOP-8577:
-


 The RPC must have failed proxyUser (auth:SIMPLE) via 
 realus...@hadoop.apache.org (auth:SIMPLE)
 --

 Key: HADOOP-8577
 URL: https://issues.apache.org/jira/browse/HADOOP-8577
 Project: Hadoop Common
  Issue Type: Bug
  Components: test
 Environment: Ubuntu 11
 JDK 1.7
 Maven 3.0.4
Reporter: chandrashekhar Kotekar
Priority: Minor
   Original Estimate: 12h
  Remaining Estimate: 12h

 Hi,
 I have downloaded maven source code today itself and tried test it. I did 
 following steps :
 1) mvn clean
 2) mvn compile
 3) mvn test
 After 3rd step one step failed. Stack trace of failed test is as follows :
 Failed tests:   
 testRealUserIPNotSpecified(org.apache.hadoop.security.TestDoAsEffectiveUser): 
 The RPC must have failed proxyUser (auth:SIMPLE) via 
 realus...@hadoop.apache.org (auth:SIMPLE)
   testWithDirStringAndConf(org.apache.hadoop.fs.shell.TestPathData): checking 
 exist
   testPartialAuthority(org.apache.hadoop.fs.TestFileSystemCanonicalization): 
 expected:myfs://host.a.b:123 but was:myfs://host.a:123
   testFullAuthority(org.apache.hadoop.fs.TestFileSystemCanonicalization): 
 expected:null but was:java.lang.IllegalArgumentException: Wrong FS: 
 myfs://host/file, expected: myfs://host.a.b
   
 testShortAuthorityWithDefaultPort(org.apache.hadoop.fs.TestFileSystemCanonicalization):
  expected:myfs://host.a.b:123 but was:myfs://host:123
   
 testPartialAuthorityWithDefaultPort(org.apache.hadoop.fs.TestFileSystemCanonicalization):
  expected:myfs://host.a.b:123 but was:myfs://host.a:123
   testShortAuthority(org.apache.hadoop.fs.TestFileSystemCanonicalization): 
 expected:myfs://host.a.b:123 but was:myfs://host:123
   
 testIpAuthorityWithOtherPort(org.apache.hadoop.fs.TestFileSystemCanonicalization):
  expected:myfs://127.0.0.1:456 but was:myfs://localhost:456
   
 testAuthorityFromDefaultFS(org.apache.hadoop.fs.TestFileSystemCanonicalization):
  expected:myfs://host.a.b:123 but was:myfs://host:123
   
 testFullAuthorityWithDefaultPort(org.apache.hadoop.fs.TestFileSystemCanonicalization):
  expected:null but was:java.lang.IllegalArgumentException: Wrong FS: 
 myfs://host/file, expected: myfs://host.a.b:123
   
 testShortAuthorityWithOtherPort(org.apache.hadoop.fs.TestFileSystemCanonicalization):
  expected:myfs://host.a.b:456 but was:myfs://host:456
   
 testPartialAuthorityWithOtherPort(org.apache.hadoop.fs.TestFileSystemCanonicalization):
  expected:myfs://host.a.b:456 but was:myfs://host.a:456
   
 testFullAuthorityWithOtherPort(org.apache.hadoop.fs.TestFileSystemCanonicalization):
  expected:null but was:java.lang.IllegalArgumentException: Wrong FS: 
 myfs://host:456/file, expected: myfs://host.a.b:456
   testIpAuthority(org.apache.hadoop.fs.TestFileSystemCanonicalization): 
 expected:myfs://127.0.0.1:123 but was:myfs://localhost:123
   
 testIpAuthorityWithDefaultPort(org.apache.hadoop.fs.TestFileSystemCanonicalization):
  expected:myfs://127.0.0.1:123 but was:myfs://localhost:123
 Tests in error: 
   testUnqualifiedUriContents(org.apache.hadoop.fs.shell.TestPathData): `d1': 
 No such file or directory
 I am newbie in Hadoop source code world. Please help me in building hadoop 
 source code.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] [Resolved] (HADOOP-8577) The RPC must have failed proxyUser (auth:SIMPLE) via realus...@hadoop.apache.org (auth:SIMPLE)

2012-07-19 Thread Daryn Sharp (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-8577?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Daryn Sharp resolved HADOOP-8577.
-

Resolution: Duplicate

 The RPC must have failed proxyUser (auth:SIMPLE) via 
 realus...@hadoop.apache.org (auth:SIMPLE)
 --

 Key: HADOOP-8577
 URL: https://issues.apache.org/jira/browse/HADOOP-8577
 Project: Hadoop Common
  Issue Type: Bug
  Components: test
 Environment: Ubuntu 11
 JDK 1.7
 Maven 3.0.4
Reporter: chandrashekhar Kotekar
Priority: Minor
   Original Estimate: 12h
  Remaining Estimate: 12h

 Hi,
 I have downloaded maven source code today itself and tried test it. I did 
 following steps :
 1) mvn clean
 2) mvn compile
 3) mvn test
 After 3rd step one step failed. Stack trace of failed test is as follows :
 Failed tests:   
 testRealUserIPNotSpecified(org.apache.hadoop.security.TestDoAsEffectiveUser): 
 The RPC must have failed proxyUser (auth:SIMPLE) via 
 realus...@hadoop.apache.org (auth:SIMPLE)
   testWithDirStringAndConf(org.apache.hadoop.fs.shell.TestPathData): checking 
 exist
   testPartialAuthority(org.apache.hadoop.fs.TestFileSystemCanonicalization): 
 expected:myfs://host.a.b:123 but was:myfs://host.a:123
   testFullAuthority(org.apache.hadoop.fs.TestFileSystemCanonicalization): 
 expected:null but was:java.lang.IllegalArgumentException: Wrong FS: 
 myfs://host/file, expected: myfs://host.a.b
   
 testShortAuthorityWithDefaultPort(org.apache.hadoop.fs.TestFileSystemCanonicalization):
  expected:myfs://host.a.b:123 but was:myfs://host:123
   
 testPartialAuthorityWithDefaultPort(org.apache.hadoop.fs.TestFileSystemCanonicalization):
  expected:myfs://host.a.b:123 but was:myfs://host.a:123
   testShortAuthority(org.apache.hadoop.fs.TestFileSystemCanonicalization): 
 expected:myfs://host.a.b:123 but was:myfs://host:123
   
 testIpAuthorityWithOtherPort(org.apache.hadoop.fs.TestFileSystemCanonicalization):
  expected:myfs://127.0.0.1:456 but was:myfs://localhost:456
   
 testAuthorityFromDefaultFS(org.apache.hadoop.fs.TestFileSystemCanonicalization):
  expected:myfs://host.a.b:123 but was:myfs://host:123
   
 testFullAuthorityWithDefaultPort(org.apache.hadoop.fs.TestFileSystemCanonicalization):
  expected:null but was:java.lang.IllegalArgumentException: Wrong FS: 
 myfs://host/file, expected: myfs://host.a.b:123
   
 testShortAuthorityWithOtherPort(org.apache.hadoop.fs.TestFileSystemCanonicalization):
  expected:myfs://host.a.b:456 but was:myfs://host:456
   
 testPartialAuthorityWithOtherPort(org.apache.hadoop.fs.TestFileSystemCanonicalization):
  expected:myfs://host.a.b:456 but was:myfs://host.a:456
   
 testFullAuthorityWithOtherPort(org.apache.hadoop.fs.TestFileSystemCanonicalization):
  expected:null but was:java.lang.IllegalArgumentException: Wrong FS: 
 myfs://host:456/file, expected: myfs://host.a.b:456
   testIpAuthority(org.apache.hadoop.fs.TestFileSystemCanonicalization): 
 expected:myfs://127.0.0.1:123 but was:myfs://localhost:123
   
 testIpAuthorityWithDefaultPort(org.apache.hadoop.fs.TestFileSystemCanonicalization):
  expected:myfs://127.0.0.1:123 but was:myfs://localhost:123
 Tests in error: 
   testUnqualifiedUriContents(org.apache.hadoop.fs.shell.TestPathData): `d1': 
 No such file or directory
 I am newbie in Hadoop source code world. Please help me in building hadoop 
 source code.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira




Re: Powered By Hadoop Wiki Page Permissions

2012-07-19 Thread Harsh J
Done.

On Thu, Jul 19, 2012 at 8:13 PM, Joey Krabacher jkrabac...@gmail.comwrote:

 Could someone please grant me write permissions to this page?
 http://wiki.apache.org/hadoop/PoweredBy

 Just following instructions from the hadoop wiki/
 To add entries you need write permission to the wiki, which you can
 get by subscribing to the common-dev@hadoop.apache.org mailing list
 and asking for the wiki account you have just created to get this
 permission.

 Thanks,
 /* Joey */




-- 
Harsh J


Re: Shifting to Java 7 . Is it good choice?

2012-07-19 Thread Harsh J
Here's the Apache Bigtop JIRA thats leading the Java 7 effort for all
components in the Hadoop eco-system:
https://issues.apache.org/jira/browse/BIGTOP-458. This may interest
you.

On Wed, Jul 18, 2012 at 3:05 AM, Pavan Kulkarni pavan.babu...@gmail.com wrote:
 That was really helpful.
 @Robert: No I am just working on a research project, I am not checking the
 code into Hadoop.
 Thanks Radim and Robert.

 On Tue, Jul 17, 2012 at 3:49 PM, Robert Evans ev...@yahoo-inc.com wrote:

 Oracle is dropping java 6 support by the end of the year.  So there is
 likely to be a big shift to java 7 before then.  Currently Hadoop
 officially supports java 6 so unless there is an official change of
 position you cannot use Java 7 specific APIs if you want to check your
 code into Hadoop. Hadoop currently should work on 7, like Radim said, and
 if you are building something on top of Hadoop it is fine, but if we are
 dropping support for java 6 that will require some discussion on the
 mailing lists.

 --Bobby Evans

 On 7/17/12 2:35 PM, Radim Kolar h...@filez.com wrote:

 
 I have to tweak a few classes and for this I needed few packages
 which
  are
  only present in Java 7 like java.nio.file , So I was wondering If I
 can
  shift my
  development environment of Hadoop to Java 7? Would this break anything ?
 openjdk 7 works, but nio async file access is slower then traditional.




 --

 --With Regards
 Pavan Kulkarni



-- 
Harsh J


[jira] [Reopened] (HADOOP-8551) fs -mkdir creates parent directories without the -p option

2012-07-19 Thread Robert Joseph Evans (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-8551?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Robert Joseph Evans reopened HADOOP-8551:
-


-mkdir a
-mkdir a/b/ (Fails)
-mkdir a/b (Succeeds)

I am going to revert this until this is fixed, thanks for catching this John.

 fs -mkdir creates parent directories without the -p option
 --

 Key: HADOOP-8551
 URL: https://issues.apache.org/jira/browse/HADOOP-8551
 Project: Hadoop Common
  Issue Type: Bug
  Components: fs
Affects Versions: 0.23.3, 2.1.0-alpha, 3.0.0
Reporter: Robert Joseph Evans
Assignee: Daryn Sharp
 Fix For: 0.23.3, 3.0.0, 2.2.0-alpha

 Attachments: HADOOP-8551.patch, HADOOP-8551.patch


 hadoop fs -mkdir foo/bar will work even if bar is not present.  It should 
 only work if -p is given and foo is not present.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira




Common and hdfs Jenkins jobs now running tests

2012-07-19 Thread Eli Collins
Hey gang,

The tests were disabled on the common and hdfs jenkins jobs for some
reason. This was hiding test failures in the tests that are not run by
test-patch (eg hadoop-dist, see HDFS-3690).

I've re-enabled the tests on these jobs and filed HADOOP-8610 to get
test-patch on Hadoop to run the root projects (eg hadoop-tools).

Thanks,
Eli


Build failed in Jenkins: Hadoop-Common-trunk #476

2012-07-19 Thread Apache Jenkins Server
See https://builds.apache.org/job/Hadoop-Common-trunk/476/

--
[...truncated 18472 lines...]
[DEBUG]   (s) reportsDirectory = 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth-examples/target/surefire-reports
[DEBUG]   (s) runOrder = filesystem
[DEBUG]   (s) session = org.apache.maven.execution.MavenSession@1b6a053
[DEBUG]   (s) skip = false
[DEBUG]   (s) skipTests = false
[DEBUG]   (s) systemPropertyVariables = {hadoop.log.dir=null, 
hadoop.tmp.dir=null, java.net.preferIPv4Stack=true, 
java.security.egd=file:///dev/urandom, 
java.security.krb5.conf=https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth-examples/src/test/resources/krb5.conf,
 test.build.classes=null, 
test.build.data=https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth-examples/target/test-dir,
 
test.build.dir=https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth-examples/target/test-dir,
 test.build.webapps=null, test.cache.data=null}
[DEBUG]   (s) testClassesDirectory = 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth-examples/target/test-classes
[DEBUG]   (s) testFailureIgnore = false
[DEBUG]   (s) testNGArtifactName = org.testng:testng
[DEBUG]   (s) testSourceDirectory = 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth-examples/src/test/java
[DEBUG]   (s) trimStackTrace = true
[DEBUG]   (s) useFile = true
[DEBUG]   (s) useManifestOnlyJar = true
[DEBUG]   (s) useSystemClassLoader = true
[DEBUG]   (s) useUnlimitedThreads = false
[DEBUG]   (s) workingDirectory = 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth-examples
[DEBUG] -- end configuration --
[INFO] No tests to run.
[INFO] Surefire report directory: 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth-examples/target/surefire-reports
[DEBUG] dummy:dummy:jar:1.0 (selected for null)
[DEBUG]   org.apache.maven.surefire:surefire-booter:jar:2.12:compile (selected 
for compile)
[DEBUG] org.apache.maven.surefire:surefire-api:jar:2.12:compile (selected 
for compile)
[DEBUG] Adding to surefire booter test classpath: 
/home/jenkins/.m2/repository/org/apache/maven/surefire/surefire-booter/2.12/surefire-booter-2.12.jar
 Scope: compile
[DEBUG] Adding to surefire booter test classpath: 
/home/jenkins/.m2/repository/org/apache/maven/surefire/surefire-api/2.12/surefire-api-2.12.jar
 Scope: compile
[DEBUG] Setting system property [java.net.preferIPv4Stack]=[true]
[DEBUG] Setting system property 
[java.security.krb5.conf]=[https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth-examples/src/test/resources/krb5.conf]
[DEBUG] Setting system property [tar]=[true]
[DEBUG] Setting system property 
[test.build.data]=[https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth-examples/target/test-dir]
[DEBUG] Setting system property 
[user.dir]=[https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth-examples]
[DEBUG] Setting system property [localRepository]=[/home/jenkins/.m2/repository]
[DEBUG] Setting system property 
[test.build.dir]=[https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth-examples/target/test-dir]
[DEBUG] Setting system property 
[basedir]=[https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth-examples]
[DEBUG] Setting system property [java.security.egd]=[file:///dev/urandom]
[DEBUG] Using JVM: /home/jenkins/tools/java/jdk1.6.0_26/jre/bin/java
[DEBUG] Setting environment variable 
[LD_LIBRARY_PATH]=[/home/jenkins/tools/java/jdk1.6.0_26/jre/lib/i386/server:/home/jenkins/tools/java/jdk1.6.0_26/jre/lib/i386:/home/jenkins/tools/java/jdk1.6.0_26/jre/../lib/i386:https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth-examples/target/native/target/usr/local/lib]
[DEBUG] Setting environment variable [MALLOC_ARENA_MAX]=[4]
[DEBUG] dummy:dummy:jar:1.0 (selected for null)
[DEBUG]   org.apache.maven.surefire:surefire-junit3:jar:2.12:test (selected for 
test)
[DEBUG] org.apache.maven.surefire:surefire-api:jar:2.12:test (selected for 
test)
[DEBUG] Adding to surefire test classpath: 
/home/jenkins/.m2/repository/org/apache/maven/surefire/surefire-junit3/2.12/surefire-junit3-2.12.jar
 Scope: test
[DEBUG] Adding to surefire test classpath: 
/home/jenkins/.m2/repository/org/apache/maven/surefire/surefire-api/2.12/surefire-api-2.12.jar
 Scope: test
[DEBUG] test classpath classpath:
[DEBUG]   
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth-examples/target/test-classes
[DEBUG]