Build failed in Jenkins: Hadoop-Common-trunk #399

2012-05-07 Thread Apache Jenkins Server
See https://builds.apache.org/job/Hadoop-Common-trunk/399/changes

Changes:

[harsh] HADOOP-8323. Revert HADOOP-7940, cause it may cause a performance 
regression. (harsh)

--
[...truncated 45270 lines...]
[DEBUG]   (f) reactorProjects = [MavenProject: 
org.apache.hadoop:hadoop-annotations:3.0.0-SNAPSHOT @ 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-annotations/pom.xml,
 MavenProject: org.apache.hadoop:hadoop-auth:3.0.0-SNAPSHOT @ 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth/pom.xml,
 MavenProject: org.apache.hadoop:hadoop-auth-examples:3.0.0-SNAPSHOT @ 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth-examples/pom.xml,
 MavenProject: org.apache.hadoop:hadoop-common:3.0.0-SNAPSHOT @ 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-common/pom.xml,
 MavenProject: org.apache.hadoop:hadoop-common-project:3.0.0-SNAPSHOT @ 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/pom.xml]
[DEBUG]   (f) useDefaultExcludes = true
[DEBUG]   (f) useDefaultManifestFile = false
[DEBUG] -- end configuration --
[INFO] 
[INFO] --- maven-enforcer-plugin:1.0:enforce (dist-enforce) @ 
hadoop-common-project ---
[DEBUG] Configuring mojo 
org.apache.maven.plugins:maven-enforcer-plugin:1.0:enforce from plugin realm 
ClassRealm[pluginorg.apache.maven.plugins:maven-enforcer-plugin:1.0, parent: 
sun.misc.Launcher$AppClassLoader@126b249]
[DEBUG] Configuring mojo 
'org.apache.maven.plugins:maven-enforcer-plugin:1.0:enforce' with basic 
configurator --
[DEBUG]   (s) fail = true
[DEBUG]   (s) failFast = false
[DEBUG]   (f) ignoreCache = false
[DEBUG]   (s) project = MavenProject: 
org.apache.hadoop:hadoop-common-project:3.0.0-SNAPSHOT @ 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/pom.xml
[DEBUG]   (s) version = [3.0.2,)
[DEBUG]   (s) version = 1.6
[DEBUG]   (s) rules = 
[org.apache.maven.plugins.enforcer.RequireMavenVersion@fa569e, 
org.apache.maven.plugins.enforcer.RequireJavaVersion@14cc273]
[DEBUG]   (s) session = org.apache.maven.execution.MavenSession@95b8a
[DEBUG]   (s) skip = false
[DEBUG] -- end configuration --
[DEBUG] Executing rule: org.apache.maven.plugins.enforcer.RequireMavenVersion
[DEBUG] Rule org.apache.maven.plugins.enforcer.RequireMavenVersion is cacheable.
[DEBUG] Key org.apache.maven.plugins.enforcer.RequireMavenVersion -937312197 
was found in the cache
[DEBUG] The cached results are still valid. Skipping the rule: 
org.apache.maven.plugins.enforcer.RequireMavenVersion
[DEBUG] Executing rule: org.apache.maven.plugins.enforcer.RequireJavaVersion
[DEBUG] Rule org.apache.maven.plugins.enforcer.RequireJavaVersion is cacheable.
[DEBUG] Key org.apache.maven.plugins.enforcer.RequireJavaVersion 48569 was 
found in the cache
[DEBUG] The cached results are still valid. Skipping the rule: 
org.apache.maven.plugins.enforcer.RequireJavaVersion
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ 
hadoop-common-project ---
[DEBUG] Configuring mojo 
org.apache.maven.plugins:maven-site-plugin:3.0:attach-descriptor from plugin 
realm ClassRealm[pluginorg.apache.maven.plugins:maven-site-plugin:3.0, parent: 
sun.misc.Launcher$AppClassLoader@126b249]
[DEBUG] Configuring mojo 
'org.apache.maven.plugins:maven-site-plugin:3.0:attach-descriptor' with basic 
configurator --
[DEBUG]   (f) basedir = 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project
[DEBUG]   (f) inputEncoding = UTF-8
[DEBUG]   (f) localRepository =id: local
  url: file:///home/jenkins/.m2/repository/
   layout: none

[DEBUG]   (f) outputEncoding = UTF-8
[DEBUG]   (f) pomPackagingOnly = true
[DEBUG]   (f) project = MavenProject: 
org.apache.hadoop:hadoop-common-project:3.0.0-SNAPSHOT @ 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/pom.xml
[DEBUG]   (f) reactorProjects = [MavenProject: 
org.apache.hadoop:hadoop-annotations:3.0.0-SNAPSHOT @ 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-annotations/pom.xml,
 MavenProject: org.apache.hadoop:hadoop-auth:3.0.0-SNAPSHOT @ 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth/pom.xml,
 MavenProject: org.apache.hadoop:hadoop-auth-examples:3.0.0-SNAPSHOT @ 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-auth-examples/pom.xml,
 MavenProject: org.apache.hadoop:hadoop-common:3.0.0-SNAPSHOT @ 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-common/pom.xml,
 MavenProject: org.apache.hadoop:hadoop-common-project:3.0.0-SNAPSHOT @ 
https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/pom.xml]
[DEBUG]   (f) siteDirectory = 

Build failed in Jenkins: Hadoop-Common-0.23-Build #245

2012-05-07 Thread Apache Jenkins Server
See https://builds.apache.org/job/Hadoop-Common-0.23-Build/245/

--
[...truncated 12308 lines...]
  [javadoc] Loading source files for package org.apache.hadoop.fs.local...
  [javadoc] Loading source files for package org.apache.hadoop.fs.permission...
  [javadoc] Loading source files for package org.apache.hadoop.fs.s3...
  [javadoc] Loading source files for package org.apache.hadoop.fs.s3native...
  [javadoc] Loading source files for package org.apache.hadoop.fs.shell...
  [javadoc] Loading source files for package org.apache.hadoop.fs.viewfs...
  [javadoc] Loading source files for package org.apache.hadoop.http...
  [javadoc] Loading source files for package org.apache.hadoop.http.lib...
  [javadoc] Loading source files for package org.apache.hadoop.io...
  [javadoc] Loading source files for package org.apache.hadoop.io.compress...
  [javadoc] Loading source files for package 
org.apache.hadoop.io.compress.bzip2...
  [javadoc] Loading source files for package 
org.apache.hadoop.io.compress.lz4...
  [javadoc] Loading source files for package 
org.apache.hadoop.io.compress.snappy...
  [javadoc] Loading source files for package 
org.apache.hadoop.io.compress.zlib...
  [javadoc] Loading source files for package org.apache.hadoop.io.file.tfile...
  [javadoc] Loading source files for package org.apache.hadoop.io.nativeio...
  [javadoc] Loading source files for package org.apache.hadoop.io.retry...
  [javadoc] Loading source files for package org.apache.hadoop.io.serializer...
  [javadoc] Loading source files for package 
org.apache.hadoop.io.serializer.avro...
  [javadoc] Loading source files for package org.apache.hadoop.ipc...
  [javadoc] Loading source files for package org.apache.hadoop.ipc.metrics...
  [javadoc] Loading source files for package org.apache.hadoop.jmx...
  [javadoc] Loading source files for package org.apache.hadoop.log...
  [javadoc] Loading source files for package org.apache.hadoop.log.metrics...
  [javadoc] Loading source files for package org.apache.hadoop.metrics...
  [javadoc] Loading source files for package org.apache.hadoop.metrics.file...
  [javadoc] Loading source files for package 
org.apache.hadoop.metrics.ganglia...
  [javadoc] Loading source files for package org.apache.hadoop.metrics.jvm...
  [javadoc] Loading source files for package org.apache.hadoop.metrics.spi...
  [javadoc] Loading source files for package org.apache.hadoop.metrics.util...
  [javadoc] Loading source files for package org.apache.hadoop.metrics2...
  [javadoc] Loading source files for package 
org.apache.hadoop.metrics2.annotation...
  [javadoc] Loading source files for package 
org.apache.hadoop.metrics2.filter...
  [javadoc] Loading source files for package org.apache.hadoop.metrics2.impl...
  [javadoc] Loading source files for package org.apache.hadoop.metrics2.lib...
  [javadoc] Loading source files for package org.apache.hadoop.metrics2.sink...
  [javadoc] Loading source files for package 
org.apache.hadoop.metrics2.sink.ganglia...
  [javadoc] Loading source files for package 
org.apache.hadoop.metrics2.source...
  [javadoc] Loading source files for package org.apache.hadoop.metrics2.util...
  [javadoc] Loading source files for package org.apache.hadoop.net...
  [javadoc] Loading source files for package org.apache.hadoop.record...
  [javadoc] Loading source files for package 
org.apache.hadoop.record.compiler...
  [javadoc] Loading source files for package 
org.apache.hadoop.record.compiler.ant...
  [javadoc] Loading source files for package 
org.apache.hadoop.record.compiler.generated...
  [javadoc] Loading source files for package org.apache.hadoop.record.meta...
  [javadoc] Loading source files for package org.apache.hadoop.security...
  [javadoc] Loading source files for package 
org.apache.hadoop.security.authorize...
  [javadoc] Loading source files for package org.apache.hadoop.security.token...
  [javadoc] Loading source files for package 
org.apache.hadoop.security.token.delegation...
  [javadoc] Loading source files for package org.apache.hadoop.tools...
  [javadoc] Loading source files for package org.apache.hadoop.util...
  [javadoc] Loading source files for package org.apache.hadoop.util.bloom...
  [javadoc] Loading source files for package org.apache.hadoop.util.hash...
  [javadoc] 2 errors
 [xslt] Processing 
https://builds.apache.org/job/Hadoop-Common-0.23-Build/ws/trunk/hadoop-common-project/hadoop-common/target/findbugsXml.xml
 to 
https://builds.apache.org/job/Hadoop-Common-0.23-Build/ws/trunk/hadoop-common-project/hadoop-common/target/site/findbugs.html
 [xslt] Loading stylesheet 
/home/jenkins/tools/findbugs/latest/src/xsl/default.xsl
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (pre-dist) @ hadoop-common ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO]  maven-javadoc-plugin:2.8.1:javadoc (default) @ hadoop-common 
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ 

[jira] [Created] (HADOOP-8365) Provide ability to disable working sync

2012-05-07 Thread Eli Collins (JIRA)
Eli Collins created HADOOP-8365:
---

 Summary: Provide ability to disable working sync
 Key: HADOOP-8365
 URL: https://issues.apache.org/jira/browse/HADOOP-8365
 Project: Hadoop Common
  Issue Type: Improvement
Affects Versions: 1.1.0
Reporter: Eli Collins


Per HADOOP-8230 there's a request for a flag so sync can be disabled.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] [Created] (HADOOP-8366) Use ProtoBuf for RpcResponseHeader

2012-05-07 Thread Sanjay Radia (JIRA)
Sanjay Radia created HADOOP-8366:


 Summary: Use ProtoBuf for RpcResponseHeader
 Key: HADOOP-8366
 URL: https://issues.apache.org/jira/browse/HADOOP-8366
 Project: Hadoop Common
  Issue Type: Improvement
Affects Versions: HA Branch (HDFS-1623)
Reporter: Sanjay Radia
Assignee: Sanjay Radia
Priority: Blocker




--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] [Resolved] (HADOOP-7775) RPC Layer improvements to support protocol compatibility

2012-05-07 Thread Sanjay Radia (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-7775?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sanjay Radia resolved HADOOP-7775.
--

Resolution: Fixed

All subtasks done.

 RPC Layer improvements to support protocol compatibility
 

 Key: HADOOP-7775
 URL: https://issues.apache.org/jira/browse/HADOOP-7775
 Project: Hadoop Common
  Issue Type: Improvement
Reporter: Sanjay Radia
Assignee: Sanjay Radia



--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] [Created] (HADOOP-8367) ProtoBufRpcEngine's rpc request header does not need declaringClass name

2012-05-07 Thread Sanjay Radia (JIRA)
Sanjay Radia created HADOOP-8367:


 Summary: ProtoBufRpcEngine's rpc request header does not need 
declaringClass name
 Key: HADOOP-8367
 URL: https://issues.apache.org/jira/browse/HADOOP-8367
 Project: Hadoop Common
  Issue Type: Improvement
Affects Versions: 2.0.0
Reporter: Sanjay Radia
Assignee: Sanjay Radia




--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] [Created] (HADOOP-8368) Use CMake rather than autotools to build native code

2012-05-07 Thread Colin Patrick McCabe (JIRA)
Colin Patrick McCabe created HADOOP-8368:


 Summary: Use CMake rather than autotools to build native code
 Key: HADOOP-8368
 URL: https://issues.apache.org/jira/browse/HADOOP-8368
 Project: Hadoop Common
  Issue Type: Improvement
Reporter: Colin Patrick McCabe
Assignee: Colin Patrick McCabe
Priority: Minor


It would be good to use cmake rather than autotools to build the native (C/C++) 
code in Hadoop.

Rationale:
1. automake depends on shell scripts, which often have problems running on 
different operating systems.  It would be extremely difficult, and perhaps 
impossible, to use autotools under Windows.  Even if it were possible, it might 
require horrible workarounds like installing cygwin.  Even on Linux variants 
like Ubuntu 12.04, there are major build issues because /bin/sh is the Dash 
shell, rather than the Bash shell as it is in other Linux versions.  It is 
currently impossible to build the native code under Ubuntu 12.04 because of 
this problem.

CMake has robust cross-platform support, including Windows.  It does not use 
shell scripts.

2. automake error messages are very confusing.  For example, autoreconf: 
cannot empty /tmp/ar0.4849: Is a directory or Can't locate object method 
path via package Autom4te... are common error messages.  In order to even 
start debugging automake problems you need to learn shell, m4, sed, and the a 
bunch of other things.  With CMake, all you have to learn is the syntax of 
CMakeLists.txt, which is simple.

CMake can do all the stuff autotools can, such as making sure that required 
libraries are installed.  There is a Maven plugin for CMake as well.

3. Different versions of autotools can have very different behaviors.  For 
example, the version installed under openSUSE defaults to putting libraries in 
/usr/local/lib64, whereas the version shipped with Ubuntu 11.04 defaults to 
installing the same libraries under /usr/local/lib.  (This is why the FUSE 
build is currently broken when using OpenSUSE.)  This is another source of 
build failures and complexity.  If things go wrong, you will often get an error 
message which is incomprehensible to normal humans (see point #2).

CMake allows you to specify the minimum_required_version of CMake that a 
particular CMakeLists.txt will accept.  In addition, CMake maintains strict 
backwards compatibility between different versions.  This prevents build bugs 
due to version skew.

4. autoconf, automake, and libtool are large and rather slow.  This adds to 
build time.

For all these reasons, I think we should switch to CMake for compiling native 
(C/C++) code in Hadoop.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira




Re: Scan Benchmark

2012-05-07 Thread Ravi Prakash
https://issues.apache.org/jira/browse/MAPREDUCE-3524?focusedCommentId=13170564page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-13170564

On Sat, May 5, 2012 at 6:35 AM, ASHISH ATTARDE ashish.at...@gmail.comwrote:

 Hi,

 Can anyone guide me for Benchmarking Hadoop Cluster?
 I am interested in Scan benchmark.

 Thanks
 -Ashish



Re: Unable to build native binaries

2012-05-07 Thread Trevor Robinson
I'm hitting this issue too, with this configuration:

Apache Maven 3.0.4 (r1232337; 2012-01-17 02:44:56-0600)
Maven home: /usr/local/apache-maven-3.0.4
Java version: 1.7.0_04, vendor: Oracle Corporation
Java home: /usr/lib/jvm/jdk1.7.0_04/jre
Default locale: en_US, platform encoding: ISO-8859-1
OS name: linux, version: 3.2.0-24-generic, arch: amd64, family: unix

The fix for me was to changing the scope of hadoop-annotations from
provided to compile.

dependency
  groupIdorg.apache.hadoop/groupId
  artifactIdhadoop-annotations/artifactId
  scopecompile/scope
/dependency

Unless anyone knows why this should be provided (nothing else is),
I'll file a bug with the patch.

-Trevor

On Sat, Apr 28, 2012 at 1:20 AM, ASHISH ATTARDE ashish.at...@gmail.com wrote:
 Harsh,

 I am using following configuration

 Apache Maven 3.0.4
 Java version: 1.6.0_24, vendor: Sun Microsystems Inc.
 OS name: linux, version: 3.2.0-24-generic-pae, arch: i386, family:
 unix

 -ashish


 On Sat, Apr 28, 2012 at 12:25 AM, Harsh J ha...@cloudera.com wrote:

 Hi,

 What JDK version are you using to compile?

 On Sat, Apr 28, 2012 at 7:53 AM, ASHISH ATTARDE ashish.at...@gmail.com
 wrote:
  My build for naitive binaries is failing with following error. I used
  -DskipTests as few unit tests are also broke
 
  I am new to Hadoop development, can anyone guide/help me or direct me to
  right direction?
  My source code version is recent one cause this evening only I checked
 out
  with svn.
 
  $ mvn assembly:assembly -Pnative -DskipTests
 
  .
  .
  .
  .
  [INFO] Executed tasks
  [INFO]
  [INFO] --- maven-dependency-plugin:2.1:build-classpath (build-classpath)
 @
  hadoop-auth ---
  [INFO] Skipped writing classpath file
 
 '/home/zylops/TestArena/hadoop-trunk/hadoop-common-project/hadoop-auth/target/classes/mrapp-generated-classpath'.
   No changes found.
  [INFO]
  [INFO]  maven-source-plugin:2.1.2:jar (default) @ hadoop-auth 
  [INFO]
  [INFO] --- maven-source-plugin:2.1.2:jar (default) @ hadoop-auth ---
  [INFO]
  [INFO] --- maven-jar-plugin:2.3.1:jar (default-jar) @ hadoop-auth ---
  [INFO]
  [INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @
  hadoop-auth ---
  [INFO]
 
  [INFO]
 
  [INFO] Forking Apache Hadoop Auth Examples 3.0.0-SNAPSHOT
  [INFO]
 
  [INFO]
  [INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @
  hadoop-auth-examples ---
  [INFO] Executing tasks
 
  main:
  [INFO] Executed tasks
  [INFO]
  [INFO] --- maven-dependency-plugin:2.1:build-classpath (build-classpath)
 @
  hadoop-auth-examples ---
  [INFO] Skipped writing classpath file
 
 '/home/zylops/TestArena/hadoop-trunk/hadoop-common-project/hadoop-auth-examples/target/classes/mrapp-generated-classpath'.
   No changes found.
  [INFO]
  [INFO] --- maven-resources-plugin:2.2:resources (default-resources) @
  hadoop-auth-examples ---
  [INFO] Using default encoding to copy filtered resources.
  [INFO]
  [INFO] --- maven-compiler-plugin:2.3.2:compile (default-compile) @
  hadoop-auth-examples ---
  [INFO] Nothing to compile - all classes are up to date
  [INFO]
  [INFO] --- maven-resources-plugin:2.2:testResources
 (default-testResources)
  @ hadoop-auth-examples ---
  [INFO] Using default encoding to copy filtered resources.
  [INFO]
  [INFO] --- maven-compiler-plugin:2.3.2:testCompile (default-testCompile)
 @
  hadoop-auth-examples ---
  [INFO] No sources to compile
  [INFO]
  [INFO] --- maven-surefire-plugin:2.12:test (default-test) @
  hadoop-auth-examples ---
  [INFO] Tests are skipped.
  [INFO]
  [INFO] --- maven-war-plugin:2.1:war (default-war) @ hadoop-auth-examples
 ---
  [INFO] Packaging webapp
  [INFO] Assembling webapp [hadoop-auth-examples] in
 
 [/home/zylops/TestArena/hadoop-trunk/hadoop-common-project/hadoop-auth-examples/target/hadoop-auth-examples-3.0.0-SNAPSHOT]
  [INFO] Processing war project
  [INFO] Copying webapp resources
 
 [/home/zylops/TestArena/hadoop-trunk/hadoop-common-project/hadoop-auth-examples/src/main/webapp]
  [INFO] Webapp assembled in [41 msecs]
  [INFO] Building war:
 
 /home/zylops/TestArena/hadoop-trunk/hadoop-common-project/hadoop-auth-examples/target/hadoop-auth-examples.war
  [INFO] WEB-INF/web.xml already added, skipping
  [INFO]
  [INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @
  hadoop-auth-examples ---
  [INFO]
 
  [INFO]
 
  [INFO] Forking Apache Hadoop Common 3.0.0-SNAPSHOT
  [INFO]
 
  [INFO]
  [INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-common
 ---
  [INFO] Executing tasks
 
  main:
  [INFO] Executed tasks
  [INFO]
  [INFO] --- build-helper-maven-plugin:1.5:add-source (add-source) @
  hadoop-common ---
  [INFO] Source directory:
 
 /home/zylops/TestArena/hadoop-trunk/hadoop-common-project/hadoop-common/target/generated-sources/java
  added.
  [INFO]
  [INFO] --- build-helper-maven-plugin:1.5:add-test-source
 (add-test-source)
  @ hadoop-common ---
  [INFO] Test Source directory:
 
 

[jira] [Created] (HADOOP-8370) Native build failure: javah: class file for org.apache.hadoop.classification.InterfaceAudience not found

2012-05-07 Thread Trevor Robinson (JIRA)
Trevor Robinson created HADOOP-8370:
---

 Summary: Native build failure: javah: class file for 
org.apache.hadoop.classification.InterfaceAudience not found
 Key: HADOOP-8370
 URL: https://issues.apache.org/jira/browse/HADOOP-8370
 Project: Hadoop Common
  Issue Type: Bug
  Components: native
Affects Versions: 0.23.1
 Environment: Apache Maven 3.0.4 (r1232337; 2012-01-17 02:44:56-0600)
Maven home: /usr/local/apache-maven-3.0.4
Java version: 1.7.0_04, vendor: Oracle Corporation
Java home: /usr/lib/jvm/jdk1.7.0_04/jre
Default locale: en_US, platform encoding: ISO-8859-1
OS name: linux, version: 3.2.0-24-generic, arch: amd64, family: unix
Reporter: Trevor Robinson


[INFO] --- native-maven-plugin:1.0-alpha-7:javah (default) @ hadoop-common ---
[INFO] /bin/sh -c cd /build/hadoop-common/hadoop-common-project/hadoop-common 
 /usr/lib/jvm/jdk1.7.0_02/bin/javah -d 
/build/hadoop-common/hadoop-common-project/hadoop-common/target/native/javah 
-classpath ... org.apache.hadoop.io.compress.zlib.ZlibDecompressor 
org.apache.hadoop.security.JniBasedUnixGroupsMapping 
org.apache.hadoop.io.nativeio.NativeIO 
org.apache.hadoop.security.JniBasedUnixGroupsNetgroupMapping 
org.apache.hadoop.io.compress.snappy.SnappyCompressor 
org.apache.hadoop.io.compress.snappy.SnappyDecompressor 
org.apache.hadoop.io.compress.lz4.Lz4Compressor 
org.apache.hadoop.io.compress.lz4.Lz4Decompressor 
org.apache.hadoop.util.NativeCrc32
Cannot find annotation method 'value()' in type 
'org.apache.hadoop.classification.InterfaceAudience.LimitedPrivate': class file 
for org.apache.hadoop.classification.InterfaceAudience not found
Cannot find annotation method 'value()' in type 
'org.apache.hadoop.classification.InterfaceAudience.LimitedPrivate'
Error: cannot access org.apache.hadoop.classification.InterfaceStability
  class file for org.apache.hadoop.classification.InterfaceStability not found

The fix for me was to changing the scope of hadoop-annotations from
provided to compile in pom.xml:

   dependency
 groupIdorg.apache.hadoop/groupId
 artifactIdhadoop-annotations/artifactId
 scopecompile/scope
   /dependency

For some reason, it was the only dependency with scope provided.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] [Resolved] (HADOOP-8348) Server$Listener.getAddress(..) may throw NullPointerException

2012-05-07 Thread Eli Collins (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-8348?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eli Collins resolved HADOOP-8348.
-

Resolution: Duplicate

This is a dupe of HDFS-3328.

 Server$Listener.getAddress(..) may throw NullPointerException
 -

 Key: HADOOP-8348
 URL: https://issues.apache.org/jira/browse/HADOOP-8348
 Project: Hadoop Common
  Issue Type: Bug
  Components: ipc
Reporter: Tsz Wo (Nicholas), SZE
Assignee: Eli Collins

 [Build 
 #2365|https://builds.apache.org/job/PreCommit-HDFS-Build/2365//testReport/org.apache.hadoop.hdfs/TestHFlush/testHFlushInterrupted/]:
 {noformat}
 Exception in thread DataXceiver for client /127.0.0.1:35472 [Waiting for 
 operation #2] java.lang.NullPointerException
   at org.apache.hadoop.ipc.Server$Listener.getAddress(Server.java:669)
   at org.apache.hadoop.ipc.Server.getListenerAddress(Server.java:1988)
   at 
 org.apache.hadoop.hdfs.server.datanode.DataNode.getIpcPort(DataNode.java:882)
   at 
 org.apache.hadoop.hdfs.server.datanode.DataNode.getDisplayName(DataNode.java:863)
   at 
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:177)
   at java.lang.Thread.run(Thread.java:662)
 {noformat}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira