For more details, see 
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/

[May 2, 2017 2:36:54 PM] (aajisaka) HADOOP-14371. License error in 
TestLoadBalancingKMSClientProvider.java.
[May 2, 2017 2:52:34 PM] (aajisaka) HADOOP-14367. Remove unused setting from 
pom.xml. Contributed by Chen
[May 2, 2017 5:51:20 PM] (wang) HADOOP-14369. NetworkTopology calls expensive 
toString() when logging.
[May 2, 2017 6:49:19 PM] (wang) HADOOP-14281. Fix 
TestKafkaMetrics#testPutMetrics. Contributed by Alison
[May 2, 2017 8:06:47 PM] (templedf) YARN-6481. Yarn top shows negative 
container number in FS (Contributed
[May 2, 2017 9:50:51 PM] (jlowe) HADOOP-14306. TestLocalFileSystem tests have 
very low timeouts.
[May 3, 2017 12:51:28 AM] (rkanter) HADOOP-14352. Make some HttpServer2 SSL 
properties optional (jzhuge via
[May 3, 2017 1:34:11 AM] (shv) HDFS-11717. Add unit test for HDFS-11709 
StandbyCheckpointer should




-1 overall


The following subsystems voted -1:
    findbugs unit


The following subsystems voted -1 but
were configured to be filtered/ignored:
    cc checkstyle javac javadoc pylint shellcheck shelldocs whitespace


The following subsystems are considered long running:
(runtime bigger than 1h  0m  0s)
    unit


Specific tests:

    FindBugs :

       module:hadoop-common-project/hadoop-minikdc 
       Possible null pointer dereference in 
org.apache.hadoop.minikdc.MiniKdc.delete(File) due to return value of called 
method Dereferenced at 
MiniKdc.java:org.apache.hadoop.minikdc.MiniKdc.delete(File) due to return value 
of called method Dereferenced at MiniKdc.java:[line 368] 

    FindBugs :

       module:hadoop-common-project/hadoop-auth 
       
org.apache.hadoop.security.authentication.server.MultiSchemeAuthenticationHandler.authenticate(HttpServletRequest,
 HttpServletResponse) makes inefficient use of keySet iterator instead of 
entrySet iterator At MultiSchemeAuthenticationHandler.java:of keySet iterator 
instead of entrySet iterator At MultiSchemeAuthenticationHandler.java:[line 
192] 

    FindBugs :

       module:hadoop-common-project/hadoop-common 
       org.apache.hadoop.crypto.CipherSuite.setUnknownValue(int) 
unconditionally sets the field unknownValue At CipherSuite.java:unknownValue At 
CipherSuite.java:[line 44] 
       org.apache.hadoop.crypto.CryptoProtocolVersion.setUnknownValue(int) 
unconditionally sets the field unknownValue At 
CryptoProtocolVersion.java:unknownValue At CryptoProtocolVersion.java:[line 67] 
       Possible null pointer dereference in 
org.apache.hadoop.fs.FileUtil.fullyDeleteOnExit(File) due to return value of 
called method Dereferenced at 
FileUtil.java:org.apache.hadoop.fs.FileUtil.fullyDeleteOnExit(File) due to 
return value of called method Dereferenced at FileUtil.java:[line 118] 
       Possible null pointer dereference in 
org.apache.hadoop.fs.RawLocalFileSystem.handleEmptyDstDirectoryOnWindows(Path, 
File, Path, File) due to return value of called method Dereferenced at 
RawLocalFileSystem.java:org.apache.hadoop.fs.RawLocalFileSystem.handleEmptyDstDirectoryOnWindows(Path,
 File, Path, File) due to return value of called method Dereferenced at 
RawLocalFileSystem.java:[line 387] 
       Return value of org.apache.hadoop.fs.permission.FsAction.or(FsAction) 
ignored, but method has no side effect At FTPFileSystem.java:but method has no 
side effect At FTPFileSystem.java:[line 421] 
       Useless condition:lazyPersist == true at this point At 
CommandWithDestination.java:[line 502] 
       org.apache.hadoop.io.DoubleWritable.compareTo(DoubleWritable) 
incorrectly handles double value At DoubleWritable.java: At 
DoubleWritable.java:[line 78] 
       org.apache.hadoop.io.DoubleWritable$Comparator.compare(byte[], int, int, 
byte[], int, int) incorrectly handles double value At DoubleWritable.java:int) 
incorrectly handles double value At DoubleWritable.java:[line 97] 
       org.apache.hadoop.io.FloatWritable.compareTo(FloatWritable) incorrectly 
handles float value At FloatWritable.java: At FloatWritable.java:[line 71] 
       org.apache.hadoop.io.FloatWritable$Comparator.compare(byte[], int, int, 
byte[], int, int) incorrectly handles float value At FloatWritable.java:int) 
incorrectly handles float value At FloatWritable.java:[line 89] 
       Possible null pointer dereference in 
org.apache.hadoop.io.IOUtils.listDirectory(File, FilenameFilter) due to return 
value of called method Dereferenced at 
IOUtils.java:org.apache.hadoop.io.IOUtils.listDirectory(File, FilenameFilter) 
due to return value of called method Dereferenced at IOUtils.java:[line 350] 
       org.apache.hadoop.io.erasurecode.ECSchema.toString() makes inefficient 
use of keySet iterator instead of entrySet iterator At ECSchema.java:keySet 
iterator instead of entrySet iterator At ECSchema.java:[line 193] 
       Possible bad parsing of shift operation in 
org.apache.hadoop.io.file.tfile.Utils$Version.hashCode() At 
Utils.java:operation in 
org.apache.hadoop.io.file.tfile.Utils$Version.hashCode() At Utils.java:[line 
398] 
       
org.apache.hadoop.metrics2.lib.DefaultMetricsFactory.setInstance(MutableMetricsFactory)
 unconditionally sets the field mmfImpl At DefaultMetricsFactory.java:mmfImpl 
At DefaultMetricsFactory.java:[line 49] 
       
org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.setMiniClusterMode(boolean) 
unconditionally sets the field miniClusterMode At 
DefaultMetricsSystem.java:miniClusterMode At DefaultMetricsSystem.java:[line 
100] 
       Useless object stored in variable seqOs of method 
org.apache.hadoop.security.token.delegation.ZKDelegationTokenSecretManager.addOrUpdateToken(AbstractDelegationTokenIdentifier,
 AbstractDelegationTokenSecretManager$DelegationTokenInformation, boolean) At 
ZKDelegationTokenSecretManager.java:seqOs of method 
org.apache.hadoop.security.token.delegation.ZKDelegationTokenSecretManager.addOrUpdateToken(AbstractDelegationTokenIdentifier,
 AbstractDelegationTokenSecretManager$DelegationTokenInformation, boolean) At 
ZKDelegationTokenSecretManager.java:[line 886] 
       Bad comparison of nonnegative value with 0 in 
org.apache.hadoop.tracing.TraceAdmin.run(String[]) At TraceAdmin.java:with 0 in 
org.apache.hadoop.tracing.TraceAdmin.run(String[]) At TraceAdmin.java:[line 
169] 
       Inconsistent synchronization of 
org.apache.hadoop.util.SysInfoWindows.cpuUsage; locked 50% of time 
Unsynchronized access at SysInfoWindows.java:50% of time Unsynchronized access 
at SysInfoWindows.java:[line 201] 
       Inconsistent synchronization of 
org.apache.hadoop.util.SysInfoWindows.numProcessors; locked 50% of time 
Unsynchronized access at SysInfoWindows.java:50% of time Unsynchronized access 
at SysInfoWindows.java:[line 174] 

    FindBugs :

       module:hadoop-hdfs-project/hadoop-hdfs 
       Possible null pointer dereference in 
org.apache.hadoop.hdfs.qjournal.server.JournalNode.getJournalsStatus() due to 
return value of called method Dereferenced at 
JournalNode.java:org.apache.hadoop.hdfs.qjournal.server.JournalNode.getJournalsStatus()
 due to return value of called method Dereferenced at JournalNode.java:[line 
300] 
       
org.apache.hadoop.hdfs.server.common.HdfsServerConstants$StartupOption.setClusterId(String)
 unconditionally sets the field clusterId At HdfsServerConstants.java:clusterId 
At HdfsServerConstants.java:[line 193] 
       
org.apache.hadoop.hdfs.server.common.HdfsServerConstants$StartupOption.setForce(int)
 unconditionally sets the field force At HdfsServerConstants.java:force At 
HdfsServerConstants.java:[line 217] 
       
org.apache.hadoop.hdfs.server.common.HdfsServerConstants$StartupOption.setForceFormat(boolean)
 unconditionally sets the field isForceFormat At 
HdfsServerConstants.java:isForceFormat At HdfsServerConstants.java:[line 229] 
       
org.apache.hadoop.hdfs.server.common.HdfsServerConstants$StartupOption.setInteractiveFormat(boolean)
 unconditionally sets the field isInteractiveFormat At 
HdfsServerConstants.java:isInteractiveFormat At HdfsServerConstants.java:[line 
237] 
       Possible null pointer dereference in 
org.apache.hadoop.hdfs.server.datanode.DataStorage.linkBlocksHelper(File, File, 
int, HardLink, boolean, File, List) due to return value of called method 
Dereferenced at 
DataStorage.java:org.apache.hadoop.hdfs.server.datanode.DataStorage.linkBlocksHelper(File,
 File, int, HardLink, boolean, File, List) due to return value of called method 
Dereferenced at DataStorage.java:[line 1333] 
       Possible null pointer dereference in 
org.apache.hadoop.hdfs.server.namenode.NNStorageRetentionManager.purgeOldLegacyOIVImages(String,
 long) due to return value of called method Dereferenced at 
NNStorageRetentionManager.java:org.apache.hadoop.hdfs.server.namenode.NNStorageRetentionManager.purgeOldLegacyOIVImages(String,
 long) due to return value of called method Dereferenced at 
NNStorageRetentionManager.java:[line 258] 
       Possible null pointer dereference in 
org.apache.hadoop.hdfs.server.namenode.NNUpgradeUtil$1.visitFile(Path, 
BasicFileAttributes) due to return value of called method Dereferenced at 
NNUpgradeUtil.java:org.apache.hadoop.hdfs.server.namenode.NNUpgradeUtil$1.visitFile(Path,
 BasicFileAttributes) due to return value of called method Dereferenced at 
NNUpgradeUtil.java:[line 133] 
       Useless condition:argv.length >= 1 at this point At DFSAdmin.java:[line 
2025] 
       Useless condition:numBlocks == -1 at this point At 
ImageLoaderCurrent.java:[line 727] 

    FindBugs :

       module:hadoop-hdfs-project/hadoop-hdfs-client 
       Possible exposure of partially initialized object in 
org.apache.hadoop.hdfs.DFSClient.initThreadsNumForStripedReads(int) At 
DFSClient.java:object in 
org.apache.hadoop.hdfs.DFSClient.initThreadsNumForStripedReads(int) At 
DFSClient.java:[line 2863] 
       org.apache.hadoop.hdfs.server.protocol.SlowDiskReports.equals(Object) 
makes inefficient use of keySet iterator instead of entrySet iterator At 
SlowDiskReports.java:keySet iterator instead of entrySet iterator At 
SlowDiskReports.java:[line 105] 

    FindBugs :

       module:hadoop-tools/hadoop-azure 
       Useless object stored in variable keysToUpdateAsFolder of method 
org.apache.hadoop.fs.azure.NativeAzureFileSystem.mkdirs(Path, FsPermission, 
boolean) At NativeAzureFileSystem.java:keysToUpdateAsFolder of method 
org.apache.hadoop.fs.azure.NativeAzureFileSystem.mkdirs(Path, FsPermission, 
boolean) At NativeAzureFileSystem.java:[line 2454] 

    FindBugs :

       module:hadoop-tools/hadoop-gridmix 
       org.apache.hadoop.mapred.gridmix.InputStriper$1.compare(Map$Entry, 
Map$Entry) incorrectly handles double value At InputStriper.java:value At 
InputStriper.java:[line 136] 
       
org.apache.hadoop.mapred.gridmix.emulators.resourceusage.TotalHeapUsageEmulatorPlugin$DefaultHeapUsageEmulator.heapSpace
 is a mutable collection which should be package protected At 
TotalHeapUsageEmulatorPlugin.java:which should be package protected At 
TotalHeapUsageEmulatorPlugin.java:[line 132] 

    FindBugs :

       module:hadoop-tools/hadoop-rumen 
       Return value of new 
org.apache.hadoop.tools.rumen.datatypes.DefaultDataType(String) ignored, but 
method has no side effect At MapReduceJobPropertiesParser.java:ignored, but 
method has no side effect At MapReduceJobPropertiesParser.java:[line 211] 

    FindBugs :

       module:hadoop-tools/hadoop-sls 
       org.apache.hadoop.yarn.sls.SLSRunner.simulateInfoMap is a mutable 
collection At SLSRunner.java: At SLSRunner.java:[line 116] 

    FindBugs :

       module:hadoop-yarn-project/hadoop-yarn/hadoop-yarn-common 
       Possible null pointer dereference in 
org.apache.hadoop.yarn.logaggregation.AggregatedLogFormat$LogValue.getPendingLogFilesToUpload(File)
 due to return value of called method Method invoked at 
AggregatedLogFormat.java:org.apache.hadoop.yarn.logaggregation.AggregatedLogFormat$LogValue.getPendingLogFilesToUpload(File)
 due to return value of called method Method invoked at 
AggregatedLogFormat.java:[line 318] 

    FindBugs :

       
module:hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-nodemanager
 
       Useless object stored in variable removedNullContainers of method 
org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl.removeOrTrackCompletedContainersFromContext(List)
 At NodeStatusUpdaterImpl.java:removedNullContainers of method 
org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl.removeOrTrackCompletedContainersFromContext(List)
 At NodeStatusUpdaterImpl.java:[line 644] 
       
org.apache.hadoop.yarn.server.nodemanager.NodeStatusUpdaterImpl.removeVeryOldStoppedContainersFromCache()
 makes inefficient use of keySet iterator instead of entrySet iterator At 
NodeStatusUpdaterImpl.java:keySet iterator instead of entrySet iterator At 
NodeStatusUpdaterImpl.java:[line 721] 
       Hard coded reference to an absolute pathname in 
org.apache.hadoop.yarn.server.nodemanager.containermanager.linux.runtime.DockerLinuxContainerRuntime.launchContainer(ContainerRuntimeContext)
 At DockerLinuxContainerRuntime.java:absolute pathname in 
org.apache.hadoop.yarn.server.nodemanager.containermanager.linux.runtime.DockerLinuxContainerRuntime.launchContainer(ContainerRuntimeContext)
 At DockerLinuxContainerRuntime.java:[line 455] 
       
org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer.createStatus()
 makes inefficient use of keySet iterator instead of entrySet iterator At 
ContainerLocalizer.java:keySet iterator instead of entrySet iterator At 
ContainerLocalizer.java:[line 334] 
       
org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainerMetrics.usageMetrics
 is a mutable collection which should be package protected At 
ContainerMetrics.java:which should be package protected At 
ContainerMetrics.java:[line 134] 

    Failed junit tests :

       hadoop.ha.TestZKFailoverController 
       hadoop.hdfs.server.namenode.TestMetadataVersionOutput 
       hadoop.hdfs.server.namenode.TestDecommissioningStatus 
       hadoop.hdfs.server.namenode.TestStartup 
       hadoop.tools.TestHadoopArchiveLogsRunner 
       hadoop.mapred.gridmix.TestGridmixSubmission 
       hadoop.yarn.server.TestMiniYarnClusterNodeUtilization 
       hadoop.yarn.server.TestContainerManagerSecurity 
      

   cc:

       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/diff-compile-cc-root.txt
  [4.0K]

   javac:

       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/diff-compile-javac-root.txt
  [184K]

   checkstyle:

       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/diff-checkstyle-root.txt
  [17M]

   pylint:

       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/diff-patch-pylint.txt
  [20K]

   shellcheck:

       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/diff-patch-shellcheck.txt
  [20K]

   shelldocs:

       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/diff-patch-shelldocs.txt
  [12K]

   whitespace:

       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/whitespace-eol.txt
  [12M]
       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/whitespace-tabs.txt
  [1.2M]

   findbugs:

       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/branch-findbugs-hadoop-common-project_hadoop-minikdc-warnings.html
  [8.0K]
       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/branch-findbugs-hadoop-common-project_hadoop-auth-warnings.html
  [8.0K]
       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/branch-findbugs-hadoop-common-project_hadoop-common-warnings.html
  [28K]
       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/branch-findbugs-hadoop-hdfs-project_hadoop-hdfs-warnings.html
  [16K]
       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/branch-findbugs-hadoop-hdfs-project_hadoop-hdfs-client-warnings.html
  [8.0K]
       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/branch-findbugs-hadoop-tools_hadoop-azure-warnings.html
  [8.0K]
       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/branch-findbugs-hadoop-tools_hadoop-gridmix-warnings.html
  [8.0K]
       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/branch-findbugs-hadoop-tools_hadoop-rumen-warnings.html
  [8.0K]
       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/branch-findbugs-hadoop-tools_hadoop-sls-warnings.html
  [8.0K]
       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/branch-findbugs-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-common-warnings.html
  [8.0K]
       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/branch-findbugs-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-nodemanager-warnings.html
  [12K]

   javadoc:

       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/diff-javadoc-javadoc-root.txt
  [2.2M]

   unit:

       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/patch-unit-hadoop-common-project_hadoop-common.txt
  [140K]
       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt
  [160K]
       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/patch-unit-hadoop-tools_hadoop-archive-logs.txt
  [8.0K]
       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/patch-unit-hadoop-tools_hadoop-gridmix.txt
  [16K]
       
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/392/artifact/out/patch-unit-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-tests.txt
  [324K]

Powered by Apache Yetus 0.5.0-SNAPSHOT   http://yetus.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: mapreduce-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: mapreduce-dev-h...@hadoop.apache.org

Reply via email to