Haohui Mai created HADOOP-10482:
-----------------------------------
Summary: Fix new findbugs warnings in hadoop-common
Key: HADOOP-10482
URL: https://issues.apache.org/jira/browse/HADOOP-10482
Project: Hadoop Common
Issue Type: Sub-task
Reporter: Haohui Mai
The following findbugs warnings need to be fixed:
{noformat}
[INFO] --- findbugs-maven-plugin:2.5.3:check (default-cli) @ hadoop-common ---
[INFO] BugInstance size is 97
[INFO] Error size is 0
[INFO] Total bugs: 97
[INFO] Found reliance on default encoding in
org.apache.hadoop.conf.Configuration.getConfResourceAsReader(String): new
java.io.InputStreamReader(InputStream) ["org.apache.hadoop.conf.Configuration"]
At Configuration.java:[lines 169-2642]
[INFO] Null passed for nonnull parameter of set(String, String) in
org.apache.hadoop.conf.Configuration.setPattern(String, Pattern)
["org.apache.hadoop.conf.Configuration"] At Configuration.java:[lines 169-2642]
[INFO] Format string should use %n rather than \n in
org.apache.hadoop.conf.ReconfigurationServlet.printHeader(PrintWriter, String)
["org.apache.hadoop.conf.ReconfigurationServlet"] At
ReconfigurationServlet.java:[lines 44-234]
[INFO] Format string should use %n rather than \n in
org.apache.hadoop.conf.ReconfigurationServlet.printHeader(PrintWriter, String)
["org.apache.hadoop.conf.ReconfigurationServlet"] At
ReconfigurationServlet.java:[lines 44-234]
[INFO] Found reliance on default encoding in new
org.apache.hadoop.crypto.key.KeyProvider$Metadata(byte[]): new
java.io.InputStreamReader(InputStream)
["org.apache.hadoop.crypto.key.KeyProvider$Metadata"] At
KeyProvider.java:[lines 110-204]
[INFO] Found reliance on default encoding in
org.apache.hadoop.crypto.key.KeyProvider$Metadata.serialize(): new
java.io.OutputStreamWriter(OutputStream)
["org.apache.hadoop.crypto.key.KeyProvider$Metadata"] At
KeyProvider.java:[lines 110-204]
[INFO] Redundant nullcheck of clazz, which is known to be non-null in
org.apache.hadoop.fs.FileSystem.createFileSystem(URI, Configuration)
["org.apache.hadoop.fs.FileSystem"] At FileSystem.java:[lines 89-3017]
[INFO] Unread public/protected field:
org.apache.hadoop.fs.HarFileSystem$Store.endHash
["org.apache.hadoop.fs.HarFileSystem$Store"] At HarFileSystem.java:[lines
492-500]
[INFO] Unread public/protected field:
org.apache.hadoop.fs.HarFileSystem$Store.startHash
["org.apache.hadoop.fs.HarFileSystem$Store"] At HarFileSystem.java:[lines
492-500]
[INFO] Found reliance on default encoding in
org.apache.hadoop.fs.HardLink.createHardLink(File, File): new
java.io.InputStreamReader(InputStream) ["org.apache.hadoop.fs.HardLink"] At
HardLink.java:[lines 51-546]
[INFO] Found reliance on default encoding in
org.apache.hadoop.fs.HardLink.createHardLinkMult(File, String[], File, int):
new java.io.InputStreamReader(InputStream) ["org.apache.hadoop.fs.HardLink"] At
HardLink.java:[lines 51-546]
[INFO] Found reliance on default encoding in
org.apache.hadoop.fs.HardLink.getLinkCount(File): new
java.io.InputStreamReader(InputStream) ["org.apache.hadoop.fs.HardLink"] At
HardLink.java:[lines 51-546]
[INFO] Bad attempt to compute absolute value of signed random integer in
org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(String,
long, Configuration, boolean)
["org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext"] At
LocalDirAllocator.java:[lines 247-549]
[INFO] Null passed for nonnull parameter of
org.apache.hadoop.conf.Configuration.set(String, String) in
org.apache.hadoop.fs.ftp.FTPFileSystem.initialize(URI, Configuration)
["org.apache.hadoop.fs.ftp.FTPFileSystem"] At FTPFileSystem.java:[lines 51-593]
[INFO] Redundant nullcheck of dirEntries, which is known to be non-null in
org.apache.hadoop.fs.ftp.FTPFileSystem.delete(FTPClient, Path, boolean)
["org.apache.hadoop.fs.ftp.FTPFileSystem"] At FTPFileSystem.java:[lines 51-593]
[INFO] Redundant nullcheck of
org.apache.hadoop.fs.ftp.FTPFileSystem.getFileStatus(FTPClient, Path), which is
known to be non-null in
org.apache.hadoop.fs.ftp.FTPFileSystem.exists(FTPClient, Path)
["org.apache.hadoop.fs.ftp.FTPFileSystem"] At FTPFileSystem.java:[lines 51-593]
[INFO] Found reliance on default encoding in
org.apache.hadoop.fs.shell.Display$AvroFileInputStream.read():
String.getBytes() ["org.apache.hadoop.fs.shell.Display$AvroFileInputStream"] At
Display.java:[lines 259-309]
[INFO] Format string should use %n rather than \n in
org.apache.hadoop.fs.shell.Display$Checksum.processPath(PathData)
["org.apache.hadoop.fs.shell.Display$Checksum"] At Display.java:[lines 169-196]
[INFO] Format string should use %n rather than \n in
org.apache.hadoop.fs.shell.Display$Checksum.processPath(PathData)
["org.apache.hadoop.fs.shell.Display$Checksum"] At Display.java:[lines 169-196]
[INFO] Found reliance on default encoding in
org.apache.hadoop.fs.shell.Display$TextRecordInputStream.read():
String.getBytes() ["org.apache.hadoop.fs.shell.Display$TextRecordInputStream"]
At Display.java:[lines 207-244]
[INFO] Call to method of static java.text.DateFormat in
org.apache.hadoop.fs.shell.Ls.processPath(PathData)
["org.apache.hadoop.fs.shell.Ls"] At Ls.java:[lines 45-207]
[INFO] org.apache.hadoop.fs.shell.Ls.dateFormat is a static field of type
java.text.DateFormat, which isn't thread safe ["org.apache.hadoop.fs.shell.Ls"]
At Ls.java:[lines 45-207]
[INFO] Call to method of static java.text.DateFormat in
org.apache.hadoop.fs.shell.Stat.processPath(PathData)
["org.apache.hadoop.fs.shell.Stat"] At Stat.java:[lines 46-128]
[INFO] org.apache.hadoop.fs.shell.Stat.timeFmt is a static field of type
java.text.DateFormat, which isn't thread safe
["org.apache.hadoop.fs.shell.Stat"] At Stat.java:[lines 46-128]
[INFO] Switch statement found in
org.apache.hadoop.fs.shell.Test.processPath(PathData) where default case is
missing ["org.apache.hadoop.fs.shell.Test"] At Test.java:[lines 33-94]
[INFO] return value of java.util.concurrent.CountDownLatch.await(long,
TimeUnit) ignored in
org.apache.hadoop.ha.ActiveStandbyElector$WatcherWithClientRef.process(WatchedEvent)
["org.apache.hadoop.ha.ActiveStandbyElector$WatcherWithClientRef"] At
ActiveStandbyElector.java:[lines 1017-1074]
[INFO] Switch statement found in
org.apache.hadoop.ha.SshFenceByTcpPort$LogAdapter.log(int, String) where
default case is missing ["org.apache.hadoop.ha.SshFenceByTcpPort$LogAdapter"]
At SshFenceByTcpPort.java:[lines 273-314]
[INFO] Found reliance on default encoding in
org.apache.hadoop.ha.StreamPumper.pump(): new
java.io.InputStreamReader(InputStream) ["org.apache.hadoop.ha.StreamPumper"] At
StreamPumper.java:[lines 32-89]
[INFO] Found reliance on default encoding in
org.apache.hadoop.http.HtmlQuoting.<static initializer for HtmlQuoting>():
String.getBytes() ["org.apache.hadoop.http.HtmlQuoting"] At
HtmlQuoting.java:[lines 27-206]
[INFO] Found reliance on default encoding in
org.apache.hadoop.http.HtmlQuoting.needsQuoting(String): String.getBytes()
["org.apache.hadoop.http.HtmlQuoting"] At HtmlQuoting.java:[lines 27-206]
[INFO] Found reliance on default encoding in
org.apache.hadoop.http.HtmlQuoting.quoteHtmlChars(String):
java.io.ByteArrayOutputStream.toString() ["org.apache.hadoop.http.HtmlQuoting"]
At HtmlQuoting.java:[lines 27-206]
[INFO] Found reliance on default encoding in
org.apache.hadoop.http.HtmlQuoting.quoteHtmlChars(String): String.getBytes()
["org.apache.hadoop.http.HtmlQuoting"] At HtmlQuoting.java:[lines 27-206]
[INFO] Found reliance on default encoding in
org.apache.hadoop.io.DefaultStringifier.toString(Object): new String(byte[])
["org.apache.hadoop.io.DefaultStringifier"] At DefaultStringifier.java:[lines
60-202]
[INFO]
org.apache.hadoop.io.LongWritable$DecreasingComparator.compare(WritableComparable,
WritableComparable) negates the return value of
org.apache.hadoop.io.WritableComparator.compare(WritableComparable,
WritableComparable) ["org.apache.hadoop.io.LongWritable$DecreasingComparator"]
At LongWritable.java:[lines 98-106]
[INFO] org.apache.hadoop.io.LongWritable$DecreasingComparator.compare(byte[],
int, int, byte[], int, int) negates the return value of
org.apache.hadoop.io.LongWritable$Comparator.compare(byte[], int, int, byte[],
int, int) ["org.apache.hadoop.io.LongWritable$DecreasingComparator"] At
LongWritable.java:[lines 98-106]
[INFO] Found reliance on default encoding in new
org.apache.hadoop.io.SequenceFile$Writer(Configuration,
SequenceFile$Writer$Option[]): String.getBytes()
["org.apache.hadoop.io.SequenceFile$Writer"] At SequenceFile.java:[lines
822-1361]
[INFO] Found reliance on default encoding in new
org.apache.hadoop.io.SequenceFile$Writer(FileSystem, Configuration, Path,
Class, Class): String.getBytes() ["org.apache.hadoop.io.SequenceFile$Writer"]
At SequenceFile.java:[lines 822-1361]
[INFO] Found reliance on default encoding in new
org.apache.hadoop.io.SequenceFile$Writer(FileSystem, Configuration, Path,
Class, Class, int, short, long, Progressable, SequenceFile$Metadata):
String.getBytes() ["org.apache.hadoop.io.SequenceFile$Writer"] At
SequenceFile.java:[lines 822-1361]
[INFO] Found reliance on default encoding in new
org.apache.hadoop.io.SequenceFile$Writer(FileSystem, Configuration, Path,
Class, Class, Progressable, SequenceFile$Metadata): String.getBytes()
["org.apache.hadoop.io.SequenceFile$Writer"] At SequenceFile.java:[lines
822-1361]
[INFO] Switch statement found in
org.apache.hadoop.io.Text.bytesToCodePoint(ByteBuffer) where default case is
missing ["org.apache.hadoop.io.Text"] At Text.java:[lines 56-672]
[INFO] Switch statement found in org.apache.hadoop.io.Text.validateUTF8(byte[],
int, int) where default case is missing ["org.apache.hadoop.io.Text"] At
Text.java:[lines 56-672]
[INFO] Found reliance on default encoding in
org.apache.hadoop.io.compress.BZip2Codec$BZip2CompressionInputStream.readStreamHeader():
new String(byte[])
["org.apache.hadoop.io.compress.BZip2Codec$BZip2CompressionInputStream"] At
BZip2Codec.java:[lines 361-539]
[INFO] Found reliance on default encoding in
org.apache.hadoop.io.compress.BZip2Codec$BZip2CompressionOutputStream.writeStreamHeader():
String.getBytes()
["org.apache.hadoop.io.compress.BZip2Codec$BZip2CompressionOutputStream"] At
BZip2Codec.java:[lines 273-335]
[INFO] Write to static field
org.apache.hadoop.io.compress.bzip2.CBZip2InputStream.skipDecompression from
instance method
org.apache.hadoop.io.compress.bzip2.CBZip2InputStream.read(byte[], int, int)
["org.apache.hadoop.io.compress.bzip2.CBZip2InputStream"] At
CBZip2InputStream.java:[lines 82-1173]
[INFO] Format string should use %n rather than \n in
org.apache.hadoop.io.file.tfile.TFile.main(String[])
["org.apache.hadoop.io.file.tfile.TFile"] At TFile.java:[lines 134-2361]
[INFO] Found reliance on default encoding in
org.apache.hadoop.io.file.tfile.TFileDumper.dumpInfo(String, PrintStream,
Configuration): new String(byte[], int, int)
["org.apache.hadoop.io.file.tfile.TFileDumper"] At TFileDumper.java:[lines
43-294]
[INFO] Format string should use %n rather than \n in
org.apache.hadoop.io.file.tfile.TFileDumper.dumpInfo(String, PrintStream,
Configuration) ["org.apache.hadoop.io.file.tfile.TFileDumper"] At
TFileDumper.java:[lines 43-294]
[INFO] Format string should use %n rather than \n in
org.apache.hadoop.io.file.tfile.TFileDumper.dumpInfo(String, PrintStream,
Configuration) ["org.apache.hadoop.io.file.tfile.TFileDumper"] At
TFileDumper.java:[lines 43-294]
[INFO] Format string should use %n rather than \n in
org.apache.hadoop.io.file.tfile.TFileDumper.dumpInfo(String, PrintStream,
Configuration) ["org.apache.hadoop.io.file.tfile.TFileDumper"] At
TFileDumper.java:[lines 43-294]
[INFO] Format string should use %n rather than \n in
org.apache.hadoop.io.file.tfile.TFileDumper.dumpInfo(String, PrintStream,
Configuration) ["org.apache.hadoop.io.file.tfile.TFileDumper"] At
TFileDumper.java:[lines 43-294]
[INFO] Found reliance on default encoding in
org.apache.hadoop.ipc.RpcConstants.<static initializer for RpcConstants>():
String.getBytes() ["org.apache.hadoop.ipc.RpcConstants"] At
RpcConstants.java:[lines 26-56]
[INFO] Found reliance on default encoding in
org.apache.hadoop.ipc.Server.<static initializer for Server>():
String.getBytes() ["org.apache.hadoop.ipc.Server"] At Server.java:[lines
133-2685]
[INFO] org.apache.hadoop.ipc.Server$Connection.close() might ignore
java.lang.Exception ["org.apache.hadoop.ipc.Server$Connection",
"java.lang.Exception"] At Server.java:[lines 1101-2053]At Exception.java:[lines
54-123]
[INFO] Found reliance on default encoding in
org.apache.hadoop.ipc.Server$Connection.setupHttpRequestOnIpcPortResponse():
String.getBytes() ["org.apache.hadoop.ipc.Server$Connection"] At
Server.java:[lines 1101-2053]
[INFO] Increment of volatile field
org.apache.hadoop.ipc.Server$Connection.rpcCount in
org.apache.hadoop.ipc.Server$Connection.decRpcCount()
["org.apache.hadoop.ipc.Server$Connection"] At Server.java:[lines 1101-2053]
[INFO] Increment of volatile field
org.apache.hadoop.ipc.Server$Connection.rpcCount in
org.apache.hadoop.ipc.Server$Connection.incRpcCount()
["org.apache.hadoop.ipc.Server$Connection"] At Server.java:[lines 1101-2053]
[INFO] HTTP parameter written to Servlet output in
org.apache.hadoop.jmx.JMXJsonServlet.doGet(HttpServletRequest,
HttpServletResponse) ["org.apache.hadoop.jmx.JMXJsonServlet"] At
JMXJsonServlet.java:[lines 120-422]
[INFO] Found reliance on default encoding in
org.apache.hadoop.log.LogLevel.process(String): new
java.io.InputStreamReader(InputStream) ["org.apache.hadoop.log.LogLevel"] At
LogLevel.java:[lines 38-86]
[INFO] Found reliance on default encoding in
org.apache.hadoop.metrics.file.FileContext.startMonitoring(): new
java.io.FileWriter(File, boolean)
["org.apache.hadoop.metrics.file.FileContext"] At FileContext.java:[lines
58-158]
[INFO] Found reliance on default encoding in
org.apache.hadoop.metrics.file.FileContext.startMonitoring(): new
java.io.PrintWriter(OutputStream)
["org.apache.hadoop.metrics.file.FileContext"] At FileContext.java:[lines
58-158]
[INFO] Found reliance on default encoding in
org.apache.hadoop.metrics.ganglia.GangliaContext.xdr_string(String):
String.getBytes() ["org.apache.hadoop.metrics.ganglia.GangliaContext"] At
GangliaContext.java:[lines 64-254]
[INFO] Redundant nullcheck of units, which is known to be non-null in
org.apache.hadoop.metrics.ganglia.GangliaContext31.emitMetric(String, String,
String) ["org.apache.hadoop.metrics.ganglia.GangliaContext31"] At
GangliaContext31.java:[lines 39-144]
[INFO] Found reliance on default encoding in
org.apache.hadoop.metrics2.impl.MetricsConfig.toString(Configuration):
java.io.ByteArrayOutputStream.toString()
["org.apache.hadoop.metrics2.impl.MetricsConfig"] At MetricsConfig.java:[lines
51-280]
[INFO] Found reliance on default encoding in
org.apache.hadoop.metrics2.impl.MetricsConfig.toString(Configuration): new
java.io.PrintStream(OutputStream)
["org.apache.hadoop.metrics2.impl.MetricsConfig"] At MetricsConfig.java:[lines
51-280]
[INFO] Increment of volatile field
org.apache.hadoop.metrics2.lib.MutableCounterInt.value in
org.apache.hadoop.metrics2.lib.MutableCounterInt.incr()
["org.apache.hadoop.metrics2.lib.MutableCounterInt"] At
MutableCounterInt.java:[lines 35-64]
[INFO] Increment of volatile field
org.apache.hadoop.metrics2.lib.MutableCounterLong.value in
org.apache.hadoop.metrics2.lib.MutableCounterLong.incr()
["org.apache.hadoop.metrics2.lib.MutableCounterLong"] At
MutableCounterLong.java:[lines 36-65]
[INFO] Increment of volatile field
org.apache.hadoop.metrics2.lib.MutableGaugeInt.value in
org.apache.hadoop.metrics2.lib.MutableGaugeInt.decr()
["org.apache.hadoop.metrics2.lib.MutableGaugeInt"] At
MutableGaugeInt.java:[lines 36-89]
[INFO] Increment of volatile field
org.apache.hadoop.metrics2.lib.MutableGaugeInt.value in
org.apache.hadoop.metrics2.lib.MutableGaugeInt.incr()
["org.apache.hadoop.metrics2.lib.MutableGaugeInt"] At
MutableGaugeInt.java:[lines 36-89]
[INFO] Increment of volatile field
org.apache.hadoop.metrics2.lib.MutableGaugeLong.value in
org.apache.hadoop.metrics2.lib.MutableGaugeLong.decr()
["org.apache.hadoop.metrics2.lib.MutableGaugeLong"] At
MutableGaugeLong.java:[lines 36-89]
[INFO] Increment of volatile field
org.apache.hadoop.metrics2.lib.MutableGaugeLong.value in
org.apache.hadoop.metrics2.lib.MutableGaugeLong.incr()
["org.apache.hadoop.metrics2.lib.MutableGaugeLong"] At
MutableGaugeLong.java:[lines 36-89]
[INFO] Found reliance on default encoding in
org.apache.hadoop.metrics2.sink.FileSink.init(SubsetConfiguration): new
java.io.FileWriter(File, boolean) ["org.apache.hadoop.metrics2.sink.FileSink"]
At FileSink.java:[lines 39-83]
[INFO] Found reliance on default encoding in
org.apache.hadoop.metrics2.sink.FileSink.init(SubsetConfiguration): new
java.io.PrintWriter(OutputStream) ["org.apache.hadoop.metrics2.sink.FileSink"]
At FileSink.java:[lines 39-83]
[INFO] Found reliance on default encoding in
org.apache.hadoop.metrics2.sink.ganglia.AbstractGangliaSink.xdr_string(String):
String.getBytes()
["org.apache.hadoop.metrics2.sink.ganglia.AbstractGangliaSink"] At
AbstractGangliaSink.java:[lines 45-289]
[INFO] Sequence of calls to java.util.concurrent.ConcurrentHashMap may not be
atomic in org.apache.hadoop.net.NetUtils.canonicalizeHost(String)
["org.apache.hadoop.net.NetUtils"] At NetUtils.java:[lines 63-905]
[INFO] Found reliance on default encoding in
org.apache.hadoop.net.TableMapping$RawTableMapping.load(): new
java.io.FileReader(String)
["org.apache.hadoop.net.TableMapping$RawTableMapping"] At
TableMapping.java:[lines 85-171]
[INFO] Found reliance on default encoding in
org.apache.hadoop.record.compiler.CGenerator.genCode(String, ArrayList,
ArrayList, String, ArrayList): new java.io.FileWriter(String)
["org.apache.hadoop.record.compiler.CGenerator"] At CGenerator.java:[lines
32-71]
[INFO] Found reliance on default encoding in
org.apache.hadoop.record.compiler.CppGenerator.genCode(String, ArrayList,
ArrayList, String, ArrayList): new java.io.FileWriter(String)
["org.apache.hadoop.record.compiler.CppGenerator"] At CppGenerator.java:[lines
32-74]
[INFO] Found reliance on default encoding in
org.apache.hadoop.record.compiler.JRecord$JavaRecord.genCode(String,
ArrayList): new java.io.FileWriter(File)
["org.apache.hadoop.record.compiler.JRecord$JavaRecord"] At JRecord.java:[lines
42-478]
[INFO] Found reliance on default encoding in
org.apache.hadoop.security.AuthenticationFilterInitializer.initFilter(FilterContainer,
Configuration): new java.io.FileReader(String)
["org.apache.hadoop.security.AuthenticationFilterInitializer"] At
AuthenticationFilterInitializer.java:[lines 46-112]
[INFO] Found reliance on default encoding in
org.apache.hadoop.security.Credentials.<static initializer for Credentials>():
String.getBytes() ["org.apache.hadoop.security.Credentials"] At
Credentials.java:[lines 58-323]
[INFO] Found reliance on default encoding in
org.apache.hadoop.security.LdapGroupsMapping.extractPassword(String): new
java.io.FileReader(String) ["org.apache.hadoop.security.LdapGroupsMapping"] At
LdapGroupsMapping.java:[lines 71-360]
[INFO] Found reliance on default encoding in
org.apache.hadoop.security.SaslRpcServer.decodeIdentifier(String):
String.getBytes() ["org.apache.hadoop.security.SaslRpcServer"] At
SaslRpcServer.java:[lines 65-214]
[INFO] Found reliance on default encoding in
org.apache.hadoop.security.SaslRpcServer.encodeIdentifier(byte[]): new
String(byte[]) ["org.apache.hadoop.security.SaslRpcServer"] At
SaslRpcServer.java:[lines 65-214]
[INFO] Found reliance on default encoding in
org.apache.hadoop.security.SaslRpcServer.encodePassword(byte[]): new
String(byte[]) ["org.apache.hadoop.security.SaslRpcServer"] At
SaslRpcServer.java:[lines 65-214]
[INFO] Switch statement found in new
org.apache.hadoop.util.ComparableVersion$StringItem(String, boolean) where
default case is missing ["org.apache.hadoop.util.ComparableVersion$StringItem"]
At ComparableVersion.java:[lines 162-257]
[INFO] Found reliance on default encoding in
org.apache.hadoop.util.HostsFileReader.readFileToSetWithFileInputStream(String,
String, InputStream, Set): new java.io.InputStreamReader(InputStream)
["org.apache.hadoop.util.HostsFileReader"] At HostsFileReader.java:[lines
41-176]
[INFO] Format string should use %n rather than \n in
org.apache.hadoop.util.NativeLibraryChecker.main(String[])
["org.apache.hadoop.util.NativeLibraryChecker"] At
NativeLibraryChecker.java:[lines 32-97]
[INFO] Format string should use %n rather than \n in
org.apache.hadoop.util.NativeLibraryChecker.main(String[])
["org.apache.hadoop.util.NativeLibraryChecker"] At
NativeLibraryChecker.java:[lines 32-97]
[INFO] Format string should use %n rather than \n in
org.apache.hadoop.util.NativeLibraryChecker.main(String[])
["org.apache.hadoop.util.NativeLibraryChecker"] At
NativeLibraryChecker.java:[lines 32-97]
[INFO] Format string should use %n rather than \n in
org.apache.hadoop.util.NativeLibraryChecker.main(String[])
["org.apache.hadoop.util.NativeLibraryChecker"] At
NativeLibraryChecker.java:[lines 32-97]
[INFO] Format string should use %n rather than \n in
org.apache.hadoop.util.NativeLibraryChecker.main(String[])
["org.apache.hadoop.util.NativeLibraryChecker"] At
NativeLibraryChecker.java:[lines 32-97]
[INFO] org.apache.hadoop.util.PrintJarMainClass.main(String[]) may fail to
close stream ["org.apache.hadoop.util.PrintJarMainClass"] At
PrintJarMainClass.java:[lines 31-54]
[INFO] Switch statement found in
org.apache.hadoop.util.PureJavaCrc32.update(byte[], int, int) where default
case is missing ["org.apache.hadoop.util.PureJavaCrc32"] At
PureJavaCrc32.java:[lines 45-118]
[INFO] Switch statement found in
org.apache.hadoop.util.PureJavaCrc32C.update(byte[], int, int) where default
case is missing ["org.apache.hadoop.util.PureJavaCrc32C"] At
PureJavaCrc32C.java:[lines 41-115]
[INFO] Found reliance on default encoding in
org.apache.hadoop.util.ReflectionUtils.logThreadInfo(Log, String, long):
java.io.ByteArrayOutputStream.toString()
["org.apache.hadoop.util.ReflectionUtils"] At ReflectionUtils.java:[lines
52-339]
[INFO] Found reliance on default encoding in
org.apache.hadoop.util.ReflectionUtils.logThreadInfo(Log, String, long): new
java.io.PrintWriter(OutputStream) ["org.apache.hadoop.util.ReflectionUtils"] At
ReflectionUtils.java:[lines 52-339]
[INFO] Found reliance on default encoding in
org.apache.hadoop.util.Shell.runCommand(): new
java.io.InputStreamReader(InputStream) ["org.apache.hadoop.util.Shell"] At
Shell.java:[lines 45-753]
{noformat}
--
This message was sent by Atlassian JIRA
(v6.2#6252)