[jira] [Resolved] (HADOOP-10165) TestMetricsSystemImpl#testMultiThreadedPublish occasionally fails

2013-12-15 Thread Akira AJISAKA (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10165?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Akira AJISAKA resolved HADOOP-10165.


Resolution: Duplicate

Closing this issue as duplicate.

> TestMetricsSystemImpl#testMultiThreadedPublish occasionally fails
> -
>
> Key: HADOOP-10165
> URL: https://issues.apache.org/jira/browse/HADOOP-10165
> Project: Hadoop Common
>  Issue Type: Bug
>Reporter: Ted Yu
>Priority: Minor
>
> From 
> https://builds.apache.org/job/Hadoop-Common-trunk/982/testReport/junit/org.apache.hadoop.metrics2.impl/TestMetricsSystemImpl/testMultiThreadedPublish/
>  :
> {code}
> Error Message
> Passed
> Passed
> Metric not collected!
> Metric not collected!
> Metric not collected!
> Metric not collected!
> Metric not collected!
> Metric not collected!
> Metric not collected!
> Passed
> Stacktrace
> java.lang.AssertionError: Passed
> Passed
> Metric not collected!
> Metric not collected!
> Metric not collected!
> Metric not collected!
> Metric not collected!
> Metric not collected!
> Metric not collected!
> Passed
>   at org.junit.Assert.fail(Assert.java:93)
>   at org.junit.Assert.assertTrue(Assert.java:43)
>   at 
> org.apache.hadoop.metrics2.impl.TestMetricsSystemImpl.testMultiThreadedPublish(TestMetricsSystemImpl.java:233)
> Standard Output
> 2013-12-15 09:14:49,144 INFO  impl.MetricsConfig 
> (MetricsConfig.java:loadFirst(111)) - loaded properties from 
> hadoop-metrics2-test.properties
> 2013-12-15 09:14:49,146 INFO  impl.MetricsSystemImpl 
> (MetricsSystemImpl.java:startTimer(341)) - Scheduled snapshot period at 80 
> second(s).
> 2013-12-15 09:14:49,146 INFO  impl.MetricsSystemImpl 
> (MetricsSystemImpl.java:start(183)) - Test metrics system started
> 2013-12-15 09:14:49,147 INFO  impl.MetricsSinkAdapter 
> (MetricsSinkAdapter.java:start(190)) - Sink Collector started
> 2013-12-15 09:14:49,147 INFO  impl.MetricsSystemImpl 
> (MetricsSystemImpl.java:registerSink(275)) - Registered sink Collector
> {code}



--
This message was sent by Atlassian JIRA
(v6.1.4#6159)


[jira] [Updated] (HADOOP-10161) In the hadoop metrics framework,The default value of dmax is 0, which means gmond process will never delete this metirc although it is disappeared.

2013-12-15 Thread Yang He (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10161?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang He updated HADOOP-10161:
-

Priority: Major  (was: Minor)

> In the hadoop metrics framework,The default value of dmax is 0, which means 
> gmond process will never delete this metirc although it is disappeared.
> ---
>
> Key: HADOOP-10161
> URL: https://issues.apache.org/jira/browse/HADOOP-10161
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: metrics
>Affects Versions: 2.2.0
>Reporter: Yang He
> Attachments: HADOOP-10161_0_20131211.patch, 
> hadoop-metrics.properties, hadoop-metrics2.properties
>
>
> The property of dmax in ganglia configration is  metric dead time , when one  
> metrics is disappeared ,so no more value of the metric will be emtic to the 
> gmond,after 'dmax' seconds,then  gmond will destroy the metric in memery . in 
> hadoop metrics framework ,the default value is 0,which means the gmond will 
> never destroy the  metric althought the metric is disappeared ,the gmetad 
> daemon also does not delete the rrdtool file ,and  there does not have  a 
> method to configure the defualt value of  dmax for all metrics .  



--
This message was sent by Atlassian JIRA
(v6.1.4#6159)


[jira] [Updated] (HADOOP-10161) In the hadoop metrics framework,The default value of dmax is 0, which means gmond process will never delete this metirc although it is disappeared.

2013-12-15 Thread Yang He (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10161?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang He updated HADOOP-10161:
-

Component/s: (was: fs)
 metrics

> In the hadoop metrics framework,The default value of dmax is 0, which means 
> gmond process will never delete this metirc although it is disappeared.
> ---
>
> Key: HADOOP-10161
> URL: https://issues.apache.org/jira/browse/HADOOP-10161
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: metrics
>Affects Versions: 2.2.0
>Reporter: Yang He
>Priority: Minor
> Attachments: HADOOP-10161_0_20131211.patch, 
> hadoop-metrics.properties, hadoop-metrics2.properties
>
>
> The property of dmax in ganglia configration is  metric dead time , when one  
> metrics is disappeared ,so no more value of the metric will be emtic to the 
> gmond,after 'dmax' seconds,then  gmond will destroy the metric in memery . in 
> hadoop metrics framework ,the default value is 0,which means the gmond will 
> never destroy the  metric althought the metric is disappeared ,the gmetad 
> daemon also does not delete the rrdtool file ,and  there does not have  a 
> method to configure the defualt value of  dmax for all metrics .  



--
This message was sent by Atlassian JIRA
(v6.1.4#6159)


[jira] [Updated] (HADOOP-10161) In the hadoop metrics framework,The default value of dmax is 0, which means gmond process will never delete this metirc although it is disappeared.

2013-12-15 Thread Yang He (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10161?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang He updated HADOOP-10161:
-

Affects Version/s: 2.2.0

> In the hadoop metrics framework,The default value of dmax is 0, which means 
> gmond process will never delete this metirc although it is disappeared.
> ---
>
> Key: HADOOP-10161
> URL: https://issues.apache.org/jira/browse/HADOOP-10161
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: fs
>Affects Versions: 2.2.0
>Reporter: Yang He
>Priority: Minor
> Attachments: HADOOP-10161_0_20131211.patch, 
> hadoop-metrics.properties, hadoop-metrics2.properties
>
>
> The property of dmax in ganglia configration is  metric dead time , when one  
> metrics is disappeared ,so no more value of the metric will be emtic to the 
> gmond,after 'dmax' times ,then  gmond will destroy the metric in memery . in 
> hadoop metrics framework ,the default value is 0,which means the gmond will 
> never destroy the  metric althought the metric is disappeared ,the gmetad 
> daemon also does not delete the rrdtool file ,and  there does not have  a 
> method to configure the defualt value of  dmax for all metrics .  



--
This message was sent by Atlassian JIRA
(v6.1.4#6159)


[jira] [Updated] (HADOOP-10161) In the hadoop metrics framework,The default value of dmax is 0, which means gmond process will never delete this metirc although it is disappeared.

2013-12-15 Thread Yang He (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10161?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang He updated HADOOP-10161:
-

Description: The property of dmax in ganglia configration is  metric dead 
time , when one  metrics is disappeared ,so no more value of the metric will be 
emtic to the gmond,after 'dmax' seconds,then  gmond will destroy the metric in 
memery . in hadoop metrics framework ,the default value is 0,which means the 
gmond will never destroy the  metric althought the metric is disappeared ,the 
gmetad daemon also does not delete the rrdtool file ,and  there does not have  
a method to configure the defualt value of  dmax for all metrics .(was: The 
property of dmax in ganglia configration is  metric dead time , when one  
metrics is disappeared ,so no more value of the metric will be emtic to the 
gmond,after 'dmax' times ,then  gmond will destroy the metric in memery . in 
hadoop metrics framework ,the default value is 0,which means the gmond will 
never destroy the  metric althought the metric is disappeared ,the gmetad 
daemon also does not delete the rrdtool file ,and  there does not have  a 
method to configure the defualt value of  dmax for all metrics .  )

> In the hadoop metrics framework,The default value of dmax is 0, which means 
> gmond process will never delete this metirc although it is disappeared.
> ---
>
> Key: HADOOP-10161
> URL: https://issues.apache.org/jira/browse/HADOOP-10161
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: fs
>Affects Versions: 2.2.0
>Reporter: Yang He
>Priority: Minor
> Attachments: HADOOP-10161_0_20131211.patch, 
> hadoop-metrics.properties, hadoop-metrics2.properties
>
>
> The property of dmax in ganglia configration is  metric dead time , when one  
> metrics is disappeared ,so no more value of the metric will be emtic to the 
> gmond,after 'dmax' seconds,then  gmond will destroy the metric in memery . in 
> hadoop metrics framework ,the default value is 0,which means the gmond will 
> never destroy the  metric althought the metric is disappeared ,the gmetad 
> daemon also does not delete the rrdtool file ,and  there does not have  a 
> method to configure the defualt value of  dmax for all metrics .  



--
This message was sent by Atlassian JIRA
(v6.1.4#6159)


[jira] [Updated] (HADOOP-10167) Making whole hadoop-common to use UTF-8 in Maven pom files / refactoring

2013-12-15 Thread Mikhail Antonov (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10167?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mikhail Antonov updated HADOOP-10167:
-

Description: 
While looking at BIGTOP-831, turned out that the way Bigtop calls maven build / 
site:site generation causes the errors like this:

[ERROR] Exit code: 1 - 
/home/user/jenkins/workspace/BigTop-RPM/label/centos-6-x86_64-HAD-1-buildbot/bigtop-repo/build/hadoop/rpm/BUILD/hadoop-2.0.2-alpha-src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics2/source/JvmMetricsInfo.java:31:
 error: unmappable character for encoding ANSI_X3.4-1968
[ERROR] JvmMetrics("JVM related metrics etc."), // record info??

Making the whole hadoop-common to use UTF-8 fixes that and seems in general 
good thing to me.

Attaching first version of patch for review.

Original issue was observed on openjdk 7 (x86-64).

  was:
While looking at BIGTOP-831, turned out that the way Bigtop calls maven build / 
site:site generation causes the errors like this:

[ERROR] Exit code: 1 - 
/home/user/jenkins/workspace/BigTop-RPM/label/centos-6-x86_64-HAD-1-buildbot/bigtop-repo/build/hadoop/rpm/BUILD/hadoop-2.0.2-alpha-src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics2/source/JvmMetricsInfo.java:31:
 error: unmappable character for encoding ANSI_X3.4-1968
[ERROR] JvmMetrics("JVM related metrics etc."), // record info??

Making the whole hadoop-common to use UTF-8 fixes that and seems in general 
good thing to me.

Attaching first version of patch for review.


> Making whole hadoop-common to use UTF-8 in Maven pom files / refactoring
> 
>
> Key: HADOOP-10167
> URL: https://issues.apache.org/jira/browse/HADOOP-10167
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Affects Versions: 2.0.6-alpha
> Environment: Fedora 19 x86-64
>Reporter: Mikhail Antonov
>  Labels: build
> Attachments: HADOOP-10167-1.patch
>
>
> While looking at BIGTOP-831, turned out that the way Bigtop calls maven build 
> / site:site generation causes the errors like this:
> [ERROR] Exit code: 1 - 
> /home/user/jenkins/workspace/BigTop-RPM/label/centos-6-x86_64-HAD-1-buildbot/bigtop-repo/build/hadoop/rpm/BUILD/hadoop-2.0.2-alpha-src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics2/source/JvmMetricsInfo.java:31:
>  error: unmappable character for encoding ANSI_X3.4-1968
> [ERROR] JvmMetrics("JVM related metrics etc."), // record info??
> Making the whole hadoop-common to use UTF-8 fixes that and seems in general 
> good thing to me.
> Attaching first version of patch for review.
> Original issue was observed on openjdk 7 (x86-64).



--
This message was sent by Atlassian JIRA
(v6.1.4#6159)


[jira] [Updated] (HADOOP-10167) Making whole hadoop-common to use UTF-8 in Maven pom files / refactoring

2013-12-15 Thread Mikhail Antonov (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10167?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mikhail Antonov updated HADOOP-10167:
-

Labels: build  (was: )

> Making whole hadoop-common to use UTF-8 in Maven pom files / refactoring
> 
>
> Key: HADOOP-10167
> URL: https://issues.apache.org/jira/browse/HADOOP-10167
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Affects Versions: 2.0.6-alpha
> Environment: Fedora 19 x86-64
>Reporter: Mikhail Antonov
>  Labels: build
> Attachments: HADOOP-10167-1.patch
>
>
> While looking at BIGTOP-831, turned out that the way Bigtop calls maven build 
> / site:site generation causes the errors like this:
> [ERROR] Exit code: 1 - 
> /home/user/jenkins/workspace/BigTop-RPM/label/centos-6-x86_64-HAD-1-buildbot/bigtop-repo/build/hadoop/rpm/BUILD/hadoop-2.0.2-alpha-src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics2/source/JvmMetricsInfo.java:31:
>  error: unmappable character for encoding ANSI_X3.4-1968
> [ERROR] JvmMetrics("JVM related metrics etc."), // record info??
> Making the whole hadoop-common to use UTF-8 fixes that and seems in general 
> good thing to me.
> Attaching first version of patch for review.



--
This message was sent by Atlassian JIRA
(v6.1.4#6159)


[jira] [Updated] (HADOOP-10167) Making whole hadoop-common to use UTF-8 in Maven pom files / refactoring

2013-12-15 Thread Mikhail Antonov (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10167?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mikhail Antonov updated HADOOP-10167:
-

Attachment: HADOOP-10167-1.patch

First version of patch

> Making whole hadoop-common to use UTF-8 in Maven pom files / refactoring
> 
>
> Key: HADOOP-10167
> URL: https://issues.apache.org/jira/browse/HADOOP-10167
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Affects Versions: 2.0.6-alpha
> Environment: Fedora 19 x86-64
>Reporter: Mikhail Antonov
>  Labels: build
> Attachments: HADOOP-10167-1.patch
>
>
> While looking at BIGTOP-831, turned out that the way Bigtop calls maven build 
> / site:site generation causes the errors like this:
> [ERROR] Exit code: 1 - 
> /home/user/jenkins/workspace/BigTop-RPM/label/centos-6-x86_64-HAD-1-buildbot/bigtop-repo/build/hadoop/rpm/BUILD/hadoop-2.0.2-alpha-src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics2/source/JvmMetricsInfo.java:31:
>  error: unmappable character for encoding ANSI_X3.4-1968
> [ERROR] JvmMetrics("JVM related metrics etc."), // record info??
> Making the whole hadoop-common to use UTF-8 fixes that and seems in general 
> good thing to me.
> Attaching first version of patch for review.



--
This message was sent by Atlassian JIRA
(v6.1.4#6159)


[jira] [Created] (HADOOP-10167) Making whole hadoop-common to use UTF-8 in Maven pom files / refactoring

2013-12-15 Thread Mikhail Antonov (JIRA)
Mikhail Antonov created HADOOP-10167:


 Summary: Making whole hadoop-common to use UTF-8 in Maven pom 
files / refactoring
 Key: HADOOP-10167
 URL: https://issues.apache.org/jira/browse/HADOOP-10167
 Project: Hadoop Common
  Issue Type: Improvement
  Components: build
Affects Versions: 2.0.6-alpha
 Environment: Fedora 19 x86-64
Reporter: Mikhail Antonov


While looking at BIGTOP-831, turned out that the way Bigtop calls maven build / 
site:site generation causes the errors like this:

[ERROR] Exit code: 1 - 
/home/user/jenkins/workspace/BigTop-RPM/label/centos-6-x86_64-HAD-1-buildbot/bigtop-repo/build/hadoop/rpm/BUILD/hadoop-2.0.2-alpha-src/hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/metrics2/source/JvmMetricsInfo.java:31:
 error: unmappable character for encoding ANSI_X3.4-1968
[ERROR] JvmMetrics("JVM related metrics etc."), // record info??

Making the whole hadoop-common to use UTF-8 fixes that and seems in general 
good thing to me.

Attaching first version of patch for review.



--
This message was sent by Atlassian JIRA
(v6.1.4#6159)


[jira] [Created] (HADOOP-10166) I am not able to run mvn package -Pdist,native-win -DskipTests -Dtar

2013-12-15 Thread Nalini Ranjan (JIRA)
Nalini Ranjan created HADOOP-10166:
--

 Summary: I am not able to run mvn package -Pdist,native-win 
-DskipTests -Dtar
 Key: HADOOP-10166
 URL: https://issues.apache.org/jira/browse/HADOOP-10166
 Project: Hadoop Common
  Issue Type: Bug
 Environment: window 8
Reporter: Nalini Ranjan


When i try to build hadoop common i am getting below error.

Loading source files for package 
org.apache.hadoop.security.authentication.examples...
Constructing Javadoc information...
Standard Doclet version 1.7.0_17
Building tree for all the packages and classes...
Generating 
C:\hdfs\common\hadoop-common-project\hadoop-auth-examples\target\org\apache\hadoop\security\authentication\examples\RequestLoggerFilter.htm
l...
Generating 
C:\hdfs\common\hadoop-common-project\hadoop-auth-examples\target\org\apache\hadoop\security\authentication\examples\WhoClient.html...
Generating 
C:\hdfs\common\hadoop-common-project\hadoop-auth-examples\target\org\apache\hadoop\security\authentication\examples\WhoServlet.html...
Generating 
C:\hdfs\common\hadoop-common-project\hadoop-auth-examples\target\org\apache\hadoop\security\authentication\examples\package-frame.html...
Generating 
C:\hdfs\common\hadoop-common-project\hadoop-auth-examples\target\org\apache\hadoop\security\authentication\examples\package-summary.html...

Generating 
C:\hdfs\common\hadoop-common-project\hadoop-auth-examples\target\org\apache\hadoop\security\authentication\examples\package-tree.html...
Generating 
C:\hdfs\common\hadoop-common-project\hadoop-auth-examples\target\constant-values.html...
Generating 
C:\hdfs\common\hadoop-common-project\hadoop-auth-examples\target\serialized-form.html...
Generating 
C:\hdfs\common\hadoop-common-project\hadoop-auth-examples\target\org\apache\hadoop\security\authentication\examples\class-use\WhoServlet.ht
ml...
Generating 
C:\hdfs\common\hadoop-common-project\hadoop-auth-examples\target\org\apache\hadoop\security\authentication\examples\class-use\WhoClient.htm
l...
Generating 
C:\hdfs\common\hadoop-common-project\hadoop-auth-examples\target\org\apache\hadoop\security\authentication\examples\class-use\RequestLogger
Filter.html...
Generating 
C:\hdfs\common\hadoop-common-project\hadoop-auth-examples\target\org\apache\hadoop\security\authentication\examples\package-use.html...
Building index for all the packages and classes...
Generating 
C:\hdfs\common\hadoop-common-project\hadoop-auth-examples\target\overview-tree.html...
Generating 
C:\hdfs\common\hadoop-common-project\hadoop-auth-examples\target\index-all.html...
Generating 
C:\hdfs\common\hadoop-common-project\hadoop-auth-examples\target\deprecated-list.html...
Building index for all classes...
Generating 
C:\hdfs\common\hadoop-common-project\hadoop-auth-examples\target\allclasses-frame.html...
Generating 
C:\hdfs\common\hadoop-common-project\hadoop-auth-examples\target\allclasses-noframe.html...
Generating 
C:\hdfs\common\hadoop-common-project\hadoop-auth-examples\target\index.html...
Generating 
C:\hdfs\common\hadoop-common-project\hadoop-auth-examples\target\help-doc.html...
[INFO] Building jar: 
C:\hdfs\common\hadoop-common-project\hadoop-auth-examples\target\hadoop-auth-examples-3.0.0-SNAPSHOT-javadoc.jar
[INFO]
[INFO] 
[INFO] Building Apache Hadoop Common 3.0.0-SNAPSHOT
[INFO] 
[INFO]
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-os) @ hadoop-common ---
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-common ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO]
[INFO] --- hadoop-maven-plugins:3.0.0-SNAPSHOT:protoc (compile-protoc) @ 
hadoop-common ---
[INFO]
[INFO] --- hadoop-maven-plugins:3.0.0-SNAPSHOT:version-info (version-info) @ 
hadoop-common ---
[WARNING] [svn, info] failed: java.io.IOException: Cannot run program "svn": 
CreateProcess error=2, The system cannot find the file specified
[WARNING] [git, branch] failed: java.io.IOException: Cannot run program "git": 
CreateProcess error=2, The system cannot find the file specified
[INFO] SCM: NONE
[INFO] Computed MD5: 9d8075203e1bd5184e801b2d7354da4a
[INFO]
[INFO] --- maven-resources-plugin:2.2:resources (default-resources) @ 
hadoop-common ---
[INFO] Using default encoding to copy filtered resources.
[INFO]
[INFO] --- maven-compiler-plugin:2.5.1:compile (default-compile) @ 
hadoop-common ---
[INFO] Compiling 12 source files to 
C:\hdfs\common\hadoop-common-project\hadoop-common\target\classes
[INFO]
[INFO] --- native-maven-plugin:1.0-alpha-7:javah (default) @ hadoop-common ---
[INFO] cmd.exe /X /C "C:\Java\jdk1.7.0_17\bin\javah -d 
C:\hdfs\common\hadoop-common-project\hadoop-common\target\native\javah 
-classpath C:\hdfs\commo
n\hadoop-common-project\hadoop-common\target\classes;C:\hdfs\common\hadoop-common-project\ha

[jira] [Created] (HADOOP-10165) TestMetricsSystemImpl#testMultiThreadedPublish occasionally fails

2013-12-15 Thread Ted Yu (JIRA)
Ted Yu created HADOOP-10165:
---

 Summary: TestMetricsSystemImpl#testMultiThreadedPublish 
occasionally fails
 Key: HADOOP-10165
 URL: https://issues.apache.org/jira/browse/HADOOP-10165
 Project: Hadoop Common
  Issue Type: Bug
Reporter: Ted Yu
Priority: Minor


>From 
>https://builds.apache.org/job/Hadoop-Common-trunk/982/testReport/junit/org.apache.hadoop.metrics2.impl/TestMetricsSystemImpl/testMultiThreadedPublish/
> :
{code}
Error Message

Passed
Passed
Metric not collected!
Metric not collected!
Metric not collected!
Metric not collected!
Metric not collected!
Metric not collected!
Metric not collected!
Passed
Stacktrace

java.lang.AssertionError: Passed
Passed
Metric not collected!
Metric not collected!
Metric not collected!
Metric not collected!
Metric not collected!
Metric not collected!
Metric not collected!
Passed
at org.junit.Assert.fail(Assert.java:93)
at org.junit.Assert.assertTrue(Assert.java:43)
at 
org.apache.hadoop.metrics2.impl.TestMetricsSystemImpl.testMultiThreadedPublish(TestMetricsSystemImpl.java:233)
Standard Output

2013-12-15 09:14:49,144 INFO  impl.MetricsConfig 
(MetricsConfig.java:loadFirst(111)) - loaded properties from 
hadoop-metrics2-test.properties
2013-12-15 09:14:49,146 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:startTimer(341)) - Scheduled snapshot period at 80 
second(s).
2013-12-15 09:14:49,146 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:start(183)) - Test metrics system started
2013-12-15 09:14:49,147 INFO  impl.MetricsSinkAdapter 
(MetricsSinkAdapter.java:start(190)) - Sink Collector started
2013-12-15 09:14:49,147 INFO  impl.MetricsSystemImpl 
(MetricsSystemImpl.java:registerSink(275)) - Registered sink Collector
{code}



--
This message was sent by Atlassian JIRA
(v6.1.4#6159)