[jira] [Updated] (HADOOP-10110) hadoop-auth has a build break due to missing dependency

2014-01-12 Thread Anonymous (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10110?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Anonymous updated HADOOP-10110:
---

Affects Version/s: (was: 3.0.0)
   2.2.0
   Status: Patch Available  (was: Reopened)

> hadoop-auth has a build break due to missing dependency
> ---
>
> Key: HADOOP-10110
> URL: https://issues.apache.org/jira/browse/HADOOP-10110
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: build
>Affects Versions: 2.2.0, 2.0.6-alpha
>Reporter: Chuan Liu
>Assignee: Chuan Liu
>Priority: Blocker
> Fix For: 3.0.0, 2.3.0
>
> Attachments: HADOOP-10110.patch
>
>
> We have a build break in hadoop-auth if build with maven cache cleaned. The 
> error looks like the follows. The problem exists on both Windows and Linux. 
> If you have old jetty jars in your maven cache, you won't see the error.
> {noformat}
> [INFO] 
> 
> [INFO] BUILD FAILURE
> [INFO] 
> 
> [INFO] Total time: 1:29.469s
> [INFO] Finished at: Mon Nov 18 12:30:36 PST 2013
> [INFO] Final Memory: 37M/120M
> [INFO] 
> 
> [ERROR] Failed to execute goal 
> org.apache.maven.plugins:maven-compiler-plugin:2.5.1:testCompile 
> (default-testCompile) on project hadoop-auth: Compilation failure: 
> Compilation failure:
> [ERROR] 
> /home/chuan/trunk/hadoop-common-project/hadoop-auth/src/test/java/org/apache/hadoop/security/authentication/client/AuthenticatorTestCase.java:[84,13]
>  cannot access org.mortbay.component.AbstractLifeCycle
> [ERROR] class file for org.mortbay.component.AbstractLifeCycle not found
> [ERROR] server = new Server(0);
> [ERROR] 
> /home/chuan/trunk/hadoop-common-project/hadoop-auth/src/test/java/org/apache/hadoop/security/authentication/client/AuthenticatorTestCase.java:[94,29]
>  cannot access org.mortbay.component.LifeCycle
> [ERROR] class file for org.mortbay.component.LifeCycle not found
> [ERROR] server.getConnectors()[0].setHost(host);
> [ERROR] 
> /home/chuan/trunk/hadoop-common-project/hadoop-auth/src/test/java/org/apache/hadoop/security/authentication/client/AuthenticatorTestCase.java:[96,10]
>  cannot find symbol
> [ERROR] symbol  : method start()
> [ERROR] location: class org.mortbay.jetty.Server
> [ERROR] 
> /home/chuan/trunk/hadoop-common-project/hadoop-auth/src/test/java/org/apache/hadoop/security/authentication/client/AuthenticatorTestCase.java:[102,12]
>  cannot find symbol
> [ERROR] symbol  : method stop()
> [ERROR] location: class org.mortbay.jetty.Server
> [ERROR] -> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
> switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions, please 
> read the following articles:
> [ERROR] [Help 1] 
> http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
> [ERROR]
> [ERROR] After correcting the problems, you can resume the build with the 
> command
> [ERROR]   mvn  -rf :hadoop-auth
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.1.5#6160)


[jira] [Updated] (HADOOP-10161) Add a method to change the default value of dmax in hadoop.properties

2014-01-12 Thread Yang He (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10161?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang He updated HADOOP-10161:
-

Affects Version/s: (was: 2.2.0)

> Add a method to change the default value of dmax in hadoop.properties
> -
>
> Key: HADOOP-10161
> URL: https://issues.apache.org/jira/browse/HADOOP-10161
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: metrics
>Reporter: Yang He
> Attachments: HADOOP-10161_2_20131219.patch
>
>
> The property of dmax in ganglia is a configurable time to rotate metrics. 
> Therefore, no more value of the metric will be emit to the gmond, after 
> 'dmax' seconds, then gmond will destroy the metric in memory. In Hadoop 
> metrics framework, the default value of 'dmax' is 0. It means the gmond will 
> never destroy the metric although the metric is disappeared. The gmetad 
> daemon also does not delete the rrdtool file forever. 
> We need to add a method to configure the default value of dmax for all 
> metrics in hadoop.properties.



--
This message was sent by Atlassian JIRA
(v6.1.5#6160)


[jira] [Commented] (HADOOP-10223) MiniKdc#main() should close the FileReader it creates

2014-01-12 Thread Hudson (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10223?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13869196#comment-13869196
 ] 

Hudson commented on HADOOP-10223:
-

SUCCESS: Integrated in Hadoop-trunk-Commit #4991 (See 
[https://builds.apache.org/job/Hadoop-trunk-Commit/4991/])
HADOOP-10223. MiniKdc#main() should close the FileReader it creates. (Ted Yu 
via tucu) (tucu: 
http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1557627)
* /hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
* 
/hadoop/common/trunk/hadoop-common-project/hadoop-minikdc/src/main/java/org/apache/hadoop/minikdc/MiniKdc.java


> MiniKdc#main() should close the FileReader it creates
> -
>
> Key: HADOOP-10223
> URL: https://issues.apache.org/jira/browse/HADOOP-10223
> Project: Hadoop Common
>  Issue Type: Bug
>Reporter: Ted Yu
>Assignee: Ted Yu
>Priority: Minor
> Fix For: 2.4.0
>
> Attachments: hadoop-10223-v2.txt, hadoop-10223-v3.txt, 
> hadoop-10223.txt
>
>
> FileReader is used to read MiniKDC properties.
> This FileReader should be closed after reading.



--
This message was sent by Atlassian JIRA
(v6.1.5#6160)


[jira] [Updated] (HADOOP-10223) MiniKdc#main() should close the FileReader it creates

2014-01-12 Thread Alejandro Abdelnur (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10223?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Alejandro Abdelnur updated HADOOP-10223:


   Resolution: Fixed
Fix Version/s: 2.4.0
 Hadoop Flags: Reviewed
   Status: Resolved  (was: Patch Available)

Thanks Ted. Committed to trunk and branch-2.

> MiniKdc#main() should close the FileReader it creates
> -
>
> Key: HADOOP-10223
> URL: https://issues.apache.org/jira/browse/HADOOP-10223
> Project: Hadoop Common
>  Issue Type: Bug
>Reporter: Ted Yu
>Assignee: Ted Yu
>Priority: Minor
> Fix For: 2.4.0
>
> Attachments: hadoop-10223-v2.txt, hadoop-10223-v3.txt, 
> hadoop-10223.txt
>
>
> FileReader is used to read MiniKDC properties.
> This FileReader should be closed after reading.



--
This message was sent by Atlassian JIRA
(v6.1.5#6160)


[jira] [Updated] (HADOOP-10225) Publish Maven javadoc and sources artifacts with Hadoop releases.

2014-01-12 Thread Lewis John McGibbney (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10225?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Lewis John McGibbney updated HADOOP-10225:
--

Attachment: HADOOP-10225.patch

Patch for hadoop-common trunk codebase.
Adds scm properties as well as a release profile to parent pom.xml.
Using the maven-release-plugin like

mvn release:clean release:prepare -DautoVersionSubmodules=true

to check generated artifacts, then 

mvn release:perform

to push the Maven artifacts to Nexus should do the trick. I've confirmed that 
sources and Javadoc artifacts are now generated locally.
This works perfectly for us over on Apache Gora when we are doing the release 
procedure. 

> Publish Maven javadoc and sources artifacts with Hadoop releases.
> -
>
> Key: HADOOP-10225
> URL: https://issues.apache.org/jira/browse/HADOOP-10225
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Reporter: Lewis John McGibbney
> Fix For: 3.0.0
>
> Attachments: HADOOP-10225.patch
>
>
> Right now Maven javadoc and sources artifacts do not accompany Hadoop 
> releases within Maven central. This means that one needs to checkout source 
> code to DEBUG aspects of the codebase... this is not user friendly.
> The build script(s) should be amended to accommodate publication of javadoc 
> and sources artifacts alongside pom and jar artifacts. 
> Some history on this conversation can be seen below
> http://s.apache.org/7qR



--
This message was sent by Atlassian JIRA
(v6.1.5#6160)


[jira] [Commented] (HADOOP-10225) Publish Maven javadoc and sources artifacts with Hadoop releases.

2014-01-12 Thread Lewis John McGibbney (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10225?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13869165#comment-13869165
 ] 

Lewis John McGibbney commented on HADOOP-10225:
---

I would be happy to update documentation to accommodate the new *release* 
profile introduced here. I've asked for write permissions on the common-dev list
http://www.mail-archive.com/common-dev%40hadoop.apache.org/msg11358.html 

> Publish Maven javadoc and sources artifacts with Hadoop releases.
> -
>
> Key: HADOOP-10225
> URL: https://issues.apache.org/jira/browse/HADOOP-10225
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Reporter: Lewis John McGibbney
>  Labels: hadoop, javadoc, maven, sources
> Fix For: 3.0.0
>
> Attachments: HADOOP-10225.patch
>
>
> Right now Maven javadoc and sources artifacts do not accompany Hadoop 
> releases within Maven central. This means that one needs to checkout source 
> code to DEBUG aspects of the codebase... this is not user friendly.
> The build script(s) should be amended to accommodate publication of javadoc 
> and sources artifacts alongside pom and jar artifacts. 
> Some history on this conversation can be seen below
> http://s.apache.org/7qR



--
This message was sent by Atlassian JIRA
(v6.1.5#6160)


[jira] [Updated] (HADOOP-10225) Publish Maven javadoc and sources artifacts with Hadoop releases.

2014-01-12 Thread Lewis John McGibbney (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10225?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Lewis John McGibbney updated HADOOP-10225:
--

Labels: hadoop javadoc maven sources  (was: )

> Publish Maven javadoc and sources artifacts with Hadoop releases.
> -
>
> Key: HADOOP-10225
> URL: https://issues.apache.org/jira/browse/HADOOP-10225
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Reporter: Lewis John McGibbney
>  Labels: hadoop, javadoc, maven, sources
> Fix For: 3.0.0
>
> Attachments: HADOOP-10225.patch
>
>
> Right now Maven javadoc and sources artifacts do not accompany Hadoop 
> releases within Maven central. This means that one needs to checkout source 
> code to DEBUG aspects of the codebase... this is not user friendly.
> The build script(s) should be amended to accommodate publication of javadoc 
> and sources artifacts alongside pom and jar artifacts. 
> Some history on this conversation can be seen below
> http://s.apache.org/7qR



--
This message was sent by Atlassian JIRA
(v6.1.5#6160)


[jira] [Resolved] (HADOOP-10227) IPC shutdown hangs if remote connection not reachable

2014-01-12 Thread Steve Loughran (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10227?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Steve Loughran resolved HADOOP-10227.
-

Resolution: Duplicate

resolving as duplicate of HADOOP-10219. Sorry -I needed to rerun the video I 
was making on filing issues and this was the bug I had to hand

> IPC shutdown hangs if remote connection not reachable
> -
>
> Key: HADOOP-10227
> URL: https://issues.apache.org/jira/browse/HADOOP-10227
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: ipc
>Affects Versions: 2.2.0
> Environment: OS/X Java7
>Reporter: Steve Loughran
>Priority: Minor
>
> If the remote HDFS namenode isn't reachable, and an attempt to shut down the 
> JVM is made, the process will keep running



--
This message was sent by Atlassian JIRA
(v6.1.5#6160)


[jira] [Commented] (HADOOP-10227) IPC shutdown hangs if remote connection not reachable

2014-01-12 Thread Steve Loughran (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10227?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13869150#comment-13869150
 ] 

Steve Loughran commented on HADOOP-10227:
-

{code}

"Thread-0" prio=5 tid=0x7fb05a077000 nid=0x5d0f waiting on condition 
[0x000116565000]
   java.lang.Thread.State: TIMED_WAITING (sleeping)
at java.lang.Thread.sleep(Native Method)
at org.apache.hadoop.ipc.Client.stop(Client.java:1173)
at org.apache.hadoop.ipc.ClientCache.stopClient(ClientCache.java:100)
at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.close(ProtobufRpcEngine.java:251)
at org.apache.hadoop.ipc.RPC.stopProxy(RPC.java:626)
at 
org.apache.hadoop.io.retry.DefaultFailoverProxyProvider.close(DefaultFailoverProxyProvider.java:57)
at 
org.apache.hadoop.io.retry.RetryInvocationHandler.close(RetryInvocationHandler.java:206)
at org.apache.hadoop.ipc.RPC.stopProxy(RPC.java:626)
at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.close(ClientNamenodeProtocolTranslatorPB.java:174)
at org.apache.hadoop.ipc.RPC.stopProxy(RPC.java:621)
at 
org.apache.hadoop.hdfs.DFSClient.closeConnectionToNamenode(DFSClient.java:738)
at org.apache.hadoop.hdfs.DFSClient.close(DFSClient.java:794)
- locked <0x0007fec77980> (a org.apache.hadoop.hdfs.DFSClient)
at 
org.apache.hadoop.hdfs.DistributedFileSystem.close(DistributedFileSystem.java:847)
at org.apache.hadoop.fs.FileSystem$Cache.closeAll(FileSystem.java:2524)
- locked <0x0007fec76600> (a org.apache.hadoop.fs.FileSystem$Cache)
at 
org.apache.hadoop.fs.FileSystem$Cache$ClientFinalizer.run(FileSystem.java:2541)
- locked <0x0007fec76618> (a 
org.apache.hadoop.fs.FileSystem$Cache$ClientFinalizer)
at 
org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)

"SIGINT handler" daemon prio=5 tid=0x7fb0588f7000 nid=0x440f in 
Object.wait() [0x0001138bb000]
   java.lang.Thread.State: WAITING (on object monitor)
at java.lang.Object.wait(Native Method)
- waiting on <0x0007fed3d2c8> (a 
org.apache.hadoop.util.ShutdownHookManager$1)
at java.lang.Thread.join(Thread.java:1280)
- locked <0x0007fed3d2c8> (a 
org.apache.hadoop.util.ShutdownHookManager$1)
at java.lang.Thread.join(Thread.java:1354)
at 
java.lang.ApplicationShutdownHooks.runHooks(ApplicationShutdownHooks.java:106)
at 
java.lang.ApplicationShutdownHooks$1.run(ApplicationShutdownHooks.java:46)
at java.lang.Shutdown.runHooks(Shutdown.java:123)
at java.lang.Shutdown.sequence(Shutdown.java:167)
at java.lang.Shutdown.exit(Shutdown.java:212)
- locked <0x0007fed5dfe8> (a java.lang.Class for java.lang.Shutdown)
at java.lang.Runtime.exit(Runtime.java:109)
at java.lang.System.exit(System.java:962)
at org.apache.hadoop.util.ExitUtil.terminate(ExitUtil.java:133)
at 
org.apache.hadoop.yarn.service.launcher.ServiceLauncher.exit(ServiceLauncher.java:279)
at 
org.apache.hadoop.yarn.service.launcher.ServiceLauncher.interrupted(ServiceLauncher.java:266)
at 
org.apache.hadoop.yarn.service.launcher.IrqHandler.handle(IrqHandler.java:70)
at sun.misc.Signal$1.run(Signal.java:212)
at java.lang.Thread.run(Thread.java:744)

"Service Thread" daemon prio=5 tid=0x7fb059011000 nid=0x5303 runnable 
[0x]
   java.lang.Thread.State: RUNNABLE

"C2 CompilerThread1" daemon prio=5 tid=0x7fb05b002000 nid=0x5103 waiting on 
condition [0x]
   java.lang.Thread.State: RUNNABLE

"C2 CompilerThread0" daemon prio=5 tid=0x7fb05b00 nid=0x4f03 waiting on 
condition [0x]
   java.lang.Thread.State: RUNNABLE

"Signal Dispatcher" daemon prio=5 tid=0x7fb058826800 nid=0x4d03 waiting on 
condition [0x]
   java.lang.Thread.State: RUNNABLE

"Finalizer" daemon prio=5 tid=0x7fb05a04e000 nid=0x3903 in Object.wait() 
[0x000113772000]
   java.lang.Thread.State: WAITING (on object monitor)
at java.lang.Object.wait(Native Method)
- waiting on <0x0007feb00658> (a java.lang.ref.ReferenceQueue$Lock)
at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:135)
- locked <0x0007feb00658> (a java.lang.ref.ReferenceQueue$Lock)
at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:151)
at java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:189)

"Reference Handler" daemon prio=5 tid=0x7fb05a04b000 nid=0x3703 in 
Object.wait() [0x00011366f000]
   java.lang.Thread.State: WAITING (on object monitor)
at java.lang.Object.wait(Native Method)
- waiting on <0x0007feb104b8> (a java.lang.ref.Reference$Lock)
at java.lang.Object.wait(Object.java:503)
at java.lang

[jira] [Created] (HADOOP-10227) IPC shutdown hangs if remote connection not reachable

2014-01-12 Thread Steve Loughran (JIRA)
Steve Loughran created HADOOP-10227:
---

 Summary: IPC shutdown hangs if remote connection not reachable
 Key: HADOOP-10227
 URL: https://issues.apache.org/jira/browse/HADOOP-10227
 Project: Hadoop Common
  Issue Type: Bug
  Components: ipc
Affects Versions: 2.2.0
 Environment: OS/X Java7
Reporter: Steve Loughran
Priority: Minor


If the remote HDFS namenode isn't reachable, and an attempt to shut down the 
JVM is made, the process will keep running



--
This message was sent by Atlassian JIRA
(v6.1.5#6160)


[jira] [Resolved] (HADOOP-10226) Help! My Hadoop doesn't work!

2014-01-12 Thread Steve Loughran (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10226?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Steve Loughran resolved HADOOP-10226.
-

Resolution: Invalid

> Help! My Hadoop doesn't work!
> -
>
> Key: HADOOP-10226
> URL: https://issues.apache.org/jira/browse/HADOOP-10226
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: bin
>Affects Versions: 1.0.3
> Environment: big data borat's hadoop
>Reporter: Steve Loughran
>Priority: Critical
>
> I have installed hadoop but it it is failing
> {code}
> hadop version
> -bash: hadop: command not found
> {code}
> please help!!



--
This message was sent by Atlassian JIRA
(v6.1.5#6160)


[jira] [Commented] (HADOOP-10226) Help! My Hadoop doesn't work!

2014-01-12 Thread Steve Loughran (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10226?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13869147#comment-13869147
 ] 

Steve Loughran commented on HADOOP-10226:
-

closing as Invalid
https://wiki.apache.org/hadoop/InvalidJiraIssues


> Help! My Hadoop doesn't work!
> -
>
> Key: HADOOP-10226
> URL: https://issues.apache.org/jira/browse/HADOOP-10226
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: bin
>Affects Versions: 1.0.3
> Environment: big data borat's hadoop
>Reporter: Steve Loughran
>Priority: Critical
>
> I have installed hadoop but it it is failing
> {code}
> hadop version
> -bash: hadop: command not found
> {code}
> please help!!



--
This message was sent by Atlassian JIRA
(v6.1.5#6160)


[jira] [Created] (HADOOP-10226) Help! My Hadoop doesn't work!

2014-01-12 Thread Steve Loughran (JIRA)
Steve Loughran created HADOOP-10226:
---

 Summary: Help! My Hadoop doesn't work!
 Key: HADOOP-10226
 URL: https://issues.apache.org/jira/browse/HADOOP-10226
 Project: Hadoop Common
  Issue Type: Bug
  Components: bin
Affects Versions: 1.0.3
 Environment: big data borat's hadoop
Reporter: Steve Loughran
Priority: Critical


I have installed hadoop but it it is failing
{code}
hadop version
-bash: hadop: command not found
{code}

please help!!



--
This message was sent by Atlassian JIRA
(v6.1.5#6160)


[jira] [Updated] (HADOOP-10225) Publish Maven javadoc and sources artifacts with Hadoop releases.

2014-01-12 Thread Lewis John McGibbney (JIRA)

 [ 
https://issues.apache.org/jira/browse/HADOOP-10225?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Lewis John McGibbney updated HADOOP-10225:
--

Fix Version/s: 3.0.0

> Publish Maven javadoc and sources artifacts with Hadoop releases.
> -
>
> Key: HADOOP-10225
> URL: https://issues.apache.org/jira/browse/HADOOP-10225
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Reporter: Lewis John McGibbney
> Fix For: 3.0.0
>
>
> Right now Maven javadoc and sources artifacts do not accompany Hadoop 
> releases within Maven central. This means that one needs to checkout source 
> code to DEBUG aspects of the codebase... this is not user friendly.
> The build script(s) should be amended to accommodate publication of javadoc 
> and sources artifacts alongside pom and jar artifacts. 
> Some history on this conversation can be seen below
> http://s.apache.org/7qR



--
This message was sent by Atlassian JIRA
(v6.1.5#6160)


[jira] [Commented] (HADOOP-10225) Publish Maven javadoc and sources artifacts with Hadoop releases.

2014-01-12 Thread Lewis John McGibbney (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10225?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13869078#comment-13869078
 ] 

Lewis John McGibbney commented on HADOOP-10225:
---

I am developing a patch for common-trunk however I would like to obtain some 
advice on the following:
The parent pom.xml [0] defines X profiles, namely *src*, *dist*, *sign* and 
*clover*. I am unsure which profile the Hadoop release manager uses for pushing 
releases. Can someone (a release manager or someone familiar with the Maven 
profile(s) used within the release process) please confirm what the process 
actually is?
Once I know this I can edit the Maven workflow and add the correct config.
Thank you  

[0] https://svn.apache.org/repos/asf/hadoop/common/trunk/pom.xml

> Publish Maven javadoc and sources artifacts with Hadoop releases.
> -
>
> Key: HADOOP-10225
> URL: https://issues.apache.org/jira/browse/HADOOP-10225
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Reporter: Lewis John McGibbney
>
> Right now Maven javadoc and sources artifacts do not accompany Hadoop 
> releases within Maven central. This means that one needs to checkout source 
> code to DEBUG aspects of the codebase... this is not user friendly.
> The build script(s) should be amended to accommodate publication of javadoc 
> and sources artifacts alongside pom and jar artifacts. 
> Some history on this conversation can be seen below
> http://s.apache.org/7qR



--
This message was sent by Atlassian JIRA
(v6.1.5#6160)


[jira] [Commented] (HADOOP-10225) Publish Maven javadoc and sources artifacts with Hadoop releases.

2014-01-12 Thread Lewis John McGibbney (JIRA)

[ 
https://issues.apache.org/jira/browse/HADOOP-10225?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13869051#comment-13869051
 ] 

Lewis John McGibbney commented on HADOOP-10225:
---

Can someone please mark 'fix versions' box and i will produce patches for all 
branches this concerns?
Thanks in advance
Lewis

> Publish Maven javadoc and sources artifacts with Hadoop releases.
> -
>
> Key: HADOOP-10225
> URL: https://issues.apache.org/jira/browse/HADOOP-10225
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Reporter: Lewis John McGibbney
>
> Right now Maven javadoc and sources artifacts do not accompany Hadoop 
> releases within Maven central. This means that one needs to checkout source 
> code to DEBUG aspects of the codebase... this is not user friendly.
> The build script(s) should be amended to accommodate publication of javadoc 
> and sources artifacts alongside pom and jar artifacts. 
> Some history on this conversation can be seen below
> http://s.apache.org/7qR



--
This message was sent by Atlassian JIRA
(v6.1.5#6160)


[jira] [Created] (HADOOP-10225) Publish Maven javadoc and sources artifacts with Hadoop releases.

2014-01-12 Thread Lewis John McGibbney (JIRA)
Lewis John McGibbney created HADOOP-10225:
-

 Summary: Publish Maven javadoc and sources artifacts with Hadoop 
releases.
 Key: HADOOP-10225
 URL: https://issues.apache.org/jira/browse/HADOOP-10225
 Project: Hadoop Common
  Issue Type: Improvement
  Components: build
Reporter: Lewis John McGibbney


Right now Maven javadoc and sources artifacts do not accompany Hadoop releases 
within Maven central. This means that one needs to checkout source code to 
DEBUG aspects of the codebase... this is not user friendly.

The build script(s) should be amended to accommodate publication of javadoc and 
sources artifacts alongside pom and jar artifacts. 

Some history on this conversation can be seen below
http://s.apache.org/7qR



--
This message was sent by Atlassian JIRA
(v6.1.5#6160)