[jira] [Updated] (HADOOP-14451) Deadlock in NativeIO

2024-03-17 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-14451?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B updated HADOOP-14451:
---
Description: 
* Scenario:
  1. One thread calls a static method of NativeIO, which loads static block of 
NativeIo.
  2. Second thread calls a static method of NativeIo.POSIX, which loads a 
static block of NativeIO.POSIX class

Both try to lock on same object inside native code gets into deadlock.

> Deadlock in NativeIO
> 
>
> Key: HADOOP-14451
> URL: https://issues.apache.org/jira/browse/HADOOP-14451
> Project: Hadoop Common
>  Issue Type: Bug
>Affects Versions: 2.8.0, 3.0.0-alpha1
>Reporter: Ajith S
>Assignee: Vinayakumar B
>Priority: Blocker
>  Labels: pull-request-available
> Attachments: HADOOP-14451-01.patch, HADOOP-14451-02.patch, 
> HADOOP-14451-03.patch, HADOOP-14451-04.patch, Nodemanager.jstack
>
>  Time Spent: 0.5h
>  Remaining Estimate: 0h
>
> * Scenario:
>   1. One thread calls a static method of NativeIO, which loads static block 
> of NativeIo.
>   2. Second thread calls a static method of NativeIo.POSIX, which loads a 
> static block of NativeIO.POSIX class
> Both try to lock on same object inside native code gets into deadlock.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-19010) NullPointerException in Hadoop Credential Check CLI Command

2023-12-27 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-19010?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17800755#comment-17800755
 ] 

Vinayakumar B commented on HADOOP-19010:


Marked as fixed.

Congratulations on the first contribution [~anikaKelhanka].

> NullPointerException in Hadoop Credential Check CLI Command
> ---
>
> Key: HADOOP-19010
> URL: https://issues.apache.org/jira/browse/HADOOP-19010
> Project: Hadoop Common
>  Issue Type: Bug
>Affects Versions: 3.3.0
>Reporter: Anika Kelhanka
>Assignee: Anika Kelhanka
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.4.0
>
>
> *Description*: Hadoop's credential check throws {{NullPointerException}} when 
> alias not found.
> {code:bash}
> hadoop credential check "fs.gs.proxy.username" -provider 
> "jceks://file/usr/lib/hive/conf/hive.jceks" {code}
> Checking aliases for CredentialProvider: 
> jceks://file/usr/lib/hive/conf/hive.jceks
> Enter alias password: 
> java.lang.NullPointerException
> at
> org.apache.hadoop.security.alias.CredentialShell$CheckCommand.execute(CredentialShell.java:369)
> at org.apache.hadoop.tools.CommandShell.run(CommandShell.java:73)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:82)
> at 
> org.apache.hadoop.security.alias.CredentialShell.main(CredentialShell.java:529)}}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Assigned] (HADOOP-19010) NullPointerException in Hadoop Credential Check CLI Command

2023-12-27 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-19010?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B reassigned HADOOP-19010:
--

Assignee: Anika Kelhanka

> NullPointerException in Hadoop Credential Check CLI Command
> ---
>
> Key: HADOOP-19010
> URL: https://issues.apache.org/jira/browse/HADOOP-19010
> Project: Hadoop Common
>  Issue Type: Bug
>Affects Versions: 3.3.0
>Reporter: Anika Kelhanka
>Assignee: Anika Kelhanka
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.4.0
>
>
> *Description*: Hadoop's credential check throws {{NullPointerException}} when 
> alias not found.
> {code:bash}
> hadoop credential check "fs.gs.proxy.username" -provider 
> "jceks://file/usr/lib/hive/conf/hive.jceks" {code}
> Checking aliases for CredentialProvider: 
> jceks://file/usr/lib/hive/conf/hive.jceks
> Enter alias password: 
> java.lang.NullPointerException
> at
> org.apache.hadoop.security.alias.CredentialShell$CheckCommand.execute(CredentialShell.java:369)
> at org.apache.hadoop.tools.CommandShell.run(CommandShell.java:73)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:82)
> at 
> org.apache.hadoop.security.alias.CredentialShell.main(CredentialShell.java:529)}}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-19010) NullPointerException in Hadoop Credential Check CLI Command

2023-12-27 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-19010?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B resolved HADOOP-19010.

Fix Version/s: 3.4.0
 Hadoop Flags: Reviewed
   Resolution: Fixed

> NullPointerException in Hadoop Credential Check CLI Command
> ---
>
> Key: HADOOP-19010
> URL: https://issues.apache.org/jira/browse/HADOOP-19010
> Project: Hadoop Common
>  Issue Type: Bug
>Affects Versions: 3.3.0
>Reporter: Anika Kelhanka
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.4.0
>
>
> *Description*: Hadoop's credential check throws {{NullPointerException}} when 
> alias not found.
> {code:bash}
> hadoop credential check "fs.gs.proxy.username" -provider 
> "jceks://file/usr/lib/hive/conf/hive.jceks" {code}
> Checking aliases for CredentialProvider: 
> jceks://file/usr/lib/hive/conf/hive.jceks
> Enter alias password: 
> java.lang.NullPointerException
> at
> org.apache.hadoop.security.alias.CredentialShell$CheckCommand.execute(CredentialShell.java:369)
> at org.apache.hadoop.tools.CommandShell.run(CommandShell.java:73)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:82)
> at 
> org.apache.hadoop.security.alias.CredentialShell.main(CredentialShell.java:529)}}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-17505) public interface GroupMappingServiceProvider needs default impl for getGroupsSet()

2021-01-29 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17505?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B updated HADOOP-17505:
---
Status: Patch Available  (was: Open)

> public interface GroupMappingServiceProvider needs default impl for 
> getGroupsSet() 
> ---
>
> Key: HADOOP-17505
> URL: https://issues.apache.org/jira/browse/HADOOP-17505
> Project: Hadoop Common
>  Issue Type: Bug
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>  Labels: pull-request-available
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> HADOOP-17079 added "GroupMappingServiceProvider#getGroupsSet()" interface.
> But since this is a public interface, it will break compilation of existing 
> implementations in downstreams.
> Consider adding a default implementation in the interface to avoid such 
> failures.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-17505) public interface GroupMappingServiceProvider needs default impl for getGroupsSet()

2021-01-28 Thread Vinayakumar B (Jira)
Vinayakumar B created HADOOP-17505:
--

 Summary: public interface GroupMappingServiceProvider needs 
default impl for getGroupsSet() 
 Key: HADOOP-17505
 URL: https://issues.apache.org/jira/browse/HADOOP-17505
 Project: Hadoop Common
  Issue Type: Bug
Reporter: Vinayakumar B


HADOOP-17079 added "GroupMappingServiceProvider#getGroupsSet()" interface.

But since this is a public interface, it will break compilation of existing 
implementations in downstreams.

Consider adding a default implementation in the interface to avoid such 
failures.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Assigned] (HADOOP-17505) public interface GroupMappingServiceProvider needs default impl for getGroupsSet()

2021-01-28 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17505?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B reassigned HADOOP-17505:
--

Assignee: Vinayakumar B

> public interface GroupMappingServiceProvider needs default impl for 
> getGroupsSet() 
> ---
>
> Key: HADOOP-17505
> URL: https://issues.apache.org/jira/browse/HADOOP-17505
> Project: Hadoop Common
>  Issue Type: Bug
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> HADOOP-17079 added "GroupMappingServiceProvider#getGroupsSet()" interface.
> But since this is a public interface, it will break compilation of existing 
> implementations in downstreams.
> Consider adding a default implementation in the interface to avoid such 
> failures.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Comment Edited] (HADOOP-17306) RawLocalFileSystem's lastModifiedTime() looses milli seconds in JDK < 10.b09

2020-11-09 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-17306?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17228567#comment-17228567
 ] 

Vinayakumar B edited comment on HADOOP-17306 at 11/9/20, 1:14 PM:
--

Hi [~Jim_Brennan], Thanks for pointing to test failures.

AFAIK, test failures are due to setting timestamp of {{LocalResource}} with 
value returned by {{File.lastModified()}} all in test code explicitly for the 
scriptfile used for tests. As mentioned in this Jira title, 
{{File.lastModified()}} is broken and looses accuracy. I tried replacing 
{{File.lastModified()}} calls with
 {{Files.getLastModifiedTime(file.toPath()).toMillis()}}, all tests passed.

AM's sets the timestamp using the value returned by 
{{FileStatus#getModifiedTime()}} in which case, it will be consistent. So I 
dont think any problem with the production code as long as 
{{FileStatus#getModificationTime()}} is used.

As Steve mentioned, relying on modificationTime and length may not be a good 
idea to detect changes.

Without this fix, There could be possibilities of corruption/modification of 
data, which can happen within the same second, without changing the length of 
the file, in which case it will go undetected since modificationTime() will 
looses the millis part.


was (Author: vinayrpet):
Hi [~Jim_Brennan], Thanks for pointing to test failures.

AFAIK, test failures are due to setting timestamp of {{LocalResource}} with 
value returned by {{File.lastModified()}} all in test code explicitly for the 
scriptfile used for tests. As mentioned in this Jira title, 
{{File.lastModified()}} is broken and looses accuracy. I tried replacing 
{{File.lastModified()}} calls with
 {{Files.getLastModifiedTime(file.toPath()).toMillis()}}, all tests passed.

AM's sets the timestamp using the value returned by 
{{FileStatus#getModifiedTime()}} in which case, it will be consistent. So I 
dont think any problem with the production code as long as 
{{FileStatus#getModificationTime()}} is used.

 

As Steve mentioned, relying on modificationTime and length may not be a good 
idea to detect changes. There could be possibilities of corruption 

> RawLocalFileSystem's lastModifiedTime() looses milli seconds in JDK < 10.b09
> 
>
> Key: HADOOP-17306
> URL: https://issues.apache.org/jira/browse/HADOOP-17306
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: fs
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>  Labels: pull-request-available
>  Time Spent: 1h 40m
>  Remaining Estimate: 0h
>
> RawLocalFileSystem's FileStatus uses {{File.lastModified()}} api from JDK.
> This api looses milliseconds due to JDK bug.
> [https://bugs.java.com/bugdatabase/view_bug.do?bug_id=8177809]
> This bug fixed in JDK 10 b09 onwards and still exists in JDK 8 which is still 
> being used in many productions.
> Apparently, {{Files.getLastModifiedTime()}} from java's nio package returns 
> correct time.
> Use {{Files.getLastModifiedTime()}} instead of {{File.lastModified}} as 
> workaround. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-17306) RawLocalFileSystem's lastModifiedTime() looses milli seconds in JDK < 10.b09

2020-11-09 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-17306?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17228567#comment-17228567
 ] 

Vinayakumar B commented on HADOOP-17306:


Hi [~Jim_Brennan], Thanks for pointing to test failures.

AFAIK, test failures are due to setting timestamp of {{LocalResource}} with 
value returned by {{File.lastModified()}} all in test code explicitly for the 
scriptfile used for tests. As mentioned in this Jira title, 
{{File.lastModified()}} is broken and looses accuracy. I tried replacing 
{{File.lastModified()}} calls with
 {{Files.getLastModifiedTime(file.toPath()).toMillis()}}, all tests passed.

AM's sets the timestamp using the value returned by 
{{FileStatus#getModifiedTime()}} in which case, it will be consistent. So I 
dont think any problem with the production code as long as 
{{FileStatus#getModificationTime()}} is used.

 

As Steve mentioned, relying on modificationTime and length may not be a good 
idea to detect changes. There could be possibilities of corruption 

> RawLocalFileSystem's lastModifiedTime() looses milli seconds in JDK < 10.b09
> 
>
> Key: HADOOP-17306
> URL: https://issues.apache.org/jira/browse/HADOOP-17306
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: fs
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>  Labels: pull-request-available
>  Time Spent: 1h 40m
>  Remaining Estimate: 0h
>
> RawLocalFileSystem's FileStatus uses {{File.lastModified()}} api from JDK.
> This api looses milliseconds due to JDK bug.
> [https://bugs.java.com/bugdatabase/view_bug.do?bug_id=8177809]
> This bug fixed in JDK 10 b09 onwards and still exists in JDK 8 which is still 
> being used in many productions.
> Apparently, {{Files.getLastModifiedTime()}} from java's nio package returns 
> correct time.
> Use {{Files.getLastModifiedTime()}} instead of {{File.lastModified}} as 
> workaround. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-17306) RawLocalFileSystem's lastModifiedTime() looses milli seconds in JDK < 10.b09

2020-10-23 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17306?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B resolved HADOOP-17306.

Fix Version/s: 3.4.0
   3.3.1
   3.2.2
 Hadoop Flags: Reviewed
   Resolution: Fixed

Merged to trunk, branch-3.3 and branch-3.2

 

Thanks [~aajisaka] and [~ayushsaxena] for reviews.

> RawLocalFileSystem's lastModifiedTime() looses milli seconds in JDK < 10.b09
> 
>
> Key: HADOOP-17306
> URL: https://issues.apache.org/jira/browse/HADOOP-17306
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: fs
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>  Labels: pull-request-available
> Fix For: 3.2.2, 3.3.1, 3.4.0
>
>  Time Spent: 1.5h
>  Remaining Estimate: 0h
>
> RawLocalFileSystem's FileStatus uses {{File.lastModified()}} api from JDK.
> This api looses milliseconds due to JDK bug.
> [https://bugs.java.com/bugdatabase/view_bug.do?bug_id=8177809]
> This bug fixed in JDK 10 b09 onwards and still exists in JDK 8 which is still 
> being used in many productions.
> Apparently, {{Files.getLastModifiedTime()}} from java's nio package returns 
> correct time.
> Use {{Files.getLastModifiedTime()}} instead of {{File.lastModified}} as 
> workaround. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-17306) RawLocalFileSystem's lastModifiedTime() looses milli seconds in JDK < 10.b09

2020-10-15 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17306?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B updated HADOOP-17306:
---
Summary: RawLocalFileSystem's lastModifiedTime() looses milli seconds in 
JDK < 10.b09  (was: RawLocalFileSystem's lastModifiedTime() looses milli 
seconds in JDK < 10 b09)

> RawLocalFileSystem's lastModifiedTime() looses milli seconds in JDK < 10.b09
> 
>
> Key: HADOOP-17306
> URL: https://issues.apache.org/jira/browse/HADOOP-17306
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: fs
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>  Labels: pull-request-available
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> RawLocalFileSystem's FileStatus uses {{File.lastModified()}} api from JDK.
> This api looses milliseconds due to JDK bug.
> [https://bugs.java.com/bugdatabase/view_bug.do?bug_id=8177809]
> This bug fixed in JDK 10 b09 onwards and still exists in JDK 8 which is still 
> being used in many productions.
> Apparently, {{Files.getLastModifiedTime()}} from java's nio package returns 
> correct time.
> Use {{Files.getLastModifiedTime()}} instead of {{File.lastModified}} as 
> workaround. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Work started] (HADOOP-17306) RawLocalFileSystem's lastModifiedTime() looses milli seconds in JDK < 10 b09

2020-10-15 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17306?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on HADOOP-17306 started by Vinayakumar B.
--
> RawLocalFileSystem's lastModifiedTime() looses milli seconds in JDK < 10 b09
> 
>
> Key: HADOOP-17306
> URL: https://issues.apache.org/jira/browse/HADOOP-17306
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: fs
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>  Labels: pull-request-available
>  Time Spent: 10m
>  Remaining Estimate: 0h
>
> RawLocalFileSystem's FileStatus uses {{File.lastModified()}} api from JDK.
> This api looses milliseconds due to JDK bug.
> [https://bugs.java.com/bugdatabase/view_bug.do?bug_id=8177809]
> This bug fixed in JDK 10 b09 onwards and still exists in JDK 8 which is still 
> being used in many productions.
> Apparently, {{Files.getLastModifiedTime()}} from java's nio package returns 
> correct time.
> Use {{Files.getLastModifiedTime()}} instead of {{File.lastModified}} as 
> workaround. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Assigned] (HADOOP-17306) RawLocalFileSystem's lastModifiedTime() looses milli seconds in JDK < 10 b09

2020-10-15 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17306?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B reassigned HADOOP-17306:
--

Assignee: Vinayakumar B

> RawLocalFileSystem's lastModifiedTime() looses milli seconds in JDK < 10 b09
> 
>
> Key: HADOOP-17306
> URL: https://issues.apache.org/jira/browse/HADOOP-17306
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: fs
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> RawLocalFileSystem's FileStatus uses {{File.lastModified()}} api from JDK.
> This api looses milliseconds due to JDK bug.
> [https://bugs.java.com/bugdatabase/view_bug.do?bug_id=8177809]
> This bug fixed in JDK 10 b09 onwards and still exists in JDK 8 which is still 
> being used in many productions.
> Apparently, {{Files.getLastModifiedTime()}} from java's nio package returns 
> correct time.
> Use {{Files.getLastModifiedTime()}} instead of {{File.lastModified}} as 
> workaround. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-17306) RawLocalFileSystem's lastModifiedTime() looses milli seconds in JDK < 10 b09

2020-10-15 Thread Vinayakumar B (Jira)
Vinayakumar B created HADOOP-17306:
--

 Summary: RawLocalFileSystem's lastModifiedTime() looses milli 
seconds in JDK < 10 b09
 Key: HADOOP-17306
 URL: https://issues.apache.org/jira/browse/HADOOP-17306
 Project: Hadoop Common
  Issue Type: Bug
  Components: fs
Reporter: Vinayakumar B


RawLocalFileSystem's FileStatus uses {{File.lastModified()}} api from JDK.

This api looses milliseconds due to JDK bug.

[https://bugs.java.com/bugdatabase/view_bug.do?bug_id=8177809]

This bug fixed in JDK 10 b09 onwards and still exists in JDK 8 which is still 
being used in many productions.

Apparently, {{Files.getLastModifiedTime()}} from java's nio package returns 
correct time.

Use {{Files.getLastModifiedTime()}} instead of {{File.lastModified}} as 
workaround. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-17278) Shade guava 29.0-jre in hadoop thirdparty

2020-09-27 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17278?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B resolved HADOOP-17278.

Fix Version/s: thirdparty-1.1.0
 Hadoop Flags: Reviewed
   Resolution: Fixed

Merged to trunk of hadoop-thirdparty

> Shade guava 29.0-jre in hadoop thirdparty
> -
>
> Key: HADOOP-17278
> URL: https://issues.apache.org/jira/browse/HADOOP-17278
> Project: Hadoop Common
>  Issue Type: Sub-task
>Reporter: Ayush Saxena
>Assignee: Ayush Saxena
>Priority: Major
>  Labels: pull-request-available
> Fix For: thirdparty-1.1.0
>
>
> Shade guava 27.0-jre in hadoop-thirdparty



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-17046) Support downstreams' existing Hadoop-rpc implementations using non-shaded protobuf classes.

2020-06-12 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17046?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B updated HADOOP-17046:
---
Fix Version/s: 3.3.0
 Hadoop Flags: Reviewed
   Resolution: Fixed
   Status: Resolved  (was: Patch Available)

Thanks [~aajisaka] [~ayushtkn] for reviews on PR.
Merged to trunk, branch-3.3 and branch-3.3.0


> Support downstreams' existing Hadoop-rpc implementations using non-shaded 
> protobuf classes.
> ---
>
> Key: HADOOP-17046
> URL: https://issues.apache.org/jira/browse/HADOOP-17046
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: rpc-server
>Affects Versions: 3.3.0
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
> Fix For: 3.3.0
>
>
> After upgrade/shade of protobuf to 3.7 version, existing Hadoop-Rpc 
> client-server implementations using ProtobufRpcEngine will not work.
> So, this Jira proposes to keep existing ProtobuRpcEngine as-is (without 
> shading and with protobuf-2.5.0 implementation) to support downstream 
> implementations.
> Use new ProtobufRpcEngine2 to use shaded protobuf classes within Hadoop and 
> later projects who wish to upgrade their protobufs to 3.x.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Assigned] (HADOOP-17046) Support downstreams' existing Hadoop-rpc implementations using non-shaded protobuf classes.

2020-05-18 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-17046?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B reassigned HADOOP-17046:
--

Assignee: Vinayakumar B

> Support downstreams' existing Hadoop-rpc implementations using non-shaded 
> protobuf classes.
> ---
>
> Key: HADOOP-17046
> URL: https://issues.apache.org/jira/browse/HADOOP-17046
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: rpc-server
>Affects Versions: 3.3.0
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> After upgrade/shade of protobuf to 3.7 version, existing Hadoop-Rpc 
> client-server implementations using ProtobufRpcEngine will not work.
> So, this Jira proposes to keep existing ProtobuRpcEngine as-is (without 
> shading and with protobuf-2.5.0 implementation) to support downstream 
> implementations.
> Use new ProtobufRpcEngine2 to use shaded protobuf classes within Hadoop and 
> later projects who wish to upgrade their protobufs to 3.x.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-17046) Support downstreams' existing Hadoop-rpc implementations using non-shaded protobuf classes.

2020-05-18 Thread Vinayakumar B (Jira)
Vinayakumar B created HADOOP-17046:
--

 Summary: Support downstreams' existing Hadoop-rpc implementations 
using non-shaded protobuf classes.
 Key: HADOOP-17046
 URL: https://issues.apache.org/jira/browse/HADOOP-17046
 Project: Hadoop Common
  Issue Type: Improvement
  Components: rpc-server
Affects Versions: 3.3.0
Reporter: Vinayakumar B


After upgrade/shade of protobuf to 3.7 version, existing Hadoop-Rpc 
client-server implementations using ProtobufRpcEngine will not work.

So, this Jira proposes to keep existing ProtobuRpcEngine as-is (without shading 
and with protobuf-2.5.0 implementation) to support downstream implementations.

Use new ProtobufRpcEngine2 to use shaded protobuf classes within Hadoop and 
later projects who wish to upgrade their protobufs to 3.x.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16957) NodeBase.normalize doesn't removing all trailing slashes.

2020-04-29 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16957?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17095236#comment-17095236
 ] 

Vinayakumar B commented on HADOOP-16957:


+1

> NodeBase.normalize doesn't removing all trailing slashes.
> -
>
> Key: HADOOP-16957
> URL: https://issues.apache.org/jira/browse/HADOOP-16957
> Project: Hadoop Common
>  Issue Type: Bug
>Reporter: Ayush Saxena
>Assignee: Ayush Saxena
>Priority: Major
> Attachments: HADOOP-16957-01.patch
>
>
> As per javadoc 
> /** Normalize a path by stripping off any trailing {@link #PATH_SEPARATOR}
> But it removes only one.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-16985) Handle release package related issues

2020-04-15 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16985?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B resolved HADOOP-16985.

Fix Version/s: 3.3.0
 Hadoop Flags: Reviewed
   Resolution: Fixed

Committed to branch-3.3 and trunk

> Handle release package related issues
> -
>
> Key: HADOOP-16985
> URL: https://issues.apache.org/jira/browse/HADOOP-16985
> Project: Hadoop Common
>  Issue Type: Bug
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
> Fix For: 3.3.0
>
>
> Same issue as mentioned in HADOOP-16919 is present in hadoop distribution 
> generation as well.
> Handle following comments from [~elek] in 1.0.0-RC0 voting mail thread 
> here[[https://lists.apache.org/thread.html/r1f2e8325ecef239f0d713c683a16336e2a22431a9f6bfbde3c763816%40%3Ccommon-dev.hadoop.apache.org%3E]]
> {quote}3. Yetus seems to be included in the source package. I am not sure if
>  it's intentional but I would remove the patchprocess directory from the
>  tar file.
> 7. Minor nit: I would suggest to use only the filename in the sha512
>  files (instead of having the /build/source/target prefix). It would help
>  to use `sha512 -c` command to validate the checksum.
> {quote}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16985) Handle release package related issues

2020-04-14 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16985?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B updated HADOOP-16985:
---
Target Version/s: 3.3.0

Marking 3.3.0 as target version as this change helps to generate the package 
without above mentioned issues.

> Handle release package related issues
> -
>
> Key: HADOOP-16985
> URL: https://issues.apache.org/jira/browse/HADOOP-16985
> Project: Hadoop Common
>  Issue Type: Bug
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> Same issue as mentioned in HADOOP-16919 is present in hadoop distribution 
> generation as well.
> Handle following comments from [~elek] in 1.0.0-RC0 voting mail thread 
> here[[https://lists.apache.org/thread.html/r1f2e8325ecef239f0d713c683a16336e2a22431a9f6bfbde3c763816%40%3Ccommon-dev.hadoop.apache.org%3E]]
> {quote}3. Yetus seems to be included in the source package. I am not sure if
>  it's intentional but I would remove the patchprocess directory from the
>  tar file.
> 7. Minor nit: I would suggest to use only the filename in the sha512
>  files (instead of having the /build/source/target prefix). It would help
>  to use `sha512 -c` command to validate the checksum.
> {quote}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Work started] (HADOOP-16985) Handle release package related issues

2020-04-14 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16985?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on HADOOP-16985 started by Vinayakumar B.
--
> Handle release package related issues
> -
>
> Key: HADOOP-16985
> URL: https://issues.apache.org/jira/browse/HADOOP-16985
> Project: Hadoop Common
>  Issue Type: Bug
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> Same issue as mentioned in HADOOP-16919 is present in hadoop distribution 
> generation as well.
> Handle following comments from [~elek] in 1.0.0-RC0 voting mail thread 
> here[[https://lists.apache.org/thread.html/r1f2e8325ecef239f0d713c683a16336e2a22431a9f6bfbde3c763816%40%3Ccommon-dev.hadoop.apache.org%3E]]
> {quote}3. Yetus seems to be included in the source package. I am not sure if
>  it's intentional but I would remove the patchprocess directory from the
>  tar file.
> 7. Minor nit: I would suggest to use only the filename in the sha512
>  files (instead of having the /build/source/target prefix). It would help
>  to use `sha512 -c` command to validate the checksum.
> {quote}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Assigned] (HADOOP-16985) Handle release package related issues

2020-04-14 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16985?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B reassigned HADOOP-16985:
--

Assignee: Vinayakumar B

> Handle release package related issues
> -
>
> Key: HADOOP-16985
> URL: https://issues.apache.org/jira/browse/HADOOP-16985
> Project: Hadoop Common
>  Issue Type: Bug
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> Same issue as mentioned in HADOOP-16919 is present in hadoop distribution 
> generation as well.
> Handle following comments from [~elek] in 1.0.0-RC0 voting mail thread 
> here[[https://lists.apache.org/thread.html/r1f2e8325ecef239f0d713c683a16336e2a22431a9f6bfbde3c763816%40%3Ccommon-dev.hadoop.apache.org%3E]]
> {quote}3. Yetus seems to be included in the source package. I am not sure if
>  it's intentional but I would remove the patchprocess directory from the
>  tar file.
> 7. Minor nit: I would suggest to use only the filename in the sha512
>  files (instead of having the /build/source/target prefix). It would help
>  to use `sha512 -c` command to validate the checksum.
> {quote}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16985) Handle release package related issues

2020-04-14 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16985?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B updated HADOOP-16985:
---
Issue Type: Bug  (was: Improvement)

> Handle release package related issues
> -
>
> Key: HADOOP-16985
> URL: https://issues.apache.org/jira/browse/HADOOP-16985
> Project: Hadoop Common
>  Issue Type: Bug
>Reporter: Vinayakumar B
>Priority: Major
>
> Same issue as mentioned in HADOOP-16919 is present in hadoop distribution 
> generation as well.
> Handle following comments from [~elek] in 1.0.0-RC0 voting mail thread 
> here[[https://lists.apache.org/thread.html/r1f2e8325ecef239f0d713c683a16336e2a22431a9f6bfbde3c763816%40%3Ccommon-dev.hadoop.apache.org%3E]]
> {quote}3. Yetus seems to be included in the source package. I am not sure if
>  it's intentional but I would remove the patchprocess directory from the
>  tar file.
> 7. Minor nit: I would suggest to use only the filename in the sha512
>  files (instead of having the /build/source/target prefix). It would help
>  to use `sha512 -c` command to validate the checksum.
> {quote}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-16985) Handle release package related issues

2020-04-14 Thread Vinayakumar B (Jira)
Vinayakumar B created HADOOP-16985:
--

 Summary: Handle release package related issues
 Key: HADOOP-16985
 URL: https://issues.apache.org/jira/browse/HADOOP-16985
 Project: Hadoop Common
  Issue Type: Improvement
Reporter: Vinayakumar B


Same issue as mentioned in HADOOP-16919 is present in hadoop distribution 
generation as well.

Handle following comments from [~elek] in 1.0.0-RC0 voting mail thread 
here[[https://lists.apache.org/thread.html/r1f2e8325ecef239f0d713c683a16336e2a22431a9f6bfbde3c763816%40%3Ccommon-dev.hadoop.apache.org%3E]]
{quote}3. Yetus seems to be included in the source package. I am not sure if
 it's intentional but I would remove the patchprocess directory from the
 tar file.

7. Minor nit: I would suggest to use only the filename in the sha512
 files (instead of having the /build/source/target prefix). It would help
 to use `sha512 -c` command to validate the checksum.
{quote}
 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16952) ADD .diff to gitignore

2020-04-03 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16952?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17074305#comment-17074305
 ] 

Vinayakumar B commented on HADOOP-16952:


+1

> ADD .diff to gitignore
> --
>
> Key: HADOOP-16952
> URL: https://issues.apache.org/jira/browse/HADOOP-16952
> Project: Hadoop Common
>  Issue Type: Improvement
>Reporter: Ayush Saxena
>Assignee: Ayush Saxena
>Priority: Minor
> Attachments: HADOOP-16952-01.patch
>
>
> Add .diff to gitignore.
> Else on git add . 
> it even stages the .diff files too.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Comment Edited] (HADOOP-16927) Update hadoop-thirdparty dependency version to 1.0.0

2020-03-20 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16927?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17063247#comment-17063247
 ] 

Vinayakumar B edited comment on HADOOP-16927 at 3/20/20, 9:48 AM:
--

Committed to trunk. thanks [~ayushtkn] for review in PR.


was (Author: vinayrpet):
Committed to trunk.

> Update hadoop-thirdparty dependency version to 1.0.0
> 
>
> Key: HADOOP-16927
> URL: https://issues.apache.org/jira/browse/HADOOP-16927
> Project: Hadoop Common
>  Issue Type: Improvement
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>  Labels: release-blocker
> Fix For: 3.3.0
>
>
> Now hadoop-thirdparty 1.0.0 is released, its time to upgrade to released 
> version in hadoop



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-16927) Update hadoop-thirdparty dependency version to 1.0.0

2020-03-20 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16927?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B resolved HADOOP-16927.

Fix Version/s: 3.3.0
 Hadoop Flags: Reviewed
   Resolution: Fixed

Committed to trunk.

> Update hadoop-thirdparty dependency version to 1.0.0
> 
>
> Key: HADOOP-16927
> URL: https://issues.apache.org/jira/browse/HADOOP-16927
> Project: Hadoop Common
>  Issue Type: Improvement
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>  Labels: release-blocker
> Fix For: 3.3.0
>
>
> Now hadoop-thirdparty 1.0.0 is released, its time to upgrade to released 
> version in hadoop



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16927) Update hadoop-thirdparty dependency version to 1.0.0

2020-03-19 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16927?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B updated HADOOP-16927:
---
Description: Now hadoop-thirdparty 1.0.0 is released, its time to upgrade 
to released version in hadoop

> Update hadoop-thirdparty dependency version to 1.0.0
> 
>
> Key: HADOOP-16927
> URL: https://issues.apache.org/jira/browse/HADOOP-16927
> Project: Hadoop Common
>  Issue Type: Improvement
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>  Labels: release-blocker
>
> Now hadoop-thirdparty 1.0.0 is released, its time to upgrade to released 
> version in hadoop



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Work started] (HADOOP-16927) Update hadoop-thirdparty dependency version to 1.0.0

2020-03-19 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16927?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on HADOOP-16927 started by Vinayakumar B.
--
> Update hadoop-thirdparty dependency version to 1.0.0
> 
>
> Key: HADOOP-16927
> URL: https://issues.apache.org/jira/browse/HADOOP-16927
> Project: Hadoop Common
>  Issue Type: Improvement
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>  Labels: release-blocker
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Assigned] (HADOOP-16927) Update hadoop-thirdparty dependency version to 1.0.0

2020-03-18 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16927?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B reassigned HADOOP-16927:
--

Assignee: Vinayakumar B

> Update hadoop-thirdparty dependency version to 1.0.0
> 
>
> Key: HADOOP-16927
> URL: https://issues.apache.org/jira/browse/HADOOP-16927
> Project: Hadoop Common
>  Issue Type: Improvement
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-16927) Update hadoop-thirdparty dependency version to 1.0.0

2020-03-18 Thread Vinayakumar B (Jira)
Vinayakumar B created HADOOP-16927:
--

 Summary: Update hadoop-thirdparty dependency version to 1.0.0
 Key: HADOOP-16927
 URL: https://issues.apache.org/jira/browse/HADOOP-16927
 Project: Hadoop Common
  Issue Type: Improvement
Reporter: Vinayakumar B






--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-16919) [thirdparty] Handle release package related issues

2020-03-11 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16919?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B resolved HADOOP-16919.

Fix Version/s: thirdparty-1.0.0
 Hadoop Flags: Reviewed
   Resolution: Fixed

Merged to trunk,branch-1.0 of hadoop-thirdparty.

Thanks [~ayushtkn] for reviews

> [thirdparty] Handle release package related issues
> --
>
> Key: HADOOP-16919
> URL: https://issues.apache.org/jira/browse/HADOOP-16919
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: hadoop-thirdparty
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
> Fix For: thirdparty-1.0.0
>
>
> Handle following comments from [~elek] in 1.0.0-RC0 voting mail thread 
> here[[https://lists.apache.org/thread.html/r1f2e8325ecef239f0d713c683a16336e2a22431a9f6bfbde3c763816%40%3Ccommon-dev.hadoop.apache.org%3E]]
> {quote}3. Yetus seems to be included in the source package. I am not sure if
>  it's intentional but I would remove the patchprocess directory from the
>  tar file.
> 7. Minor nit: I would suggest to use only the filename in the sha512
>  files (instead of having the /build/source/target prefix). It would help
>  to use `sha512 -c` command to validate the checksum.
> {quote}
> Also, update available artifacts in docs.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-16895) [thirdparty] Revisit LICENSEs and NOTICEs

2020-03-11 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16895?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B resolved HADOOP-16895.

Fix Version/s: thirdparty-1.0.0
 Hadoop Flags: Reviewed
   Resolution: Fixed

Committed to brach-1.0, trunk of hadoop-thirdparty

Thanks [~aajisaka] and [~elek] for reviews.

> [thirdparty] Revisit LICENSEs and NOTICEs
> -
>
> Key: HADOOP-16895
> URL: https://issues.apache.org/jira/browse/HADOOP-16895
> Project: Hadoop Common
>  Issue Type: Improvement
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
> Fix For: thirdparty-1.0.0
>
>
> LICENSE.txt and NOTICE.txt have many entries which are unrelated to 
> thirdparty,
> Revisit and cleanup such entries.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Assigned] (HADOOP-16919) [thirdparty] Handle release package related issues

2020-03-11 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16919?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B reassigned HADOOP-16919:
--

Assignee: Vinayakumar B

> [thirdparty] Handle release package related issues
> --
>
> Key: HADOOP-16919
> URL: https://issues.apache.org/jira/browse/HADOOP-16919
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: hadoop-thirdparty
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> Handle following comments from [~elek] in 1.0.0-RC0 voting mail thread 
> here[[https://lists.apache.org/thread.html/r1f2e8325ecef239f0d713c683a16336e2a22431a9f6bfbde3c763816%40%3Ccommon-dev.hadoop.apache.org%3E]]
> {quote}3. Yetus seems to be included in the source package. I am not sure if
>  it's intentional but I would remove the patchprocess directory from the
>  tar file.
> 7. Minor nit: I would suggest to use only the filename in the sha512
>  files (instead of having the /build/source/target prefix). It would help
>  to use `sha512 -c` command to validate the checksum.
> {quote}
> Also, update available artifacts in docs.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-16919) [thirdparty] Handle release package related issues

2020-03-11 Thread Vinayakumar B (Jira)
Vinayakumar B created HADOOP-16919:
--

 Summary: [thirdparty] Handle release package related issues
 Key: HADOOP-16919
 URL: https://issues.apache.org/jira/browse/HADOOP-16919
 Project: Hadoop Common
  Issue Type: Bug
  Components: hadoop-thirdparty
Reporter: Vinayakumar B


Handle following comments from [~elek] in 1.0.0-RC0 voting mail thread 
here[[https://lists.apache.org/thread.html/r1f2e8325ecef239f0d713c683a16336e2a22431a9f6bfbde3c763816%40%3Ccommon-dev.hadoop.apache.org%3E]]
{quote}3. Yetus seems to be included in the source package. I am not sure if
 it's intentional but I would remove the patchprocess directory from the
 tar file.

7. Minor nit: I would suggest to use only the filename in the sha512
 files (instead of having the /build/source/target prefix). It would help
 to use `sha512 -c` command to validate the checksum.
{quote}
Also, update available artifacts in docs.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16895) [thirdparty] Revisit LICENSEs and NOTICEs

2020-02-28 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16895?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B updated HADOOP-16895:
---
Labels:   (was: Third-party)

> [thirdparty] Revisit LICENSEs and NOTICEs
> -
>
> Key: HADOOP-16895
> URL: https://issues.apache.org/jira/browse/HADOOP-16895
> Project: Hadoop Common
>  Issue Type: Improvement
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> LICENSE.txt and NOTICE.txt have many entries which are unrelated to 
> thirdparty,
> Revisit and cleanup such entries.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Assigned] (HADOOP-16895) [thirdparty] Revisit LICENSEs and NOTICEs

2020-02-28 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16895?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B reassigned HADOOP-16895:
--

Target Version/s: thirdparty-1.1.0
Assignee: Vinayakumar B
  Labels: Third-party  (was: )

> [thirdparty] Revisit LICENSEs and NOTICEs
> -
>
> Key: HADOOP-16895
> URL: https://issues.apache.org/jira/browse/HADOOP-16895
> Project: Hadoop Common
>  Issue Type: Improvement
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>  Labels: Third-party
>
> LICENSE.txt and NOTICE.txt have many entries which are unrelated to 
> thirdparty,
> Revisit and cleanup such entries.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-16895) [thirdparty] Revisit LICENSEs and NOTICEs

2020-02-28 Thread Vinayakumar B (Jira)
Vinayakumar B created HADOOP-16895:
--

 Summary: [thirdparty] Revisit LICENSEs and NOTICEs
 Key: HADOOP-16895
 URL: https://issues.apache.org/jira/browse/HADOOP-16895
 Project: Hadoop Common
  Issue Type: Improvement
Reporter: Vinayakumar B


LICENSE.txt and NOTICE.txt have many entries which are unrelated to thirdparty,
Revisit and cleanup such entries.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-16596) [pb-upgrade] Use shaded protobuf classes from hadoop-thirdparty dependency

2020-02-07 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16596?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B resolved HADOOP-16596.

Fix Version/s: 3.3.0
 Hadoop Flags: Reviewed
 Release Note: All protobuf classes will be used from 
hadooop-shaded-protobuf_3_7 artifact with package prefix as 
'org.apache.hadoop.thirdparty.protobuf' instead of 'com.google.protobuf'
   Resolution: Fixed

Merged to trunk. Thanks everyone for reviews

> [pb-upgrade] Use shaded protobuf classes from hadoop-thirdparty dependency
> --
>
> Key: HADOOP-16596
> URL: https://issues.apache.org/jira/browse/HADOOP-16596
> Project: Hadoop Common
>  Issue Type: Sub-task
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
> Fix For: 3.3.0
>
>
> Use the shaded protobuf classes from "hadoop-thirdparty" in hadoop codebase.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-16824) [thirdparty] port HADOOP-16754 (Fix docker failed to build yetus/hadoop) to thirdparty Dockerfile

2020-01-22 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16824?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B resolved HADOOP-16824.

Fix Version/s: thirdparty-1.0.0
 Hadoop Flags: Reviewed
   Resolution: Fixed

Merged PR.
Thanks [~aajisaka] for review.

> [thirdparty] port HADOOP-16754 (Fix docker failed to build yetus/hadoop) to 
> thirdparty Dockerfile
> -
>
> Key: HADOOP-16824
> URL: https://issues.apache.org/jira/browse/HADOOP-16824
> Project: Hadoop Common
>  Issue Type: Bug
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
> Fix For: thirdparty-1.0.0
>
>
> port HADOOP-16754 to avoid Docker build failure



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Assigned] (HADOOP-16824) [thirdparty] port HADOOP-16754 (Fix docker failed to build yetus/hadoop) to thirdparty Dockerfile

2020-01-21 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16824?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B reassigned HADOOP-16824:
--

Target Version/s: thirdparty-1.0.0
Assignee: Vinayakumar B

> [thirdparty] port HADOOP-16754 (Fix docker failed to build yetus/hadoop) to 
> thirdparty Dockerfile
> -
>
> Key: HADOOP-16824
> URL: https://issues.apache.org/jira/browse/HADOOP-16824
> Project: Hadoop Common
>  Issue Type: Bug
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> port HADOOP-16754 to avoid Docker build failure



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-16824) [thirdparty] port HADOOP-16754 (Fix docker failed to build yetus/hadoop) to thirdparty Dockerfile

2020-01-21 Thread Vinayakumar B (Jira)
Vinayakumar B created HADOOP-16824:
--

 Summary: [thirdparty] port HADOOP-16754 (Fix docker failed to 
build yetus/hadoop) to thirdparty Dockerfile
 Key: HADOOP-16824
 URL: https://issues.apache.org/jira/browse/HADOOP-16824
 Project: Hadoop Common
  Issue Type: Bug
Reporter: Vinayakumar B


port HADOOP-16754 to avoid Docker build failure



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-16820) [thirdparty] ChangeLog and ReleaseNote are not packaged by createrelease script

2020-01-21 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16820?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B resolved HADOOP-16820.

Fix Version/s: thirdparty-1.0.0
 Hadoop Flags: Reviewed
 Assignee: Vinayakumar B
   Resolution: Fixed

Merged PR.
Thanks [~ayushtkn] for review.

> [thirdparty] ChangeLog and ReleaseNote are not packaged by createrelease 
> script
> ---
>
> Key: HADOOP-16820
> URL: https://issues.apache.org/jira/browse/HADOOP-16820
> Project: Hadoop Common
>  Issue Type: Bug
>  Components: hadoop-thirdparty
>Affects Versions: thirdparty-1.0.0
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
> Fix For: thirdparty-1.0.0
>
>
> createrelease script is not packaging CHANGELOGS and RELEASENOTES during 
> generation of site package for hadoop-thirdparty module.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-16821) [pb-upgrade] Use 'o.a.h.thirdparty.protobuf' shaded prefix instead of 'protobuf_3_7'

2020-01-21 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16821?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B resolved HADOOP-16821.

Fix Version/s: thirdparty-1.0.0
 Hadoop Flags: Reviewed
 Assignee: Vinayakumar B
   Resolution: Fixed

Committed to trunk.
Thanks [~ste...@apache.org] for review.

> [pb-upgrade] Use 'o.a.h.thirdparty.protobuf' shaded prefix instead of 
> 'protobuf_3_7'
> 
>
> Key: HADOOP-16821
> URL: https://issues.apache.org/jira/browse/HADOOP-16821
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: hadoop-thirdparty
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
> Fix For: thirdparty-1.0.0
>
>
> As per discussion  
> [here|https://github.com/apache/hadoop/pull/1635#issuecomment-576247014], 
> versioned package name may make upgrade of library to a non-trivial task. 
> package name needs to be updated in all usages in all modules. 
> So common package name is preferred.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16821) [pb-upgrade] Use 'o.a.h.thirdparty.protobuf' shaded prefix instead of 'protobuf_3_7'

2020-01-21 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16821?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B updated HADOOP-16821:
---
Target Version/s: thirdparty-1.0.0

> [pb-upgrade] Use 'o.a.h.thirdparty.protobuf' shaded prefix instead of 
> 'protobuf_3_7'
> 
>
> Key: HADOOP-16821
> URL: https://issues.apache.org/jira/browse/HADOOP-16821
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: hadoop-thirdparty
>Reporter: Vinayakumar B
>Priority: Major
>
> As per discussion  
> [here|https://github.com/apache/hadoop/pull/1635#issuecomment-576247014], 
> versioned package name may make upgrade of library to a non-trivial task. 
> package name needs to be updated in all usages in all modules. 
> So common package name is preferred.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16821) [pb-upgrade] Use 'o.a.h.thirdparty.protobuf' shaded prefix instead of 'protobuf_3_7'

2020-01-21 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16821?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B updated HADOOP-16821:
---
Component/s: hadoop-thirdparty

> [pb-upgrade] Use 'o.a.h.thirdparty.protobuf' shaded prefix instead of 
> 'protobuf_3_7'
> 
>
> Key: HADOOP-16821
> URL: https://issues.apache.org/jira/browse/HADOOP-16821
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: hadoop-thirdparty
>Reporter: Vinayakumar B
>Priority: Major
>
> As per discussion  
> [here|https://github.com/apache/hadoop/pull/1635#issuecomment-576247014], 
> versioned package name may make upgrade of library to a non-trivial task. 
> package name needs to be updated in all usages in all modules. 
> So common package name is preferred.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-16821) [pb-upgrade] Use 'o.a.h.thirdparty.protobuf' shaded prefix instead of 'protobuf_3_7'

2020-01-21 Thread Vinayakumar B (Jira)
Vinayakumar B created HADOOP-16821:
--

 Summary: [pb-upgrade] Use 'o.a.h.thirdparty.protobuf' shaded 
prefix instead of 'protobuf_3_7'
 Key: HADOOP-16821
 URL: https://issues.apache.org/jira/browse/HADOOP-16821
 Project: Hadoop Common
  Issue Type: Sub-task
Reporter: Vinayakumar B


As per discussion  
[here|https://github.com/apache/hadoop/pull/1635#issuecomment-576247014], 
versioned package name may make upgrade of library to a non-trivial task. 
package name needs to be updated in all usages in all modules. 

So common package name is preferred.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-16820) [thirdparty] ChangeLog and ReleaseNote are not packaged by createrelease script

2020-01-21 Thread Vinayakumar B (Jira)
Vinayakumar B created HADOOP-16820:
--

 Summary: [thirdparty] ChangeLog and ReleaseNote are not packaged 
by createrelease script
 Key: HADOOP-16820
 URL: https://issues.apache.org/jira/browse/HADOOP-16820
 Project: Hadoop Common
  Issue Type: Bug
  Components: hadoop-thirdparty
Affects Versions: thirdparty-1.0.0
Reporter: Vinayakumar B


createrelease script is not packaging CHANGELOGS and RELEASENOTES during 
generation of site package for hadoop-thirdparty module.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-13363) Upgrade protobuf from 2.5.0 to something newer

2020-01-19 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-13363?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17019250#comment-17019250
 ] 

Vinayakumar B commented on HADOOP-13363:


Hi all, 
Please review PR https://github.com/apache/hadoop/pull/1635 for HADOOP-16596. 
This is required for 3.3.0 release.
thanks.

> Upgrade protobuf from 2.5.0 to something newer
> --
>
> Key: HADOOP-13363
> URL: https://issues.apache.org/jira/browse/HADOOP-13363
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Affects Versions: 3.0.0-alpha1, 3.0.0-alpha2
>Reporter: Anu Engineer
>Assignee: Vinayakumar B
>Priority: Major
>  Labels: security
> Attachments: HADOOP-13363.001.patch, HADOOP-13363.002.patch, 
> HADOOP-13363.003.patch, HADOOP-13363.004.patch, HADOOP-13363.005.patch
>
>
> Standard protobuf 2.5.0 does not work properly on many platforms.  (See, for 
> example, https://gist.github.com/BennettSmith/7111094 ).  In order for us to 
> avoid crazy work arounds in the build environment and the fact that 2.5.0 is 
> starting to slowly disappear as a standard install-able package for even 
> Linux/x86, we need to either upgrade or self bundle or something else.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-16621) [pb-upgrade] Remove Protobuf classes from signatures of Public APIs

2020-01-16 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16621?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B resolved HADOOP-16621.

Fix Version/s: 3.3.0
 Hadoop Flags: Reviewed
 Release Note: 
Following APIs have been removed from Token.java to avoid protobuf classes in 
signature.
1.   o.a.h.security.token.Token(TokenProto tokenPB)
2.   o.a.h.security.token.Token.toTokenProto()
   Resolution: Fixed

Merged PR to trunk.
Thanks [~ste...@apache.org] and [~ayushtkn] for reviews. 

> [pb-upgrade] Remove Protobuf classes from signatures of Public APIs
> ---
>
> Key: HADOOP-16621
> URL: https://issues.apache.org/jira/browse/HADOOP-16621
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: common
>Affects Versions: 3.3.0
>Reporter: Steve Loughran
>Assignee: Vinayakumar B
>Priority: Critical
> Fix For: 3.3.0
>
>
> the move to protobuf 3.x stops spark building because Token has a method 
> which returns a protobuf, and now its returning some v3 types.
> if we want to isolate downstream code from protobuf changes, we need to move 
> that marshalling method from token and put in a helper class.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16621) [pb-upgrade] Remove Protobuf classes from signatures of Public APIs

2020-01-13 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16621?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17014120#comment-17014120
 ] 

Vinayakumar B commented on HADOOP-16621:


I have verified the compilation of spark (master branch) with the  PR, 
compilation was successfull.

> [pb-upgrade] Remove Protobuf classes from signatures of Public APIs
> ---
>
> Key: HADOOP-16621
> URL: https://issues.apache.org/jira/browse/HADOOP-16621
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: common
>Affects Versions: 3.3.0
>Reporter: Steve Loughran
>Assignee: Vinayakumar B
>Priority: Critical
>
> the move to protobuf 3.x stops spark building because Token has a method 
> which returns a protobuf, and now its returning some v3 types.
> if we want to isolate downstream code from protobuf changes, we need to move 
> that marshalling method from token and put in a helper class.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16621) [pb-upgrade] Remove Protobuf classes from signatures of Public APIs

2020-01-12 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16621?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17014070#comment-17014070
 ] 

Vinayakumar B commented on HADOOP-16621:


thanks [~ayushtkn] for confirmation from Compat guidelines as well.
Updated the title to describe the problem.

Raised PR for the #1 approach. 
Please review.

> [pb-upgrade] Remove Protobuf classes from signatures of Public APIs
> ---
>
> Key: HADOOP-16621
> URL: https://issues.apache.org/jira/browse/HADOOP-16621
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: common
>Affects Versions: 3.3.0
>Reporter: Steve Loughran
>Assignee: Vinayakumar B
>Priority: Critical
>
> the move to protobuf 3.x stops spark building because Token has a method 
> which returns a protobuf, and now its returning some v3 types.
> if we want to isolate downstream code from protobuf changes, we need to move 
> that marshalling method from token and put in a helper class.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16621) [pb-upgrade] Remove Protobuf classes from signatures of Public APIs

2020-01-12 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16621?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B updated HADOOP-16621:
---
Summary: [pb-upgrade] Remove Protobuf classes from signatures of Public 
APIs  (was: [pb-upgrade] spark-hive doesn't compile against hadoop trunk 
because of Token's marshalling)

> [pb-upgrade] Remove Protobuf classes from signatures of Public APIs
> ---
>
> Key: HADOOP-16621
> URL: https://issues.apache.org/jira/browse/HADOOP-16621
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: common
>Affects Versions: 3.3.0
>Reporter: Steve Loughran
>Assignee: Vinayakumar B
>Priority: Critical
>
> the move to protobuf 3.x stops spark building because Token has a method 
> which returns a protobuf, and now its returning some v3 types.
> if we want to isolate downstream code from protobuf changes, we need to move 
> that marshalling method from token and put in a helper class.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-16595) [pb-upgrade] Create hadoop-thirdparty artifact to have shaded protobuf

2020-01-12 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16595?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B resolved HADOOP-16595.

Fix Version/s: thirdparty-1.0.0
 Hadoop Flags: Reviewed
   Resolution: Fixed

Merged  PR.
Thanks everyone.

> [pb-upgrade] Create hadoop-thirdparty artifact to have shaded protobuf
> --
>
> Key: HADOOP-16595
> URL: https://issues.apache.org/jira/browse/HADOOP-16595
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: hadoop-thirdparty
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
> Fix For: thirdparty-1.0.0
>
>
> Create a separate repo "hadoop-thirdparty" to have shaded dependencies.
> starting with protobuf-java:3.7.1



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-16797) Add dockerfile for ARM builds

2020-01-12 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16797?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B resolved HADOOP-16797.

Fix Version/s: 3.3.0
 Hadoop Flags: Reviewed
   Resolution: Fixed

Merged PR to trunk. 
Thanks [~ayushtkn] and [~aajisaka] for reviews.

> Add dockerfile for ARM builds
> -
>
> Key: HADOOP-16797
> URL: https://issues.apache.org/jira/browse/HADOOP-16797
> Project: Hadoop Common
>  Issue Type: Sub-task
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
> Fix For: 3.3.0
>
>
> Similar to x86 docker image in {{dev-support/docker/Dockerfile}},
> add one more Dockerfile to support aarch64 builds.
> And support all scripts (createrelease, start-build-env.sh, etc ) to make use 
> of it in ARM platform.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16603) Lack of aarch64 platform support of dependent PhantomJS

2020-01-09 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16603?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17012486#comment-17012486
 ] 

Vinayakumar B commented on HADOOP-16603:


HADOOP-16797 takes care of installing phantomjs from custom prebuilt binary for 
aarch4 in docker image for aarch64.

So, above said maven plugin will not attempt to download again

> Lack of aarch64 platform support of dependent PhantomJS
> ---
>
> Key: HADOOP-16603
> URL: https://issues.apache.org/jira/browse/HADOOP-16603
> Project: Hadoop Common
>  Issue Type: Bug
>Reporter: liusheng
>Priority: Trivial
>
> Hadoop depend the "PhantomJS-2.1.1"[1] library and import it by 
> "phantomjs-maven-plugin:0.7", but there isn't an artifact of phantomjs for 
> aarch64 in "com.github.klieber" group used by Hadoop[2].
> [1] 
> [https://github.com/apache/hadoop/blob/trunk/hadoop-project/pom.xml#L1703-L1707]
> [[2] 
> https://search.maven.org/artifact/com.github.klieber/phantomjs/2.1.1/N%2FA|https://search.maven.org/artifact/com.github.klieber/phantomjs/2.1.1/N%2FA]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16797) Add dockerfile for ARM builds

2020-01-09 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16797?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17011749#comment-17011749
 ] 

Vinayakumar B commented on HADOOP-16797:


Created the Docker image for ARM and updated the PR for using the docker file 
in script to generate release artifacts.

Please review.

> Add dockerfile for ARM builds
> -
>
> Key: HADOOP-16797
> URL: https://issues.apache.org/jira/browse/HADOOP-16797
> Project: Hadoop Common
>  Issue Type: Sub-task
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> Similar to x86 docker image in {{dev-support/docker/Dockerfile}},
> add one more Dockerfile to support aarch64 builds.
> And support all scripts (createrelease, start-build-env.sh, etc ) to make use 
> of it in ARM platform.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Work started] (HADOOP-16797) Add dockerfile for ARM builds

2020-01-09 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16797?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on HADOOP-16797 started by Vinayakumar B.
--
> Add dockerfile for ARM builds
> -
>
> Key: HADOOP-16797
> URL: https://issues.apache.org/jira/browse/HADOOP-16797
> Project: Hadoop Common
>  Issue Type: Sub-task
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> Similar to x86 docker image in {{dev-support/docker/Dockerfile}},
> add one more Dockerfile to support aarch64 builds.
> And support all scripts (createrelease, start-build-env.sh, etc ) to make use 
> of it in ARM platform.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Assigned] (HADOOP-16797) Add dockerfile for ARM builds

2020-01-09 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16797?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B reassigned HADOOP-16797:
--

Assignee: Vinayakumar B

> Add dockerfile for ARM builds
> -
>
> Key: HADOOP-16797
> URL: https://issues.apache.org/jira/browse/HADOOP-16797
> Project: Hadoop Common
>  Issue Type: Sub-task
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> Similar to x86 docker image in {{dev-support/docker/Dockerfile}},
> add one more Dockerfile to support aarch64 builds.
> And support all scripts (createrelease, start-build-env.sh, etc ) to make use 
> of it in ARM platform.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16797) Add dockerfile for ARM builds

2020-01-09 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16797?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B updated HADOOP-16797:
---
Description: 
Similar to x86 docker image in {{dev-support/docker/Dockerfile}},
add one more Dockerfile to support aarch64 builds.

And support all scripts (createrelease, start-build-env.sh, etc ) to make use 
of it in ARM platform.

  was:
Similar to x86 docker image in {{dev-support/docker/Dockerfile}},
add one more Dockerfile to support aarch64 builds.


> Add dockerfile for ARM builds
> -
>
> Key: HADOOP-16797
> URL: https://issues.apache.org/jira/browse/HADOOP-16797
> Project: Hadoop Common
>  Issue Type: Sub-task
>Reporter: Vinayakumar B
>Priority: Major
>
> Similar to x86 docker image in {{dev-support/docker/Dockerfile}},
> add one more Dockerfile to support aarch64 builds.
> And support all scripts (createrelease, start-build-env.sh, etc ) to make use 
> of it in ARM platform.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-16797) Add dockerfile for ARM builds

2020-01-09 Thread Vinayakumar B (Jira)
Vinayakumar B created HADOOP-16797:
--

 Summary: Add dockerfile for ARM builds
 Key: HADOOP-16797
 URL: https://issues.apache.org/jira/browse/HADOOP-16797
 Project: Hadoop Common
  Issue Type: Sub-task
Reporter: Vinayakumar B


Similar to x86 docker image in {{dev-support/docker/Dockerfile}},
add one more Dockerfile to support aarch64 builds.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16603) Lack of aarch64 platform support of dependent PhantomJS

2020-01-09 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16603?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17011571#comment-17011571
 ] 

Vinayakumar B commented on HADOOP-16603:


This will not be a problem, if phantomjs is already installed in the system and 
available in system path. usually /usr/bin/phantomjs.
phantomjs plugin will findout this and ignore searching in maven repo.

Since phantomjs binaries are available only for x86 platforms, aarch64 binary 
needs to be built from source. as already mentioned in phantomjs site.
I think, onetime installation of phantomjs in compile machine should not be a 
problem.

I have tried this in ARM machine, it works.

> Lack of aarch64 platform support of dependent PhantomJS
> ---
>
> Key: HADOOP-16603
> URL: https://issues.apache.org/jira/browse/HADOOP-16603
> Project: Hadoop Common
>  Issue Type: Bug
>Reporter: liusheng
>Priority: Trivial
>
> Hadoop depend the "PhantomJS-2.1.1"[1] library and import it by 
> "phantomjs-maven-plugin:0.7", but there isn't an artifact of phantomjs for 
> aarch64 in "com.github.klieber" group used by Hadoop[2].
> [1] 
> [https://github.com/apache/hadoop/blob/trunk/hadoop-project/pom.xml#L1703-L1707]
> [[2] 
> https://search.maven.org/artifact/com.github.klieber/phantomjs/2.1.1/N%2FA|https://search.maven.org/artifact/com.github.klieber/phantomjs/2.1.1/N%2FA]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16621) [pb-upgrade] spark-hive doesn't compile against hadoop trunk because of Token's marshalling

2020-01-07 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16621?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17009689#comment-17009689
 ] 

Vinayakumar B commented on HADOOP-16621:


Thanks [~ste...@apache.org] for confirmation.

Will send out a mail in dev list about removal of \{{Token#toTokenProto()}} 
before moving forward.

> [pb-upgrade] spark-hive doesn't compile against hadoop trunk because of 
> Token's marshalling
> ---
>
> Key: HADOOP-16621
> URL: https://issues.apache.org/jira/browse/HADOOP-16621
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: common
>Affects Versions: 3.3.0
>Reporter: Steve Loughran
>Assignee: Vinayakumar B
>Priority: Critical
>
> the move to protobuf 3.x stops spark building because Token has a method 
> which returns a protobuf, and now its returning some v3 types.
> if we want to isolate downstream code from protobuf changes, we need to move 
> that marshalling method from token and put in a helper class.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16595) [pb-upgrade] Create hadoop-thirdparty artifact to have shaded protobuf

2020-01-06 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16595?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B updated HADOOP-16595:
---
Target Version/s: thirdparty-1.0.0

> [pb-upgrade] Create hadoop-thirdparty artifact to have shaded protobuf
> --
>
> Key: HADOOP-16595
> URL: https://issues.apache.org/jira/browse/HADOOP-16595
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: hadoop-thirdparty
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> Create a separate repo "hadoop-thirdparty" to have shaded dependencies.
> starting with protobuf-java:3.7.1



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Assigned] (HADOOP-16621) [pb-upgrade] spark-hive doesn't compile against hadoop trunk because of Token's marshalling

2020-01-06 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16621?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B reassigned HADOOP-16621:
--

Assignee: Vinayakumar B

> [pb-upgrade] spark-hive doesn't compile against hadoop trunk because of 
> Token's marshalling
> ---
>
> Key: HADOOP-16621
> URL: https://issues.apache.org/jira/browse/HADOOP-16621
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: common
>Affects Versions: 3.3.0
>Reporter: Steve Loughran
>Assignee: Vinayakumar B
>Priority: Major
>
> the move to protobuf 3.x stops spark building because Token has a method 
> which returns a protobuf, and now its returning some v3 types.
> if we want to isolate downstream code from protobuf changes, we need to move 
> that marshalling method from token and put in a helper class.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16358) Add an ARM CI for Hadoop

2020-01-06 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16358?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17008748#comment-17008748
 ] 

Vinayakumar B commented on HADOOP-16358:


Thanks Everyone

> Add an ARM CI for Hadoop
> 
>
> Key: HADOOP-16358
> URL: https://issues.apache.org/jira/browse/HADOOP-16358
> Project: Hadoop Common
>  Issue Type: Task
>  Components: build
>Reporter: Zhenyu Zheng
>Assignee: Zhenyu Zheng
>Priority: Major
> Fix For: 3.3.0
>
>
> Now the CI of Hadoop is handled by jenkins. While the tests are running under 
> x86 ARCH, the ARM arch has not being considered. This leads an problem that 
> we don't have a way to test every pull request that if it'll break the Hadoop 
> deployment on ARM or not.
> We should add a CI system that support ARM arch. Using it, Hadoop can 
> officially support arm release in the future. Here I'd like to introduce 
> OpenLab to the community. [OpenLab|https://openlabtesting.org/] is a open 
> source CI system that can test any open source software on either x86 or arm 
> ARCH, it's mainly used by github projects. Now some 
> [projects|https://github.com/theopenlab/openlab-zuul-jobs/blob/master/zuul.d/jobs.yaml]
>  has integrated it already. Such as containerd (a graduated CNCF project, the 
> arm build will be triggerd in every PR, 
> [https://github.com/containerd/containerd/pulls]), terraform and so on.
> OpenLab uses the open source CI software [Zuul 
> |https://github.com/openstack-infra/zuul] for CI system. Zuul is used by 
> OpenStack community as well. integrating with OpneLab is quite easy using its 
> github app. All config info is open source as well.
> If apache Hadoop community has interested with it, we can help for the 
> integration.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Assigned] (HADOOP-16358) Add an ARM CI for Hadoop

2020-01-06 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16358?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B reassigned HADOOP-16358:
--

Assignee: Zhenyu Zheng

> Add an ARM CI for Hadoop
> 
>
> Key: HADOOP-16358
> URL: https://issues.apache.org/jira/browse/HADOOP-16358
> Project: Hadoop Common
>  Issue Type: Task
>  Components: build
>Reporter: Zhenyu Zheng
>Assignee: Zhenyu Zheng
>Priority: Major
> Fix For: 3.3.0
>
>
> Now the CI of Hadoop is handled by jenkins. While the tests are running under 
> x86 ARCH, the ARM arch has not being considered. This leads an problem that 
> we don't have a way to test every pull request that if it'll break the Hadoop 
> deployment on ARM or not.
> We should add a CI system that support ARM arch. Using it, Hadoop can 
> officially support arm release in the future. Here I'd like to introduce 
> OpenLab to the community. [OpenLab|https://openlabtesting.org/] is a open 
> source CI system that can test any open source software on either x86 or arm 
> ARCH, it's mainly used by github projects. Now some 
> [projects|https://github.com/theopenlab/openlab-zuul-jobs/blob/master/zuul.d/jobs.yaml]
>  has integrated it already. Such as containerd (a graduated CNCF project, the 
> arm build will be triggerd in every PR, 
> [https://github.com/containerd/containerd/pulls]), terraform and so on.
> OpenLab uses the open source CI software [Zuul 
> |https://github.com/openstack-infra/zuul] for CI system. Zuul is used by 
> OpenStack community as well. integrating with OpneLab is quite easy using its 
> github app. All config info is open source as well.
> If apache Hadoop community has interested with it, we can help for the 
> integration.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-16358) Add an ARM CI for Hadoop

2020-01-06 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16358?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B resolved HADOOP-16358.

Fix Version/s: 3.3.0
   Resolution: Fixed

A Jenkins Job has been created to run nightly tests on aarch64

[https://builds.apache.org/view/H-L/view/Hadoop/job/Hadoop-qbt-linux-ARM-trunk/]

> Add an ARM CI for Hadoop
> 
>
> Key: HADOOP-16358
> URL: https://issues.apache.org/jira/browse/HADOOP-16358
> Project: Hadoop Common
>  Issue Type: Task
>  Components: build
>Reporter: Zhenyu Zheng
>Priority: Major
> Fix For: 3.3.0
>
>
> Now the CI of Hadoop is handled by jenkins. While the tests are running under 
> x86 ARCH, the ARM arch has not being considered. This leads an problem that 
> we don't have a way to test every pull request that if it'll break the Hadoop 
> deployment on ARM or not.
> We should add a CI system that support ARM arch. Using it, Hadoop can 
> officially support arm release in the future. Here I'd like to introduce 
> OpenLab to the community. [OpenLab|https://openlabtesting.org/] is a open 
> source CI system that can test any open source software on either x86 or arm 
> ARCH, it's mainly used by github projects. Now some 
> [projects|https://github.com/theopenlab/openlab-zuul-jobs/blob/master/zuul.d/jobs.yaml]
>  has integrated it already. Such as containerd (a graduated CNCF project, the 
> arm build will be triggerd in every PR, 
> [https://github.com/containerd/containerd/pulls]), terraform and so on.
> OpenLab uses the open source CI software [Zuul 
> |https://github.com/openstack-infra/zuul] for CI system. Zuul is used by 
> OpenStack community as well. integrating with OpneLab is quite easy using its 
> github app. All config info is open source as well.
> If apache Hadoop community has interested with it, we can help for the 
> integration.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16358) Add an ARM CI for Hadoop

2020-01-06 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16358?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B updated HADOOP-16358:
---
Issue Type: Task  (was: Improvement)

> Add an ARM CI for Hadoop
> 
>
> Key: HADOOP-16358
> URL: https://issues.apache.org/jira/browse/HADOOP-16358
> Project: Hadoop Common
>  Issue Type: Task
>  Components: build
>Reporter: Zhenyu Zheng
>Priority: Major
>
> Now the CI of Hadoop is handled by jenkins. While the tests are running under 
> x86 ARCH, the ARM arch has not being considered. This leads an problem that 
> we don't have a way to test every pull request that if it'll break the Hadoop 
> deployment on ARM or not.
> We should add a CI system that support ARM arch. Using it, Hadoop can 
> officially support arm release in the future. Here I'd like to introduce 
> OpenLab to the community. [OpenLab|https://openlabtesting.org/] is a open 
> source CI system that can test any open source software on either x86 or arm 
> ARCH, it's mainly used by github projects. Now some 
> [projects|https://github.com/theopenlab/openlab-zuul-jobs/blob/master/zuul.d/jobs.yaml]
>  has integrated it already. Such as containerd (a graduated CNCF project, the 
> arm build will be triggerd in every PR, 
> [https://github.com/containerd/containerd/pulls]), terraform and so on.
> OpenLab uses the open source CI software [Zuul 
> |https://github.com/openstack-infra/zuul] for CI system. Zuul is used by 
> OpenStack community as well. integrating with OpneLab is quite easy using its 
> github app. All config info is open source as well.
> If apache Hadoop community has interested with it, we can help for the 
> integration.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16621) [pb-upgrade] spark-hive doesn't compile against hadoop trunk because of Token's marshalling

2020-01-06 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16621?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B updated HADOOP-16621:
---
Target Version/s: 3.3.0

> [pb-upgrade] spark-hive doesn't compile against hadoop trunk because of 
> Token's marshalling
> ---
>
> Key: HADOOP-16621
> URL: https://issues.apache.org/jira/browse/HADOOP-16621
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: common
>Affects Versions: 3.3.0
>Reporter: Steve Loughran
>Priority: Major
>
> the move to protobuf 3.x stops spark building because Token has a method 
> which returns a protobuf, and now its returning some v3 types.
> if we want to isolate downstream code from protobuf changes, we need to move 
> that marshalling method from token and put in a helper class.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16774) TestDiskChecker and TestReadWriteDiskValidator fails when run with -Pparallel-tests

2019-12-20 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16774?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B updated HADOOP-16774:
---
Fix Version/s: 3.3.0
 Hadoop Flags: Reviewed
   Resolution: Fixed
   Status: Resolved  (was: Patch Available)

PR is merged.

> TestDiskChecker and TestReadWriteDiskValidator fails when run with 
> -Pparallel-tests
> ---
>
> Key: HADOOP-16774
> URL: https://issues.apache.org/jira/browse/HADOOP-16774
> Project: Hadoop Common
>  Issue Type: Bug
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
> Fix For: 3.3.0
>
>
> {noformat}
> $  mvn test -Pparallel-tests 
> -Dtest=TestReadWriteDiskValidator,TestDiskChecker -Pnative
>  {noformat}
> {noformat}
> [INFO] Results:
> [INFO] 
> [ERROR] Errors: 
> [ERROR]   
> TestDiskChecker.testCheckDir_normal:111->_checkDirs:158->createTempDir:153 » 
> NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_normal_local:180->checkDirs:205->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notDir:116->_checkDirs:158->createTempFile:142 » 
> NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notDir_local:185->checkDirs:205->createTempFile:142
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notListable:131->_checkDirs:158->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notListable_local:200->checkDirs:205->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notReadable:121->_checkDirs:158->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notReadable_local:190->checkDirs:205->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notWritable:126->_checkDirs:158->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notWritable_local:195->checkDirs:205->createTempDir:153
>  » NoSuchFile
> [ERROR]   TestReadWriteDiskValidator.testCheckFailures:114 » NoSuchFile 
> /usr1/code/hadoo...
> [ERROR]   TestReadWriteDiskValidator.testReadWriteDiskValidator:62 » 
> DiskError Disk Chec...
> [INFO] 
> [ERROR] Tests run: 16, Failures: 0, Errors: 12, Skipped: 0
> {noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16774) TestDiskChecker and TestReadWriteDiskValidator fails when run with -Pparallel-tests

2019-12-20 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16774?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17001081#comment-17001081
 ] 

Vinayakumar B commented on HADOOP-16774:


thanks [~ayushtkn] for review.

> TestDiskChecker and TestReadWriteDiskValidator fails when run with 
> -Pparallel-tests
> ---
>
> Key: HADOOP-16774
> URL: https://issues.apache.org/jira/browse/HADOOP-16774
> Project: Hadoop Common
>  Issue Type: Bug
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
> Fix For: 3.3.0
>
>
> {noformat}
> $  mvn test -Pparallel-tests 
> -Dtest=TestReadWriteDiskValidator,TestDiskChecker -Pnative
>  {noformat}
> {noformat}
> [INFO] Results:
> [INFO] 
> [ERROR] Errors: 
> [ERROR]   
> TestDiskChecker.testCheckDir_normal:111->_checkDirs:158->createTempDir:153 » 
> NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_normal_local:180->checkDirs:205->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notDir:116->_checkDirs:158->createTempFile:142 » 
> NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notDir_local:185->checkDirs:205->createTempFile:142
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notListable:131->_checkDirs:158->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notListable_local:200->checkDirs:205->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notReadable:121->_checkDirs:158->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notReadable_local:190->checkDirs:205->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notWritable:126->_checkDirs:158->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notWritable_local:195->checkDirs:205->createTempDir:153
>  » NoSuchFile
> [ERROR]   TestReadWriteDiskValidator.testCheckFailures:114 » NoSuchFile 
> /usr1/code/hadoo...
> [ERROR]   TestReadWriteDiskValidator.testReadWriteDiskValidator:62 » 
> DiskError Disk Chec...
> [INFO] 
> [ERROR] Tests run: 16, Failures: 0, Errors: 12, Skipped: 0
> {noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16774) TestDiskChecker and TestReadWriteDiskValidator fails when run with -Pparallel-tests

2019-12-20 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16774?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17000988#comment-17000988
 ] 

Vinayakumar B commented on HADOOP-16774:


Created the PR. Please review.

 Rootcause:

These tests will not create the parent dirs for tests. If test dir is not 
available, then fails as above

In parallel-tests separate dirs needs to be created for each fork thread. 
 # {{parallel-tests-createdir}} goal was not configured with 
{{test.build.data}} configuration. It was creating default dirs, but tests 
expect different dir.
 # {{parallel-tests-createdir}} also should be configured to run after 
{{create-log-dirs}} to avoid deletion of test dirs. Right now, 
{{parallel-tests-createdir}} runs first and creates some dirs and 
{{create-log-dirs }} deletes the {{target/test/data}} itself. Before recreating 
only {{target/test/data}}

> TestDiskChecker and TestReadWriteDiskValidator fails when run with 
> -Pparallel-tests
> ---
>
> Key: HADOOP-16774
> URL: https://issues.apache.org/jira/browse/HADOOP-16774
> Project: Hadoop Common
>  Issue Type: Bug
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> {noformat}
> $  mvn test -Pparallel-tests 
> -Dtest=TestReadWriteDiskValidator,TestDiskChecker -Pnative
>  {noformat}
> {noformat}
> [INFO] Results:
> [INFO] 
> [ERROR] Errors: 
> [ERROR]   
> TestDiskChecker.testCheckDir_normal:111->_checkDirs:158->createTempDir:153 » 
> NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_normal_local:180->checkDirs:205->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notDir:116->_checkDirs:158->createTempFile:142 » 
> NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notDir_local:185->checkDirs:205->createTempFile:142
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notListable:131->_checkDirs:158->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notListable_local:200->checkDirs:205->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notReadable:121->_checkDirs:158->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notReadable_local:190->checkDirs:205->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notWritable:126->_checkDirs:158->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notWritable_local:195->checkDirs:205->createTempDir:153
>  » NoSuchFile
> [ERROR]   TestReadWriteDiskValidator.testCheckFailures:114 » NoSuchFile 
> /usr1/code/hadoo...
> [ERROR]   TestReadWriteDiskValidator.testReadWriteDiskValidator:62 » 
> DiskError Disk Chec...
> [INFO] 
> [ERROR] Tests run: 16, Failures: 0, Errors: 12, Skipped: 0
> {noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16774) TestDiskChecker and TestReadWriteDiskValidator fails when run with -Pparallel-tests

2019-12-20 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16774?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B updated HADOOP-16774:
---
Status: Patch Available  (was: Open)

> TestDiskChecker and TestReadWriteDiskValidator fails when run with 
> -Pparallel-tests
> ---
>
> Key: HADOOP-16774
> URL: https://issues.apache.org/jira/browse/HADOOP-16774
> Project: Hadoop Common
>  Issue Type: Bug
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> {noformat}
> $  mvn test -Pparallel-tests 
> -Dtest=TestReadWriteDiskValidator,TestDiskChecker -Pnative
>  {noformat}
> {noformat}
> [INFO] Results:
> [INFO] 
> [ERROR] Errors: 
> [ERROR]   
> TestDiskChecker.testCheckDir_normal:111->_checkDirs:158->createTempDir:153 » 
> NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_normal_local:180->checkDirs:205->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notDir:116->_checkDirs:158->createTempFile:142 » 
> NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notDir_local:185->checkDirs:205->createTempFile:142
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notListable:131->_checkDirs:158->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notListable_local:200->checkDirs:205->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notReadable:121->_checkDirs:158->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notReadable_local:190->checkDirs:205->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notWritable:126->_checkDirs:158->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notWritable_local:195->checkDirs:205->createTempDir:153
>  » NoSuchFile
> [ERROR]   TestReadWriteDiskValidator.testCheckFailures:114 » NoSuchFile 
> /usr1/code/hadoo...
> [ERROR]   TestReadWriteDiskValidator.testReadWriteDiskValidator:62 » 
> DiskError Disk Chec...
> [INFO] 
> [ERROR] Tests run: 16, Failures: 0, Errors: 12, Skipped: 0
> {noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Assigned] (HADOOP-16774) TestDiskChecker and TestReadWriteDiskValidator fails when run with -Pparallel-tests

2019-12-20 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16774?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B reassigned HADOOP-16774:
--

Assignee: Vinayakumar B

> TestDiskChecker and TestReadWriteDiskValidator fails when run with 
> -Pparallel-tests
> ---
>
> Key: HADOOP-16774
> URL: https://issues.apache.org/jira/browse/HADOOP-16774
> Project: Hadoop Common
>  Issue Type: Bug
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> {noformat}
> $  mvn test -Pparallel-tests 
> -Dtest=TestReadWriteDiskValidator,TestDiskChecker -Pnative
>  {noformat}
> {noformat}
> [INFO] Results:
> [INFO] 
> [ERROR] Errors: 
> [ERROR]   
> TestDiskChecker.testCheckDir_normal:111->_checkDirs:158->createTempDir:153 » 
> NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_normal_local:180->checkDirs:205->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notDir:116->_checkDirs:158->createTempFile:142 » 
> NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notDir_local:185->checkDirs:205->createTempFile:142
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notListable:131->_checkDirs:158->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notListable_local:200->checkDirs:205->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notReadable:121->_checkDirs:158->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notReadable_local:190->checkDirs:205->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notWritable:126->_checkDirs:158->createTempDir:153
>  » NoSuchFile
> [ERROR]   
> TestDiskChecker.testCheckDir_notWritable_local:195->checkDirs:205->createTempDir:153
>  » NoSuchFile
> [ERROR]   TestReadWriteDiskValidator.testCheckFailures:114 » NoSuchFile 
> /usr1/code/hadoo...
> [ERROR]   TestReadWriteDiskValidator.testReadWriteDiskValidator:62 » 
> DiskError Disk Chec...
> [INFO] 
> [ERROR] Tests run: 16, Failures: 0, Errors: 12, Skipped: 0
> {noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-16774) TestDiskChecker and TestReadWriteDiskValidator fails when run with -Pparallel-tests

2019-12-20 Thread Vinayakumar B (Jira)
Vinayakumar B created HADOOP-16774:
--

 Summary: TestDiskChecker and TestReadWriteDiskValidator fails when 
run with -Pparallel-tests
 Key: HADOOP-16774
 URL: https://issues.apache.org/jira/browse/HADOOP-16774
 Project: Hadoop Common
  Issue Type: Bug
Reporter: Vinayakumar B


{noformat}
$  mvn test -Pparallel-tests -Dtest=TestReadWriteDiskValidator,TestDiskChecker 
-Pnative
 {noformat}
{noformat}
[INFO] Results:
[INFO] 
[ERROR] Errors: 
[ERROR]   
TestDiskChecker.testCheckDir_normal:111->_checkDirs:158->createTempDir:153 » 
NoSuchFile
[ERROR]   
TestDiskChecker.testCheckDir_normal_local:180->checkDirs:205->createTempDir:153 
» NoSuchFile
[ERROR]   
TestDiskChecker.testCheckDir_notDir:116->_checkDirs:158->createTempFile:142 » 
NoSuchFile
[ERROR]   
TestDiskChecker.testCheckDir_notDir_local:185->checkDirs:205->createTempFile:142
 » NoSuchFile
[ERROR]   
TestDiskChecker.testCheckDir_notListable:131->_checkDirs:158->createTempDir:153 
» NoSuchFile
[ERROR]   
TestDiskChecker.testCheckDir_notListable_local:200->checkDirs:205->createTempDir:153
 » NoSuchFile
[ERROR]   
TestDiskChecker.testCheckDir_notReadable:121->_checkDirs:158->createTempDir:153 
» NoSuchFile
[ERROR]   
TestDiskChecker.testCheckDir_notReadable_local:190->checkDirs:205->createTempDir:153
 » NoSuchFile
[ERROR]   
TestDiskChecker.testCheckDir_notWritable:126->_checkDirs:158->createTempDir:153 
» NoSuchFile
[ERROR]   
TestDiskChecker.testCheckDir_notWritable_local:195->checkDirs:205->createTempDir:153
 » NoSuchFile
[ERROR]   TestReadWriteDiskValidator.testCheckFailures:114 » NoSuchFile 
/usr1/code/hadoo...
[ERROR]   TestReadWriteDiskValidator.testReadWriteDiskValidator:62 » DiskError 
Disk Chec...
[INFO] 
[ERROR] Tests run: 16, Failures: 0, Errors: 12, Skipped: 0

{noformat}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Comment Edited] (HADOOP-16621) [pb-upgrade] spark-hive doesn't compile against hadoop trunk because of Token's marshalling

2019-12-18 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16621?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16999436#comment-16999436
 ] 

Vinayakumar B edited comment on HADOOP-16621 at 12/18/19 7:03 PM:
--

Sorry for being late.

Following public APIs were introduced by HADOOP-12563 back in 2016 in Hadoop 
3.0.0 version.
{code:java}
public Token(TokenProto tokenPB);

public TokenProto toTokenProto();
{code}
Ideally there should not be any public API with @Public interface with protobuf 
in signature.
 Right now, this is breaking the binary compatibility of downstream due to 
protobuf version upgrade. Because generated proto classes' super class name is 
changed to {{GeneratedMessage3}} from {{GeneratedMessage}} in 2.5.0 protobuf.

So possible options to proceed will be only
 # Remove all public methods with protobuf signature replace with helper 
classes to do the same job. as being done in HDFS' {{PBHelperClient.java}}. 
This will break the compatibility if by any chance these methods are being used 
outside hadoop-common module (also Hadoop project overall, as upgrade happens 
all Hadoop components together).
 # Mark methods deprecated, Keep the old 'TokenProto' class with 2.5.0 
generated protobuf committed to repo. And rename current {{TokenProto}} to 
{{TokenProto3}} and all their occurances throughout project (Hopefully 
TokenProto is not used outside Hadoop project). And skip shading of 2.5.0 
TokenProto. Can remove methods and committed TokenProto class.

 

Approach #1 is would be easy and direct change, but again compatibility issue 
if these methods used by other projects which is most unlikely.

[~ste...@apache.org] / [~vinodkv] / [~raviprak] is it okay to remove above 
mentioned methods ? and replace with something similar to 
{{PBHelperClient#convert(Token tok)}} and 
{{PBHelperClient#convert(TokenProto tok)}}

 

Approach #2 is a workaround still keeping the Compatibility but unnecessary 
(most possibly unused ) code will be present in repo.  

    Also #2 is possible only after HADOOP-16596 is in, to support both 2.5.0 
and 3.x versions of protobuf together.

This change is very much mandatory to allow spark(and others, which just 
imports Token classes) to compile/run successfully without need to explicitly 
set the protobuf version same as Hadoop.

 

Please let me know your opinions.


was (Author: vinayrpet):
Sorry for being late.

Following public APIs were introduced by HADOOP-12563 back in 2016 in Hadoop 
3.0.0 version.
{code:java}
public Token(TokenProto tokenPB);

public TokenProto toTokenProto();
{code}
Ideally there should not be any public API with @Public interface with protobuf 
in signature.
 Right now, this is breaking the binary compatibility of downstream due to 
protobuf version upgrade. Because generated proto classes' super class name is 
changed to {{GeneratedMessage3}} from {{GeneratedMessage}} in 2.5.0 protobuf.

So possible options to proceed will be only
 # Remove all public methods with protobuf signature replace with helper 
classes to do the same job. as being done in HDFS' {{PBHelperClient.java}}. 
This will break the compatibility if by any chance these methods are being used 
outside hadoop-common module (also Hadoop project overall, as upgrade happens 
all Hadoop components together).
 # Mark methods deprecated, Keep the old 'TokenProto' class with 2.5.0 
generated protobuf committed to repo. And rename current {{TokenProto}} to 
{{TokenProto3}} and all their occurances throughout project (Hopefully 
TokenProto is not used outside Hadoop project). And skip shading of 2.5.0 
TokenProto. Can remove methods and committed TokenProto class.

 

Approach #1 is would be easy and direct change, but again compatibility issue 
if these methods used by other projects which is most unlikely.

[~ste...@apache.org] / [~vinodkv] / [~raviprak] is it okay to remove above 
mentioned methods ? and replace with something similar to 
{{PBHelperClient#convert(Token tok)}} and 
{{PBHelperClient#convert(TokenProto tok)}}

 

Approach #2 is a workaround still keeping the Compatibility but unnecessary 
(most possibly unused ) code will be present in repo.


 This change is very much mandatory to allow spark(and others, which just 
imports Token classes) to compile/run successfully without need to explicitly 
set the protobuf version same as Hadoop.

 

Please let me know your opinions.

> [pb-upgrade] spark-hive doesn't compile against hadoop trunk because of 
> Token's marshalling
> ---
>
> Key: HADOOP-16621
> URL: https://issues.apache.org/jira/browse/HADOOP-16621
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: common
>Affects Versions: 3.3.0
>Reporter: Steve Loughran
>Priority: Major
>
> the move to 

[jira] [Commented] (HADOOP-16621) [pb-upgrade] spark-hive doesn't compile against hadoop trunk because of Token's marshalling

2019-12-18 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16621?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16999436#comment-16999436
 ] 

Vinayakumar B commented on HADOOP-16621:


Sorry for being late.

Following public APIs were introduced by HADOOP-12563 back in 2016 in Hadoop 
3.0.0 version.
{code:java}
public Token(TokenProto tokenPB);

public TokenProto toTokenProto();
{code}
Ideally there should not be any public API with @Public interface with protobuf 
in signature.
 Right now, this is breaking the binary compatibility of downstream due to 
protobuf version upgrade. Because generated proto classes' super class name is 
changed to {{GeneratedMessage3}} from {{GeneratedMessage}} in 2.5.0 protobuf.

So possible options to proceed will be only
 # Remove all public methods with protobuf signature replace with helper 
classes to do the same job. as being done in HDFS' {{PBHelperClient.java}}. 
This will break the compatibility if by any chance these methods are being used 
outside hadoop-common module (also Hadoop project overall, as upgrade happens 
all Hadoop components together).
 # Mark methods deprecated, Keep the old 'TokenProto' class with 2.5.0 
generated protobuf committed to repo. And rename current {{TokenProto}} to 
{{TokenProto3}} and all their occurances throughout project (Hopefully 
TokenProto is not used outside Hadoop project). And skip shading of 2.5.0 
TokenProto. Can remove methods and committed TokenProto class.

 

Approach #1 is would be easy and direct change, but again compatibility issue 
if these methods used by other projects which is most unlikely.

[~ste...@apache.org] / [~vinodkv] / [~raviprak] is it okay to remove above 
mentioned methods ? and replace with something similar to 
{{PBHelperClient#convert(Token tok)}} and 
{{PBHelperClient#convert(TokenProto tok)}}

 

Approach #2 is a workaround still keeping the Compatibility but unnecessary 
(most possibly unused ) code will be present in repo.


 This change is very much mandatory to allow spark(and others, which just 
imports Token classes) to compile/run successfully without need to explicitly 
set the protobuf version same as Hadoop.

 

Please let me know your opinions.

> [pb-upgrade] spark-hive doesn't compile against hadoop trunk because of 
> Token's marshalling
> ---
>
> Key: HADOOP-16621
> URL: https://issues.apache.org/jira/browse/HADOOP-16621
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: common
>Affects Versions: 3.3.0
>Reporter: Steve Loughran
>Priority: Major
>
> the move to protobuf 3.x stops spark building because Token has a method 
> which returns a protobuf, and now its returning some v3 types.
> if we want to isolate downstream code from protobuf changes, we need to move 
> that marshalling method from token and put in a helper class.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16595) [pb-upgrade] Create hadoop-thirdparty artifact to have shaded protobuf

2019-12-17 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16595?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16998870#comment-16998870
 ] 

Vinayakumar B commented on HADOOP-16595:


Have updated the PR with fixed comments.

[~zhangduo]/[~ayushtkn]/[~weichiu] Please review latest changes.

 

> [pb-upgrade] Create hadoop-thirdparty artifact to have shaded protobuf
> --
>
> Key: HADOOP-16595
> URL: https://issues.apache.org/jira/browse/HADOOP-16595
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: hadoop-thirdparty
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> Create a separate repo "hadoop-thirdparty" to have shaded dependencies.
> starting with protobuf-java:3.7.1



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16596) [pb-upgrade] Use shaded protobuf classes from hadoop-thirdparty dependency

2019-10-09 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16596?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16948226#comment-16948226
 ] 

Vinayakumar B commented on HADOOP-16596:


Created the PR [https://github.com/apache/hadoop/pull/1635] for the same.

Please review.

> [pb-upgrade] Use shaded protobuf classes from hadoop-thirdparty dependency
> --
>
> Key: HADOOP-16596
> URL: https://issues.apache.org/jira/browse/HADOOP-16596
> Project: Hadoop Common
>  Issue Type: Sub-task
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> Use the shaded protobuf classes from "hadoop-thirdparty" in hadoop codebase.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Work started] (HADOOP-16596) [pb-upgrade] Use shaded protobuf classes from hadoop-thirdparty dependency

2019-10-09 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16596?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on HADOOP-16596 started by Vinayakumar B.
--
> [pb-upgrade] Use shaded protobuf classes from hadoop-thirdparty dependency
> --
>
> Key: HADOOP-16596
> URL: https://issues.apache.org/jira/browse/HADOOP-16596
> Project: Hadoop Common
>  Issue Type: Sub-task
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> Use the shaded protobuf classes from "hadoop-thirdparty" in hadoop codebase.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16595) [pb-upgrade] Create hadoop-thirdparty artifact to have shaded protobuf

2019-10-09 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16595?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16947528#comment-16947528
 ] 

Vinayakumar B commented on HADOOP-16595:


will update the PR later today.

> [pb-upgrade] Create hadoop-thirdparty artifact to have shaded protobuf
> --
>
> Key: HADOOP-16595
> URL: https://issues.apache.org/jira/browse/HADOOP-16595
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: hadoop-thirdparty
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> Create a separate repo "hadoop-thirdparty" to have shaded dependencies.
> starting with protobuf-java:3.7.1



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-13363) Upgrade protobuf from 2.5.0 to something newer

2019-09-30 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-13363?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16941538#comment-16941538
 ] 

Vinayakumar B commented on HADOOP-13363:


If you are specifically pointing to above mentioned Spark/Hive's case, I could 
compile the spark from its root, by passing {{-Dprotobuf.version=3.7.1}}.

In Spark's root pom.xml protobuf-2.5.0 is explicitly added as dependency to 
avoid problem in maven auto resolution to different version. As per comments 
this dependency is only for hadoop/yarn.

 
{quote} If we are passing protobuf types around, either as inputs or outputs. 
Then we cannot shade the artifacts. And so: we cannot update protobuf 
"transparently".

{color:#172b4d}For Token, if that toProbuf method is considered private, well, 
we could move the operation out into its own class and invoke passing in a 
Token instance, but it is probably a symptom of a bigger problem -protobuf 
types in public APIs{color}
{quote}
 

IMO, Token's API change will not be problematic in normal cases.

{{TokenProto}} is a generated class, whose package will not change, but its 
super class changes to {{GeneratedMessageV3}} from {{GeneratedMessage}}.

Direct usages of {{TokenProto}} or its {{Builder}} will not be a problem. It 
can be problematic if application is embedding tokenproto in some other proto 
message, for which code is generated using 2.5.0, or using common 
{{GeneratedMessage}} as type for {{TokenProto}}. But this is unlikely as this 
feature is only provided in Hadoop-3.0.0 onwards.

Otherwise, I think it won't be a problem.

 

> Upgrade protobuf from 2.5.0 to something newer
> --
>
> Key: HADOOP-13363
> URL: https://issues.apache.org/jira/browse/HADOOP-13363
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Affects Versions: 3.0.0-alpha1, 3.0.0-alpha2
>Reporter: Anu Engineer
>Assignee: Vinayakumar B
>Priority: Major
>  Labels: security
> Attachments: HADOOP-13363.001.patch, HADOOP-13363.002.patch, 
> HADOOP-13363.003.patch, HADOOP-13363.004.patch, HADOOP-13363.005.patch
>
>
> Standard protobuf 2.5.0 does not work properly on many platforms.  (See, for 
> example, https://gist.github.com/BennettSmith/7111094 ).  In order for us to 
> avoid crazy work arounds in the build environment and the fact that 2.5.0 is 
> starting to slowly disappear as a standard install-able package for even 
> Linux/x86, we need to either upgrade or self bundle or something else.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-13363) Upgrade protobuf from 2.5.0 to something newer

2019-09-30 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-13363?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16940765#comment-16940765
 ] 

Vinayakumar B commented on HADOOP-13363:


There is no solid answer for deciding 3.7.1 instead of 3.9.1.
May be 3.7.1 is fairly tested than 3.9.1 as its fairly new.
Anyway.. not blocking anyone to upgrade directly to 3.9.1. 
As I have verified current trunk compiles with 3.9.1 as well by just passing 
-Dprotobuf.version=3.9.1.

> Upgrade protobuf from 2.5.0 to something newer
> --
>
> Key: HADOOP-13363
> URL: https://issues.apache.org/jira/browse/HADOOP-13363
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Affects Versions: 3.0.0-alpha1, 3.0.0-alpha2
>Reporter: Anu Engineer
>Assignee: Vinayakumar B
>Priority: Major
>  Labels: security
> Attachments: HADOOP-13363.001.patch, HADOOP-13363.002.patch, 
> HADOOP-13363.003.patch, HADOOP-13363.004.patch, HADOOP-13363.005.patch
>
>
> Standard protobuf 2.5.0 does not work properly on many platforms.  (See, for 
> example, https://gist.github.com/BennettSmith/7111094 ).  In order for us to 
> avoid crazy work arounds in the build environment and the fact that 2.5.0 is 
> starting to slowly disappear as a standard install-able package for even 
> Linux/x86, we need to either upgrade or self bundle or something else.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-13363) Upgrade protobuf from 2.5.0 to something newer

2019-09-27 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-13363?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16939645#comment-16939645
 ] 

Vinayakumar B commented on HADOOP-13363:


{quote}
  
     com.google.protobuf
     protobuf-java
     ${protobuf.version}
     ${hadoop.deps.scope}
 
{quote}
I found this in the spark's root pom.xml. {{$\{protobuf.version}}} is 2.5.0 
there. And all the class paths are having 2.5.0 instead of 3.7.1 from trunk 
hadoop.

providing {{-Dprotobuf.version=3.7.1}} would solve I guess.

> Upgrade protobuf from 2.5.0 to something newer
> --
>
> Key: HADOOP-13363
> URL: https://issues.apache.org/jira/browse/HADOOP-13363
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Affects Versions: 3.0.0-alpha1, 3.0.0-alpha2
>Reporter: Anu Engineer
>Assignee: Vinayakumar B
>Priority: Major
>  Labels: security
> Attachments: HADOOP-13363.001.patch, HADOOP-13363.002.patch, 
> HADOOP-13363.003.patch, HADOOP-13363.004.patch, HADOOP-13363.005.patch
>
>
> Standard protobuf 2.5.0 does not work properly on many platforms.  (See, for 
> example, https://gist.github.com/BennettSmith/7111094 ).  In order for us to 
> avoid crazy work arounds in the build environment and the fact that 2.5.0 is 
> starting to slowly disappear as a standard install-able package for even 
> Linux/x86, we need to either upgrade or self bundle or something else.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-13363) Upgrade protobuf from 2.5.0 to something newer

2019-09-27 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-13363?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16939627#comment-16939627
 ] 

Vinayakumar B commented on HADOOP-13363:


protobuf  not available in the classpath transitively?

> Upgrade protobuf from 2.5.0 to something newer
> --
>
> Key: HADOOP-13363
> URL: https://issues.apache.org/jira/browse/HADOOP-13363
> Project: Hadoop Common
>  Issue Type: Improvement
>  Components: build
>Affects Versions: 3.0.0-alpha1, 3.0.0-alpha2
>Reporter: Anu Engineer
>Assignee: Vinayakumar B
>Priority: Major
>  Labels: security
> Attachments: HADOOP-13363.001.patch, HADOOP-13363.002.patch, 
> HADOOP-13363.003.patch, HADOOP-13363.004.patch, HADOOP-13363.005.patch
>
>
> Standard protobuf 2.5.0 does not work properly on many platforms.  (See, for 
> example, https://gist.github.com/BennettSmith/7111094 ).  In order for us to 
> avoid crazy work arounds in the build environment and the fact that 2.5.0 is 
> starting to slowly disappear as a standard install-able package for even 
> Linux/x86, we need to either upgrade or self bundle or something else.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Work started] (HADOOP-16595) [pb-upgrade] Create hadoop-thirdparty artifact to have shaded protobuf

2019-09-27 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16595?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Work on HADOOP-16595 started by Vinayakumar B.
--
> [pb-upgrade] Create hadoop-thirdparty artifact to have shaded protobuf
> --
>
> Key: HADOOP-16595
> URL: https://issues.apache.org/jira/browse/HADOOP-16595
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: hadoop-thirdparty
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> Create a separate repo "hadoop-thirdparty" to have shaded dependencies.
> starting with protobuf-java:3.7.1



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-16596) [pb-upgrade] Use shaded protobuf classes from hadoop-thirdparty dependency

2019-09-24 Thread Vinayakumar B (Jira)
Vinayakumar B created HADOOP-16596:
--

 Summary: [pb-upgrade] Use shaded protobuf classes from 
hadoop-thirdparty dependency
 Key: HADOOP-16596
 URL: https://issues.apache.org/jira/browse/HADOOP-16596
 Project: Hadoop Common
  Issue Type: Sub-task
Reporter: Vinayakumar B


Use the shaded protobuf classes from "hadoop-thirdparty" in hadoop codebase.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Assigned] (HADOOP-16596) [pb-upgrade] Use shaded protobuf classes from hadoop-thirdparty dependency

2019-09-24 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16596?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B reassigned HADOOP-16596:
--

Assignee: Vinayakumar B

> [pb-upgrade] Use shaded protobuf classes from hadoop-thirdparty dependency
> --
>
> Key: HADOOP-16596
> URL: https://issues.apache.org/jira/browse/HADOOP-16596
> Project: Hadoop Common
>  Issue Type: Sub-task
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> Use the shaded protobuf classes from "hadoop-thirdparty" in hadoop codebase.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Assigned] (HADOOP-16595) [pb-upgrade] Create hadoop-thirdparty artifact to have shaded protobuf

2019-09-24 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16595?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B reassigned HADOOP-16595:
--

Assignee: Vinayakumar B

> [pb-upgrade] Create hadoop-thirdparty artifact to have shaded protobuf
> --
>
> Key: HADOOP-16595
> URL: https://issues.apache.org/jira/browse/HADOOP-16595
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: hadoop-thirdparty
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> Create a separate repo "hadoop-thirdparty" to have shaded dependencies.
> starting with protobuf-java:3.7.1



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Created] (HADOOP-16595) [pb-upgrade] Create hadoop-thirdparty artifact to have shaded protobuf

2019-09-24 Thread Vinayakumar B (Jira)
Vinayakumar B created HADOOP-16595:
--

 Summary: [pb-upgrade] Create hadoop-thirdparty artifact to have 
shaded protobuf
 Key: HADOOP-16595
 URL: https://issues.apache.org/jira/browse/HADOOP-16595
 Project: Hadoop Common
  Issue Type: Sub-task
  Components: hadoop-thirdparty
Reporter: Vinayakumar B


Create a separate repo "hadoop-thirdparty" to have shaded dependencies.

starting with protobuf-java:3.7.1



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-16561) [MAPREDUCE] use protobuf-maven-plugin to generate protobuf classes

2019-09-24 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16561?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B resolved HADOOP-16561.

Fix Version/s: 3.3.0
 Hadoop Flags: Reviewed
   Resolution: Fixed

Merged to trunk.

> [MAPREDUCE] use protobuf-maven-plugin to generate protobuf classes
> --
>
> Key: HADOOP-16561
> URL: https://issues.apache.org/jira/browse/HADOOP-16561
> Project: Hadoop Common
>  Issue Type: Sub-task
>Reporter: Vinayakumar B
>Assignee: Duo Zhang
>Priority: Major
> Fix For: 3.3.0
>
>
> Use "protoc-maven-plugin" to dynamically download protobuf executable to 
> generate protobuf classes from proto file



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Resolved] (HADOOP-16560) [YARN] use protobuf-maven-plugin to generate protobuf classes

2019-09-24 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16560?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B resolved HADOOP-16560.

Fix Version/s: 3.3.0
 Hadoop Flags: Reviewed
   Resolution: Fixed

Merged already to trunk

> [YARN] use protobuf-maven-plugin to generate protobuf classes
> -
>
> Key: HADOOP-16560
> URL: https://issues.apache.org/jira/browse/HADOOP-16560
> Project: Hadoop Common
>  Issue Type: Sub-task
>Reporter: Vinayakumar B
>Assignee: Duo Zhang
>Priority: Major
> Fix For: 3.3.0
>
>
> Use "protoc-maven-plugin" to dynamically download protobuf executable to 
> generate protobuf classes from proto file



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16558) [COMMON+HDFS] use protobuf-maven-plugin to generate protobuf classes

2019-09-23 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16558?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B updated HADOOP-16558:
---
Fix Version/s: 3.3.0
 Hadoop Flags: Reviewed
   Resolution: Fixed
   Status: Resolved  (was: Patch Available)

Merged the PR into trunk.

Thanks [~zhangduo] for reviews.

> [COMMON+HDFS] use protobuf-maven-plugin to generate protobuf classes
> 
>
> Key: HADOOP-16558
> URL: https://issues.apache.org/jira/browse/HADOOP-16558
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: common
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
> Fix For: 3.3.0
>
>
> Use "protoc-maven-plugin" to dynamically download protobuf executable to 
> generate protobuf classes from proto files.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Comment Edited] (HADOOP-16559) [HDFS] use protobuf-maven-plugin to generate protobuf classes

2019-09-22 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16559?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16935388#comment-16935388
 ] 

Vinayakumar B edited comment on HADOOP-16559 at 9/22/19 6:49 PM:
-

{quote}And here, since this is not a trivial change, I suggest we revert 
HADOOP-16557 first? Now it is a pain for developpers as we need protobuf 3.7.1 
to compile trunk and then 2.5.0 to compile other branches...
{quote}
Need not switch back and forth to build separate branches. protobuf 3.7.1 can 
be installed in a separate location (ex: /opt/protobuf-3.7.1) and "protoc.path" 
system property pointing to absolute path of 3.7.1 protoc (ex: 
-Dprotoc.path=/opt/protobuf-3.7.1/bin/protoc)  can be used to build trunk until 
we replace protobuf-maven-plugin.

 

Instead of compiling protobuf from source (which is a very time consuming 
task),  can download the applicable exe from maven central 
[here|https://repo1.maven.org/maven2/com/google/protobuf/protoc/3.7.1/] 
directly and provide as "protoc.path". Make sure it exe have executable 
permissions.


was (Author: vinayrpet):
{quote}And here, since this is not a trivial change, I suggest we revert 
HADOOP-16557 first? Now it is a pain for developpers as we need protobuf 3.7.1 
to compile trunk and then 2.5.0 to compile other branches...
{quote}
Need not switch back and forth to build separate branches. protobuf 3.7.1 can 
be installed in a separate location (ex: /opt/protobuf-3.7.1) and "protoc.path" 
system property pointing to absolute path of 3.7.1 protoc (ex: 
-Dprotoc.path=/opt/protobuf-3.7.1/bin/protoc)  can be used to build trunk until 
we replace protobuf-maven-plugin.

> [HDFS] use protobuf-maven-plugin to generate protobuf classes
> -
>
> Key: HADOOP-16559
> URL: https://issues.apache.org/jira/browse/HADOOP-16559
> Project: Hadoop Common
>  Issue Type: Sub-task
>Reporter: Vinayakumar B
>Priority: Major
>
> Use "protoc-maven-plugin" to dynamically download protobuf executable to 
> generate protobuf classes from proto file



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16559) [HDFS] use protobuf-maven-plugin to generate protobuf classes

2019-09-22 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16559?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16935388#comment-16935388
 ] 

Vinayakumar B commented on HADOOP-16559:


{quote}And here, since this is not a trivial change, I suggest we revert 
HADOOP-16557 first? Now it is a pain for developpers as we need protobuf 3.7.1 
to compile trunk and then 2.5.0 to compile other branches...
{quote}
Need not switch back and forth to build separate branches. protobuf 3.7.1 can 
be installed in a separate location (ex: /opt/protobuf-3.7.1) and "protoc.path" 
system property pointing to absolute path of 3.7.1 protoc (ex: 
-Dprotoc.path=/opt/protobuf-3.7.1/bin/protoc)  can be used to build trunk until 
we replace protobuf-maven-plugin.

> [HDFS] use protobuf-maven-plugin to generate protobuf classes
> -
>
> Key: HADOOP-16559
> URL: https://issues.apache.org/jira/browse/HADOOP-16559
> Project: Hadoop Common
>  Issue Type: Sub-task
>Reporter: Vinayakumar B
>Priority: Major
>
> Use "protoc-maven-plugin" to dynamically download protobuf executable to 
> generate protobuf classes from proto file



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16559) [HDFS] use protobuf-maven-plugin to generate protobuf classes

2019-09-21 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16559?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16935118#comment-16935118
 ] 

Vinayakumar B commented on HADOOP-16559:


Thanks [~zhangduo] for pitching in. 

I too have tried updating all instances hadoop-maven-plugin in one PR. But with 
this, precommit jenkins result will never come as its needs to verify all 
modules with unit tests.

It will always gets timedout.

So only decided do in 4 patches. Since HADOOP-16558 combines both COMMON+HDFS, 
total 3 patches required. YARN and MAPREDUCE together also may timeout. So it 
needs to be kept separately.

And morever, default protoc-maven-plugin adds all proto files to jars. This was 
not earlier behavior. So I had excluded them from getting added. Please check 
PR for HADOOP-16558.

 

> [HDFS] use protobuf-maven-plugin to generate protobuf classes
> -
>
> Key: HADOOP-16559
> URL: https://issues.apache.org/jira/browse/HADOOP-16559
> Project: Hadoop Common
>  Issue Type: Sub-task
>Reporter: Vinayakumar B
>Assignee: Duo Zhang
>Priority: Major
>
> Use "protoc-maven-plugin" to dynamically download protobuf executable to 
> generate protobuf classes from proto file



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Commented] (HADOOP-16558) [COMMON+HDFS] use protobuf-maven-plugin to generate protobuf classes

2019-09-21 Thread Vinayakumar B (Jira)


[ 
https://issues.apache.org/jira/browse/HADOOP-16558?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16935115#comment-16935115
 ] 

Vinayakumar B commented on HADOOP-16558:


[~zhangduo], can you take a look at the PR ?

This task replaces the hadoop-maven-plugin for common and hdfs modules.

> [COMMON+HDFS] use protobuf-maven-plugin to generate protobuf classes
> 
>
> Key: HADOOP-16558
> URL: https://issues.apache.org/jira/browse/HADOOP-16558
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: common
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> Use "protoc-maven-plugin" to dynamically download protobuf executable to 
> generate protobuf classes from proto files.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



[jira] [Updated] (HADOOP-16558) [COMMON+HDFS] use protobuf-maven-plugin to generate protobuf classes

2019-09-21 Thread Vinayakumar B (Jira)


 [ 
https://issues.apache.org/jira/browse/HADOOP-16558?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Vinayakumar B updated HADOOP-16558:
---
Status: Patch Available  (was: In Progress)

> [COMMON+HDFS] use protobuf-maven-plugin to generate protobuf classes
> 
>
> Key: HADOOP-16558
> URL: https://issues.apache.org/jira/browse/HADOOP-16558
> Project: Hadoop Common
>  Issue Type: Sub-task
>  Components: common
>Reporter: Vinayakumar B
>Assignee: Vinayakumar B
>Priority: Major
>
> Use "protoc-maven-plugin" to dynamically download protobuf executable to 
> generate protobuf classes from proto files.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org



  1   2   3   4   5   6   7   8   >