[jira] [Created] (HADOOP-16551) The changelog*.md seems not generated when create-release

2019-09-05 Thread Zhankun Tang (Jira)
Zhankun Tang created HADOOP-16551:
-

 Summary: The changelog*.md seems not generated when create-release
 Key: HADOOP-16551
 URL: https://issues.apache.org/jira/browse/HADOOP-16551
 Project: Hadoop Common
  Issue Type: Task
Reporter: Zhankun Tang


Hi,
 When creating Hadoop 3.1.3 release with "create-release" script, after the mvn 
site succeeded. But it complains about this and failed:

{code:java}
dev-support/bin/create-release --asfrelease --docker --dockercache{code}
{code:java}
$ cd /build/source
$ mv /build/source/target/hadoop-site-3.1.3.tar.gz 
/build/source/target/artifacts/hadoop-3.1.3-site.tar.gz
$ cp -p 
/build/source/hadoop-common-project/hadoop-common/src/site/markdown/release/3.1.3/CHANGES*.md
 /build/source/target/artifacts/CHANGES.md
cp: cannot stat 
'/build/source/hadoop-common-project/hadoop-common/src/site/markdown/release/3.1.3/CHANGES*.md':
 No such file or directory
{code}

And there's no 3.1.3 release site markdown folder.
{code:java}
[ztang@release-vm hadoop]$ ls 
hadoop-common-project/hadoop-common/src/site/markdown/release/3.1.3
ls: cannot access 
hadoop-common-project/hadoop-common/src/site/markdown/release/3.1.3: No such 
file or directory

{code}
I've checked the HADOOP-14671 but have no idea why this changelog is not 
generated.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



Re: [DISCUSS] ARM/aarch64 support for Hadoop

2019-09-05 Thread Vinayakumar B
Thanks @Anu

I understand the concern. I took it in different manner.

Anyway, since protobuf upgrade looks huge, and need everyone's eyes on
changes as early as possible, its better to do it trunk itself.

I was able to come to successfull attempt of upgrading protobuf as per
suggestion of stack in Jira
https://issues.apache.org/jira/browse/HADOOP-13363?focusedCommentId=15958253=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-15958253
 .

Have created the PR. Please review.  Changes looks huge, because all
references of "com.google.protobuf" relocated to
"o.a.h.shaded.com.google.protobuf".
Otherwise changes are reasonable.

This change is with still keeping the current 2.5.0 dependency, for
downstream builds. So essentially nothing should be changed for downstreams.

-Vinay


On Thu, Sep 5, 2019 at 9:56 PM Anu Engineer  wrote:

> Yes, I think that is what Sunil and I are trying to suggest; the complex
> dependencies like Protobuf, if you do it in the trunk you have a better
> change of getting it done. Otherwise, at merge point random downstream
> applications which you have never heard of will object, and Hadoop
> compatibility rules are very clear so you cannot fix it.
>
> With that said, even doing this in the trunk is complex; It might be good
> for you to host a meeting and get some feedback. I have openly said it is a
> great idea like "belling the cat", but the effort is in getting the
> community to agree and align. Solve that, most of your technical problems
> will be easier to solve.
>
> If you go into a branch, it might be that the community might forget about
> your work; and when you come in to merge you will see issues which you did
> not think about.
>
> So, Here is what would be great if you can make this happen; for ARM work,
> get a list of dependencies that needed to be upgraded; see if you can get
> the community aligned with this goal; since ARM might not be in the target
> for many users. You need to convince users that even without ARM, this is a
> great idea.
>
> If you like we can get together during one of the HDFS meetups hosted by
> Wei-chiu.
>
> Thanks
> Anu
>
>
>
> On Thu, Sep 5, 2019 at 3:19 AM Vinayakumar B 
> wrote:
>
>> Hi all,
>>
>> Thanks for the response.
>> As I see, protobuf upgrade is long pending and most awaited one.
>>
>> @Sunil
>> Protobuf upgrade looks to be a non-trivial task.
>> Thanks @Duo Zhang for the suggestion of
>> 'org.xolstice.maven.plugins:protobuf-maven-plugin'. This solves the
>> problem
>> of dependency on build environment.
>> However more problem lies in upgrade protobuf without breaking the
>> downstream builds.
>> Reason is many downstream projects directly refers server specific jars
>> and
>> they expect protobuf-2.5.0 jar to get added to classpath by transitive
>> dependency.
>>
>> There are some historical discussions and suggestions on
>> https://issues.apache.org/jira/browse/HADOOP-13363 related to protobuf
>> upgrade.
>> Recommended path for solution is, try to upgrade protobuf using shading of
>> latest protobuf for hadoop, and still keep protobuf-2.5.0 jar in
>> dependencies for downstreams.
>> I am trying out ideas on this and if it gets completed within time, may be
>> we can target trunk itself for the protobuf upgrade.
>>
>> separate branch idea is suggested for the overall ARM support including
>> protobuf upgrade and other problems mentioned in the discussion above.
>>
>> I dont expect separate branch to have a huge changes, apart from bug
>> fixes,
>> since there are no separate features specific to ARM is being planned.
>> So timely rebase of separate branch would reduce the overhead on branch
>> review/merge task.
>>
>> Still, if the solution to protobuf upgrade winds up early, without any
>> side
>> effects, I am more than happy to land it in trunk itself.
>>
>> Thanks,
>> Vinay
>> On Thu, Sep 5, 2019 at 2:27 PM Sunil Govindan  wrote:
>>
>> > Thanks Vinay for starting the thread.
>> >
>> > I agree to Anu's view point related to protobuf. And with the suggestion
>> > pointed out by Duo Zhang, if we can make use
>> > of org.xolstice.maven.plugins:protobuf-maven-plugin, our upgrade to
>> 3.0.0
>> > of protobuf will also be more easier.
>> >
>> > However i think its better to do this effort in trunk itself.
>> > In offline talks, few members were interested to start 3.3.0 release.
>> And
>> > given that happens soon, I feel its better
>> > we do this task in trunk itself as branch diverge is very much possible.
>> > And to bring to call a merge on such a big branch will be even more
>> tough
>> > task.
>> >
>> > my 2 cents.
>> >
>> > Thanks
>> > Sunil
>> >
>> > On Thu, Sep 5, 2019 at 6:04 AM 张铎(Duo Zhang) 
>> > wrote:
>> >
>> >> Suggest to use org.xolstice.maven.plugins:protobuf-maven-plugin to
>> >> generate
>> >> the protobuf code. It will download the protoc binary from the maven
>> >> central so we do not need to install protoc on the build machine any
>> more.
>> >>
>> >> Zhenyu Zheng  于2019年9月4日周三 

Re: [DISCUSS] ARM/aarch64 support for Hadoop

2019-09-05 Thread Anu Engineer
Yes, I think that is what Sunil and I are trying to suggest; the complex
dependencies like Protobuf, if you do it in the trunk you have a better
change of getting it done. Otherwise, at merge point random downstream
applications which you have never heard of will object, and Hadoop
compatibility rules are very clear so you cannot fix it.

With that said, even doing this in the trunk is complex; It might be good
for you to host a meeting and get some feedback. I have openly said it is a
great idea like "belling the cat", but the effort is in getting the
community to agree and align. Solve that, most of your technical problems
will be easier to solve.

If you go into a branch, it might be that the community might forget about
your work; and when you come in to merge you will see issues which you did
not think about.

So, Here is what would be great if you can make this happen; for ARM work,
get a list of dependencies that needed to be upgraded; see if you can get
the community aligned with this goal; since ARM might not be in the target
for many users. You need to convince users that even without ARM, this is a
great idea.

If you like we can get together during one of the HDFS meetups hosted by
Wei-chiu.

Thanks
Anu



On Thu, Sep 5, 2019 at 3:19 AM Vinayakumar B 
wrote:

> Hi all,
>
> Thanks for the response.
> As I see, protobuf upgrade is long pending and most awaited one.
>
> @Sunil
> Protobuf upgrade looks to be a non-trivial task.
> Thanks @Duo Zhang for the suggestion of
> 'org.xolstice.maven.plugins:protobuf-maven-plugin'. This solves the problem
> of dependency on build environment.
> However more problem lies in upgrade protobuf without breaking the
> downstream builds.
> Reason is many downstream projects directly refers server specific jars and
> they expect protobuf-2.5.0 jar to get added to classpath by transitive
> dependency.
>
> There are some historical discussions and suggestions on
> https://issues.apache.org/jira/browse/HADOOP-13363 related to protobuf
> upgrade.
> Recommended path for solution is, try to upgrade protobuf using shading of
> latest protobuf for hadoop, and still keep protobuf-2.5.0 jar in
> dependencies for downstreams.
> I am trying out ideas on this and if it gets completed within time, may be
> we can target trunk itself for the protobuf upgrade.
>
> separate branch idea is suggested for the overall ARM support including
> protobuf upgrade and other problems mentioned in the discussion above.
>
> I dont expect separate branch to have a huge changes, apart from bug fixes,
> since there are no separate features specific to ARM is being planned.
> So timely rebase of separate branch would reduce the overhead on branch
> review/merge task.
>
> Still, if the solution to protobuf upgrade winds up early, without any side
> effects, I am more than happy to land it in trunk itself.
>
> Thanks,
> Vinay
> On Thu, Sep 5, 2019 at 2:27 PM Sunil Govindan  wrote:
>
> > Thanks Vinay for starting the thread.
> >
> > I agree to Anu's view point related to protobuf. And with the suggestion
> > pointed out by Duo Zhang, if we can make use
> > of org.xolstice.maven.plugins:protobuf-maven-plugin, our upgrade to 3.0.0
> > of protobuf will also be more easier.
> >
> > However i think its better to do this effort in trunk itself.
> > In offline talks, few members were interested to start 3.3.0 release. And
> > given that happens soon, I feel its better
> > we do this task in trunk itself as branch diverge is very much possible.
> > And to bring to call a merge on such a big branch will be even more tough
> > task.
> >
> > my 2 cents.
> >
> > Thanks
> > Sunil
> >
> > On Thu, Sep 5, 2019 at 6:04 AM 张铎(Duo Zhang) 
> > wrote:
> >
> >> Suggest to use org.xolstice.maven.plugins:protobuf-maven-plugin to
> >> generate
> >> the protobuf code. It will download the protoc binary from the maven
> >> central so we do not need to install protoc on the build machine any
> more.
> >>
> >> Zhenyu Zheng  于2019年9月4日周三 下午5:27写道:
> >>
> >> > BTW, I also noticed that the Hadoop-trunk-Commit job has been failling
> >> for
> >> > over 2 month related to the Protobuf problem .
> >> > According to the latest successful build log:
> >> >
> >>
> https://builds.apache.org/job/Hadoop-trunk-Commit/lastSuccessfulBuild/consoleFull
> >> > the
> >> > os version was ubuntu 14.04 and for the jobs that are failling now
> such
> >> > as: https://builds.apache.org/job/Hadoop-trunk-Commit/17222/console,
> >> > the os version is 18.04. I'm not very familiar with the version
> changing
> >> > for the jobs but I did a little search, according to:
> >> >
> >> >
> >>
> https://packages.ubuntu.com/search?keywords=protobuf-compiler=names
> >> > &
> >> >
> >> >
> >>
> https://packages.ubuntu.com/search?suite=default=all=any=libprotoc-dev=names
> >> > it both said that the version of libprotc-dev and protobuf-compiler
> >> > available for ubuntu 18.04 is 3.0.0
> >> >
> >> >
> >> > On Wed, Sep 4, 2019 at 4:39 PM Ayush Saxena 
> wrote:
> >> >

Apache Hadoop qbt Report: branch2+JDK7 on Linux/x86

2019-09-05 Thread Apache Jenkins Server
For more details, see 
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/435/

No changes




-1 overall


The following subsystems voted -1:
asflicense findbugs hadolint pathlen unit xml


The following subsystems voted -1 but
were configured to be filtered/ignored:
cc checkstyle javac javadoc pylint shellcheck shelldocs whitespace


The following subsystems are considered long running:
(runtime bigger than 1h  0m  0s)
unit


Specific tests:

XML :

   Parsing Error(s): 
   
hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/conf/empty-configuration.xml
 
   hadoop-tools/hadoop-azure/src/config/checkstyle-suppressions.xml 
   hadoop-yarn-project/hadoop-yarn/hadoop-yarn-ui/public/crossdomain.xml 
   
hadoop-yarn-project/hadoop-yarn/hadoop-yarn-ui/src/main/webapp/public/crossdomain.xml
 

FindBugs :

   
module:hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-timelineservice-hbase/hadoop-yarn-server-timelineservice-hbase-client
 
   Boxed value is unboxed and then immediately reboxed in 
org.apache.hadoop.yarn.server.timelineservice.storage.common.ColumnRWHelper.readResultsWithTimestamps(Result,
 byte[], byte[], KeyConverter, ValueConverter, boolean) At 
ColumnRWHelper.java:then immediately reboxed in 
org.apache.hadoop.yarn.server.timelineservice.storage.common.ColumnRWHelper.readResultsWithTimestamps(Result,
 byte[], byte[], KeyConverter, ValueConverter, boolean) At 
ColumnRWHelper.java:[line 335] 

Failed junit tests :

   hadoop.util.TestReadWriteDiskValidator 
   hadoop.hdfs.TestQuota 
   hadoop.hdfs.server.namenode.TestNameNodeHttpServerXFrame 
   hadoop.hdfs.qjournal.server.TestJournalNodeRespectsBindHostKeys 
   hadoop.hdfs.server.datanode.TestDirectoryScanner 
   hadoop.yarn.client.api.impl.TestAMRMClient 
   hadoop.registry.secure.TestSecureLogins 
   hadoop.yarn.server.timelineservice.security.TestTimelineAuthFilterForV2 
  

   cc:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/435/artifact/out/diff-compile-cc-root-jdk1.7.0_95.txt
  [4.0K]

   javac:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/435/artifact/out/diff-compile-javac-root-jdk1.7.0_95.txt
  [328K]

   cc:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/435/artifact/out/diff-compile-cc-root-jdk1.8.0_222.txt
  [4.0K]

   javac:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/435/artifact/out/diff-compile-javac-root-jdk1.8.0_222.txt
  [308K]

   checkstyle:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/435/artifact/out/diff-checkstyle-root.txt
  [16M]

   hadolint:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/435/artifact/out/diff-patch-hadolint.txt
  [4.0K]

   pathlen:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/435/artifact/out/pathlen.txt
  [12K]

   pylint:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/435/artifact/out/diff-patch-pylint.txt
  [24K]

   shellcheck:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/435/artifact/out/diff-patch-shellcheck.txt
  [72K]

   shelldocs:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/435/artifact/out/diff-patch-shelldocs.txt
  [8.0K]

   whitespace:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/435/artifact/out/whitespace-eol.txt
  [12M]
   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/435/artifact/out/whitespace-tabs.txt
  [1.3M]

   xml:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/435/artifact/out/xml.txt
  [12K]

   findbugs:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/435/artifact/out/branch-findbugs-hadoop-yarn-project_hadoop-yarn_hadoop-yarn-server_hadoop-yarn-server-timelineservice-hbase_hadoop-yarn-server-timelineservice-hbase-client-warnings.html
  [8.0K]

   javadoc:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/435/artifact/out/diff-javadoc-javadoc-root-jdk1.7.0_95.txt
  [16K]
   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/435/artifact/out/diff-javadoc-javadoc-root-jdk1.8.0_222.txt
  [1.1M]

   unit:

   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/435/artifact/out/patch-unit-hadoop-common-project_hadoop-common.txt
  [160K]
   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/435/artifact/out/patch-unit-hadoop-hdfs-project_hadoop-hdfs.txt
  [236K]
   
https://builds.apache.org/job/hadoop-qbt-branch2-java7-linux-x86/435/artifact/out/patch-unit-hadoop-mapreduce-project_hadoop-mapreduce-client_hadoop-mapreduce-client-jobclient.txt
  [96K]
   

Re: [VOTE] Moving Submarine to a separate Apache project proposal

2019-09-05 Thread Abhishek Modi
+1, Thanks for the proposal.

I am interested in project. Please include me.

On Thu, Sep 5, 2019 at 10:11 AM Uma Maheswara Rao Gangumalla <
umaganguma...@gmail.com> wrote:

> +1
>
> Regards,
> Uma
>
> On Sat, Aug 31, 2019, 10:19 PM Wangda Tan  wrote:
>
> > Hi all,
> >
> > As we discussed in the previous thread [1],
> >
> > I just moved the spin-off proposal to CWIKI and completed all TODO parts.
> >
> >
> >
> https://cwiki.apache.org/confluence/display/HADOOP/Submarine+Project+Spin-Off+to+TLP+Proposal
> >
> > If you have interests to learn more about this. Please review the
> proposal
> > let me know if you have any questions/suggestions for the proposal. This
> > will be sent to board post voting passed. (And please note that the
> > previous voting thread [2] to move Submarine to a separate Github repo
> is a
> > necessary effort to move Submarine to a separate Apache project but not
> > sufficient so I sent two separate voting thread.)
> >
> > Please let me know if I missed anyone in the proposal, and reply if you'd
> > like to be included in the project.
> >
> > This voting runs for 7 days and will be concluded at Sep 7th, 11 PM PDT.
> >
> > Thanks,
> > Wangda Tan
> >
> > [1]
> >
> >
> https://lists.apache.org/thread.html/4a2210d567cbc05af92c12aa6283fd09b857ce209d537986ed800029@%3Cyarn-dev.hadoop.apache.org%3E
> > [2]
> >
> >
> https://lists.apache.org/thread.html/6e94469ca105d5a15dc63903a541bd21c7ef70b8bcff475a16b5ed73@%3Cyarn-dev.hadoop.apache.org%3E
> >
>


-- 
With Regards,
Abhishek Modi
Member of Technical Staff,
Qubole Private Ltd,
Bengaluru
Mobile: +91-9560486536


[jira] [Created] (HADOOP-16550) Wrong Spark config name on the "Launching Applications Using Docker Containers" page

2019-09-05 Thread Attila Zsolt Piros (Jira)
Attila Zsolt Piros created HADOOP-16550:
---

 Summary: Wrong Spark config name on the "Launching Applications 
Using Docker Containers" page
 Key: HADOOP-16550
 URL: https://issues.apache.org/jira/browse/HADOOP-16550
 Project: Hadoop Common
  Issue Type: Bug
  Components: documentation
Affects Versions: 3.1.2, 2.8.5, 3.0.3, 2.9.2, 3.1.1, 3.0.2, 2.8.4, 3.0.1, 
2.9.1, 3.1.0, 3.0.0, 2.8.3, 2.8.2, 2.9.0
Reporter: Attila Zsolt Piros


On the "Launching Applications Using Docker Containers" page at the "Example: 
Spark" section the Spark config for configuring the environment variables for 
the application master the config prefix are wrong:
- 
spark.yarn.{color:#DE350B}*A*{color}ppMasterEnv.YARN_CONTAINER_RUNTIME_DOCKER_IMAGE
- park.yarn.{color:#DE350B}*A*{color}ppMasterEnv.YARN_CONTAINER_RUNTIME_TYPE  

The correct ones:
- spark.yarn.appMasterEnv.YARN_CONTAINER_RUNTIME_DOCKER_IMAGE
- spark.yarn.appMasterEnv.YARN_CONTAINER_RUNTIME_TYPE

See https://spark.apache.org/docs/2.4.0/running-on-yarn.html:

{quote}
spark.yarn.appMasterEnv.[EnvironmentVariableName]
{quote}




--
This message was sent by Atlassian Jira
(v8.3.2#803003)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



Re: [DISCUSS] ARM/aarch64 support for Hadoop

2019-09-05 Thread Vinayakumar B
Hi all,

Thanks for the response.
As I see, protobuf upgrade is long pending and most awaited one.

@Sunil
Protobuf upgrade looks to be a non-trivial task.
Thanks @Duo Zhang for the suggestion of
'org.xolstice.maven.plugins:protobuf-maven-plugin'. This solves the problem
of dependency on build environment.
However more problem lies in upgrade protobuf without breaking the
downstream builds.
Reason is many downstream projects directly refers server specific jars and
they expect protobuf-2.5.0 jar to get added to classpath by transitive
dependency.

There are some historical discussions and suggestions on
https://issues.apache.org/jira/browse/HADOOP-13363 related to protobuf
upgrade.
Recommended path for solution is, try to upgrade protobuf using shading of
latest protobuf for hadoop, and still keep protobuf-2.5.0 jar in
dependencies for downstreams.
I am trying out ideas on this and if it gets completed within time, may be
we can target trunk itself for the protobuf upgrade.

separate branch idea is suggested for the overall ARM support including
protobuf upgrade and other problems mentioned in the discussion above.

I dont expect separate branch to have a huge changes, apart from bug fixes,
since there are no separate features specific to ARM is being planned.
So timely rebase of separate branch would reduce the overhead on branch
review/merge task.

Still, if the solution to protobuf upgrade winds up early, without any side
effects, I am more than happy to land it in trunk itself.

Thanks,
Vinay
On Thu, Sep 5, 2019 at 2:27 PM Sunil Govindan  wrote:

> Thanks Vinay for starting the thread.
>
> I agree to Anu's view point related to protobuf. And with the suggestion
> pointed out by Duo Zhang, if we can make use
> of org.xolstice.maven.plugins:protobuf-maven-plugin, our upgrade to 3.0.0
> of protobuf will also be more easier.
>
> However i think its better to do this effort in trunk itself.
> In offline talks, few members were interested to start 3.3.0 release. And
> given that happens soon, I feel its better
> we do this task in trunk itself as branch diverge is very much possible.
> And to bring to call a merge on such a big branch will be even more tough
> task.
>
> my 2 cents.
>
> Thanks
> Sunil
>
> On Thu, Sep 5, 2019 at 6:04 AM 张铎(Duo Zhang) 
> wrote:
>
>> Suggest to use org.xolstice.maven.plugins:protobuf-maven-plugin to
>> generate
>> the protobuf code. It will download the protoc binary from the maven
>> central so we do not need to install protoc on the build machine any more.
>>
>> Zhenyu Zheng  于2019年9月4日周三 下午5:27写道:
>>
>> > BTW, I also noticed that the Hadoop-trunk-Commit job has been failling
>> for
>> > over 2 month related to the Protobuf problem .
>> > According to the latest successful build log:
>> >
>> https://builds.apache.org/job/Hadoop-trunk-Commit/lastSuccessfulBuild/consoleFull
>> > the
>> > os version was ubuntu 14.04 and for the jobs that are failling now such
>> > as: https://builds.apache.org/job/Hadoop-trunk-Commit/17222/console,
>> > the os version is 18.04. I'm not very familiar with the version changing
>> > for the jobs but I did a little search, according to:
>> >
>> >
>> https://packages.ubuntu.com/search?keywords=protobuf-compiler=names
>> > &
>> >
>> >
>> https://packages.ubuntu.com/search?suite=default=all=any=libprotoc-dev=names
>> > it both said that the version of libprotc-dev and protobuf-compiler
>> > available for ubuntu 18.04 is 3.0.0
>> >
>> >
>> > On Wed, Sep 4, 2019 at 4:39 PM Ayush Saxena  wrote:
>> >
>> >> Thanx Vinay for the initiative, Makes sense to add support for
>> different
>> >> architectures.
>> >>
>> >> +1, for the branch idea.
>> >> Good Luck!!!
>> >>
>> >> -Ayush
>> >>
>> >> > On 03-Sep-2019, at 6:19 AM, 张铎(Duo Zhang) 
>> >> wrote:
>> >> >
>> >> > For HBase, we purged all the protobuf related things from the public
>> >> API,
>> >> > and then upgraded to a shaded and relocated version of protobuf. We
>> have
>> >> > created a repo for this:
>> >> >
>> >> > https://github.com/apache/hbase-thirdparty
>> >> >
>> >> > But since the hadoop dependencies still pull in the protobuf 2.5
>> jars,
>> >> our
>> >> > coprocessors are still on protobuf 2.5. Recently we have opened a
>> >> discuss
>> >> > on how to deal with the upgrading of coprocessor. Glad to see that
>> the
>> >> > hadoop community is also willing to solve the problem.
>> >> >
>> >> > Anu Engineer  于2019年9月3日周二 上午1:23写道:
>> >> >
>> >> >> +1, for the branch idea. Just FYI, Your biggest problem is proving
>> that
>> >> >> Hadoop and the downstream projects work correctly after you upgrade
>> >> core
>> >> >> components like Protobuf.
>> >> >> So while branching and working on a branch is easy, merging back
>> after
>> >> you
>> >> >> upgrade some of these core components is insanely hard. You might
>> want
>> >> to
>> >> >> make sure that community buys into upgrading these components in the
>> >> trunk.
>> >> >> That way we will get testing and downstream components will notice
>> when
>> 

Re: [DISCUSS] ARM/aarch64 support for Hadoop

2019-09-05 Thread Sunil Govindan
Thanks Vinay for starting the thread.

I agree to Anu's view point related to protobuf. And with the suggestion
pointed out by Duo Zhang, if we can make use
of org.xolstice.maven.plugins:protobuf-maven-plugin, our upgrade to 3.0.0
of protobuf will also be more easier.

However i think its better to do this effort in trunk itself.
In offline talks, few members were interested to start 3.3.0 release. And
given that happens soon, I feel its better
we do this task in trunk itself as branch diverge is very much possible.
And to bring to call a merge on such a big branch will be even more tough
task.

my 2 cents.

Thanks
Sunil

On Thu, Sep 5, 2019 at 6:04 AM 张铎(Duo Zhang)  wrote:

> Suggest to use org.xolstice.maven.plugins:protobuf-maven-plugin to generate
> the protobuf code. It will download the protoc binary from the maven
> central so we do not need to install protoc on the build machine any more.
>
> Zhenyu Zheng  于2019年9月4日周三 下午5:27写道:
>
> > BTW, I also noticed that the Hadoop-trunk-Commit job has been failling
> for
> > over 2 month related to the Protobuf problem .
> > According to the latest successful build log:
> >
> https://builds.apache.org/job/Hadoop-trunk-Commit/lastSuccessfulBuild/consoleFull
> > the
> > os version was ubuntu 14.04 and for the jobs that are failling now such
> > as: https://builds.apache.org/job/Hadoop-trunk-Commit/17222/console,
> > the os version is 18.04. I'm not very familiar with the version changing
> > for the jobs but I did a little search, according to:
> >
> >
> https://packages.ubuntu.com/search?keywords=protobuf-compiler=names
> > &
> >
> >
> https://packages.ubuntu.com/search?suite=default=all=any=libprotoc-dev=names
> > it both said that the version of libprotc-dev and protobuf-compiler
> > available for ubuntu 18.04 is 3.0.0
> >
> >
> > On Wed, Sep 4, 2019 at 4:39 PM Ayush Saxena  wrote:
> >
> >> Thanx Vinay for the initiative, Makes sense to add support for different
> >> architectures.
> >>
> >> +1, for the branch idea.
> >> Good Luck!!!
> >>
> >> -Ayush
> >>
> >> > On 03-Sep-2019, at 6:19 AM, 张铎(Duo Zhang) 
> >> wrote:
> >> >
> >> > For HBase, we purged all the protobuf related things from the public
> >> API,
> >> > and then upgraded to a shaded and relocated version of protobuf. We
> have
> >> > created a repo for this:
> >> >
> >> > https://github.com/apache/hbase-thirdparty
> >> >
> >> > But since the hadoop dependencies still pull in the protobuf 2.5 jars,
> >> our
> >> > coprocessors are still on protobuf 2.5. Recently we have opened a
> >> discuss
> >> > on how to deal with the upgrading of coprocessor. Glad to see that the
> >> > hadoop community is also willing to solve the problem.
> >> >
> >> > Anu Engineer  于2019年9月3日周二 上午1:23写道:
> >> >
> >> >> +1, for the branch idea. Just FYI, Your biggest problem is proving
> that
> >> >> Hadoop and the downstream projects work correctly after you upgrade
> >> core
> >> >> components like Protobuf.
> >> >> So while branching and working on a branch is easy, merging back
> after
> >> you
> >> >> upgrade some of these core components is insanely hard. You might
> want
> >> to
> >> >> make sure that community buys into upgrading these components in the
> >> trunk.
> >> >> That way we will get testing and downstream components will notice
> when
> >> >> things break.
> >> >>
> >> >> That said, I have lobbied for the upgrade of Protobuf for a really
> long
> >> >> time; I have argued that 2.5 is out of support and we cannot stay on
> >> that
> >> >> branch forever; or we need to take ownership of the Protobuf 2.5 code
> >> base.
> >> >> It has been rightly pointed to me that while all the arguments I make
> >> is
> >> >> correct; it is a very complicated task to upgrade Protobuf, and the
> >> worst
> >> >> part is we will not even know what breaks until downstream projects
> >> pick up
> >> >> these changes and work against us.
> >> >>
> >> >> If we work off the Hadoop version 3 — and assume that we have
> >> "shading" in
> >> >> place for all deployments; it might be possible to get there; still a
> >> >> daunting task.
> >> >>
> >> >> So best of luck with the branch approach — But please remember,
> Merging
> >> >> back will be hard, Just my 2 cents.
> >> >>
> >> >> — Anu
> >> >>
> >> >>
> >> >>
> >> >>
> >> >> On Sun, Sep 1, 2019 at 7:40 PM Zhenyu Zheng <
> zhengzhenyul...@gmail.com
> >> >
> >> >> wrote:
> >> >>
> >> >>> Hi,
> >> >>>
> >> >>> Thanks Vinaya for bring this up and thanks Sheng for the idea. A
> >> separate
> >> >>> branch with it's own ARM CI seems a really good idea.
> >> >>> By doing this we won't break any of the undergoing development in
> >> trunk
> >> >> and
> >> >>> a CI can be a very good way to show what are the
> >> >>> current problems and what have been fixed, it will also provide a
> very
> >> >> good
> >> >>> view for contributors that are intrested to working on
> >> >>> this. We can finally merge back the branch to trunk until the
> >> community
> >> >>> thinks it is good enough and 

[jira] [Created] (HADOOP-16549) Remove Unsupported SSL/TLS Properties from Docs/Properties

2019-09-05 Thread Daisuke Kobayashi (Jira)
Daisuke Kobayashi created HADOOP-16549:
--

 Summary: Remove Unsupported SSL/TLS Properties from Docs/Properties
 Key: HADOOP-16549
 URL: https://issues.apache.org/jira/browse/HADOOP-16549
 Project: Hadoop Common
  Issue Type: Improvement
  Components: documentation, security
Reporter: Daisuke Kobayashi


We should remove the following unsupported versions from docs and 
core-default.xml appropriately.

TLS v1.0
TLS v1.1
SSL v3
SSLv2Hello

ref: 
https://www.eclipse.org/jetty/documentation/9.3.27.v20190418/configuring-ssl.html
https://github.com/eclipse/jetty.project/issues/866

[~aajisaka], I happened to find you left TLSv1.1 in 
https://issues.apache.org/jira/browse/HADOOP-16000. Should we still keep it?



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org



Apache Hadoop qbt Report: trunk+JDK8 on Linux/x86

2019-09-05 Thread Apache Jenkins Server
For more details, see 
https://builds.apache.org/job/hadoop-qbt-trunk-java8-linux-x86/1249/

[Sep 3, 2019 8:24:32 PM] (github) HDDS-1909. Use new HA code for Non-HA in OM. 
(#1225)
[Sep 3, 2019 9:06:14 PM] (github) HDDS-2018. Handle Set DtService of token for 
OM HA. (#1371)
[Sep 4, 2019 9:39:09 AM] (surendralilhore) HDFS-14777. RBF: Set ReadOnly is 
failing for mount Table but actually
[Sep 4, 2019 9:58:59 AM] (nanda) HDDS-2077. Add maven-gpg-plugin.version to 
pom.ozone.xml. (#1396)
[Sep 4, 2019 3:22:02 PM] (xkrogen) HADOOP-16268. Allow StandbyException to be 
thrown as


[Error replacing 'FILE' - Workspace is not accessible]

-
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org