[
https://issues.apache.org/jira/browse/HDDS-2218?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17650549#comment-17650549
]
mingchao zhao commented on HDDS-2218:
-------------------------------------
Ozone 1.3.0 had been released and we currently have more than 600 open issues
targeted for 1.3.0. I am moving the target field to 1.4.0.
If there is anything needs to be discussed about the Target Version, Please
reach out to me via Apache email or Slack.
> Use OZONE_CLASSPATH instead of HADOOP_CLASSPATH
> -----------------------------------------------
>
> Key: HDDS-2218
> URL: https://issues.apache.org/jira/browse/HDDS-2218
> Project: Apache Ozone
> Issue Type: Task
> Components: docker
> Reporter: Marton Elek
> Assignee: Sandeep Nemuri
> Priority: Major
> Labels: TriagePending, newbie
>
> HADOOP_CLASSPATH is the standard way to add additional jar files to the
> classpath of the mapreduce/spark/.. .jobs. If something is added to the
> HADOOP_CLASSPATH, than it should be on the classpath of the classic hadoop
> daemons.
> But for the Ozone components we don't need any new jar files (cloud
> connectors, libraries). I think it's more safe to separated HADOOP_CLASSPATH
> from OZONE_CLASSPATH. If something is really need on the classpath for Ozone
> daemons the dedicated environment variable should be used.
>
> Most probably it can be fixed in
> hadoop-hdds/common/src/main/bin/hadoop-functions.sh
> And the hadoop-ozone/dev/src/main/compose files also should be checked (some
> of them contain HADOOP_CLASSPATH
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]