[
https://issues.apache.org/jira/browse/OOZIE-3227?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16499906#comment-16499906
]
Hadoop QA commented on OOZIE-3227:
----------------------------------
Testing JIRA OOZIE-3227
Cleaning local git workspace
----------------------------
{color:green}+1 PATCH_APPLIES{color}
{color:green}+1 CLEAN{color}
{color:green}+1 RAW_PATCH_ANALYSIS{color}
. {color:green}+1{color} the patch does not introduce any @author tags
. {color:green}+1{color} the patch does not introduce any tabs
. {color:green}+1{color} the patch does not introduce any trailing spaces
. {color:green}+1{color} the patch does not introduce any line longer than
132
. {color:green}+1{color} the patch adds/modifies 2 testcase(s)
{color:green}+1 RAT{color}
. {color:green}+1{color} the patch does not seem to introduce new RAT
warnings
{color:green}+1 JAVADOC{color}
{color:green}+1 JAVADOC{color}
. {color:green}+1{color} the patch does not seem to introduce new Javadoc
warning(s)
. {color:green}+1{color} the patch does not seem to introduce new Javadoc
error(s)
. {color:red}ERROR{color}: the current HEAD has 2 Javadoc error(s)
{color:green}+1 COMPILE{color}
. {color:green}+1{color} HEAD compiles
. {color:green}+1{color} patch compiles
. {color:green}+1{color} the patch does not seem to introduce new javac
warnings
{color:green}+1{color} There are no new bugs found in total.
. {color:green}+1{color} There are no new bugs found in [examples].
. {color:green}+1{color} There are no new bugs found in [webapp].
. {color:green}+1{color} There are no new bugs found in [core].
. {color:green}+1{color} There are no new bugs found in [tools].
. {color:green}+1{color} There are no new bugs found in [server].
. {color:green}+1{color} There are no new bugs found in [docs].
. {color:green}+1{color} There are no new bugs found in [sharelib/hive2].
. {color:green}+1{color} There are no new bugs found in [sharelib/pig].
. {color:green}+1{color} There are no new bugs found in [sharelib/streaming].
. {color:green}+1{color} There are no new bugs found in [sharelib/hive].
. {color:green}+1{color} There are no new bugs found in [sharelib/hcatalog].
. {color:green}+1{color} There are no new bugs found in [sharelib/sqoop].
. {color:green}+1{color} There are no new bugs found in [sharelib/oozie].
. {color:green}+1{color} There are no new bugs found in [sharelib/distcp].
. {color:green}+1{color} There are no new bugs found in [sharelib/spark].
. {color:green}+1{color} There are no new bugs found in [client].
{color:green}+1 BACKWARDS_COMPATIBILITY{color}
. {color:green}+1{color} the patch does not change any JPA
Entity/Colum/Basic/Lob/Transient annotations
. {color:green}+1{color} the patch does not modify JPA files
{color:green}+1 TESTS{color}
. Tests run: 2146
{color:green}+1 DISTRO{color}
. {color:green}+1{color} distro tarball builds with the patch
----------------------------
{color:green}*+1 Overall result, good!, no -1s*{color}
The full output of the test-patch run is available at
. https://builds.apache.org/job/PreCommit-OOZIE-Build/600/
> Eliminate duplicated dependencies from distributed cache
> --------------------------------------------------------
>
> Key: OOZIE-3227
> URL: https://issues.apache.org/jira/browse/OOZIE-3227
> Project: Oozie
> Issue Type: Sub-task
> Components: core
> Affects Versions: 5.0.0, 4.3.1
> Reporter: Denes Bodo
> Assignee: Denes Bodo
> Priority: Major
> Attachments: OOZIE-3227_001.patch, OOZIE-3227_002.patch
>
>
> Using Hadoop 3 it is not allowed to have multiple dependencies with same file
> names on the list of *mapreduce.job.cache.files*.
> The issue occurs when I have the same file name on multiple sharelib folders
> and/or my application's lib folder. This can be avoided but not easy all the
> time.
> I suggest to remove the duplicates from this list.
> A quick workaround for the source code in JavaActionExecutor is like:
> {code}
> removeDuplicatedDependencies(launcherJobConf,
> "mapreduce.job.cache.files");
> removeDuplicatedDependencies(launcherJobConf,
> "mapreduce.job.cache.archives");
> ......
> private void removeDuplicatedDependencies(JobConf conf, String key) {
> final Map<String, String> nameToPath = new HashMap<>();
> StringBuilder uniqList = new StringBuilder();
> for(String dependency: conf.get(key).split(",")) {
> final String[] arr = dependency.split("/");
> final String dependencyName = arr[arr.length - 1];
> if(nameToPath.containsKey(dependencyName)) {
> LOG.warn(dependencyName + " [" + dependency + "] is already
> defined in " + key + ". Skipping...");
> } else {
> nameToPath.put(dependencyName, dependency);
> uniqList.append(dependency).append(",");
> }
> }
> uniqList.setLength(uniqList.length() - 1);
> conf.set(key, uniqList.toString());
> }
> {code}
> Other way is to eliminate the deprecated
> *org.apache.hadoop.filecache.DistributedCache*.
> I am going to have a deeper understanding how we should use distributed cache
> and all the comments are welcome.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)