[ 
https://issues.apache.org/jira/browse/HDDS-600?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16644123#comment-16644123
 ] 

Namit Maheshwari commented on HDDS-600:
---------------------------------------

Thanks [~hanishakoneru]. With the correct URL Mapreduce job fails as below:
{code:java}
[root@ctr-e138-1518143905142-510793-01-000002 ~]# su - hdfs
Last login: Tue Oct 9 07:11:08 UTC 2018
-bash-4.2$ /usr/hdp/current/hadoop-client/bin/hadoop jar 
/usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-examples.jar 
wordcount /tmp/mr_jobs/input/ o3://bucket1.volume1/mr_job_dir/output
18/10/09 20:24:07 INFO conf.Configuration: Removed undeclared tags:
18/10/09 20:24:08 INFO conf.Configuration: Removed undeclared tags:
18/10/09 20:24:08 INFO conf.Configuration: Removed undeclared tags:
18/10/09 20:24:08 INFO client.AHSProxy: Connecting to Application History 
server at ctr-e138-1518143905142-510793-01-000004.hwx.site/172.27.79.197:10200
18/10/09 20:24:09 INFO client.ConfiguredRMFailoverProxyProvider: Failing over 
to rm2
18/10/09 20:24:09 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding 
for path: /user/hdfs/.staging/job_1539069219098_0001
18/10/09 20:24:09 INFO input.FileInputFormat: Total input files to process : 1
18/10/09 20:24:09 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
18/10/09 20:24:09 INFO lzo.LzoCodec: Successfully loaded & initialized 
native-lzo library [hadoop-lzo rev 5d6248d8d690f8456469979213ab2e9993bfa2e9]
18/10/09 20:24:10 INFO mapreduce.JobSubmitter: number of splits:1
18/10/09 20:24:10 INFO mapreduce.JobSubmitter: Submitting tokens for job: 
job_1539069219098_0001
18/10/09 20:24:10 INFO mapreduce.JobSubmitter: Executing with tokens: []
18/10/09 20:24:11 INFO conf.Configuration: Removed undeclared tags:
18/10/09 20:24:11 INFO conf.Configuration: found resource resource-types.xml at 
file:/etc/hadoop/3.0.3.0-63/0/resource-types.xml
18/10/09 20:24:11 INFO conf.Configuration: Removed undeclared tags:
18/10/09 20:24:11 INFO impl.YarnClientImpl: Submitted application 
application_1539069219098_0001
18/10/09 20:24:11 INFO mapreduce.Job: The url to track the job: 
http://ctr-e138-1518143905142-510793-01-000005.hwx.site:8088/proxy/application_1539069219098_0001/
18/10/09 20:24:11 INFO mapreduce.Job: Running job: job_1539069219098_0001
18/10/09 20:25:04 INFO mapreduce.Job: Job job_1539069219098_0001 running in 
uber mode : false
18/10/09 20:25:04 INFO mapreduce.Job: map 0% reduce 0%
18/10/09 20:25:04 INFO mapreduce.Job: Job job_1539069219098_0001 failed with 
state FAILED due to: Application application_1539069219098_0001 failed 20 times 
due to AM Container for appattempt_1539069219098_0001_000020 exited with 
exitCode: 1
Failing this attempt.Diagnostics: [2018-10-09 20:25:04.763]Exception from 
container-launch.
Container id: container_e03_1539069219098_0001_20_000003
Exit code: 1

[2018-10-09 20:25:04.765]Container exited with a non-zero exit code 1. Error 
file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
log4j:WARN No appenders could be found for logger 
(org.apache.hadoop.mapreduce.v2.app.MRAppMaster).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more 
info.


[2018-10-09 20:25:04.765]Container exited with a non-zero exit code 1. Error 
file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
log4j:WARN No appenders could be found for logger 
(org.apache.hadoop.mapreduce.v2.app.MRAppMaster).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more 
info.


For more detailed output, check the application tracking page: 
http://ctr-e138-1518143905142-510793-01-000005.hwx.site:8088/cluster/app/application_1539069219098_0001
 Then click on links to logs of each attempt.
. Failing the application.
18/10/09 20:25:05 INFO mapreduce.Job: Counters: 0
18/10/09 20:25:05 INFO conf.Configuration: Removed undeclared tags:
{code}
Yarn container logs:
{code:java}
Application
Tools
Configuration
Local logs
Server stacks
Server metrics
Log Type: directory.info

Log Upload Time: Tue Oct 09 20:25:06 +0000 2018

Log Length: 20398

Showing 4096 bytes of 20398 total. Click here for the full log.

06:50 ./mr-framework/hadoop/lib/native/libsnappy.so.1
8651115 3324 -r-xr-xr-x   2 yarn     hadoop    3402313 Oct  8 06:38 
./mr-framework/hadoop/lib/native/libnativetask.so
8651058    4 drwxr-xr-x   3 yarn     hadoop       4096 Oct  8 06:32 
./mr-framework/hadoop/sbin
8651091    4 -r-xr-xr-x   1 yarn     hadoop       3898 Oct  8 06:33 
./mr-framework/hadoop/sbin/stop-dfs.sh
8651084    4 -r-xr-xr-x   1 yarn     hadoop       1756 Oct  8 06:33 
./mr-framework/hadoop/sbin/stop-secure-dns.sh
8651081    4 -r-xr-xr-x   1 yarn     hadoop       2166 Oct  8 06:32 
./mr-framework/hadoop/sbin/stop-all.sh
8651096    4 -r-xr-xr-x   1 yarn     hadoop       1779 Oct  8 06:32 
./mr-framework/hadoop/sbin/start-all.cmd
8651097    4 -r-xr-xr-x   1 yarn     hadoop       3342 Oct  8 06:38 
./mr-framework/hadoop/sbin/start-yarn.sh
8651062    4 -r-xr-xr-x   1 yarn     hadoop       1983 Oct  8 06:32 
./mr-framework/hadoop/sbin/hadoop-daemon.sh
8651078    4 -r-xr-xr-x   1 yarn     hadoop       1841 Oct  8 06:38 
./mr-framework/hadoop/sbin/mr-jobhistory-daemon.sh
8651080    4 -r-xr-xr-x   1 yarn     hadoop       1982 Oct  8 06:32 
./mr-framework/hadoop/sbin/workers.sh
8651061    4 -r-xr-xr-x   1 yarn     hadoop       3083 Oct  8 06:38 
./mr-framework/hadoop/sbin/stop-yarn.sh
8651076    4 -r-xr-xr-x   1 yarn     hadoop       1880 Oct  8 06:33 
./mr-framework/hadoop/sbin/start-balancer.sh
8651095    4 -r-xr-xr-x   1 yarn     hadoop       1401 Oct  8 06:33 
./mr-framework/hadoop/sbin/start-dfs.cmd
8651090    4 -r-xr-xr-x   1 yarn     hadoop       2221 Oct  8 06:32 
./mr-framework/hadoop/sbin/start-all.sh
8651079    4 -r-xr-xr-x   1 yarn     hadoop       1793 Oct  8 06:33 
./mr-framework/hadoop/sbin/start-secure-dns.sh
8651077    4 -r-xr-xr-x   1 yarn     hadoop       1500 Oct  8 06:32 
./mr-framework/hadoop/sbin/kms.sh
8651063    4 drwxr-xr-x   4 yarn     hadoop       4096 Oct  8 06:38 
./mr-framework/hadoop/sbin/FederationStateStore
8651067    4 drwxr-xr-x   2 yarn     hadoop       4096 Oct  8 06:38 
./mr-framework/hadoop/sbin/FederationStateStore/MySQL
8651064    4 drwxr-xr-x   2 yarn     hadoop       4096 Oct  8 06:38 
./mr-framework/hadoop/sbin/FederationStateStore/SQLServer
8651089    4 -r-xr-xr-x   1 yarn     hadoop       2522 Oct  8 06:32 
./mr-framework/hadoop/sbin/hadoop-daemons.sh
8651093    4 -r-xr-xr-x   1 yarn     hadoop       2328 Oct  8 06:38 
./mr-framework/hadoop/sbin/yarn-daemons.sh
8651085    4 -r-xr-xr-x   1 yarn     hadoop       1783 Oct  8 06:33 
./mr-framework/hadoop/sbin/stop-balancer.sh
8651082    4 -r-xr-xr-x   1 yarn     hadoop       1455 Oct  8 06:33 
./mr-framework/hadoop/sbin/stop-dfs.cmd
8651094    4 -r-xr-xr-x   1 yarn     hadoop       1542 Oct  8 06:33 
./mr-framework/hadoop/sbin/httpfs.sh
8651092    4 -r-xr-xr-x   1 yarn     hadoop       2086 Oct  8 06:33 
./mr-framework/hadoop/sbin/refresh-namenodes.sh
8651059    4 -r-xr-xr-x   1 yarn     hadoop       1571 Oct  8 06:38 
./mr-framework/hadoop/sbin/start-yarn.cmd
8651083    4 -r-xr-xr-x   1 yarn     hadoop       1642 Oct  8 06:38 
./mr-framework/hadoop/sbin/stop-yarn.cmd
8651060    4 -r-xr-xr-x   1 yarn     hadoop       1814 Oct  8 06:38 
./mr-framework/hadoop/sbin/yarn-daemon.sh
8651088    8 -r-xr-xr-x   1 yarn     hadoop       5170 Oct  8 06:33 
./mr-framework/hadoop/sbin/start-dfs.sh
8651086    4 -r-xr-xr-x   1 yarn     hadoop       1770 Oct  8 06:32 
./mr-framework/hadoop/sbin/stop-all.cmd
8651087    4 -r-xr-xr-x   1 yarn     hadoop       2756 Oct  8 06:33 
./mr-framework/hadoop/sbin/distribute-exclude.sh
8651550    4 -rw-r--r--   1 yarn     hadoop         12 Oct  9 20:25 
./.container_tokens.crc
8651548    4 drwx--x---   2 yarn     hadoop       4096 Oct  9 20:25 ./tmp
8651554    4 -rw-r--r--   1 yarn     hadoop         16 Oct  9 20:25 
./.default_container_executor_session.sh.crc
8651556    4 -rw-r--r--   1 yarn     hadoop         16 Oct  9 20:25 
./.default_container_executor.sh.crc
8523603    4 drwx------   2 yarn     hadoop       4096 Oct  9 20:24 ./job.jar
8523604  312 -r-x------   1 yarn     hadoop     316294 Oct  9 20:24 
./job.jar/job.jar
broken symlinks(find -L . -maxdepth 5 -type l -ls):
Log Type: launch_container.sh

Log Upload Time: Tue Oct 09 20:25:06 +0000 2018

Log Length: 5481

Showing 4096 bytes of 5481 total. Click here for the full log.

port 
LOG_DIRS="/grid/0/hadoop/yarn/log/application_1539069219098_0001/container_e03_1539069219098_0001_20_000003"
export USER="hdfs"
export LOGNAME="hdfs"
export HOME="/home/"
export 
PWD="/grid/0/hadoop/yarn/local/usercache/hdfs/appcache/application_1539069219098_0001/container_e03_1539069219098_0001_20_000003"
export JVM_PID="$$"
export MALLOC_ARENA_MAX="4"
export NM_AUX_SERVICE_timeline_collector=""
export 
NM_AUX_SERVICE_mapreduce_shuffle="AAA0+gAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA="
export NM_AUX_SERVICE_spark2_shuffle=""
export APP_SUBMIT_TIME_ENV="1539116651239"
export TIMELINE_FLOW_NAME_TAG="word count"
export LD_LIBRARY_PATH="$PWD:$HADOOP_COMMON_HOME/lib/native"
export TIMELINE_FLOW_VERSION_TAG="1"
export APPLICATION_WEB_PROXY_BASE="/proxy/application_1539069219098_0001"
export SHELL="/bin/bash"
export 
CLASSPATH="$PWD:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/3.0.3.0-63/hadoop/lib/hadoop-lzo-0.6.0.3.0.3.0-63.jar:/etc/hadoop/conf/secure:job.jar/*:job.jar/classes/:job.jar/lib/*:$PWD/*"
export TIMELINE_FLOW_RUN_ID_TAG="1539116651241"
echo "Setting up job resources"
ln -sf 
"/grid/0/hadoop/yarn/local/usercache/hdfs/appcache/application_1539069219098_0001/filecache/11/job.jar"
 "job.jar"
mkdir -p jobSubmitDir
ln -sf 
"/grid/0/hadoop/yarn/local/usercache/hdfs/appcache/application_1539069219098_0001/filecache/12/job.split"
 "jobSubmitDir/job.split"
mkdir -p jobSubmitDir
ln -sf 
"/grid/0/hadoop/yarn/local/usercache/hdfs/appcache/application_1539069219098_0001/filecache/10/job.splitmetainfo"
 "jobSubmitDir/job.splitmetainfo"
ln -sf "/grid/0/hadoop/yarn/local/filecache/13/mapreduce.tar.gz" "mr-framework"
ln -sf 
"/grid/0/hadoop/yarn/local/usercache/hdfs/appcache/application_1539069219098_0001/filecache/13/job.xml"
 "job.xml"
echo "Copying debugging information"
# Creating copy of launch script
cp "launch_container.sh" 
"/grid/0/hadoop/yarn/log/application_1539069219098_0001/container_e03_1539069219098_0001_20_000003/launch_container.sh"
chmod 640 
"/grid/0/hadoop/yarn/log/application_1539069219098_0001/container_e03_1539069219098_0001_20_000003/launch_container.sh"
# Determining directory contents
echo "ls -l:" 
1>"/grid/0/hadoop/yarn/log/application_1539069219098_0001/container_e03_1539069219098_0001_20_000003/directory.info"
ls -l 
1>>"/grid/0/hadoop/yarn/log/application_1539069219098_0001/container_e03_1539069219098_0001_20_000003/directory.info"
echo "find -L . -maxdepth 5 -ls:" 
1>>"/grid/0/hadoop/yarn/log/application_1539069219098_0001/container_e03_1539069219098_0001_20_000003/directory.info"
find -L . -maxdepth 5 -ls 
1>>"/grid/0/hadoop/yarn/log/application_1539069219098_0001/container_e03_1539069219098_0001_20_000003/directory.info"
echo "broken symlinks(find -L . -maxdepth 5 -type l -ls):" 
1>>"/grid/0/hadoop/yarn/log/application_1539069219098_0001/container_e03_1539069219098_0001_20_000003/directory.info"
find -L . -maxdepth 5 -type l -ls 
1>>"/grid/0/hadoop/yarn/log/application_1539069219098_0001/container_e03_1539069219098_0001_20_000003/directory.info"
echo "Launching container"
exec /bin/bash -c "$JAVA_HOME/bin/java -Djava.io.tmpdir=$PWD/tmp 
-Dlog4j.configuration=container-log4j.properties 
-Dyarn.app.container.log.dir=/grid/0/hadoop/yarn/log/application_1539069219098_0001/container_e03_1539069219098_0001_20_000003
 -Dyarn.app.container.log.filesize=0 -Dhadoop.root.logger=INFO,CLA 
-Dhadoop.root.logfile=syslog -Dhdp.version=3.0.3.0-63 -Xmx3276m 
-Dhdp.version=3.0.3.0-63 org.apache.hadoop.mapreduce.v2.app.MRAppMaster 
1>/grid/0/hadoop/yarn/log/application_1539069219098_0001/container_e03_1539069219098_0001_20_000003/stdout
 
2>/grid/0/hadoop/yarn/log/application_1539069219098_0001/container_e03_1539069219098_0001_20_000003/stderr
 "
Log Type: prelaunch.err

Log Upload Time: Tue Oct 09 20:25:06 +0000 2018

Log Length: 0

Log Type: prelaunch.out

Log Upload Time: Tue Oct 09 20:25:06 +0000 2018

Log Length: 100

Setting up env variables
Setting up job resources
Copying debugging information
Launching container
Log Type: stderr

Log Upload Time: Tue Oct 09 20:25:06 +0000 2018

Log Length: 240

log4j:WARN No appenders could be found for logger 
(org.apache.hadoop.mapreduce.v2.app.MRAppMaster).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more 
info.
Log Type: stdout

Log Upload Time: Tue Oct 09 20:25:06 +0000 2018

Log Length: 0

Log Type: syslog

Log Upload Time: Tue Oct 09 20:25:06 +0000 2018

Log Length: 61449

Showing 4096 bytes of 61449 total. Click here for the full log.

361)
        at 
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.<init>(FileOutputCommitter.java:160)
        at 
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.<init>(FileOutputCommitter.java:116)
        at 
org.apache.hadoop.mapreduce.lib.output.PathOutputCommitterFactory.createFileOutputCommitter(PathOutputCommitterFactory.java:134)
        at 
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitterFactory.createOutputCommitter(FileOutputCommitterFactory.java:35)
        at 
org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.getOutputCommitter(FileOutputFormat.java:338)
        at 
org.apache.hadoop.mapreduce.v2.app.MRAppMaster$3.call(MRAppMaster.java:552)
        ... 11 more
Caused by: java.lang.ClassNotFoundException: Class 
org.apache.hadoop.fs.ozone.OzoneFileSystem not found
        at 
org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2500)
        at 
org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2594)
        ... 24 more
2018-10-09 20:25:04,703 ERROR [main] 
org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Error starting MRAppMaster
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: 
java.lang.RuntimeException: java.lang.ClassNotFoundException: Class 
org.apache.hadoop.fs.ozone.OzoneFileSystem not found
        at 
org.apache.hadoop.mapreduce.v2.app.MRAppMaster$3.call(MRAppMaster.java:554)
        at 
org.apache.hadoop.mapreduce.v2.app.MRAppMaster$3.call(MRAppMaster.java:534)
        at 
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.callWithJobClassLoader(MRAppMaster.java:1802)
        at 
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.createOutputCommitter(MRAppMaster.java:534)
        at 
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:311)
        at 
org.apache.hadoop.service.AbstractService.init(AbstractService.java:164)
        at 
org.apache.hadoop.mapreduce.v2.app.MRAppMaster$6.run(MRAppMaster.java:1760)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
        at 
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1757)
        at 
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1691)
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class 
org.apache.hadoop.fs.ozone.OzoneFileSystem not found
        at 
org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2596)
        at 
org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3320)
        at 
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3352)
        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:124)
        at 
org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3403)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3371)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:477)
        at org.apache.hadoop.fs.Path.getFileSystem(Path.java:361)
        at 
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.<init>(FileOutputCommitter.java:160)
        at 
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.<init>(FileOutputCommitter.java:116)
        at 
org.apache.hadoop.mapreduce.lib.output.PathOutputCommitterFactory.createFileOutputCommitter(PathOutputCommitterFactory.java:134)
        at 
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitterFactory.createOutputCommitter(FileOutputCommitterFactory.java:35)
        at 
org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.getOutputCommitter(FileOutputFormat.java:338)
        at 
org.apache.hadoop.mapreduce.v2.app.MRAppMaster$3.call(MRAppMaster.java:552)
        ... 11 more
Caused by: java.lang.ClassNotFoundException: Class 
org.apache.hadoop.fs.ozone.OzoneFileSystem not found
        at 
org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2500)
        at 
org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2594)
        ... 24 more
2018-10-09 20:25:04,705 INFO [main] org.apache.hadoop.util.ExitUtil: Exiting 
with status 1: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: 
java.lang.RuntimeException: java.lang.ClassNotFoundException: Class 
org.apache.hadoop.fs.ozone.OzoneFileSystem not found
{code}
The above fails with OzoneFileSystem not found. But the hadoop classpath 
command shows ozonefs jar present in the classpath:
{code:java}
-bash-4.2$ hadoop classpath
/usr/hdp/3.0.3.0-63/hadoop/conf:/usr/hdp/3.0.3.0-63/hadoop/lib/*:/usr/hdp/3.0.3.0-63/hadoop/.//*:/usr/hdp/3.0.3.0-63/hadoop-hdfs/./:/usr/hdp/3.0.3.0-63/hadoop-hdfs/lib/*:/usr/hdp/3.0.3.0-63/hadoop-hdfs/.//*:/usr/hdp/3.0.3.0-63/hadoop-mapreduce/lib/*:/usr/hdp/3.0.3.0-63/hadoop-mapreduce/.//*:/usr/hdp/3.0.3.0-63/hadoop-yarn/./:/usr/hdp/3.0.3.0-63/hadoop-yarn/lib/*:/usr/hdp/3.0.3.0-63/hadoop-yarn/.//*:/usr/hdp/3.0.3.0-63/tez/*:/usr/hdp/3.0.3.0-63/tez/lib/*:/usr/hdp/3.0.3.0-63/tez/conf:/tmp/ozone-0.3.0-SNAPSHOT/share/hadoop/ozoneplugin/hadoop-ozone-datanode-plugin-0.3.0-SNAPSHOT.jar:/tmp/ozone-0.3.0-SNAPSHOT/share/hadoop/ozonefs/hadoop-ozone-filesystem-0.3.0-SNAPSHOT.jar:/usr/hdp/3.0.3.0-63/tez/conf_llap:/usr/hdp/3.0.3.0-63/tez/doc:/usr/hdp/3.0.3.0-63/tez/hadoop-shim-0.9.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/hadoop-shim-2.8-0.9.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/lib:/usr/hdp/3.0.3.0-63/tez/man:/usr/hdp/3.0.3.0-63/tez/tez-api-0.9.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/tez-common-0.9.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/tez-dag-0.9.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/tez-examples-0.9.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/tez-history-parser-0.9.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/tez-javadoc-tools-0.9.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/tez-job-analyzer-0.9.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/tez-mapreduce-0.9.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/tez-protobuf-history-plugin-0.9.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/tez-runtime-internals-0.9.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/tez-runtime-library-0.9.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/tez-tests-0.9.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/tez-yarn-timeline-cache-plugin-0.9.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/tez-yarn-timeline-history-0.9.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/tez-yarn-timeline-history-with-acls-0.9.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/tez-yarn-timeline-history-with-fs-0.9.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/ui:/usr/hdp/3.0.3.0-63/tez/lib/async-http-client-1.9.40.jar:/usr/hdp/3.0.3.0-63/tez/lib/commons-cli-1.2.jar:/usr/hdp/3.0.3.0-63/tez/lib/commons-codec-1.4.jar:/usr/hdp/3.0.3.0-63/tez/lib/commons-collections-3.2.2.jar:/usr/hdp/3.0.3.0-63/tez/lib/commons-collections4-4.1.jar:/usr/hdp/3.0.3.0-63/tez/lib/commons-io-2.4.jar:/usr/hdp/3.0.3.0-63/tez/lib/commons-lang-2.6.jar:/usr/hdp/3.0.3.0-63/tez/lib/commons-math3-3.1.1.jar:/usr/hdp/3.0.3.0-63/tez/lib/gcs-connector-1.9.0.3.0.3.0-63-shaded.jar:/usr/hdp/3.0.3.0-63/tez/lib/guava-11.0.2.jar:/usr/hdp/3.0.3.0-63/tez/lib/hadoop-aws-3.1.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/lib/hadoop-azure-3.1.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/lib/hadoop-azure-datalake-3.1.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/lib/hadoop-hdfs-client-3.1.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/lib/hadoop-mapreduce-client-common-3.1.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/lib/hadoop-mapreduce-client-core-3.1.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/lib/hadoop-yarn-server-timeline-pluginstorage-3.1.1.3.0.3.0-63.jar:/usr/hdp/3.0.3.0-63/tez/lib/jersey-client-1.19.jar:/usr/hdp/3.0.3.0-63/tez/lib/jersey-json-1.19.jar:/usr/hdp/3.0.3.0-63/tez/lib/jettison-1.3.4.jar:/usr/hdp/3.0.3.0-63/tez/lib/jetty-server-9.3.22.v20171030.jar:/usr/hdp/3.0.3.0-63/tez/lib/jetty-util-9.3.22.v20171030.jar:/usr/hdp/3.0.3.0-63/tez/lib/jsr305-3.0.0.jar:/usr/hdp/3.0.3.0-63/tez/lib/metrics-core-3.1.0.jar:/usr/hdp/3.0.3.0-63/tez/lib/protobuf-java-2.5.0.jar:/usr/hdp/3.0.3.0-63/tez/lib/RoaringBitmap-0.4.9.jar:/usr/hdp/3.0.3.0-63/tez/lib/servlet-api-2.5.jar:/usr/hdp/3.0.3.0-63/tez/lib/slf4j-api-1.7.10.jar:/usr/hdp/3.0.3.0-63/tez/lib/tez.tar.gz
{code}
 

> Mapreduce example fails with java.lang.IllegalArgumentException: Bucket or 
> Volume name has an unsupported character
> -------------------------------------------------------------------------------------------------------------------
>
>                 Key: HDDS-600
>                 URL: https://issues.apache.org/jira/browse/HDDS-600
>             Project: Hadoop Distributed Data Store
>          Issue Type: Bug
>            Reporter: Namit Maheshwari
>            Assignee: Hanisha Koneru
>            Priority: Blocker
>
> Set up a hadoop cluster where ozone is also installed. Ozone can be 
> referenced via o3://xx.xx.xx.xx:9889
> {code:java}
> [root@ctr-e138-1518143905142-510793-01-000002 ~]# ozone sh bucket list 
> o3://xx.xx.xx.xx:9889/volume1/
> 2018-10-09 07:21:24,624 WARN util.NativeCodeLoader: Unable to load 
> native-hadoop library for your platform... using builtin-java classes where 
> applicable
> [ {
> "volumeName" : "volume1",
> "bucketName" : "bucket1",
> "createdOn" : "Tue, 09 Oct 2018 06:48:02 GMT",
> "acls" : [ {
> "type" : "USER",
> "name" : "root",
> "rights" : "READ_WRITE"
> }, {
> "type" : "GROUP",
> "name" : "root",
> "rights" : "READ_WRITE"
> } ],
> "versioning" : "DISABLED",
> "storageType" : "DISK"
> } ]
> [root@ctr-e138-1518143905142-510793-01-000002 ~]# ozone sh key list 
> o3://xx.xx.xx.xx:9889/volume1/bucket1
> 2018-10-09 07:21:54,500 WARN util.NativeCodeLoader: Unable to load 
> native-hadoop library for your platform... using builtin-java classes where 
> applicable
> [ {
> "version" : 0,
> "md5hash" : null,
> "createdOn" : "Tue, 09 Oct 2018 06:58:32 GMT",
> "modifiedOn" : "Tue, 09 Oct 2018 06:58:32 GMT",
> "size" : 0,
> "keyName" : "mr_job_dir"
> } ]
> [root@ctr-e138-1518143905142-510793-01-000002 ~]#{code}
> Hdfs is also set fine as below
> {code:java}
> [root@ctr-e138-1518143905142-510793-01-000002 ~]# hdfs dfs -ls 
> /tmp/mr_jobs/input/
> Found 1 items
> -rw-r--r-- 3 root hdfs 215755 2018-10-09 06:37 
> /tmp/mr_jobs/input/wordcount_input_1.txt
> [root@ctr-e138-1518143905142-510793-01-000002 ~]#{code}
> Now try to run Mapreduce example job against ozone o3:
> {code:java}
> [root@ctr-e138-1518143905142-510793-01-000002 ~]# 
> /usr/hdp/current/hadoop-client/bin/hadoop jar 
> /usr/hdp/current/hadoop-mapreduce-client/hadoop-mapreduce-examples.jar 
> wordcount /tmp/mr_jobs/input/ 
> o3://xx.xx.xx.xx:9889/volume1/bucket1/mr_job_dir/output
> 18/10/09 07:15:38 INFO conf.Configuration: Removed undeclared tags:
> java.lang.IllegalArgumentException: Bucket or Volume name has an unsupported 
> character : :
> at 
> org.apache.hadoop.hdds.scm.client.HddsClientUtils.verifyResourceName(HddsClientUtils.java:143)
> at 
> org.apache.hadoop.ozone.client.rpc.RpcClient.getVolumeDetails(RpcClient.java:231)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at 
> org.apache.hadoop.ozone.client.OzoneClientInvocationHandler.invoke(OzoneClientInvocationHandler.java:54)
> at com.sun.proxy.$Proxy16.getVolumeDetails(Unknown Source)
> at org.apache.hadoop.ozone.client.ObjectStore.getVolume(ObjectStore.java:92)
> at 
> org.apache.hadoop.fs.ozone.OzoneFileSystem.initialize(OzoneFileSystem.java:121)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3354)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:124)
> at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3403)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3371)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:477)
> at org.apache.hadoop.fs.Path.getFileSystem(Path.java:361)
> at 
> org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.setOutputPath(FileOutputFormat.java:178)
> at org.apache.hadoop.examples.WordCount.main(WordCount.java:85)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at 
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:71)
> at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144)
> at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:74)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
> 18/10/09 07:15:39 INFO conf.Configuration: Removed undeclared tags:
> [root@ctr-e138-1518143905142-510793-01-000002 ~]#
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to