Hi all

I'm pretty new to the Hadoop environment and I'm about performing some micro benchmarks. In particular, I'm struggling with executing NNBench against an external File System:

hadoop jar /usr/hdp/2.2.6.0-2800/hadoop-mapreduce/hadoop-mapreduce-client-jobclient-tests.jar nnbench -Dfs.defaultFS='hfds://<external.file.system>' -operation create_write -bytesToWrite 10 -maps 2 -reduces 1 -numberOfFiles 100 -baseDir hdfs://dapsilon.daplab.ch/user/username/nnbench-`hostname -s`

yields in
java.lang.IllegalArgumentException: Wrong FS: hdfs://<external.file.system>/user/username/nnbench-hostname/data, expected: hdfs://<native fs>

If I neglect the ext FS prefix in the baseDir, NNBench simply ignores the -D option and writes the files to the native DFS. Does anyone have an idea how to solve this and nnbench an external DFS?

Thanks a lot, any hints are very appreciated!
Regards,
Alex

Reply via email to