Hi All!

I want a directory to be present in the local working directory of the task
for which I am using the following statements: 

DistributedCache.addCacheArchive(new URI("/home/akhil1988/Config.zip"),
conf);
DistributedCache.createSymlink(conf);

>> Here Config is a directory which I have zipped and put at the given
>> location in HDFS

I have zipped the directory because the API doc of DistributedCache
(http://hadoop.apache.org/core/docs/r0.20.0/api/index.html) says that the
archive files are unzipped in the local cache directory :

DistributedCache can be used to distribute simple, read-only data/text files
and/or more complex types such as archives, jars etc. Archives (zip, tar and
tgz/tar.gz files) are un-archived at the slave nodes.

So, from my understanding of the API docs I expect that the Config.zip file
will be unzipped to Config directory and since I have SymLinked them I can
access the directory in the following manner from my map function:

FileInputStream fin = new FileInputStream("Config/file1.config");

But I get the FileNotFoundException on the execution of this statement.
Please let me know where I am going wrong.

Thanks,
Akhil
-- 
View this message in context: 
http://www.nabble.com/Using-addCacheArchive-tp24207739p24207739.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.

Reply via email to