I did not get your complete idea here but before that, file will be splitted 
into blocks. Then how will you extract ?
Why con't you create tar file and one index file which will maintain index of 
each file..something like this?

________________________________
From: Sesha Kumar [sesha...@gmail.com]
Sent: Wednesday, January 18, 2012 8:24 PM
To: hdfs-user@hadoop.apache.org
Subject: Re: Data processing in DFSClient


Sorry for the delay. I'm trying to implement an IEEE paper which combines a 
bunch of files into a single file and when the file is requested the datanode 
extracts the desired file from the block and sends the file to DFSClient.

Reply via email to