I believe 0.15 had HftpFileSystem.
http://hadoop.apache.org/common/docs/r0.15.3/api/index.html

You may be able to run 0.19's distcp to copy from your 0.15 (use HFTP
as the source) to HDFS.

-- Philip


On Sun, Mar 21, 2010 at 12:26 PM, Owen O'Malley <owen.omal...@gmail.com> wrote:
> I believe you need to take two jumps. I believe it is 15-> 18 -> 20. I'd
> strongly suggest trying a practice file system first. Did we have owners and
> perms in 15? If not, you'll need to set owners and perms.
>
> -- Owen
>
> On Mar 21, 2010, at 12:23 AM, "ilayaraja" <ilayar...@rediff.co.in> wrote:
>
>> Hi,
>>
>> We 've been using hadoop 15.5 in our production environment where we have
>> about 10 TB of data stored on the dfs.
>> The files were generated as mapreduce output. We want to move our env. to
>> Amazon Elastic Map Reduce (EMR) which throws the following questions to us:
>>
>> 1. EMR supports only hadoop 19.0 and above. Is it possible to use the
>> current data that were generated with hadoop 15.5 from hadoop 19.0?
>>
>> 2. Or how can we make it possible to use or update to hadoop 19.0 from
>> hadoop 15.5? What are the issues expected while doing so?
>>
>>
>> Regards,
>> Ilayaraja
>>
>

Reply via email to