I guess problem is in modification of closed HDFS file. Closed files
are immutable.
So if you want to do data updates you can use this solution: store new
changes in memory, then flush them in new file (when you got too much
changes in memory) and then merge this files in one (when you got too
much files with changes). That is how HBase solves this problem
inside. To not implement it yourself, you can get this already done
solution by using HBase.
But HBase is not a DFS it's Distributed Storage and it differs the way of usage.

Best Regards,
Evgeny.


2009/6/12 Jae Joo <[email protected]>:
> I don't understand exactluy what the operation means. Please advise me.
>
> Jae Joo
>
> On 6/11/09, 황명진 <[email protected]> wrote:
>> we are studying hadoop. Recently we have some difficulties excuting Hadoop.
>>
>>
>>
>> I have learned that Hodoop doesn't support open operation to modify some
>> file.
>>
>>
>>
>> I heard that in this case, it can be solved using HBase.
>>
>>
>>
>> If it is right, please let me know how to modify the file and deal with
>> HBase.
>>
>>
>

Reply via email to