Yes you can do so. Hadoop is a distributed computing framework. And you are able to do multiple writes also. Only thing we cannot do is we will not be able to update a file content. But you can do this by deleting and them writing the whole once again
On Mon, May 12, 2014 at 8:06 AM, Stanley Shi <[email protected]> wrote: > Yes, why not? > > Regards, > *Stanley Shi,* > > > > On Sun, May 11, 2014 at 9:57 PM, Karim Awara <[email protected]>wrote: > >> Hi, >> >> Can I open multiple files on hdfs and write data to them in parallel and >> then close them at the end? >> >> -- >> Best Regards, >> Karim Ahmed Awara >> >> ------------------------------ >> This message and its contents, including attachments are intended solely >> for the original recipient. If you are not the intended recipient or have >> received this message in error, please notify me immediately and delete >> this message from your computer system. Any unauthorized use or >> distribution is prohibited. Please consider the environment before printing >> this email. > > > -- *Thanks & Regards * *Unmesha Sreeveni U.B* *Hadoop, Bigdata Developer* *Center for Cyber Security | Amrita Vishwa Vidyapeetham* http://www.unmeshasreeveni.blogspot.in/
