Problem appending to file with WebHDFS

2016-05-13 Thread David Robison
I am trying to use WebHDFS to write two files to Hadoop. One is an image file and the other is an index file. I am able to create both files, append twice to both files, but when I try to append a 3rd time I get the following error:

RE: WebHDFS op=CREATE

2016-04-18 Thread David Robison
In case anyone is curious, it does. David -Original Message- From: Namikaze Minato [mailto:lloydsen...@gmail.com] Sent: Monday, April 18, 2016 12:20 PM To: David Robison <david.robi...@psgglobal.net> Cc: user@hadoop.apache.org Subject: Re: WebHDFS op=CREATE Try it and see. R

Question appending data using WebHDFS

2016-04-15 Thread David Robison
I am trying to use WebHDFS to append a large amount of data to a file. From the documentation I see that I send the op=APPEND to the NameNode first and then it sends me the location of the DataNode. Then I can start sending data to the DataNode using the op=APPEND. My question is, can I

Forcing a file to update its length

2017-08-09 Thread David Robison
I understand that, when writing to a file, I can force it to update its length on the namenode by using the following command: ((DFSOutputStream) imageWriter.getWrappedStream()).hsync(EnumSet.of(SyncFlag.UPDATE_LENGTH)); Is there a way to force the update without having to open a

RE: Forcing a file to update its length

2017-08-09 Thread David Robison
I don’t see where I can pass the UPDATE_LENGTH flag to hflush. Should it be there? David Best Regards, David R Robison Senior Systems Engineer [cid:image004.png@01D19182.F24CA3E0] From: Harsh J [mailto:ha...@cloudera.com] Sent: Wednesday, August 9, 2017 3:01 PM To: David Robison <david.r