Yes, these are warning unless they fail for 3 times. In which case your dfs -put command would fail with stack trace. Thanks, Lohit
----- Original Message ---- From: Ryan LeCompte <[EMAIL PROTECTED]> To: "[email protected]" <[email protected]> Sent: Monday, September 22, 2008 5:18:01 PM Subject: Re: NotYetReplicated exceptions when pushing large files into HDFS I've noticed that although I get a few of these exceptions, the file is ultimately uploaded to the HDFS cluster. Does this mean that my file ended up getting there in 1 piece? The exceptions are just logged at the WARN level and indicate retry attempts. Thanks, Ryan On Mon, Sep 22, 2008 at 11:08 AM, Ryan LeCompte <[EMAIL PROTECTED]> wrote: > Hello all, > > I'd love to be able to upload into HDFS very large files (e.g., 8 or > 10GB), but it seems like my only option is to chop up the file into > smaller pieces. Otherwise, after a while I get NotYetReplication > exceptions while the transfer is in progress. I'm using 0.18.1. Is > there any way I can do this? Perhaps use something else besides > bin/hadoop -put input output? > > Thanks, > Ryan >
