You didn't miss understand -nc.  You missed the -O option where that
expects the
next argument to be the output file retrieved from the URL.


On Tue, Nov 18, 2014 at 10:11 AM, John Jason Jordan <[email protected]>
wrote:

> At the Clinic Wes composed a lovely command (that I converted to a
> shell script) to download files from a Flashair SD+wifi card that lives
> in a CPAP machine upstairs in my house. The connection is a bit wobbly
> because the advertised range of the Flashair is 30 feet, and that is
> about the distance from the machine to my laptop downstairs, plus there
> is a floor in between. But it does work if I am patient. Here is the
> command:
>
> wget -qO - http://192.168.0.1/DATALOG | grep 'DATALOG.*fname' | sed
> -e "s/^.*fname\"\:\"//" -e "s/\", \"fsize.*//" | while read line; do
> wget "http://192.168.0.1/DATALOG/$line";done
>
> The reason for the grep and filename search is that Toshiba, in its
> alleged wisdom, decided to bury the files in the html rather than just
> use a directory structure like ftp.
>
> But there is a problem: Every night the CPAP machine writes eight more
> small files to the DATALOG folder, where the name of each file starts
> with the date, e.g.:
>
> 20141116_235932_BRP.crc                 (from Sunday night)
> 20141117_235932_BRP.crc                 (from last night)
>
> The normal behavior of wget is to re-download files already downloaded
> and append .n to the additional copies, so running the script this
> morning would result in the following files in the folder:
>
> 20141116_235932_BRP.crc                 (from Sunday night)
> 20141116_235932_BRP.crc.1       (from Sunday night)
> 20141117_235932_BRP.crc                 (from last night)
>
> After a while the folder where I store these files on my computer is
> going to get horribly cluttered. I could add a line to the script to
> delete all files ending in .1, but considering the time it takes for
> the script to run (due to the poor connection) it would be far
> preferable for wget not to download copies in the first place.
> According to the man page adding -nc (no-clobber) is supposed to do
> this. So I added -nc like this:
>
> wget -qO -nc - http://192.168.0.1/DATALOG ...
>
> When I ran the script it downloaded no new files, but created a file
> 'nc' of zero bytes. WTH? Clearly I have failed to grasp how the -nc
> option is supposed to work.
>
> Are there any wget experts here who can lead me to the light?
> _______________________________________________
> PLUG mailing list
> [email protected]
> http://lists.pdxlinux.org/mailman/listinfo/plug
>
_______________________________________________
PLUG mailing list
[email protected]
http://lists.pdxlinux.org/mailman/listinfo/plug

Reply via email to