Downloading grib files

2006-01-25 Thread Benjamin Scaplen
Hi,

I am using Wget version 1.9.1.

I am downloading weather information off the weatheroffice website
everyday around the same time. Each day the weatheroffice clears the page
and reposts the updated data but since the date is in the name, the names
all change.
eg. CMC_reg_ABSV_ISBL_250_ps15km_2006012312_P000.grib

What I am doing is using wget to get about 500 files from the site. I want
to be able to download the files AS SOON AS they are posted. I was
wondering if there was a way retry to look for a file (which wouldn't be
there yet because the previous day's date would be in the name) until it
is posted then as soon as it is posted be able to download it.

I tried to set the download 'tries' to infinity ( -t inf) but it seems
that it only works for a failure due to connection problems and doesn't
retry when the file I am looking for is not there. Is there any way to
keep looking for a file even if the file specified is not at the site
specified...yet (then of course download it when it is posted)?

I know this may be very confusing, the last question sums it up.

please reply to [EMAIL PROTECTED],

thanks,

Ben Scaplen



Re: Downloading grib files

2006-01-25 Thread Jonathan
I think you are trying to use wget for a use case that wget was not intended 
for... but there is a simple solution:  write a script/routine which wakes 
up every n seconds/minutes and does an http request to the url you want to 
check.  If the page has changed then your routine/script would invoke wget 
to retrieve the new page(s).


hth

Jonathan


- Original Message - 
From: Benjamin Scaplen [EMAIL PROTECTED]

To: wget@sunsite.dk
Sent: Wednesday, January 25, 2006 7:06 AM
Subject: Downloading grib files



Hi,

I am using Wget version 1.9.1.

I am downloading weather information off the weatheroffice website
everyday around the same time. Each day the weatheroffice clears the page
and reposts the updated data but since the date is in the name, the names
all change.
eg. CMC_reg_ABSV_ISBL_250_ps15km_2006012312_P000.grib

What I am doing is using wget to get about 500 files from the site. I want
to be able to download the files AS SOON AS they are posted. I was
wondering if there was a way retry to look for a file (which wouldn't be
there yet because the previous day's date would be in the name) until it
is posted then as soon as it is posted be able to download it.

I tried to set the download 'tries' to infinity ( -t inf) but it seems
that it only works for a failure due to connection problems and doesn't
retry when the file I am looking for is not there. Is there any way to
keep looking for a file even if the file specified is not at the site
specified...yet (then of course download it when it is posted)?

I know this may be very confusing, the last question sums it up.

please reply to [EMAIL PROTECTED],

thanks,

Ben Scaplen






Re: download to one textfile

2006-01-25 Thread Mauro Tortonesi

gerd schmalfeld wrote:

Hello wgetter,

i am trying to downlaod a webpage to one textfile, but always get an 
error message.

I tried with different options i.e. :


hi gerd,

which version of wget are you using? the code in svn seems to work fine 
for me.


BTW, you don't want to use -O and -r together, as they can't interact.


--
Aequam memento rebus in arduis servare mentem...

Mauro Tortonesi  http://www.tortonesi.com

University of Ferrara - Dept. of Eng.http://www.ing.unife.it
GNU Wget - HTTP/FTP file retrieval tool  http://www.gnu.org/software/wget
Deep Space 6 - IPv6 for Linuxhttp://www.deepspace6.net
Ferrara Linux User Group http://www.ferrara.linux.it


Re: wget -O writes empty file on failure

2006-01-25 Thread Mauro Tortonesi

Hrvoje Niksic wrote:

Mauro Tortonesi [EMAIL PROTECTED] writes:

The semantics of -O are well-defined, but they're not what people
expect. In other words, -O breaks the principle of least surprise.


it does indeed.


in this case, the redirection command would simply write all the
downloaded data to the output without performing any
trasformation. on the other hand, the output filename command could
perform more complex operations,


That seems to break the principle of least surprise as well.  If such
an option is specified, maybe Wget should simply refuse to accept more
than a single URL.


what are you exactly suggesting to do? to keep the current behavior of 
-O allowing only single downloads?


--
Aequam memento rebus in arduis servare mentem...

Mauro Tortonesi  http://www.tortonesi.com

University of Ferrara - Dept. of Eng.http://www.ing.unife.it
GNU Wget - HTTP/FTP file retrieval tool  http://www.gnu.org/software/wget
Deep Space 6 - IPv6 for Linuxhttp://www.deepspace6.net
Ferrara Linux User Group http://www.ferrara.linux.it


Re: Ant task definition for wget.

2006-01-25 Thread Mauro Tortonesi

Nicolas F Rouquette wrote:

Has anyone written a an Ant task definition to invoke wget from Ant files?


to the best of my knowledge, no.

--
Aequam memento rebus in arduis servare mentem...

Mauro Tortonesi  http://www.tortonesi.com

University of Ferrara - Dept. of Eng.http://www.ing.unife.it
GNU Wget - HTTP/FTP file retrieval tool  http://www.gnu.org/software/wget
Deep Space 6 - IPv6 for Linuxhttp://www.deepspace6.net
Ferrara Linux User Group http://www.ferrara.linux.it


Re: wget -O writes empty file on failure

2006-01-25 Thread Hrvoje Niksic
Mauro Tortonesi [EMAIL PROTECTED] writes:

 That seems to break the principle of least surprise as well.  If such
 an option is specified, maybe Wget should simply refuse to accept more
 than a single URL.

 what are you exactly suggesting to do? to keep the current behavior
 of -O allowing only single downloads?

Huh?  -O doesn't currently allow only single downloads -- multiple
downloads are appended to the same output file.

I'm suggesting that we should include an option named --save-to
(perhaps shortened to -s) that allows specifying an output file
independent of what is in the URL.  In that case we might want to
disallow multiple URLs, or multiple URLs could save to file.1,
file.2, etc.


.

2006-01-25 Thread Nobody User

.

_
Don't just search. Find. Check out the new MSN Search! 
http://search.msn.click-url.com/go/onm00200636ave/direct/01/




download file with latest modified date in directory

2006-01-25 Thread Nobody User

Hi,
I am new to wget and I have stupid question:
If  wget can download  file with latest modified date in directory?
Which wget arguments I need to use for this task?
For example, I want to download file with latest modifed date from directory
/ftp.symantec.com/AVDEFS/norton_antivirus/xdb/

Thanks

_
Express yourself instantly with MSN Messenger! Download today it's FREE! 
http://messenger.msn.click-url.com/go/onm00200471ave/direct/01/




RE: download file with latest modified date in directory

2006-01-25 Thread Willener, Pat
Hi,
I am new to wget and I have stupid question:
If  wget can download  file with latest modified date in directory?
Which wget arguments I need to use for this task?
For example, I want to download file with latest modifed date from directory
/ftp.symantec.com/AVDEFS/norton_antivirus/xdb/

Thanks
_

Hi,
I imagine that you want to daily download the newest virus definitions,
is that correct?

I do something similar with the McAfee virus definitions, using a scheduled
Windows batch script that invokes wget. I do not check for the date, but
compare the file name; they are also higher with every released virus def
file.

If you are interested I can send you the batch script.

Best regards,
Pat Willener
Cincom Systems (Japan) Ltd.


RE: download file with latest modified date in directory

2006-01-25 Thread Willener, Pat
Hi,
I am new to wget and I have stupid question:
If  wget can download  file with latest modified date in directory?
Which wget arguments I need to use for this task?
For example, I want to download file with latest modifed date from directory
/ftp.symantec.com/AVDEFS/norton_antivirus/xdb/

Thanks
_

Hi,
I imagine that you want to daily download the newest virus definitions,
is that correct?

I do something similar with the McAfee virus definitions, using a scheduled
Windows batch script that invokes wget. I do not check for the date, but
compare the file name; they are also higher with every released virus def
file.

If you are interested I can send you the batch script.

Best regards,
Pat Willener
Cincom Systems (Japan) Ltd.
_

Sorry, I have to apologize - it is not quite true what I wrote above.
(McAfee has only 1 file - the latest - for download. I do the file
compare locally).

Hopefully someone else has a better idea for you.

Best regards,
Pat Willener


Re: download file with latest modified date in directory

2006-01-25 Thread Steven M. Schweda
   You could do something with the --no-remove-listing option, like:

  wget --no-remove-listing \
   ftp://ftp.symantec.com/AVDEFS/norton_antivirus/xdb/fred*
or:
  wget --no-remove-listing \
   ftp://ftp.symantec.com/AVDEFS/norton_antivirus/xdb/

Then, look at the resulting .listing file (and/or index.html,
depending), extract the name (and/or URL) of the file you'd like, and
use it in a second Wget command.  The details would depend on your OS,
which I must have missed (along with the Wget version you're using).  (I
assume that a DCL example procedure would not be of much use to you, but
that's what I'd write.)



   Steven M. Schweda   (+1) 651-699-9818
   382 South Warwick Street[EMAIL PROTECTED]
   Saint Paul  MN  55105-2547