Hi --
I'm using wget 1.10.2
I'm trying to mirror a web site with the following command:
wget -m http://www.example.com
After this process finished, I realized that I also needed pages from
a subdomain (eg. www2)
To re-start the mirror process without downloading the same pages
again,
) - `-' saved [24275/24275]
www.zdziarski.com/index.html: No such file or directory
FINISHED --15:40:06--
Downloaded: 24,275 bytes in 1 files
Jonathan
Please cc: (copy) your response(s) to my email,
[EMAIL PROTECTED], as I am not subscribed to the
list, thank you.
I've repeatedly tried to download the wikipedia
database, one of which is 8 gigabytes, and the other,
57.5 gigabytes, from a server which supports
resumability.
The downloads are
to the mailing
list, but I do need to have wget working properly to
get the database.
Thanks,
Jonathan.
__
Do You Yahoo!?
Tired of spam? Yahoo! Mail has the best spam protection around
http://mail.yahoo.com
,
___
Jonathan Abrahams
BrainMedia LLC
Director of Operations
Tel. no. 212 529 6500 x 150
Fax. no. 212 481-8188
mailto: [EMAIL PROTECTED]
website: www.BrainMediaCo.com
This electronic mail transmission is
intended only for the addressee indicated above
to get wget to do this for me, but I think
many people could benefit from this.
It's much easier to throw wget into a script or other program and then have
that script/program do the post processing (why mess around with wget?).
Jonathan
because wget may not
sucessfully download the file. Also, wget may be downloading multiple
files (usually the case) so, wrapping wget up in a script will not allow
you to process each downloaded file.
K.
Jonathan wrote:
any thoughts on this new feature for wget:
when a file has been
invoke wget
to retrieve the new page(s).
hth
Jonathan
- Original Message -
From: Benjamin Scaplen [EMAIL PROTECTED]
To: wget@sunsite.dk
Sent: Wednesday, January 25, 2006 7:06 AM
Subject: Downloading grib files
Hi,
I am using Wget version 1.9.1.
I am downloading weather information
/ .
Is there a work-around for this?
Thanks,
-JD
Jonathan DeGumbia
Systems Engineer, Omitron Inc.
[EMAIL PROTECTED]
301.474.1700 x611
I think you should be using a tool like linklint (www.linklint.org) not
wget.
Jonathan
- Original Message -
From: Jean-Marc MOLINA [EMAIL PROTECTED]
To: wget@sunsite.dk
Sent: Wednesday, November 02, 2005 12:56 PM
Subject: Re: Getting list of files
Shahram Bakhtiari wrote:
I
easier to deal with.
Is anyone else interested in this idea? Is it feasible?
Jonathan
/Linux kernal
release:2.4.20-20.8 kernal version:Mon Aug 18 14:39:22 EDT
2003machine type:i686processor:athlon
Any and all ideas greatfully
accepted!
Jonathan
On Tue, 22 Feb 2005 19:09:11 +, Bryan [EMAIL PROTECTED] wrote:
Automize cannot access PKI-enabled websites, as it does not support
that functionality. I was reading up about Wget, and I saw that you
can send a specific cookie back to a website. This is almost exactly
what I am looking
Sorry for the Dual post Steven, just realised I hadn't sent it to the list.
On Sat, 19 Feb 2005 11:26:16 +, Jonathan Share [EMAIL PROTECTED] wrote:
On Fri, 18 Feb 2005 22:43:50 -0600 (CST), Steven M. Schweda
[EMAIL PROTECTED] wrote:
In case it might be useful, I've included the -d
On Sat, 19 Feb 2005 15:54:42 -0500, Post, Mark K [EMAIL PROTECTED] wrote:
That does seem a bit odd. I did a wget www.exeter-airport.co.uk command
using 1.9.1, and got this result:
wget www.exeter-airport.co.uk
--15:52:05-- http://www.exeter-airport.co.uk/
= `index.html'
On Sat, 19 Feb 2005 18:06:37 -0500, Post, Mark K [EMAIL PROTECTED] wrote:
I meant your users' problem seemed a bit odd, not the fact that my attempt
worked.
Sorry, it's gettin late over here, I misread your message. It really
is time I went to bed.
Thanks again anyway.
Jon
Mark Post
Hi,
I'm using wget to assist in debugging a intermittent connection
problem to a particular server. I cannot reproduce the problem myself
and the users experiencing the problem receive correct output from
nslookup so I have ruled out DNS problems.
As a second phase I got them to run wget with
On Thu, 09 Sep 2004 23:44:09 -0400, Leonid [EMAIL PROTECTED] wrote:
Jonathan,
There exists more than one LFS patch for wget. For Linux, Sun and
HP-UX one can use http://software.lpetrov.net/wget-LFS/
I shall have to try said patches. And then find a large file to download.
But i am
Wget doesn't support 2GB files. It is a known issue that is brought up a lot.
Please patch if you're able, so far no fix has been forthcoming.
Cheers,
Jonathan
- Original Message -
From: david coornaert [EMAIL PROTECTED]
Date: Thu, 09 Sep 2004 12:41:31 +0200
Subject: 2 giga file size
to the
author of this: http://software.lpetrov.net/wget-LFS/
--
Jonathan
Title: Stratus VOS support
Any thoughts of adding support for Stratus VOS file structures?
*
Jonathan D Grubb [EMAIL PROTECTED]
WebMD Envoy - Medical Real-Time
Ah, that was it. I didn't even think to check
that. Now I got 1.8.2 and it's ok.
Thanks
Jonathan Buhacoff
- Original Message -
From:
Herold Heiko
To: 'Jonathan Buhacoff' ; [EMAIL PROTECTED]
Sent: Monday, 17 February, 2003
03:00
Subject: RE: retrieving stylesheets
be added? Do you want me to do it myself and submit a
patch to this list?
Please CC your replies to [EMAIL PROTECTED] because I'm not on the
list.
Thanks,
Jonathan Buhacoff
Hi All
Sorry for the off topic and 'Newbie' question, but is it possible having
downloaded a site using 'wget' to a local directory, to then at a later
stage to run just the 'proxy' part of the process, reading the site
contents from the local directory and note from the site to enable
I recently successfully compiled and installed wget 1.8.1 on my box.
The new OS and architecture reads as follows: Mac OS X
(powerpc-apple-darwin5.2)
Jonathan
hello,
i have a suggestion for the wget program. would it be possible to
have a command line option that, when invoked, would tell wget to
preserve the modification date when transfering the file?? the
modification time would then reflect the last time the file was modified
on the remote
26 matches
Mail list logo