Re: gzip question

2007-12-19 Thread Steven M. Schweda
From: "Christopher Eastwood" > wget --header=3D'Accept-Encoding: gzip, deflate' http://{gzippedcontent} "Doctor, it hurts when I do this." "Don't do that." What does it do without "--header='Accept-Encoding: gzip, deflate'"? > [...] (Wget version? OS? Example with transcript?) St

RE: gzip question

2007-12-19 Thread Christopher Eastwood
wget --header='Accept-Encoding: gzip, deflate' http://{gzippedcontent} -Original Message- From: Steven M. Schweda [mailto:[EMAIL PROTECTED] Sent: Wednesday, December 19, 2007 2:57 PM To: WGET@sunsite.dk Cc: Christopher Eastwood Subject: Re: gzip question From: Christophe

gzip question

2007-12-19 Thread Christopher Eastwood
Does wget automatically decompress gzip compressed files? Is there a way to get wget NOT to decompress gzip cpmpressed files, but to download them as the gzipped file? Thanks, Christopher

Re: gzip question

2007-12-19 Thread Steven M. Schweda
From: Christopher Eastwood > Does wget automatically decompress gzip compressed files? I don't think so. Have you any evidence that it does this? (Wget version? OS? Example with transcript?) > Is there a > way to get wget NOT to decompress gzip cpmpressed files, but to download > them a

gzip question

2007-12-19 Thread Christopher Eastwood
Does wget automatically decompress gzip compressed files? Is there a way to get wget NOT to decompress gzip cpmpressed files, but to download them as the gzipped file? Thanks, Christopher

gzip question

2007-12-19 Thread Christopher Eastwood
Does wget automatically decompress gzip compressed files? Is there a way to get wget NOT to decompress gzip cpmpressed files, but to download them as the gzipped file? Thanks, Christopher

Re: Question about spidering

2007-12-12 Thread Micah Cowan
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Micah Cowan wrote: > Srinivasan Palaniappan wrote: >> wget -r l5 --save-headers --no-check-certificate https://www.mystie.com > ^^ > -r doesn't take an argument. Perhaps you wanted a -l before the 15? Or a - before the l5. Curse the visual

Re: Question about spidering

2007-12-12 Thread Micah Cowan
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Srinivasan Palaniappan wrote: > I am using WGET version 1.10.2, and trying to crawl through a secured > site (that we are developing for our customer) I noticed two things. > WGET is not downloading all the binaries in the website. It downloads > abo

Question about spidering

2007-12-11 Thread Srinivasan Palaniappan
Hi, I am using WGET version 1.10.2, and trying to crawl through a secured site (that we are developing for our customer) I noticed two things. WGET is not downloading all the binaries in the website. It downloads about 30% of it then skips the rest of the documents. But I don't see any log fil

Re: Content disposition question

2007-12-10 Thread Hrvoje Niksic
Micah Cowan <[EMAIL PROTECTED]> writes: >> I thought the code was refactored to determine the file name after >> the headers arrive. It certainly looks that way by the output it >> prints: >> >> {mulj}[~]$ wget www.cnn.com >> [...] >> HTTP request sent, awaiting response... 200 OK >> Length: uns

Re: Content disposition question

2007-12-10 Thread Micah Cowan
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Hrvoje Niksic wrote: > Micah Cowan <[EMAIL PROTECTED]> writes: > >> Actually, the reason it is not enabled by default is that (1) it is >> broken in some respects that need addressing, and (2) as it is currently >> implemented, it involves a significa

Re: Content disposition question

2007-12-10 Thread Hrvoje Niksic
Micah Cowan <[EMAIL PROTECTED]> writes: > Actually, the reason it is not enabled by default is that (1) it is > broken in some respects that need addressing, and (2) as it is currently > implemented, it involves a significant amount of extra traffic, > regardless of whether the remote end actually

Re: Bugs! [Re: Question re server actions]

2007-12-05 Thread Micah Cowan
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Micah Cowan wrote: > Alan Thomas wrote: >> Thanks. I unzipped those binaries, but I still have a problem. . . . > >> I changed the wget command to: > >> wget --recursive --level=20 --append-output=wget_log.txt -econtent_dispositi >> on=o

Re: Content disposition question

2007-12-03 Thread Micah Cowan
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 Matthias Vill wrote: > Hi, > > we know this. This was just recently discussed on the mailinglist and I > agree with you. > But there are two arguments why this is not default: > a) It's a quite new feature for wget and therefore would brake > compatib

Re: Content disposition question

2007-12-03 Thread Matthias Vill
Hi, we know this. This was just recently discussed on the mailinglist and I agree with you. But there are two arguments why this is not default: a) It's a quite new feature for wget and therefore would brake compatibility with prior versions and any "old" script would need to be rewritten. b) It's

Content disposition question

2007-12-03 Thread Vladimir Niksic
Hi! I have noticed that wget doesn't automatically use the option '--content-disposition'. So what happens is when you download something from a site that uses content disposition, the resulting file on the filesystem is not what it should be. For example, when downloading an Ubuntu torrent from

Bugs! [Re: Question re server actions]

2007-11-06 Thread Micah Cowan
-BEGIN PGP SIGNED MESSAGE- Hash: SHA256 Alan Thomas wrote: > Thanks. I unzipped those binaries, but I still have a problem. . . . > > I changed the wget command to: > > wget --recursive --level=20 --append-output=wget_log.txt -econtent_dispositi > on=on --accept=pdf,doc,ppt

Re: Question re server actions

2007-11-06 Thread Micah Cowan
-BEGIN PGP SIGNED MESSAGE- Hash: SHA256 Alan Thomas wrote: > I admittedly do not know much about web server responses, and I > have a question about why wget did not retrieve a document. . . . > >I executed the following wget command: > > wget --recursive -

Re: wget -o question

2007-10-01 Thread Saso Tomat
Micah Cowan cowan.name> writes: > > > Jim Wright wrote: > > My usage is counter to your assumptions below.[...] > > A change as proposed here is very simple, but > > would be VERY useful. > > Okay. Guess I'm sold, then. :D > > -- > Micah J. Cowan > Programmer, musician, typesetting enthusiast

Re: wget -o question

2007-10-01 Thread Micah Cowan
-BEGIN PGP SIGNED MESSAGE- Hash: SHA256 Jim Wright wrote: > My usage is counter to your assumptions below.[...] > A change as proposed here is very simple, but > would be VERY useful. Okay. Guess I'm sold, then. :D - -- Micah J. Cowan Programmer, musician, typesetting enthusiast, gamer..

Re: wget -o question

2007-10-01 Thread Jim Wright
My usage is counter to your assumptions below. I run every hour to connect to 1,000 instruments (1,500 in 12 months) dispersed over the entire western US and Alaska. I append log messages for all runs from a day to a single file. This is an important debugging tool for us. We have mostly VSAT an

Re: wget -o question

2007-09-30 Thread Micah Cowan
-BEGIN PGP SIGNED MESSAGE- Hash: SHA256 Steven M. Schweda wrote: >> But, since any specific transaction is unlikely to take such a long >> time, the spread of the run is easily deduced by the start and end >> times, and, in the unlikely event of multiple days, counting time >> regressions

Re: wget -o question

2007-09-30 Thread Steven M. Schweda
From: Micah Cowan > But, since any specific transaction is unlikely to take such a long > time, the spread of the run is easily deduced by the start and end > times, and, in the unlikely event of multiple days, counting time > regressions. And if the pages in books were all numbered 1, 2, 3, 4

Re: wget -o question

2007-09-30 Thread Micah Cowan
-BEGIN PGP SIGNED MESSAGE- Hash: SHA256 Steven M. Schweda wrote: > From: Micah Cowan > >>> - tms = time_str (NULL); >>> + tms = datetime_str (NULL); > >> Does anyone think there's any general usefulness for this sort of >> thing? > >I don't care much, but it seems like a f

Re: wget -o question

2007-09-30 Thread Steven M. Schweda
From: Micah Cowan > > - tms = time_str (NULL); > > + tms = datetime_str (NULL); > Does anyone think there's any general usefulness for this sort of > thing? I don't care much, but it seems like a fairly harmless change with some benefit. Of course, I use an OS where a directory lis

Re: wget -o question

2007-09-29 Thread Micah Cowan
-BEGIN PGP SIGNED MESSAGE- Hash: SHA256 Steven M. Schweda wrote: > ALP $ gdiff -u http.c;5 http.c;6 > --- http.c;52005-10-13 12:36:21.0 -0500 > +++ http.c;62007-09-30 01:10:45.0 -0500 > @@ -2177,7 +2177,7 @@ >++count; >sleep_between_retrievals (count

Re: wget -o question

2007-09-29 Thread Steven M. Schweda
From: Saso Tomat > I have a question regarding the -o switch: The messages are the same, whether or not you use "-o" to send them to a file. > currently I see that log file contains timestamp ONLY. Is it possible > to tell wget to include date too? Assuming that you&#

wget -o question

2007-09-27 Thread Saso Tomat
Hi all, I have a question regarding the -o switch: currently I see that log file contains timestamp ONLY. Is it possible to tell wget to include date too? Thank you. Saso

Re: Question

2007-08-07 Thread Micah Cowan
-BEGIN PGP SIGNED MESSAGE- Hash: SHA256 It seems to me that you can simply start a recursive, non-parent-traversing fetch (-r -np) of the page with the links, and you'll end up with the PDF files you want (plus anything else linked to on that page). If the PDF files are stored in different

Re: Question

2007-08-07 Thread Andra Isan
I have a paper proceeding and I want to follow a link of that proceeding and go to a paper link, then follow the paper link and go to author link and then follow author link which leads to all the paper that the author has written. I want to place all these pdf files( papers of one author) into

Re: Question

2007-08-07 Thread Micah Cowan
-BEGIN PGP SIGNED MESSAGE- Hash: SHA256 Andra Isan wrote: > I am wondering if there is a way that I can download pdf files and > organize them in a directory with Wget or should I write a code for that? > > If I need to write a code for that, would you please let me know if > there is an

Question

2007-08-07 Thread Andra Isan
Hi All, I am wondering if there is a way that I can download pdf files and organize them in a directory with Wget or should I write a code for that? If I need to write a code for that, would you please let me know if there is any sample code available? Thanks in advance

Re: Question about the frame

2007-07-02 Thread Micah Cowan
-BEGIN PGP SIGNED MESSAGE- Hash: SHA256 Ben Galin wrote: > > On Jun 26, 2007, at 11:50 PM, Micah Cowan wrote: > >> After running >> >> $ wget -H -k -p http://www.fdoxnews.com/ >> >> It downloaded all of the relevant files. However, the results were still >> not viewable until I edited

Re: Question about the frame

2007-07-02 Thread Ben Galin
On Jun 26, 2007, at 11:50 PM, Micah Cowan wrote: After running $ wget -H -k -p http://www.fdoxnews.com/ It downloaded all of the relevant files. However, the results were still not viewable until I edited the link in www.fdoxnews.com/index.html, replacing the "?" with "%3F" ("index.mas%3

Re: Question about the frame

2007-06-26 Thread Micah Cowan
has frames) the above command doesn't retrieve the html pages of >> > the urls in the frames. >>> Thanks Micah for your reply! > > The main url is: > http://www.fdoxnews.com/ > The frame url is : > http://searchportal.information.com/a_id=2519&domainname=referer

Re: Question about the frame

2007-06-26 Thread Micah Cowan
Mishari Al-Mishari wrote: > Hi, > I am using the following command: > wget -p url > the url has frames. > the url retrieves a page that has set of frames. But wget doesn't > retrieve the html pages of the frames urls. Is there any bug or i am > missing something? Works fine for me. In fact, if the

Question about the frame

2007-06-26 Thread Mishari Al-Mishari
Hi, I am using the following command: wget -p url the url has frames. the url retrieves a page that has set of frames. But wget doesn't retrieve the html pages of the frames urls. Is there any bug or i am missing something? Also the command wget -r -l 2 url (url has frames) the above command does

RE: Question on wget upload/dload usage

2007-06-18 Thread Tony Lewis
Joe Kopra wrote: > The wget statement looks like: > > wget --post-file=serverdata.mup -o postlog -O survey.html > http://www14.software.ibm.com/webapp/set2/mds/mds --post-file does not work the way you want it to; it expects a text file that contains something like this:

Question on wget upload/dload usage

2007-06-18 Thread Joe Kopra
I believe I may be using wget incorrectly. I am trying to upload .mup files to the IBM site: http://www14.software.ibm.com/webapp/set2/mds/mds The purpose of this exercise is to send the invscout output (the .mup) to IBM and get back a .html file that is a formatted report of what mic

RE: simple wget question

2007-05-13 Thread Willener, Pat
Sorry, I didn't see that Steven has already answered the question. -Original Message- From: Steven M. Schweda [mailto:[EMAIL PROTECTED] Sent: Saturday, May 12, 2007 10:05 To: WGET@sunsite.dk Cc: [EMAIL PROTECTED] Subject: Re: simple wget question From: R Kimber > What I

RE: simple wget question

2007-05-13 Thread Willener, Pat
This is something that is not supported by the http protocol. If you access the site via ftp://..., then you can use wildcards like *.pdf -Original Message- From: R Kimber [mailto:[EMAIL PROTECTED] Sent: Saturday, May 12, 2007 06:43 To: wget@sunsite.dk Subject: Re: simple wget question

Re: simple wget question

2007-05-11 Thread Steven M. Schweda
From: R Kimber > What I'm trying to download is what I might express as: > > http://www.stirling.gov.uk/*.pdf At last. > but I guess that's not possible. In general, it's not. FTP servers often support wildcards. HTTP servers do not. Generally, an HTTP server will not give you a list

Re: simple wget question

2007-05-11 Thread R Kimber
On Thu, 10 May 2007 16:04:41 -0500 (CDT) Steven M. Schweda wrote: > From: R Kimber > > > Yes there's a web page. I usually know what I want. > >There's a difference between knowing what you want and being able > to describe what you want so that it makes sense to someone who does > not know

Re: simple wget question

2007-05-10 Thread Steven M. Schweda
From: R Kimber > Yes there's a web page. I usually know what I want. There's a difference between knowing what you want and being able to describe what you want so that it makes sense to someone who does not know what you want. > But won't a recursive get get more than just those files? Inde

Re: simple wget question

2007-05-10 Thread R Kimber
. Not necessarily what I don't want. I did look at the man page, and came to the tentative conclusion that there wasn't a way (or at least an efficient way) of doing it, which is why I asked the question. - Richard -- Richard Kimber http://www.psr.keele.ac.uk/

Re: simple wget question

2007-05-06 Thread Steven M. Schweda
From: R Kimber > If I have a series of files such as > > http://www.stirling.gov.uk/elections07abcd.pdf > http://www.stirling.gov.uk/elections07efg.pdf > http://www.stirling.gov.uk/elections07gfead.pdf > > etc > > is there a single wget command that would download them all, or would I > need t

simple wget question

2007-05-06 Thread R Kimber
If I have a series of files such as http://www.stirling.gov.uk/elections07abcd.pdf http://www.stirling.gov.uk/elections07efg.pdf http://www.stirling.gov.uk/elections07gfead.pdf etc is there a single wget command that would download them all, or would I need to do each one separately? Thanks,

Re: Question re web link conversions

2007-03-13 Thread Alan Thomas
do with the characters in the filename, which you mentioned. Thanks, Alan - Original Message - From: "Steven M. Schweda" <[EMAIL PROTECTED]> To: Cc: <[EMAIL PROTECTED]> Sent: Tuesday, March 13, 2007 1:23 AM Subject: Re: Question re web link conversions > Fr

Re: Question re web link conversions

2007-03-12 Thread Steven M. Schweda
From: Alan Thomas As usual, "wget" without a version does not adequately describe the "wget" program you're using, "Internet Explorer" without a version does not adequately describe the Web browser you're using, and I can only assume that you're doing all this on some version or other of Window

Question re web link conversions

2007-03-12 Thread Alan Thomas
I am using the wget command below to get a page from the U.S. Patent Office. This works fine. However, when I open the resulting local file with Internet Explorer (IE), click a link in the file (go to another web site) and the click Back, it goes back to the real web address (http:...)

Re: Newbie Question - DNS Failure

2007-01-22 Thread Steven M. Schweda
From: Terry Babbey > > Built how? > Installed using swinstall How the depot contents were built probably matters more. > >Second guess: If DNS works for everyone else, I'd try building wget > > (preferably a current version, 1.10.2) from the source, and see if that > > makes any differen

RE: Newbie Question - DNS Failure

2007-01-22 Thread Terry Babbey
> I installed wget on a HP-UX box using the depot package. Which depot package? (Anyone can make a depot package.) Depot package came from http://hpux.connect.org.uk/hppd/hpux/Gnu/wget-1.10.2/ Which wget version ("wget -V")? 1.10.2 Built how? Installed using swinstall Running on which HP-UX

Re: Newbie Question - DNS Failure

2007-01-20 Thread Steven M. Schweda
From: Terry Babbey > I installed wget on a HP-UX box using the depot package. Great. Which depot package? (Anyone can make a depot package.) Which wget version ("wget -V")? Built how? Running on which HP-UX system type? OS version? > Resolving www.lambton.on.ca... failed: host nor servi

Newbie Question - DNS Failure

2007-01-19 Thread Terry Babbey
I installed wget on a HP-UX box using the depot package. Now when I run wget it will not resolve DNS queries. wget http://192.139.190.140 works. wget http://www.lambton.on.ca fails with the following error: # wget http://www.lambt

Re: Question!

2006-11-07 Thread Assistenza Deltasys
At 2006-11-07 02:57, Yan Qing Chen wrote: Hi wget, I had found a problem when i try to mirror a ftp site use wget. i use it with "-m -b" prameters. some files will be recopy when every mirror time. i will how to config a mirror site? Thanks & Best Regards, Hi, when modified date reported b

Question!

2006-11-06 Thread Yan Qing Chen
Hi wget, I had found a problem when i try to mirror a ftp site use wget. i use it with "-m -b" prameters. some files will be recopy when every mirror time. i will how to config a mirror site? Thanks & Best Regards, Yan Qing Chen(陈延庆) Tivoli China Development(IBM CSDL) Internet Email

Re: wget question (connect multiple times)

2006-10-18 Thread Hrvoje Niksic
"Tony Lewis" <[EMAIL PROTECTED]> writes: > A) This is the list for reporting bugs. Questions should go to > wget@sunsite.dk For what it's worth, [EMAIL PROTECTED] is simply redirected to [EMAIL PROTECTED] It is still useful to have a separate address for bug reports, for at least two reasons. O

RE: Err, which list? [was: Re: wget question (connect multiple times)]

2006-10-18 Thread Willener, Pat
: Re: wget question (connect multiple times)] Tony Lewis wrote: > A) This is the list for reporting bugs. Questions should go to > wget@sunsite.dk Err, I posted Qs to wget@sunsite.dk and they come via this list - is there a mix-up here? Perhaps why I never get any answers;) (If there's a

Err, which list? [was: Re: wget question (connect multiple times)]

2006-10-18 Thread Morgan Read
olding back on giving me some fantastic bit of info that'll make my life for ever better because this is a bug list and not a question list - please feel free to email me off-list:) M. -- Morgan Read NEW ZEALAND <mailto:mstuffATreadDOTorgDOTnz> fedora: Freedom Forever! http://fedorapro

RE: wget question (connect multiple times)

2006-10-17 Thread Doug Kaufman
On Tue, 17 Oct 2006, Tony Lewis wrote: > A) This is the list for reporting bugs. Questions should go to > wget@sunsite.dk I had always understood that "bug-wget" was just an alias for the regular wget mailing list. Has this changed recently? Doug -- Doug Kaufman Internet: [

Re: wget question (connect multiple times)

2006-10-17 Thread t u
> -Original Message- From: t u [mailto:[EMAIL PROTECTED] > Sent: Tuesday, October 17, 2006 3:50 PM To: [EMAIL PROTECTED] Subject: > wget question (connect multiple times) > > hi, I hope it is okay to drop a question here. > > I recently found that if wget downloads

RE: wget question (connect multiple times)

2006-10-17 Thread Tony Lewis
lto:[EMAIL PROTECTED] Sent: Tuesday, October 17, 2006 3:50 PM To: [EMAIL PROTECTED] Subject: wget question (connect multiple times) -BEGIN PGP SIGNED MESSAGE- Hash: SHA1 hi, I hope it is okay to drop a question here. I recently found that if wget downloads one file, my download speed wi

wget question (connect multiple times)

2006-10-17 Thread t u
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 hi, I hope it is okay to drop a question here. I recently found that if wget downloads one file, my download speed will be Y, but if wget downloads two separate files (from the same server, doesn't matter), the download speed for each of the

Re: Question / Suggestion for wget

2006-10-13 Thread Steven M. Schweda
From: Mitch Silverstein > If -O output file and -N are both specified [...] When "-O foo" is specified, it's not a suggestion for a file name to be used later if needed. Instead, wget opens the output file ("foo") before it does anything else. Thus, it's always a newly created file, and henc

Question / Suggestion for wget

2006-10-13 Thread Mitch Silverstein
If -O output file and -N are both specified, it seems like there should be some mode where the tests for noclobber apply to the output file, not the filename that exists on the remote machine. So, if I run # wget -N http://www.gnu.org/graphics/gnu-head-banner.png -O foo and then # wget -N http:

RE: question with wget 1.10.2 for windows

2006-08-17 Thread Sandhu, Ranjit
rsday, August 17, 2006 3:46 PMTo: [EMAIL PROTECTED]Subject: question with wget 1.10.2 for windows Thx for the program first off. This might be a big help for me. What I’m trying to do is pull .aspx pages off of a companies website as .html files and save them locally. I also need the images and c

question with wget 1.10.2 for windows

2006-08-17 Thread Savage, Ken
Thx for the program first off. This might be a big help for me. What I’m trying to do is pull .aspx pages off of a companies website as .html files and save them locally. I also need the images and css to be converted for local also.   I can’t figure out the proper command to do this. Al

Suggestion/Question

2006-02-26 Thread Markus Raab
Hallo, yesterday I encountered to wget and I find it a very useful program. I am mirroring a big site, more precious a forum. Because it is a forum under each post you have the action "quote". Because that forum has 20.000 post it would download all with "action=quote", so I rejected it with

Beginners question

2006-01-02 Thread Mikael Niklasson
Hi all I need some help/advice. My problem is the following, I want to be able to download a local copy of a website or rather a specific page down in a web-site structure and all pages above it In direct line i.e: Page \ \ --Subpage \ \ Subsubpage I wonder if it is possible to m

Re: wget output question

2005-12-01 Thread Jon Berry
Steven M. Schweda wrote: I do get the full Internet address in the download if I use -k or --convert-links, but not if I use it with -O Ah. Right you are. Looks like a bug to me. Is the developer available to confirm this? Without looking at the code, I'd say that someone is

Re: wget output question

2005-12-01 Thread Steven M. Schweda
> I do get the full Internet address in the download if I use -k or > --convert-links, but not if I use it with -O Ah. Right you are. Looks like a bug to me. Wget/1.10.2a1 (VMS Alpha V7.3-2) says this without "-O": 08:53:42 (51.00 MB/s) - `index.html' saved [2674] Converting index.html...

Re: wget output question

2005-11-30 Thread Jon Berry
Steven M. Schweda wrote: Not anything about converting relative links to absolute. I don't see an option to do this automatically. From the wget man page for --convert-links: ...if a linked file was downloaded, the link will refer to its local name; if it was not downloaded, the link wi

Re: wget output question

2005-11-30 Thread Steven M. Schweda
> 1. retrieve a single page That worked. > 2. convert the links in the retrieved page to their full, absolute > addresses. My "wget -h" output (Wget 1.10.2a1) says: -k, --convert-links make links in downloaded HTML point to local files. Wget 1.9.1e says: -k, --convert-links

wget output question

2005-11-30 Thread Jon Berry
I'm trying to use wget to do the following: 1. retrieve a single page 2. convert the links in the retrieved page to their full, absolute addresses. 3. save the page with a file name that I specify I thought this would do it: wget -k -O test.html http://www.google.com However, it doesn't c

Wget Log Question

2005-10-20 Thread dxu
I am trying to use Wget to get all the web pages of the IP Phones. If I use default "verbose" log option, the log gives me too much unused information: wget -t 1 -i phones_104.txt -O test.txt -o log.txt If I add -nv option, the log files looks fine: 20:14:23 URL:http://10.104.110.10/NetworkConfigur

Re: A mirror question

2005-09-12 Thread Mauro Tortonesi
Alle 10:18, giovedì 1 settembre 2005, Pär-Ola Nilsson ha scritto: > Hi! > > Is it possible to get wget to delete files that has disappeared at the > remote ftp-host during --mirror? not at the moment, but we might consider adding it to 2.0. -- Aequam memento rebus in arduis servare mentem... Ma

A mirror question

2005-09-01 Thread Pär-Ola Nilsson
Hi! Is it possible to get wget to delete files that has disappeared at the remote ftp-host during --mirror? ftp-mirror is able to do this but I have to mirror a site which ftp-mirror barfs at. thanks Pär-Ola -- Pär-Ola Nilsson

Re: wget Mailing List question

2005-08-26 Thread Hrvoje Niksic
"Jonathan" <[EMAIL PROTECTED]> writes: > Would it be possible (and is anyone else interested) to have the > subject line of messages posted to this list prefixed with '[wget]'? I am against munging subject lines of mail messages. The mailing list software provides headers such as `Mailing-List'

Re: wget Mailing List question

2005-08-26 Thread Daniel Stenberg
On Fri, 26 Aug 2005, Jonathan wrote: Would it be possible (and is anyone else interested) to have the subject line of messages posted to this list prefixed with '[wget]'? Please don't. Subject real estate is precious and limited already is it is. I find subject prefixes highly distdurbing.

wget Mailing List question

2005-08-26 Thread Jonathan
Would it be possible (and is anyone else interested) to have the subject line of messages posted to this list prefixed with '[wget]'? I belong to several development mailing lists that utilize this feature so that distributed messages to not get removed by spam filters, or deleted by recipient

Re: Question

2005-08-09 Thread Hrvoje Niksic
Mauro Tortonesi <[EMAIL PROTECTED]> writes: > oops, my fault. i was in a hurry and i misunderstood what > Abdurrahman was asking. what i wanted to say is that we talked about > supporting the same html file download mode of firefox, in which you > save all the related files in a directory with the

Re: Question

2005-08-09 Thread Mauro Tortonesi
On Tuesday 09 August 2005 04:37 am, Hrvoje Niksic wrote: > Mauro Tortonesi <[EMAIL PROTECTED]> writes: > > On Saturday 09 July 2005 10:34 am, Abdurrahman ÇARKACIOĞLU wrote: > >> MS Internet Explorer can save a web page as a whole. That means all the > >> images, > >> > >> Tables, can be saved as

Re: Question

2005-08-09 Thread Frank McCown
While the MHT format is not extremely popular yet, I'm betting it will continue to grow in popularity. It encapsulates an entire web page and graphics, javascripts, style sheets, etc into a single text file. This makes it much easier to email and store. See RFC 2557 for more info: http://www

Re: Question

2005-08-09 Thread Hrvoje Niksic
Mauro Tortonesi <[EMAIL PROTECTED]> writes: > On Saturday 09 July 2005 10:34 am, Abdurrahman ÇARKACIOĞLU wrote: >> MS Internet Explorer can save a web page as a whole. That means all the >> images, >> >> Tables, can be saved as a file. It is called as "Web Archieve, single file >> (*.mht)". >> >

Re: Question

2005-08-08 Thread Mauro Tortonesi
On Saturday 09 July 2005 10:34 am, Abdurrahman ÇARKACIOĞLU wrote: > MS Internet Explorer can save a web page as a whole. That means all the > images, > > Tables, can be saved as a file. It is called as "Web Archieve, single file > (*.mht)". > > Does it possible for wget ? not at the moment, but it

Question

2005-07-09 Thread Abdurrahman ÇARKACIOĞLU
MS Internet Explorer can save a web page as a whole. That means all the images, Tables, can be saved as a file. It is called as “Web Archieve, single file (*.mht)”.   Does it possible for wget ?

Re: question

2005-05-20 Thread Hrvoje Niksic
Василевский Сергей <[EMAIL PROTECTED]> writes: > I use wget 1.9.1 > In IE6.0 page load OK, > but wget return (It's a bug or timeout or ...?) Thanks for the report. The reported timeout might or might not be incorrect. Wget 1.9.1 on Windows has a known bug of misrepresenting error codes (this h

question

2005-05-20 Thread Василевский Сергей
I use wget 1.9.1 In IE6.0 page load OK, but wget return (It's a bug or timeout or ...?) 16:59:59 (9.17 KB/s) - Read error at byte 31472 (Operation timed out).Retrying. --16:59:59-- http://www.nirgos.com/d.htm (try: 2) => `/p5/poisk/spider/resource/www.nirgos.com/d.htm' Connecting to www.nirgo

Re: wget Question/Suggestion

2005-05-20 Thread Hrvoje Niksic
Mark Anderson <[EMAIL PROTECTED]> writes: > Is there an option, or could you add one if there isn't, to specify > that I want wget to write the downloaded html file, or whatever, to > stdout so I can pipe it into some filters in a script? Yes, use `-O -'.

wget Question/Suggestion

2005-05-20 Thread Mark Anderson
Is there an option, or could you add one if there isn't, to specify that I want wget to write the downloaded html file, or whatever, to stdout so I can pipe it into some filters in a script?

Re: [unclassified] Re: newbie question

2005-04-14 Thread Alan Thomas
10:12 AM Subject: [unclassified] Re: newbie question > Alan, > > You could try something like this > > wget -r -d -l1 -H -t1 -nd -N -np -A pdf > > On Wed, 13 Apr 2005, Alan Thomas wrote: > > > Date: Wed, 13 Apr 2005 16:02:40 -0400 > > From: Alan Thomas <

Re: newbie question

2005-04-14 Thread Jens Rösner
Hi! Yes, I see now, I misread Alan's original post. I thought he would not even be able to download the single .pdf. Don't know why, as he clearly said it works getting a single pdf. Sorry for the confusion! Jens > "Tony Lewis" <[EMAIL PROTECTED]> writes: > > > PS) Jens was mistaken when he

Re: newbie question

2005-04-14 Thread Hrvoje Niksic
"Tony Lewis" <[EMAIL PROTECTED]> writes: > PS) Jens was mistaken when he said that https requires you to log > into the server. Some servers may require authentication before > returning information over a secure (https) channel, but that is not > a given. That is true. HTTPS provides encrypted

Re: newbie question

2005-04-14 Thread Hrvoje Niksic
"Alan Thomas" <[EMAIL PROTECTED]> writes: > I am having trouble getting the files I want using a wildcard > specifier (-A option = accept list). The following command works fine to > get an individual file: > > wget > https://164.224.25.30/FY06.nsf/($reload)/85256F8A00606A1585256F900040A32F/$

RE: newbie question

2005-04-14 Thread Tony Lewis
Alan Thomas wrote: > I am having trouble getting the files I want using a wildcard specifier... There are no options on the command line for what you're attempting to do. Neither wget nor the server you're contacting understand "*.pdf" in a URI. In the case of wget, it is designed to read web pa

Re: newbie question

2005-04-14 Thread Jens Rösner
Hi Alan! As the URL starts with https, it is a secure server. You will need to log in to this server in order to download stuff. See the manual for info how to do that (I have no experience with it). Good luck Jens (just another user) > I am having trouble getting the files I want using a

newbie question

2005-04-13 Thread Alan Thomas
I am having trouble getting the files I want using a wildcard specifier (-A option = accept list).  The following command works fine to get an individual file:   wget https://164.224.25.30/FY06.nsf/($reload)/85256F8A00606A1585256F900040A32F/$FILE/160RDTEN_FY06PB.pdf   However, I cannot ge

RE: --continue question

2005-02-12 Thread Leonid
Ken, > i am suspicious that data is not being flushed when i kill wget I'm affraid, you misuse wget. The main beauty of wget that it restarts autmoatically when ftp connection is dropped. > i use perl to manage the progress of wget You do not need to do it. The only thing you should do is to ch

--continue question

2005-02-10 Thread Schneider, Kenneth (MLCI)
Title: Message i am using wget to retrieve files from a somewhat unstable ftp server. often i kill and restart wget with the --continue option. i use perl to manage the progress of wget and on bad days wget may be restarted 40, 50 or 60 times before the complete file is retrieved

RE: wget: question about tag

2005-02-02 Thread Tony Lewis
Normand Savard wrote: > I have a question about wget. Is is possible to download other attribute > value other than the harcoded ones? No, at least not in the existing versions of wget. I have not heard that anyone is working on such an enhancement.

  1   2   3   >