Question about timestamping

2004-02-24 Thread Tsabros Leonidas
I've been using wget for the last time in order to retrieve mirrors of some web sites. Recently i discovered the -N option. When i use it checks if the local files are older than the server files (same filenames) and if the last one is newer than the local file then it overwrites the local one

Re: Question about timestamping

2004-02-24 Thread OTR Comm
to the mail list.. And I am sorry for my terrible english Your English is clear enough to get your question across, so don't apologize. Murrah Boswell

RE: Question about timestamping

2004-02-24 Thread Craig Sowadski
Yes, I beleive the option you are looking for is --backup-converted or to make thing easier -K (make sure it is capital) is an alias for the same thing. Craig Sowadski _ Dream of owning a home? Find out how in the First-time Home

downloading multiple files question...

2004-02-03 Thread Ron
In the docs I've seen on wget, I see that I can use wildcards to download multiple files on ftp sites. So using *.pdf would get me all the pdfs in a directory. It seems that this isn't possible with http sites though. For work I often have to download lots of pdfs when there's new info I

Re: downloading multiple files question...

2004-02-03 Thread Jens Rösner
Hi Ron! If I understand you correctly, you could probably use the -A acclist --accept acclist accept = acclist option. So, probably (depending on your site), the syntax should be something like: wget -r -A *.pdf URL wget -r -A *.pdf -np URL or, if you have to recurse through multiple html

Mirroring CGI/PHP/ASP/JSP Question

2004-01-30 Thread Noname NoLast
I know that to download a web page that is a cgi script (or php or asp or jsp, etc.), you should should use the --output-document parameter. For example, if you want to download the following address: http://www.osnews.com/comment.php?news_id=5602offset=105rows=120 you would simply do something

Re: Mirroring CGI/PHP/ASP/JSP Question

2004-01-30 Thread Hrvoje Niksic
Noname NoLast [EMAIL PROTECTED] writes: reasonable so that it can save them) otherwise it will give me error messages such as: Cannot write to 'http://www.osnews.com/comment.php?news_id=5602offset=105rows=120' I don't understand this error message. Wget should never try to write to a

RE: Syntax question ...

2004-01-23 Thread Daniel Stenberg
On Thu, 22 Jan 2004, Simons, Rick wrote: curl https://server/file -uuser:pass Virtual user user logged in. No file created locally. Chalk it up as a http server flaw? Uh, curl doesn't create any file when used like that. It outputs the downloaded data to stdout unless you use an option to

Re: Syntax question ...

2004-01-23 Thread Hrvoje Niksic
Daniel Stenberg [EMAIL PROTECTED] writes: On Thu, 22 Jan 2004, Simons, Rick wrote: curl https://server/file -uuser:pass Virtual user user logged in. [...] In my eyes, this looks like the correct output from curl. Wasn't it? I think that Rick expects to see a complete HTML page rather than

RE: Syntax question ...

2004-01-22 Thread Simons, Rick
] fdx_map_tilde: pw.dir server/directory/ ui.type 1 r-uri file Is that last line the issue? -Original Message- From: Hrvoje Niksic [mailto:[EMAIL PROTECTED] Sent: Thursday, January 22, 2004 9:00 AM To: Simons, Rick Cc: '[EMAIL PROTECTED]' Subject: Re: Syntax question ... Simons, Rick [EMAIL

Re: Syntax question ...

2004-01-22 Thread Hrvoje Niksic
Thanks for persisting with this. It doesn't look like a mishandled redirection -- the response headers exist and they don't request a redirection or any kind of refresh. access_log shows that 30 bytes have been transmitted. As it happens, the string Virtual user ricks logged in.\n is exactly

RE: Syntax question ...

2004-01-22 Thread Simons, Rick
? -Original Message- From: Hrvoje Niksic [mailto:[EMAIL PROTECTED] Sent: Thursday, January 22, 2004 1:10 PM To: Simons, Rick Cc: '[EMAIL PROTECTED]' Subject: Re: Syntax question ... Thanks for persisting with this. It doesn't look like a mishandled redirection -- the response headers

Syntax question ...

2004-01-21 Thread Simons, Rick
Greetings all. I've posted in the past, but never really have gotten connectivity to a https server I support using the wget application. I've looked in the manual, on the website and searched the Internet but am not getting very far. wget -V GNU Wget 1.9 wget -d -S https://server/file

Re: Syntax question ...

2004-01-21 Thread Hrvoje Niksic
Simons, Rick [EMAIL PROTECTED] writes: Greetings all. I've posted in the past, but never really have gotten connectivity to a https server I support using the wget application. I've looked in the manual, on the website and searched the Internet but am not getting very far. wget -V

RE: Syntax question ...

2004-01-21 Thread Simons, Rick
I got wget compiled with ssl support now, and have a followup question ... I'm getting the local file created but populated with a server response, not the actual contents of the remote file. See example: wget -d -S https://server/testfile --http-user=user --http-passwd=pass DEBUG output

RE: Syntax question ...

2004-01-21 Thread Post, Mark K
Well, that's what you're telling it to do with the -S option, so why are you surprised? man wget, then /-S Mark Post -Original Message- From: Simons, Rick [mailto:[EMAIL PROTECTED] Sent: Wednesday, January 21, 2004 11:09 AM To: '[EMAIL PROTECTED]' Subject: RE: Syntax question ... I

RE: Syntax question ...

2004-01-21 Thread Simons, Rick
Another followup question(s), and thanks for the continued assistance ...: -S --server-response Print the headers sent by HTTP servers and responses sent by FTP servers. I misinterpreted this switch that the file would still be downloaded, but the console would see the server messages

Re: Syntax question ...

2004-01-21 Thread Hrvoje Niksic
Simons, Rick [EMAIL PROTECTED] writes: I got wget compiled with ssl support now, and have a followup question ... I'm getting the local file created but populated with a server response, not the actual contents of the remote file. See example: wget -d -S https://server/testfile --http-user

question

2003-12-03 Thread Danny Linkov
Hello, I'd like to download recursively the content of a web directory WITHOUT AN INDEX file. The directory content if generated by the server. How can I do that ? I use wget 1.8.2 for windows. Thank you. __ Do you Yahoo!? Protect your identity with Yahoo! Mail

Re: question

2003-12-03 Thread Tony Lewis
Danny Linkov wrote: I'd like to download recursively the content of a web directory WITHOUT AN INDEX file. What shows up in your web browser if you enter the directory (such as http://www.somesite.com/dir/)? The most common responses are: * some HTML file selected by the server (often

Problem with wget 1.9 and question mark at least on windows

2003-10-23 Thread Boris New
Hi, I tried wget 1.9 for windows from Heiko Herold (http://xoomer.virgilio.it/hherold/) and the problem with the filters and the question marks remains: On the following page: http://www.wordtheque.com/owa-wt/new_wordtheque.wcom_literature.literaturea_page?lang=FRletter=Asource=searchpage=1

RE: Problem with wget 1.9 and question mark at least on windows

2003-10-23 Thread Herold Heiko
[mailto:[EMAIL PROTECTED] Sent: Thursday, October 23, 2003 12:12 PM To: Boris New Cc: [EMAIL PROTECTED] Subject: Re: Problem with wget 1.9 and question mark at least on windows Sorry about that, Wget currently applies -R and -A only to file names, not to the query part of the URL

Question about url convert

2003-10-14 Thread Sergey Vasilevsky
Have wget any rules to convert retrive url to store url? Or may be in future? For example: Get - site.com/index.php?PHPSESSID=123124324 Filter - /PHPSESSID=[a-z0-9]+//i Save as - site.com/index.php

Re: Question about url convert

2003-10-14 Thread Hrvoje Niksic
Sergey Vasilevsky [EMAIL PROTECTED] writes: Have wget any rules to convert retrive url to store url? Or may be in future? For example: Get - site.com/index.php?PHPSESSID=123124324 Filter - /PHPSESSID=[a-z0-9]+//i Save as - site.com/index.php The problem with this is that it would

very simple wget syntax question (-d info added) ...

2003-10-13 Thread Simons, Rick
I'm having trouble with downloading a file across https using wget. I can't figure out if it is something i'm doing wrong with wget syntax, or if the httpd server isn't working like it should. # wget -d https://filed1/InBox/FILE3 --http-user=blahuser --http-passwd=blahpw DEBUG output created

Re: very simple wget syntax question (-d info added) ...

2003-10-13 Thread Hrvoje Niksic
Simons, Rick [EMAIL PROTECTED] writes: I'm having trouble with downloading a file across https using wget. I can't figure out if it is something i'm doing wrong with wget syntax, or if the httpd server isn't working like it should. I don't know what's goingn wrong here. Your Wget syntax

RE: very simple wget syntax question (-d info added) ...

2003-10-13 Thread Simons, Rick
-V GNU Wget 1.9-b5 -Original Message- From: Hrvoje Niksic [mailto:[EMAIL PROTECTED] Sent: Monday, October 13, 2003 8:44 AM To: Simons, Rick Cc: 'Tony Lewis'; '[EMAIL PROTECTED]' Subject: Re: very simple wget syntax question (-d info added) ... Simons, Rick [EMAIL PROTECTED] writes

Re: very simple wget syntax question (-d info added) ...

2003-10-13 Thread Hrvoje Niksic
Simons, Rick [EMAIL PROTECTED] writes: Using 1.9 I get a different error ... [...] using 1.9b5 # ./wget https://filed1/InBox/FILE3 --http-user=user --http-passwd=pass https://filed1/InBox/FILE3: Unsupported scheme. That just means that you haven't compiled 1.9-b5 with SSL. Did you compile

RE: very simple wget syntax question (-d info added) ...

2003-10-13 Thread Simons, Rick
? -Original Message- From: Hrvoje Niksic [mailto:[EMAIL PROTECTED] Sent: Monday, October 13, 2003 9:07 AM To: Simons, Rick Cc: '[EMAIL PROTECTED]' Subject: Re: very simple wget syntax question (-d info added) ... Simons, Rick [EMAIL PROTECTED] writes: Using 1.9 I get a different error

Re: very simple wget syntax question (-d info added) ...

2003-10-13 Thread Hrvoje Niksic
Simons, Rick [EMAIL PROTECTED] writes: I believe 1.8 was an rpm install, but I could be mistaken. You are right about the 1.9 install .. it was just a config/make/make install on the tar I nabbed. How can I determine if I have SSL includes on a RH9 box? I think you need to install the

very simple wget syntax question ...

2003-10-10 Thread Simons, Rick
If the following: wget https://filed1/InBox/FILE3 --http-user=username --http-passwd=password is creating a file locally called FILE3 that has a server response in it Virtual user username logged in., instead of the actual contents of FILE3 ... what i'm trying to figure out is if that is a WGET

Question

2003-08-19 Thread shef_xt
Hello My office connected to internet with 2 serial-line modems, and we use load balancing with Linux ip tool... but as I know WGET doesn't support multithreading download (like flashget, for instance), so if I download file,the download stream will use only 50% of bandwidth... Did U plan to

wget question

2003-03-19 Thread Christian Deutsch
begin:vcard n:Deutsch;Christian tel;fax:+49-7031-14-8253 tel;work:+49-7031-14-7273 x-mozilla-html:TRUE url:http://ovweb.bbn.hp.com/ito/hp/htdocs/users/cdeutsch/ org:Hewlett-Packard GmbH;Mailstop ASM-AP (B32, 3.OG, M22) adr:;;Schickardstrasse 25;Boeblingen;;D-71034;Germany version:2.1

wGet question

2003-02-22 Thread Tom Madigan
or to Logprint? Please reply at your earliest convenience. Thank you. Regards, Tom Madigan, Director Global Intelligence Data Operations Adzone Research, Inc. [EMAIL PROTECTED] PS. I received a bounce back when attempting to send this question to Hrvoje Niksic [EMAIL PROTECTED] The original

question

2003-02-16 Thread Oleg Gorchakov
Hello, I tried to copy to my local disk the manual http://www.kgraph.narod.ru/lectures/lectures.htm like wget -r -k -l 4 -nH http://www.kgraph.narod.ru/lectures/lectures.htm but 99.9% of .gif files were not copied and their links left like absolute links (for example

Re: question

2003-02-16 Thread Fred Holmes
From the help -p, --page-requisitesget all images, etc. needed to display HTML page. I think you need to add the -p option as well. Fred Holmes At 05:05 AM 2/16/2003, Oleg Gorchakov wrote: Hello, I tried to copy to my local disk the manual

Beginner question

2002-12-01 Thread Menno Israël
Hello all, I found the wget project while searching on the web for web downloaders and crawlers. My question is, whether it is possible to let wget crawl or spider over the web (given a certain start url) and follow all the unique urls that it runs into and download only the images it finds

Re: mirroring question

2002-11-02 Thread DennisBagley
[EMAIL PROTECTED] (Max Bowsher) wrote in news:001401c2820f$345485b0$78d96f83 @pomello: DennisBagley [EMAIL PROTECTED] wrote: ok - am using wget to mirror an ftp site [duh] and would like it not only to keep an up to date copy of the files [which it does beautifuly] but also remove files that

mirroring question

2002-11-01 Thread DennisBagley
ok - am using wget to mirror an ftp site [duh] and would like it not only to keep an up to date copy of the files [which it does beautifuly] but also remove files that are no-longer on the ftp server ?? Is this possible ??? tia den

Re: mirroring question

2002-11-01 Thread Max Bowsher
DennisBagley [EMAIL PROTECTED] wrote: ok - am using wget to mirror an ftp site [duh] and would like it not only to keep an up to date copy of the files [which it does beautifuly] but also remove files that are no-longer on the ftp server ?? Is this possible ??? Use a perl script. Max.

easy timeout question

2002-10-31 Thread Farid Hamjavar
wget 1.7 on linux rh 7.2 hello, I have a very simple question. I am writing a script that utilizes wget of course. All I want to accomplish is for wget to ,very quickly, exit when url is unreachable (like a test-bogus one below for e.g.) None of the following tries (A thru C

WGET 1.5.3 question or problem

2002-10-15 Thread Dave Wulkan
Hi, I hope this is the right place to ask for some help? I have written a (UNIX/LINUX) shell script thatuses wget to get a commodity options page from www.cme.com. I have been running this script on LINUX using wget 1.8.1 with no problems. I am trying to move the script over to the people

Question, or bugreport?

2002-10-09 Thread Robert creeq Krause
Hi, i think while downloading wget does not show the current speed but the average speed. i dont know whether this is a bug or not, but it's not favourably because (one reason) it seems that wget calculates the time to download with this speed, so this ETA is not correct. i hope i could help...

WGET 1.5.3 Question?

2002-09-09 Thread Dave Wulkan
Hi, I hope this is the right place to ask for some help? I have written a (UNIX/LINUX) shell script that goes and gets a commodity options page from www.cme.com. I have been running this on LINUX using wget 1.8.1 with no problems. I am trying to move the script over to the people that host

Re: Newbie question --- using wget for AV def file update - port failure problem

2002-07-22 Thread Steve Bratsberg
that was the fix, thank you Matt Whimp Sarah Kemp matt[EMAIL PROTECTED] wrote in message news:20020720064602.0645af92.matt[EMAIL PROTECTED]... On Fri, 19 Jul 2002 11:57:38 -0400 Steve tapped the following into the keyboard: == PORT ... Master socket fd 428 bound. using port 1342. --

a little question

2002-07-16 Thread WereWolf
hi... I've got one question... when I'm downloading some file, that contain spaces in his name, with -nd or --cut-dirs options, it saves file with %20 instead of space character whetn I'm downloading without any options, in directories names thera are %20, but in file name there are normal

Re: Question about wget !!

2002-06-20 Thread coredump
This is not a wget issue, I saw something like that on a capped cable modem, may be thats how caps are implemented. No problem on the LAN. From: Abdullah al-Muhaitheef [EMAIL PROTECTED] Date: 2002/06/20 Thu AM 08:41:03 GMT To: [EMAIL PROTECTED] Subject: Question about wget !! Hi guys

question about wget flavor

2002-05-17 Thread Pavel Stepchenko
Hello wget, $ wget -v wget: missing URL Usage: wget [OPTION]... [URL]... Try `wget --help' for more options. [root@rmt-gw]1014# wget --version GNU Wget 1.8.1 script.sh: #!/bin/sh wget=/usr/local/bin/wget -t0 -nr -nc -x --timeout=20 --wait=61 --waitretry=120 $wget ftp://nonanonymous:[EMAIL

Re: question about wget flavor

2002-05-17 Thread Ian Abbott
On Fri, 17 May 2002 16:59:07 +0400, Pavel Stepchenko [EMAIL PROTECTED] wrote: #!/bin/sh wget=/usr/local/bin/wget -t0 -nr -nc -x --timeout=20 --wait=61 --waitretry=120 $wget ftp://nonanonymous:[EMAIL PROTECTED]/file1.zip sleep 60 $wget ftp://nonanonymous:[EMAIL PROTECTED]/file2.zip Why WGET

Re: question on printing to screen

2002-05-13 Thread Ian Abbott
On 12 May 2002 02:54:52 -0500, asher [EMAIL PROTECTED] wrote: hi, I've been trying to figure out how wget prints all over the screen with out using curses, and I'm hoping someone can help. from the code, I'm pretty sure it's just printing to the C-stream stderr, but I can't for the life of me

question on printing to screen

2002-05-12 Thread asher
hi, I've been trying to figure out how wget prints all over the screen with out using curses, and I'm hoping someone can help. from the code, I'm pretty sure it's just printing to the C-stream stderr, but I can't for the life of me figure out how it seeks or jumps around in the stream. any help

basic question

2002-05-03 Thread AARC . DISTRIBUTORS
I need to download multiple data files from a site requiring a username and password. Wget gets in fine and downloads a number of files in other directories, but when attempting the directory containing the data files each file is listed but not downloaded with a socket:Too many

Re: newbie question

2002-04-13 Thread Hrvoje Niksic
Newer versions of Wget check the server type and adjust the directory listing parser accordingly. If I remember correctly, NT directory listing is now supported.

newbie question

2002-04-12 Thread dbotham
Just when I thought it was safe to start downloading files, I get this: wget --mirror -v -I/ -X/report,/Software -w1 -gon ftp://x:[EMAIL PROTECTED] --11:27:41-- ftp://x:[EMAIL PROTECTED]:21/ = `64.226.243.208/.listing' Connecting to 64.226.243.208:21... connected!

WGET Offline proxy question

2002-04-04 Thread Jonathan A Ruxton
Hi All Sorry for the off topic and 'Newbie' question, but is it possible having downloaded a site using 'wget' to a local directory, to then at a later stage to run just the 'proxy' part of the process, reading the site contents from the local directory and note from the site to enable re

Re: Question

2002-03-04 Thread Alan Eldridge
On Mon, Mar 04, 2002 at 12:36:41PM -0800, Jim Gifford wrote: I noticed a lot of posts about the php?, I have followed alot of this It's invoking a php script and passing it parameters Normal stuff I am using the current versions of wget-181 on a linux box wget -c -nv

Re: Question

2002-03-04 Thread Thomas Lussnig
wget -c -nv http://gbarbierfreefr/prj/dev/downphp3?file=txt112targz; Hi, this version 112 wget can't nor mozilla can fetch it br bWarning/b: Unable to access down/txt112targz in byour script/b on line b61/bbr br bWarning/b: ReadFile(down/txt112targz) - No such file or directory in byour

Re: Question

2002-03-04 Thread Alan Eldridge
On Mon, Mar 04, 2002 at 11:48:50PM +0100, Thomas Lussnig wrote: Try 11targz and it work I think it is simple The file DO NOT EXSIST YET I must've screwed up with Konq and grabbed a different link I believe you're correct -- Alan Eldridge Dave's not here, man

Question about the -O option

2002-02-18 Thread Georg Prager
Hi! I tried to continue downloading a half existing file with: wget -c ftp://www.test.com/file.c; -O testfile.c But I realised that this doesn't work as I want, because according to wgets info page the -O option overwrites an existing file. I'd like to know if there is any possibility wget

File overwriting question for wget

2002-02-05 Thread Todd
Hi, I love wget it has been a godsend for many a project for me, thank you for creating it. My questions is that I have noticed the -nc option to not overwrite a file but how can I tell wget to specifically overwrite a file? I have tried a few things and I am unable to keep it from

(non-subscriber question) can wget handle this job?

2002-01-10 Thread Smith, Walter (CXO)
Folks, I've been looking into using wget to do some web data-collecting for me (instead of doing it all by hand) and I'm striking out so far. Can someone tell me if these things are do-able via wget (or even some other util), and point me at an example? The tasks I'm trying to do: 1)

Re: more of a question I guess..

2002-01-08 Thread T. Bharath
replace the first @ with %40 and check Regards Bharath Turgut Kalfaoglu wrote: Hi. I love WGET, but I have a stumper: How do you put a URL if the FTP site requires a password that has an '@' in it?? Like: wget ftp://userid:password@[EMAIL PROTECTED]:21/pub/incoming/blah.zip does not

Re: more of a question I guess..

2002-01-08 Thread Vladi Belperchinov-Shabanski
doesn't work, cannot login... P! Vladi. T. Bharath wrote: replace the first @ with %40 and check Regards Bharath Turgut Kalfaoglu wrote: Hi. I love WGET, but I have a stumper: How do you put a URL if the FTP site requires a password that has an '@' in it?? Like: wget

Re: more of a question I guess..

2002-01-08 Thread Vladi Belperchinov-Shabanski
I have go the same problem and I found that my previous version of wget can handle this (the version was 1.7.1 I guess): wget ftp://user:pass@[EMAIL PROTECTED]/etc and worked fine, lates sources print error `no host in the url' or similar, perhaps --ftp-user and --ftp-pass would be good

Re: Just a Question

2001-12-29 Thread Hrvoje Niksic
Edward Manukovsky [EMAIL PROTECTED] writes: Excuse me, please, but I've got a question. I cannot set retry timeout for 30 seconds by doing: wget -w30 -T600 -c -b -t0 -S -alist.log -iurl_list For me, Wget waits for 30 seconds between each retrieval. What version are you using?

Just a Question

2001-12-25 Thread Edward Manukovsky
Hello! Excuse me, please, but I've got a question. I cannot set retry timeout for 30 seconds by doing: wget -w30 -T600 -c -b -t0 -S -alist.log -iurl_list This timeout is much more shorter :((( Answer me, please, how can I do this. -- Best regards, Edward Manukovsky

Re: wget question for non index.html retrieval

2001-11-27 Thread Adrian Aichner
Amy == Amy Rupp [EMAIL PROTECTED] writes: Amy I'd been archiving obituaries from: Amy www.valleystar.com/obits Amy The obituaries were in the default file index.html. Amy Lately they have switched things so that I believe the Amy output of invoking the above URL is dynamic

A question for use wget!

2001-11-08 Thread ´÷ÓÂ
hi administrator; how can i use the speed limited with wget? ¼Óн£¬ÉýÖ°ÃÜóÅ http://www.englishtown.com/master/home/courseoverview.asp?etag=TOCNctr=cn === ÊÖ»úºÅÂëÊǵçÓÊ£¬´Ó´Ë½»·Ñ²»ÓóһºÅÔÚÊÖ¡°ËæÉíÓÊ¡± ¡ª¡ª

Re: Mulitple-site question

2001-10-09 Thread CJ Kucera
On Thu Oct 4 04:55:53 2001, Ian Abbott wrote: On 3 Oct 2001, at 16:01, CJ Kucera wrote: The closest I've come is (and there's lots of extraneous stuff in there): wget -r -l inf -k -p --wait=1 -H

Re: Mulitple-site question

2001-10-04 Thread Ian Abbott
On 3 Oct 2001, at 16:01, CJ Kucera wrote: The closest I've come is (and there's lots of extraneous stuff in there): wget -r -l inf -k -p --wait=1 -H --domains=theonion.com,graphics.theonion.com,www.theonion.com,theonionavclub.com,www.theonionavclub.com http://www.theonion.com The domains

Mulitple-site question

2001-10-03 Thread CJ Kucera
Greetings, humans! I'd like to use wget to take a snapshot of www.theonion.com. On that site, all of the graphics are served from graphics.theonion.com, and there's a bunch of other sub-domains as well. Also, it links over to www.theonionavclub.com, which I would also like to mirror. How can I

Re: Mulitple-site question

2001-10-03 Thread CJ Kucera
I said: I'd like to use wget to take a snapshot of www.theonion.com. On that [snip!] I forgot to mention I'm using wget 1.7. Sorry 'bout that. -CJ WOW: Rapacious | A priest advised Voltaire on his death bed to apocalyptech.com/wow | renounce the devil. Replied Voltaire, This

referer question

2001-09-13 Thread Vladi Belperchinov-Shabanski
hi! is it possible (well I mean easy way:)) to make wget pass referer auto? I mean for every url that wget tries to fetch to pass hostname as referer. for example: http://www.somewhere.org/path/etc/page.html then the referer should be `http://www.somewhere.org/' well this is not

Re: referer question

2001-09-13 Thread Jens Rösner
Hi Vladi! If you are using windows, you might try http://www.jensroesner.de/wgetgui/ it is a GUI for wGet written in VB 6.0. If you click on the checkbox identify as browser, wGetGUI will create a command line like you want. I use it and it works for me. Hope this helps? CU Jens Vladi wrote:

Re: referer question

2001-09-13 Thread Jan Hnila
Hello! To be able to use the referer switch, you must have a new version of wget - I'm not sure, if 1.6 is enough, 1.7 certainly is enough and 1.5.3 is not enough.(Get more info's from http://wget.sunsite.dk) The switch is --referer=URL Try to use it with the -d (debug) switch to see, that

Re: referer question

2001-09-13 Thread Vladi Belperchinov-Shabanski
hi! well I think you misunderstood perhaps... I mean not to give referer (the --referer) but to give something like --auto-referer (possibly in wgetrc even) and wget will set referer for each url it process to the host-part from the same url. i.e. I'll give the example once again:

Re: referer question

2001-09-13 Thread Jens Roesner
Hi wgetters! @André Guys, you don't understand what the OP wants. He needs a dynamically generated referer, something like wget --referer 'http://%h/' where, for each URL downloaded, wget would replace %h by the hostname. Well, I understood it this way. My problem was that I mainly use

Re: Q: (problem) wget on dos/win: question marks in url

2001-08-07 Thread Reto Kohli
, too) does of course not allow you to write files with question marks in their filenames! so, wget will complain Cannot write to 'mydomain.org/index.html?foo=bar', [snip] The following characters are not legal in DOS filenames, even long (VFAT) filenames: \ / : * ?| (sorry

Re: Q: (problem) wget on dos/win: question marks in url

2001-08-07 Thread Doug Kaufman
On Tue, 7 Aug 2001, Reto Kohli wrote: as far as i can tell, one would have to change/rebuild wget so that all those illegal characters are translated to something else (ie underscores) before the files are written to disk. if someone did that, i would appreciate it _very_ much. maybe this

Re: [Question] What's the problem?

2001-07-10 Thread Ian Abbott
On 10 Jul 2001, at 10:16, ÀÌÀç·É wrote: When we execute wget, we can get this message... Connecting to www.chosun.com:80... connected! HTTP request sent, awaiting response... 206 Partial Content What's the problem??? What command line parameters did you use? And What is the 206

Question about crawler and Etag

2001-07-10 Thread Bazuka
This question is not related to Wget - so this newsgroup is probably not the right place to post this message. However, I am posting this msg here hoping that someone here might be able to help me. I have just written a bare-bones crawler in C++. It seems to run just fine except when getting

[Question] What's the problem?

2001-07-09 Thread
When we execute "wget", we can get this message... "Connecting to www.chosun.com:80... connected! HTTP request sent, awaiting response... 206 Partial Content" What's the problem??? And What is the "206 Partial Content"?? Please answer to me... = µðÁöƲ

Re: Domain Acceptance question

2001-07-06 Thread Mengmeng Zhang
But as home.nexgo.de has very many dirs except mine, I also tried wget -r -l0 -nh -H -Dhome.nexgo.de/bmaj.roesner http://www.audistory.com and wget -r -l0 -nh -H -Dhome.nexgo.de/bmaj* http://www.audistory.com which both did not work. The reason is that -D takes only domain names as

Re: Domain Acceptance question

2001-07-06 Thread Jens Roesner
Hi Mengmeng! Thanks very much, I (obviously) was not aware of that! I'll see how I can incorporate that (-I/-X/-D/-H) in wGetGUI. Can I do something like -H -Dhome.nexgo.de -Ibmaj.roesner http://www.AudiStory.com ? I'll just give it a try. Thanks again! Jens

Re: Q: (problem) wget on dos/win: question marks in url

2001-07-02 Thread Malcolm Austen
On Mon, 2 Jul 2001, Doug Kaufman wrote: + On Mon, 2 Jul 2001, Malcolm Austen wrote: + + If I recall correctly, the DOS version of wget 1.5.2 handled this by + translating the ? into @. I did feed back (can't recall exactly when + now) that wget 1.6 for DOS wasn't doing this. I haven't used

Re: Q: (problem) wget on dos/win: question marks in url

2001-06-29 Thread Rick Palazola
windows, too) does of course not allow you to write files with question marks in their filenames! so, wget will complain Cannot write to 'mydomain.org/index.html?foo=bar', which is sadly true, and i will never get those wonderful pages.. (b.g. die die die! ;) -- can this be circumdone

Re: Q: (problem) wget on dos/win: question marks in url

2001-06-29 Thread Ian Abbott
files with question marks in their filenames! so, wget will complain Cannot write to 'mydomain.org/index.html?foo=bar', which is sadly true, and i will never get those wonderful pages.. (b.g. die die die! ;) Did you try putting quotes around the URL? Don't have time to test

wget question

2001-06-13 Thread Emile
Hi, Sorry for bothering you but I have a small wget quetsion? How do I fetch in a html which location ends on this: /printpage.cgi?forum=1topic=1. Most of the time it is something like /index.html which doesn't give a problem. thanks, Emile __

Question

2001-05-30 Thread Shellabarger Scott - sshell
Can I use Wget to retreive files through https? If yes, could I get an example? Thanks, Scott Shellabarger Acxiom Corporation - Products Division 501.252.3089 [EMAIL PROTECTED]

Re: Question

2001-05-30 Thread Hrvoje Niksic
Shellabarger Scott - sshell [EMAIL PROTECTED] writes: Can I use Wget to retreive files through https? Yes, but you need the version 1.7, which is about to be released.

translations 1.7? (Re: Question)

2001-05-30 Thread Karl Eichwalder
On 30 May 2001, Hrvoje Niksic wrote: Yes, but you need the version 1.7, which is about to be released. If a pre-release is available I'd like to install the .pot file of this pre-release package on the Translation Robot site (http://www.iro.umontreal.ca/contrib/po/HTML/). Of course, I can

Re: translations 1.7? (Re: Question)

2001-05-30 Thread Hrvoje Niksic
Karl Eichwalder [EMAIL PROTECTED] writes: On 30 May 2001, Hrvoje Niksic wrote: Yes, but you need the version 1.7, which is about to be released. If a pre-release is available I'd like to install the .pot file of this pre-release package on the Translation Robot site

newbie question

2001-05-03 Thread Gavin Burnett
hello. i'm finding wget really useful for my dial up connection at home, and I am trying to set it up in work, as there are some web based documents I would like to store in my home directory. I have given the program the following arguments: wget --background

Re: -c question

2001-03-31 Thread Hrvoje Niksic
"Dan Harkless" [EMAIL PROTECTED] writes: Vladi Belperchinov-Shabanski [EMAIL PROTECTED] writes: Yes, it's a known bug and is documented in the current CVS version of wget.texi. With luck, the fix may be as simple as changing a = to a . It's just that no one's had a chance to look at

-c question

2001-03-06 Thread Vladi Belperchinov-Shabanski
hi! `wget -c file' starts to download file from the begining if the file is completely downloaded already... why?! I expect wget to do nothing in this case: I wanted it to download file to the end (i.e. to continue, -c) and if the file is already here so there is nothing to do.

Re: -c question

2001-03-06 Thread Dan Harkless
Vladi Belperchinov-Shabanski [EMAIL PROTECTED] writes: hi! `wget -c file' starts to download file from the begining if the file is completely downloaded already... why?! I expect wget to do nothing in this case: I wanted it to download file to the end (i.e. to

<    1   2