Re: [EMAIL PROTECTED] FTP wildcards
=?Windows-1250?B?VuFjbGF2IEtycGVj?= [EMAIL PROTECTED] writes: I'm having trouble using wget on [EMAIL PROTECTED] While trying to do FTP connection wget doesn't understand wildcards, for example: $ wget ftp://ftp.fit.vutbr.cz/pub/XFree86/4.3.0/* Warning: wildcards not supported in HTTP. --12:54:29-- ftp://ftp.fit.vutbr.cz/pub/XFree86/4.3.0/* = `*' Connecting to 192.168.35.1:3128... connected. Proxy request sent, awaiting response... 404 Not Found 12:55:01 ERROR 404: Not Found. You're not really doing an FTP connection, you're doing the proxy connection (and proxy is doing FTP for you). Unset the proxy setting and FTP wildcards should work. If you cannot avoid using the proxy, replace wildcards with something like `-r -l 1 -A PATTERN'.
Re: wget bug with ftp/passive
don [EMAIL PROTECTED] writes: I did not specify the passive option, yet it appears to have been used anyway Here's a short transcript: [EMAIL PROTECTED] sim390]$ wget ftp://musicm.mcgill.ca/sim390/sim390dm.zip --21:05:21-- ftp://musicm.mcgill.ca/sim390/sim390dm.zip = `sim390dm.zip' Resolving musicm.mcgill.ca... done. Connecting to musicm.mcgill.ca[132.206.120.4]:21... connected. Logging in as anonymous ... Logged in! == SYST ... done.== PWD ... done. == TYPE I ... done. == CWD /sim390 ... done. == PASV ... Cannot initiate PASV transfer. Are you sure that something else hasn't done it for you? For example, a system-wide initialization file `/usr/local/etc/wgetrc' or `/etc/wgetrc'.
RE: Syntax question ...
Thanks for the continued assistance, i'd like to get this nailed down to a http server issue or a wget issue so I can complete or stop wget certification of our product. wget -o output -d -S https://server/file --http-user=user --http-passwd=pass cat file Virtual user ricks logged in. cat output DEBUG output created by Wget 1.9.1 on linux-gnu. --09:13:48-- https://server/file = `file' Resolving server... ip Caching server = ip Connecting to server[ip]:443... connected. Created socket 4. Releasing 0x8122b98 (new refcount 1). ---request begin--- GET /file HTTP/1.0 User-Agent: Wget/1.9.1 Host: host Accept: */* Connection: Keep-Alive Authorization: Basic cmlja3M6cmlja3MyNjI2 ---request end--- HTTP request sent, awaiting response... HTTP/1.1 200 OK Date: Thu, 22 Jan 2004 14:22:45 GMT 2 Date: Thu, 22 Jan 2004 14:22:45 GMTServer: Apache/1.3.26 (Unix) mod_ssl/2.8.10 OpenSSL/0.9.6g SecureTransport/4.1.2 3 Server: Apache/1.3.26 (Unix) mod_ssl/2.8.10 OpenSSL/0.9.6g SecureTransport/4.1.2Set-Cookie: FDX=8gLvC68DA9Q7oK7WokR89w==; path=/ 4 Set-Cookie: FDX=8gLvC68DA9Q7oK7WokR89w==; path=/ Stored cookie server 443 / nonpermanent 0 undefined FDX 8gLvC68DA9Q7oK7WokR89w== Accept-Ranges: bytes 5 Accept-Ranges: bytesExpires: Thu, 01 Jan 1970 00:00:00 GMT 6 Expires: Thu, 01 Jan 1970 00:00:00 GMTFeatures: CHPWD;RTCK;STCK;ASC 7 Features: CHPWD;RTCK;STCK;ASCConnection: close 8 Connection: closeContent-Type: text/plain; charset=UTF-8 9 Content-Type: text/plain; charset=UTF-8 0K 292.97 KB/s Closing fd 4 09:13:49 (292.97 KB/s) - `file' saved [30] cat ssl_log [Thu Jan 22 08:22:45 200 16984] [info] Connection to child 9 established (server server:443, client ip) [Thu Jan 22 08:22:45 200 16984] [info] Seeding PRNG with 1160 bytes of entropy [Thu Jan 22 08:22:45 200 16984] [info] Initial (No.1) HTTPS request received for child 9 (server server:443) [Thu Jan 22 08:22:46 200 16984] [info] Connection to child 9 closed with standard shutdown (server server:443, client ip) cat ssl_access_log 10.211.90.71 - user [22/Jan/2004:08:22:46 -0600] GET /file HTTP/1.0 200 30 cat ssl_error_log [Thu Jan 22 08:22:45 2004] [info] VIRTUAL HTTP LOGIN FROM ip [ip], user (class virt) cat error_log [Thu Jan 22 08:22:45 2004] [info] [16984] Setting tables to 0 [Thu Jan 22 08:22:45 2004] [info] [16984] fdx_request_handler called for URL /file [Thu Jan 22 08:22:45 2004] [info] [16984] Timeout set to 300 [Thu Jan 22 08:22:45 2004] [info] [16984] fdx_authenticate Starting new session [Thu Jan 22 08:22:45 2004] [info] [16984] Timeout set to 300 [Thu Jan 22 08:22:45 2004] [info] [16984] isWSProxy=0 timeout=1074781665 lgi_failure_threshold=3 [Thu Jan 22 08:22:45 2004] [info] [16984] Is Browser client 1 [Thu Jan 22 08:22:45 2004] [info] [16984] fdx_check_credentials: useldap 0 reqldap 0 user user [Thu Jan 22 08:22:45 2004] [info] [16984] Timeout set to 300 [Thu Jan 22 08:22:45 2004] [info] [16984] fdx_map_tilde: pw.dir server/directory/ ui.type 1 r-uri file Is that last line the issue? -Original Message- From: Hrvoje Niksic [mailto:[EMAIL PROTECTED] Sent: Thursday, January 22, 2004 9:00 AM To: Simons, Rick Cc: '[EMAIL PROTECTED]' Subject: Re: Syntax question ... Simons, Rick [EMAIL PROTECTED] writes: Another followup question(s), and thanks for the continued assistance ...: -S --server-response Print the headers sent by HTTP servers and responses sent by FTP servers. I misinterpreted this switch that the file would still be downloaded, but the console would see the server messages not that the server messages would be concatinated onto the file. Actually, your interpretation is correct. I mistook `-S' for `-s' which tells Wget to prepend the headers to the file. Sorry about the confusion. In other words, what you wrote should have worked. Perhaps it can be reworded in future versions? Also, shouldn't the contents of the file still appear in the file, along with the server responses??? It should. cat testfile Virtual user username logged in. I have seen this bug reported before, always in conjunction with SSL, but I cannot repeat it. What is the server response? Is the server maybe sending a redirection that Wget does not recognize?
Re: Syntax question ...
Thanks for persisting with this. It doesn't look like a mishandled redirection -- the response headers exist and they don't request a redirection or any kind of refresh. access_log shows that 30 bytes have been transmitted. As it happens, the string Virtual user ricks logged in.\n is exactly thirty bytes long. If there is an error, it seems to be on the server's side. At least I can't see that Wget is doing anything obviously wrong in the request. To modify the request, you could try one or more of these: * Use `--user-agent' to pretend to be a web browser. The server software might be checking for browser versions. * Use `--no-http-keep-alive' to get rid of the `Connection' header. * Use `--header Accept: text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,image/jpeg,image/gif;q=0.2,*/*;q=0.1' to force the Accept header to the value that Mozilla uses and that apparently works on your site. For this to work, you need the CVS version of Wget. Even better would be if you could try the same site with another OpenSSL-using client, such as curl. If it doesn't work, the problem might be related to the site. If it does work, we can take a look at what curl does differently.
RE: Syntax question ...
I tried all your suggestions except the cvs one, and the results were the same. curl -V curl 7.9.8 (i386-redhat-linux-gnu) libcurl 7.9.8 (OpenSSL 0.9.7a) (ipv6 enabled) curl https://server/file -uuser:pass Virtual user user logged in. No file created locally. Chalk it up as a http server flaw? -Original Message- From: Hrvoje Niksic [mailto:[EMAIL PROTECTED] Sent: Thursday, January 22, 2004 1:10 PM To: Simons, Rick Cc: '[EMAIL PROTECTED]' Subject: Re: Syntax question ... Thanks for persisting with this. It doesn't look like a mishandled redirection -- the response headers exist and they don't request a redirection or any kind of refresh. access_log shows that 30 bytes have been transmitted. As it happens, the string Virtual user ricks logged in.\n is exactly thirty bytes long. If there is an error, it seems to be on the server's side. At least I can't see that Wget is doing anything obviously wrong in the request. To modify the request, you could try one or more of these: * Use `--user-agent' to pretend to be a web browser. The server software might be checking for browser versions. * Use `--no-http-keep-alive' to get rid of the `Connection' header. * Use `--header Accept: text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q= 0.8,image/png,image/jpeg,image/gif;q=0.2,*/*;q=0.1' to force the Accept header to the value that Mozilla uses and that apparently works on your site. For this to work, you need the CVS version of Wget. Even better would be if you could try the same site with another OpenSSL-using client, such as curl. If it doesn't work, the problem might be related to the site. If it does work, we can take a look at what curl does differently.
problem with # in path
Hi, im trying get all from wget -r -l 0 ftp://19.24.24.24/some/datase/C#Tool/ vut i cant get anything, because wget cut all from #, it thinks its comment. plz any help Thnx in advance Miki +---V---+ | Peter Mikeska |[EMAIL PROTECTED] | | A L C A T E L | | System Engineer | phone: +421 44 5206316 | +---+ | IT Services MadaCom | fax: +421 44 5206356 |
RE: problem with # in path
It's more likely your system/shell that is doing it, if you're using Linux or UNIX. wget -r -l 0 ftp://19.24.24.24/some/datase/C\#Tool/ Mark Post -Original Message- From: Peter Mikeska [mailto:[EMAIL PROTECTED] Sent: Thursday, January 22, 2004 6:28 PM To: [EMAIL PROTECTED] Subject: problem with # in path Hi, im trying get all from wget -r -l 0 ftp://19.24.24.24/some/datase/C#Tool/ vut i cant get anything, because wget cut all from #, it thinks its comment. plz any help Thnx in advance Miki +---V---+ | Peter Mikeska |[EMAIL PROTECTED] | | A L C A T E L | | System Engineer | phone: +421 44 5206316 | +---+ | IT Services MadaCom | fax: +421 44 5206356 |