Re: ftp and .netrc via proxy problem

2001-06-08 Thread Joe Cooper

Silly me.  I see FTP and Squid in the same sentence and assume it is 
YAFQ (yet another ftp question--they happen all the time on the Squid list).

In this case, wget I guess is acting correctly as an http client, so no 
problem there.

Looking at it again more closely, here's what I bet is happening:

First case...You provide rick@ in the hostname.  wget does not ask for 
further auth information from you (it won't do that regardless of 
whether a proxy is in place--I just tested it).  The ftp server denies 
entry because a blank password is not acceptable login info in this case.

Second case...Anonymous login, which is accepted when authenticating. 
Fails because the anonymous login chroots you to the /home/ftp or some 
similar directory before even doing anything--so /path/to/your/file 
becomes /home/ftp/path/to/your/file, which does not exist on your server.

Third case.  Works because the ftp server doesn't chroot you to the ftp 
homedir first, and accepts your authentication because it is complete.

Seems to be expected behavior, doesn't it?  Or am I missing something again?

Hope this helps...

Richard Travett wrote:

 Hi Joe,
 
 
Squid is not a proxy for FTP clients, it only talks to HTTP clients, 
though it will talk to FTP servers on behalf of HTTP clients.

 
 Umm, that may be so (I have no idea so I'll take your word for it), but
 unless I've missed something I don't understand how this affects what
 I'm trying to do. I can retrieve stuff via ftp as attempt 3 shows, so I
 don't believe it is the proxy that is the problem, it's wget not
 supplying what I think is the correct information to the proxy in the
 first place. This means that the proxy is unable to retrieve the file
 on my behalf.
 
 Regards,
 



-- 

   --
  Joe Cooper [EMAIL PROTECTED]
  Affordable Web Caching Proxy Appliances
 http://www.swelltech.com




RE: problem getting a file

2001-06-08 Thread Simha, Shuba

Thanks a bunch for reply Jan..

I could get it to work!
All I had to do was include the following lines in my .wgetrc file..

http_proxy = http://ourproxyserver.com:1010/
use_proxy = on
user-agent = Mozilla/4.0 (compatible; MSIE 5.5; Windows NT 4.0)

But I have another problem now. I want to get only the files starting with
access, in a particular directory. After searching through the Wget help and
the mailing list too, and I tried this..
wget --accept 'access*' http://hostname/directory/

But it gets only index.html from the directory !! Any suggestions..

Thanks!
-Shuba


-Original Message-
From: Jan Prikryl [mailto:[EMAIL PROTECTED]]
Sent: Friday, June 08, 2001 3:45 AM
To: Simha, Shuba
Cc: Wget mailing list
Subject: Re: problem getting a file

Quoting Simha, Shuba :

 I am a first-time user of wget. I have wget within my Cygwin. I am
 trying to get a file from a remote host as explained below, but it
 fails to connect.  Can anybody point me where I am going wrong?

Could you please provide us with the full debugging output (-d switch)
of the failed wget session? Also, could your tell us which version of
wget you are using? 

It seems that the connect() system call fails as it cannot esablish
TCP/IP connection to port 80 on the remote host. Does `telnet hostname
80' work?   

-- jan



Re: ftp and .netrc via proxy problem

2001-06-08 Thread Richard Travett

Hi Joe,

Thanks for your comments so far.

 Silly me.  I see FTP and Squid in the same sentence and assume it is 
 YAFQ (yet another ftp question--they happen all the time on the Squid list).

Thats ok.

 In this case, wget I guess is acting correctly as an http client, so no 
 problem there.

[snip]

I agree with your interpretation of what is happening at the remote end
i.e. why the ftp is failing

 Seems to be expected behavior, doesn't it?  Or am I missing something again?

...but I don't think it is expected behaviour because it says in the
help:


URL Format
==

   URL is an acronym for Uniform Resource Locator.  A uniform
resource locator is a compact string representation for a resource
available via the Internet.  Wget recognizes the URL syntax as per
RFC1738.  This is the most widely used form (square brackets denote
optional parts):

 http://host[:port]/directory/file
 ftp://host[:port]/directory/file

   You can also encode your username and password within a URL:

 ftp://user:password@host/path
 http://user:password@host/path

   Either USER or PASSWORD, or both, may be left out.  If you leave out
either the HTTP username or password, no authentication will be sent.
If you leave out the FTP username, `anonymous' will be used.  If you
leave out the FTP password, your email address will be supplied as a
default password.(1)

[snip]

   -- Footnotes --

   (1) If you have a `.netrc' file in your home directory, password
will also be searched for there.


I *do* have a .netrc file in my home directory - so in cases 1 and 2
wget should be looking there for my login information shouldn't it?.

Rick.
-- 
Richard Travett,Email: [EMAIL PROTECTED] 
  May all your communications be free from electro-magnetic disturbances,
and your days be free from temporal distortions in the space time continuum
 This email does not represent a formal communication from Simoco.



Re: wget-1.7 does not compile with glibc1 (libc5)

2001-06-08 Thread Andre Majorel

On 2001-06-08 17:57 -0400, Parsons, Donald wrote:
 Previous versions up to 1.6 compiled fine.
 
 cd src  make CC='gcc' CPPFLAGS='' DEFS='-DHAVE_CONFIG_H 
-DSYSTEM_WGETRC=\/usr/etc/wgetrc\ -DLOCA
 LEDIR=\/usr/share/locale\' CFLAGS='-O2 -fomit-frame-pointer -march=pentium 
-mcpu=pentium -pipe' LD
 FLAGS='-s' LIBS='' prefix='/usr' exec_prefix='/usr' bindir='/usr/bin' 
infodir='/usr/info' mandir='/u
 sr/man' manext='1'
 make[1]: Entering directory `/usr/src/wget-1.7/src'
 gcc -I. -I.-DHAVE_CONFIG_H -DSYSTEM_WGETRC=\/usr/etc/wgetrc\ 
-DLOCALEDIR=\/usr/share/locale\
  -O2 -fomit-frame-pointer -march=pentium -mcpu=pentium -pipe -c utils.c
 utils.c: In function `read_file':
 utils.c:980: `MAP_FAILED' undeclared (first use in this function)
 utils.c:980: (Each undeclared identifier is reported only once
 utils.c:980: for each function it appears in.)
 make[1]: *** [utils.o] Error 1
 make[1]: Leaving directory `/usr/src/wget-1.7/src'
 make: *** [src] Error 2

Quick and dirty fix : insert the following in utils.c before the
reference to MAP_FAILED :

#ifndef MAP_FAILED
#  define MAP_FAILED -1
#endif

-- 
André Majorel [EMAIL PROTECTED]
http://www.teaser.fr/~amajorel/



dynamic web page using wget?

2001-06-08 Thread Jingwen Jin


Hi, Do any of you know if wget allows us to retrieve dynamic query pages?
I tested
wget http://altavista.com/sites/search/web?q=musickl=XXpg=q

which queries music at altavista. But wget doesn't work with this...

Can somebody please help?

Thanks for your attention,

Jingwen