Re: How do I get SSL support to work in 1.7?

2001-06-07 Thread wget

On Thu, Jun 07, 2001 at 11:42:08AM +0200, Hrvoje Niksic wrote:
 [EMAIL PROTECTED] writes:
 
  Not surprising. Neither IRIX 6.5 nor Tru64 UNIX 4.0D have
  /dev/random.  So, you need either EGD/PRNGD to provide a substitute
  for your missing /dev/random. And, the *client* software has to be
  configured to support this. So, if wget doesn't call RAND_egd() from
  OpenSSL, there is *nothing* you can do. And, from a quick perusal of
  wget 1.7, it doesn't. So, 1.7 is useless for https:// on any system
  without /dev/random.
 
 Ouch.  I would be thankful for any patches that allowed the use of
 Wget/SSL on non-Linux systems.  (I know next to nothing about SSL
 myself.)

Is Wget available via CVS somewhere or should patches be against 1.7?

-- 
albert chin ([EMAIL PROTECTED])



Re: How do I get SSL support to work in 1.7?

2001-06-07 Thread Jan Prikryl

Quoting [EMAIL PROTECTED] ([EMAIL PROTECTED]):

 Is Wget available via CVS somewhere or should patches be against 1.7?

See http://sunsite.dk/wget/wgetdev.html - I guess patches against 1.7
are fine, as the current difference to CVS is almost NULL.  

Thanks for your help!

-- jan

---+
  Jan Prikryl  icq | vr|vis center for virtual reality and
  [EMAIL PROTECTED]  83242638 | visualisation http://www.vrvis.at
---+



Re: stealing from cURL

2001-06-07 Thread Hrvoje Niksic

Daniel Stenberg [EMAIL PROTECTED] writes:

 (Not related to this, but I thought I could through this in: One of
 the blue-sky dreams I have for a rainy day, is converting wget to
 use libcurl as transport layer for FTP(S)/HTTP(S)...)

Such a thing is not entirely out of the question.  I'm not exactly
satisfied with Wget's backend code and I've been thinking about ways
to redesign it for years now.  But designing an HTTP layer is damned
hard.  You have to handle the concepts of connection and download,
as well as reconnecting, persistent and non-persistent connections,
etc.  Then come the proxies, redirections, protocols based on HTTP
which are not exactly HTTP, and a legion of other nuisances.

Wget has traditionally been advertised as a no dependency program,
i.e. people have been reported to install it right after `gzip' and
`gcc'.  But if I found a library that handled all of the above issues
*graciously* (sorry folks, not libwww), I think I would prefer to use
it than implement all of those things myself.

I haven't looked at cURL before, but will do so.  Is the documentation
available online?  If you are willing to advertise its features, take
this as an invitation to do so.  :-)


(I'm now reading curl(1) man page and finding many cool things to
steal, interface-wise.)



Re: stealing from cURL

2001-06-07 Thread T. Bharath

Its is  a  fantastic  library that supports multithreading too.I use it
and  is easy to use
Thanks to Daniel

Hrvoje Niksic wrote:

 Daniel Stenberg [EMAIL PROTECTED] writes:

  (Not related to this, but I thought I could through this in: One of
  the blue-sky dreams I have for a rainy day, is converting wget to
  use libcurl as transport layer for FTP(S)/HTTP(S)...)

 Such a thing is not entirely out of the question.  I'm not exactly
 satisfied with Wget's backend code and I've been thinking about ways
 to redesign it for years now.  But designing an HTTP layer is damned
 hard.  You have to handle the concepts of connection and download,
 as well as reconnecting, persistent and non-persistent connections,
 etc.  Then come the proxies, redirections, protocols based on HTTP
 which are not exactly HTTP, and a legion of other nuisances.

 Wget has traditionally been advertised as a no dependency program,
 i.e. people have been reported to install it right after `gzip' and
 `gcc'.  But if I found a library that handled all of the above issues
 *graciously* (sorry folks, not libwww), I think I would prefer to use
 it than implement all of those things myself.

 I haven't looked at cURL before, but will do so.  Is the documentation
 available online?  If you are willing to advertise its features, take
 this as an invitation to do so.  :-)

 (I'm now reading curl(1) man page and finding many cool things to
 steal, interface-wise.)



Re: stealing from cURL

2001-06-07 Thread Daniel Stenberg

On 7 Jun 2001, Hrvoje Niksic wrote:

[ Disclaimer, please regard this dicussion as what *could* become reality,
  if many things would move in the right direction. A dream, a vision,
  plain fantasies. ]

  (Not related to this, but I thought I could through this in: One of
  the blue-sky dreams I have for a rainy day, is converting wget to
  use libcurl as transport layer for FTP(S)/HTTP(S)...)

 Such a thing is not entirely out of the question.  I'm not exactly
 satisfied with Wget's backend code and I've been thinking about ways to
 redesign it for years now.  But designing an HTTP layer is damned hard.
 You have to handle the concepts of connection and download, as well
 as reconnecting, persistent and non-persistent connections, etc.  Then
 come the proxies, redirections, protocols based on HTTP which are not
 exactly HTTP, and a legion of other nuisances.

Well of course. libcurl is such a library today, and it works. It supports
all these things you mention here above, and more. Multi platform.

I've studied the wget sources before with this purpose in mind, and I realize
it isn't just an afternoon patch we're talking about.

 Wget has traditionally been advertised as a no dependency program, i.e.
 people have been reported to install it right after `gzip' and `gcc'.

I understand this perfectly.

 But if I found a library that handled all of the above issues
 *graciously* (sorry folks, not libwww), I think I would prefer to use it
 than implement all of those things myself.

It would of course put a great demand on the library in question, and I'm not
saying libcurl of today would fit right into this shoe without any glitch
(I'm not saying it doesn't either, I'm just realistic enough to expect
trouble).  The point would instead be that it could be a benefit to work out
the problems, and end up with an even more powerful transport library that
would, well, if not rule the world, at least be a mighty fine library. For
synchronous URL transfers.

I can't but agree about your sentiments for libwww.

 I haven't looked at cURL before, but will do so.  Is the documentation
 available online?

You'll find most libcurl docs from http://curl.haxx.se/libcurl/

 If you are willing to advertise its features, take this as an invitation
 to do so.  :-)

curl is the tool that is powered by libcurl, the client-side URL transfer
library. My focus right now right here is the library parts. libcurl supports
FTP(S), HTTP(S), GOPHER, LDAP, FILE, TELNET, LDAP and DICT. It does both ways
transfers (for FTP and HTTP), persitent connections, cookies, POST, PUT,
rfc1867-posts, kerberos4, authentication and more. Friendly people have
written interfaces for all sorts of languages, including PHP, Perl, Ruby,
Python, Tcl and Java. libcurl never does anything with the actually
transfered data. It transfers, it doesn't interpret nor interfere with the
contents.

I have this comparison table over at the curl site comparing HTTP/FTP tools:
http://curl.haxx.se/docs/comparison-table.html (probable curl bias)

 (I'm now reading curl(1) man page and finding many cool things to steal,
 interface-wise.)

:-)

Of course (this may be unnecessary to add but I feel I should): I am not the
single person behind curl/libcurl. More than 60 persons are named for having
done non-trivial contributions.

-- 
  Daniel Stenberg - http://daniel.haxx.se - +46-705-44 31 77
   ech`echo xiun|tr nu oc|sed 'sx\([sx]\)\([xoi]\)xo un\2\1 is xg'`ol








Re: wget 1.7, linux, -rpath

2001-06-07 Thread karl

Replacing -rpath  with -Wl,rpath  

It has to be -Wl,-rpath, not -Wl,rpath (with the - on rpath too).

All I can say is that after I made that change, the ssl test worked for
me on redhat 7.1 (and so did the ssl functionality :).  I can see how
your other changes might be needed in other cases, though.

On another topic, you might consider upgrading to autoconf 2.50.  It has
a number of nice new features, and at least for me (i.e., Texinfo), the
upgrade was pretty much painless.

Thanks,
karl



ftp and .netrc via proxy problem

2001-06-07 Thread Richard Travett

Hi All,

Apologies for the length of this email but I wanted to include all the
information I thought would be needed.

I'm fairly new to wget so its possible I'm doing something wrong here,
but having read the docs etc I still can't get this to work as I think
it ought to.

I have to access the net via a proxy as I am behind a firewall here at
work. I have an account on another machine externally and I often ftp
stuff from this machine to work.

I have my .wgetrc file as follows:

- .wgetrc file follows ---
# Allow continuation of d/l
continue=on
#
# Set The FTP proxy
ftp_proxy=http://proxy.simoco.com:880/
#
# Set the HTTP proxy
http_proxy=http://proxy.simoco.com:880/
#
# Set the FTP login name
login=anonymous
#
# Enable no clobbering
#noclobber=on
#
# no proxy for
no_proxy=serv13,serv13.simoco.com
#
# Set the login FTP passwd
[EMAIL PROTECTED]
#
# Set the Proxy username
proxy_user=myproxyaccount
#
# Proxy passwd
proxy_passwd=myproxypassword
#
# Print server responses
server_response=on
#
# Use the proxy
use_proxy=on
- end of .wgetrc file ---

I also have a .netrc file as follows:

- .netrc file follows ---

machine a.machine.com login rick password thisismypassword
snip several more entries
- end of .netrc file ---


Here are 3 attempts to retrieve a file from the remote machine. I added
some extra debug in netrc.c at the start of search_netrc to print the
host out as follows:

  DEBUGP((SEARCH_NETRC: host:%s\n,host));

and in each case it reports that it is looking for the proxy machine!?

Shouldn't *all* of them work and if not why not?

== Attempt 1:

--- FAILS

serv13(599): wget -d ftp:[EMAIL PROTECTED]/dl/a.a
DEBUG output created by Wget 1.7 on solaris2.6.

parseurl (ftp:[EMAIL PROTECTED]/dl/a.a;) - host a.machine.com - ftp_type I - 
opath dl/a.a - dir dl - file a.a - ndir dl
newpath: /dl/a.a
parseurl (http://proxy.simoco.com:880/;) - host proxy.simoco.com - port 880 - 
opath  - dir  - file  - ndir 
newpath: /
--17:01:49--  ftp:[EMAIL PROTECTED]/dl/a.a
   = `a.a'
Connecting to proxy.simoco.com:880... Caching proxy.simoco.com - 193.150.150.3
Created fd 4.
connected!
SEARCH_NETRC: host:proxy.simoco.com
---request begin---
GET ftp:[EMAIL PROTECTED]/dl/a.a HTTP/1.0
User-Agent: Wget/1.7
Host: a.machine.com:21
Accept: */*
Proxy-Authorization: Basic BASE64PROXYUSERPASS

---request end---
Proxy request sent, awaiting response... HTTP/1.0 401 Unauthorized
Server: Squid/2.2.STABLE3

2 Server: Squid/2.2.STABLE3Mime-Version: 1.0

3 Mime-Version: 1.0Date: Thu, 07 Jun 2001 15:31:01 GMT

4 Date: Thu, 07 Jun 2001 15:31:01 GMTContent-Type: text/html

5 Content-Type: text/htmlContent-Length: 707

6 Content-Length: 707Expires: Thu, 07 Jun 2001 15:31:01 GMT

7 Expires: Thu, 07 Jun 2001 15:31:01 GMTX-Squid-Error: ERR_ACCESS_DENIED 0

8 X-Squid-Error: ERR_ACCESS_DENIED 0WWW-Authenticate: Basic realm=ftp rick

9 WWW-Authenticate: Basic realm=ftp rickX-Cache: MISS from ukcamnet01

10 X-Cache: MISS from ukcamnet01Proxy-Connection: close

11 Proxy-Connection: close

12 
Closing fd 4
Authorization failed.


== Attempt 2:

--- FAILS

serv13(600): wget -d ftp://a.machine.com/dl/a.a
DEBUG output created by Wget 1.7 on solaris2.6.

parseurl (ftp://a.machine.com/dl/a.a;) - host a.machine.com - ftp_type I - opath 
dl/a.a - dir dl - file a.a - ndir dl
newpath: /dl/a.a
parseurl (http://proxy.simoco.com:880/;) - host proxy.simoco.com - port 880 - 
opath  - dir  - file  - ndir 
newpath: /
--17:10:59--  ftp://a.machine.com/dl/a.a
   = `a.a'
Connecting to proxy.simoco.com:880... Caching proxy.simoco.com - 193.150.150.3
Created fd 4.
connected!
SEARCH_NETRC: host:proxy.simoco.com
---request begin---
GET ftp://a.machine.com/dl/a.a HTTP/1.0
User-Agent: Wget/1.7
Host: a.machine.com:21
Accept: */*
Proxy-Authorization: Basic BASE64PROXYUSERPASS

---request end---
Proxy request sent, awaiting response... HTTP/1.0 404 Not Found
Server: Squid/2.2.STABLE3

2 Server: Squid/2.2.STABLE3Mime-Version: 1.0

3 Mime-Version: 1.0Date: Thu, 07 Jun 2001 15:40:14 GMT

4 Date: Thu, 07 Jun 2001 15:40:14 GMTContent-Type: text/html

5 Content-Type: text/htmlContent-Length: 964

6 Content-Length: 964Expires: Thu, 07 Jun 2001 15:40:14 GMT

7 Expires: Thu, 07 Jun 2001 15:40:14 GMTX-Squid-Error: ERR_FTP_NOT_FOUND 0

8 X-Squid-Error: ERR_FTP_NOT_FOUND 0X-Cache: MISS from ukcamnet01

9 X-Cache: MISS from ukcamnet01Proxy-Connection: close

10 Proxy-Connection: close

11 
Closing fd 4
17:11:02 ERROR 404: Not Found.

== Attempt 3:

-- SUCCESS!!

serv13(601): wget -d ftp://rick:[EMAIL PROTECTED]/dl/a.a
DEBUG output created by Wget 1.7 on solaris2.6.

parseurl (ftp://rick:[EMAIL PROTECTED]/dl/a.a;) - host a.machine.com - 
ftp_type I - opath dl/a.a - dir dl - file a.a - ndir dl
newpath: /dl/a.a
parseurl (http://proxy.simoco.com:880/;) - host proxy.simoco.com - port 880 - 
opath  - dir  - file  - ndir 
newpath: /
--17:11:41--  

Re: wget 1.7 configure errors

2001-06-07 Thread Hrvoje Niksic

Maciej W. Rozycki [EMAIL PROTECTED] writes:

 On 7 Jun 2001, Hrvoje Niksic wrote:
 
 We build with rpath only if the SSL libraries are at a non-standard
 location, i.e. one not recognized by the system (we first try to build
 without rpath).  In that case, building with rpath or its moral
 equivalent is the only way to produce a working executable.
 
 I think the following patch should solve the problem for platforms
 libtool supports.

Maciej, this is excellent.  With this, and possibly an upgrade to
libtool 1.4 (but not Autoconf 2.50), we can release Wget 1.7.1.



Re: ftp and .netrc via proxy problem

2001-06-07 Thread Joe Cooper

Squid is not a proxy for FTP clients, it only talks to HTTP clients, 
though it will talk to FTP servers on behalf of HTTP clients.

Richard Travett wrote:

 Hi All,
 
 Apologies for the length of this email but I wanted to include all the
 information I thought would be needed.
 
 I'm fairly new to wget so its possible I'm doing something wrong here,
 but having read the docs etc I still can't get this to work as I think
 it ought to.
 
 I have to access the net via a proxy as I am behind a firewall here at
 work. I have an account on another machine externally and I often ftp
 stuff from this machine to work.
 
 I have my .wgetrc file as follows:



   --
  Joe Cooper [EMAIL PROTECTED]
  Affordable Web Caching Proxy Appliances
 http://www.swelltech.com