Re: stealing from cURL

2001-06-07 Thread Daniel Stenberg

On 7 Jun 2001, Hrvoje Niksic wrote:

[ Disclaimer, please regard this dicussion as what *could* become reality,
  if many things would move in the right direction. A dream, a vision,
  plain fantasies. ]

> > (Not related to this, but I thought I could through this in: One of
> > the blue-sky dreams I have for a rainy day, is converting wget to
> > use libcurl as transport layer for FTP(S)/HTTP(S)...)
>
> Such a thing is not entirely out of the question.  I'm not exactly
> satisfied with Wget's "backend" code and I've been thinking about ways to
> redesign it for years now.  But designing an HTTP layer is damned hard.
> You have to handle the concepts of "connection" and "download", as well
> as "reconnecting", persistent and non-persistent connections, etc.  Then
> come the proxies, redirections, protocols based on HTTP which are not
> exactly HTTP, and a legion of other nuisances.

Well of course. libcurl is such a library today, and it works. It supports
all these things you mention here above, and more. Multi platform.

I've studied the wget sources before with this purpose in mind, and I realize
it isn't just an afternoon patch we're talking about.

> Wget has traditionally been advertised as a "no dependency" program, i.e.
> people have been reported to install it right after `gzip' and `gcc'.

I understand this perfectly.

> But if I found a library that handled all of the above issues
> *graciously* (sorry folks, not libwww), I think I would prefer to use it
> than implement all of those things myself.

It would of course put a great demand on the library in question, and I'm not
saying libcurl of today would fit right into this shoe without any glitch
(I'm not saying it doesn't either, I'm just realistic enough to expect
trouble).  The point would instead be that it could be a benefit to work out
the problems, and end up with an even more powerful transport library that
would, well, if not rule the world, at least be a mighty fine library. For
synchronous URL transfers.

I can't but agree about your sentiments for libwww.

> I haven't looked at cURL before, but will do so.  Is the documentation
> available online?

You'll find most libcurl docs from http://curl.haxx.se/libcurl/

> If you are willing to advertise its features, take this as an invitation
> to do so.  :-)

curl is the tool that is powered by libcurl, the client-side URL transfer
library. My focus right now right here is the library parts. libcurl supports
FTP(S), HTTP(S), GOPHER, LDAP, FILE, TELNET, LDAP and DICT. It does both ways
transfers (for FTP and HTTP), persitent connections, cookies, POST, PUT,
rfc1867-posts, kerberos4, authentication and more. Friendly people have
written interfaces for all sorts of languages, including PHP, Perl, Ruby,
Python, Tcl and Java. libcurl never does anything with the actually
transfered data. It transfers, it doesn't interpret nor interfere with the
contents.

I have this comparison table over at the curl site comparing HTTP/FTP tools:
http://curl.haxx.se/docs/comparison-table.html (probable curl bias)

> (I'm now reading curl(1) man page and finding many cool things to steal,
> interface-wise.)

:-)

Of course (this may be unnecessary to add but I feel I should): I am not the
single person behind curl/libcurl. More than 60 persons are named for having
done non-trivial contributions.

-- 
  Daniel Stenberg - http://daniel.haxx.se - +46-705-44 31 77
   ech`echo xiun|tr nu oc|sed 'sx\([sx]\)\([xoi]\)xo un\2\1 is xg'`ol








Re: stealing from cURL

2001-06-07 Thread T. Bharath

Its is  a  fantastic  library that supports multithreading too.I use it
and  is easy to use
Thanks to Daniel

Hrvoje Niksic wrote:

> Daniel Stenberg <[EMAIL PROTECTED]> writes:
>
> > (Not related to this, but I thought I could through this in: One of
> > the blue-sky dreams I have for a rainy day, is converting wget to
> > use libcurl as transport layer for FTP(S)/HTTP(S)...)
>
> Such a thing is not entirely out of the question.  I'm not exactly
> satisfied with Wget's "backend" code and I've been thinking about ways
> to redesign it for years now.  But designing an HTTP layer is damned
> hard.  You have to handle the concepts of "connection" and "download",
> as well as "reconnecting", persistent and non-persistent connections,
> etc.  Then come the proxies, redirections, protocols based on HTTP
> which are not exactly HTTP, and a legion of other nuisances.
>
> Wget has traditionally been advertised as a "no dependency" program,
> i.e. people have been reported to install it right after `gzip' and
> `gcc'.  But if I found a library that handled all of the above issues
> *graciously* (sorry folks, not libwww), I think I would prefer to use
> it than implement all of those things myself.
>
> I haven't looked at cURL before, but will do so.  Is the documentation
> available online?  If you are willing to advertise its features, take
> this as an invitation to do so.  :-)
>
> (I'm now reading curl(1) man page and finding many cool things to
> steal, interface-wise.)



Re: stealing from cURL

2001-06-07 Thread Hrvoje Niksic

Daniel Stenberg <[EMAIL PROTECTED]> writes:

> (Not related to this, but I thought I could through this in: One of
> the blue-sky dreams I have for a rainy day, is converting wget to
> use libcurl as transport layer for FTP(S)/HTTP(S)...)

Such a thing is not entirely out of the question.  I'm not exactly
satisfied with Wget's "backend" code and I've been thinking about ways
to redesign it for years now.  But designing an HTTP layer is damned
hard.  You have to handle the concepts of "connection" and "download",
as well as "reconnecting", persistent and non-persistent connections,
etc.  Then come the proxies, redirections, protocols based on HTTP
which are not exactly HTTP, and a legion of other nuisances.

Wget has traditionally been advertised as a "no dependency" program,
i.e. people have been reported to install it right after `gzip' and
`gcc'.  But if I found a library that handled all of the above issues
*graciously* (sorry folks, not libwww), I think I would prefer to use
it than implement all of those things myself.

I haven't looked at cURL before, but will do so.  Is the documentation
available online?  If you are willing to advertise its features, take
this as an invitation to do so.  :-)


(I'm now reading curl(1) man page and finding many cool things to
steal, interface-wise.)



Re: stealing from cURL

2001-06-07 Thread Hrvoje Niksic

Daniel Stenberg <[EMAIL PROTECTED]> writes:

> I just like to point out that "stealing from cURL" would in fact not
> convert the cURL source to GPL, since the curl source code is MIT
> licensed. The license requires the copyright text to be included.

I would add to this that, even if cURL were under the GPL, I would
still need an explicit assignment to the FSF before being able to
incorporate the code to Wget.  That's how GNU works.

(It would of course be perfectly legal for a third party to mix two
pieces of GPL'ed code; it's just that the FSF prefers to own the code
that goes in GNU.)

> Since I am the originator of most curl code (and quite probably of
> the particular SSL related code that has been refered to here), I am
> willing to donate source code from curl to wget, with a translated
> (to GPL) license for those parts, if need be.

I would be grateful for such a contribution.

Before doing that, we should find out if cURL code can be used in
Wget, and who would do the merge.  (I cannot do it because I don't
understand or use SSL at all.)  It would be a shame if your donated
ended up not being used!



stealing from cURL

2001-06-07 Thread Daniel Stenberg

hi

I just like to point out that "stealing from cURL" would in fact not convert
the cURL source to GPL, since the curl source code is MIT licensed. The
license requires the copyright text to be included.

Since I am the originator of most curl code (and quite probably of the
particular SSL related code that has been refered to here), I am willing to
donate source code from curl to wget, with a translated (to GPL) license for
those parts, if need be.

curl has successfully supported https:// powered by SSLeay and OpenSSL for
many years by now.

(Not related to this, but I thought I could through this in: One of the
blue-sky dreams I have for a rainy day, is converting wget to use libcurl as
transport layer for FTP(S)/HTTP(S)...)

-- 
   Daniel Stenberg -- curl dude -- http://curl.haxx.se/