S3 <[EMAIL PROTECTED]> writes:

> I have a script that uses wget to download from a site using https.
> I have been using it for months and it has worked reliably.  This
> morning it stopped working.
[...]
> However, it works just fine in wget 1.9, curl, and Mozilla.

Were you using Wget 1.10.2 before this morning, or did you upgrade
just then?  SSL handling changed between 1.9 and 1.10, but not between
1.10 and 1.10.2.

> ERROR: Certificate verification error for members.dyndns.org: unable to get
> local issuer certificate
> To connect to members.dyndns.org insecurely, use `--no-check-certificate'.
> Unable to establish SSL connection.

This means that Wget's OpenSSL doesn't "see" the local certificate
that certifies Equifax (the company that authorized the dyndns site)
as an authority which you trust.  Wget doesn't do anything special to
get the certificate: it simply uses OpenSSL's defaults.  It is up to
the system administrator (or the distributor) to set up OpenSSL with a
reasonable certificate bundle.

You should ask the gentoo people what you need to do to get OpenSSL to
recognize a certificate bundle.  On Debian, the certificates are in a
`ca-certificates' package.  On SuSE, they're distributed along with
the openssl package.

> However, it works just fine in wget 1.9, curl, and Mozilla.

Wget 1.9 doesn't check for certificates at all.  Curl and Mozilla have
their own certificate files.

If nothing else works, grab the CA bundle from
http://curl.haxx.se/docs/sslcerts.html and put something like
`ca_certificate=/location/of/the/file' in /etc/wgetrc.

Reply via email to