Unable to establish SSL connection
I am trying to download a file from a HTTPS site. The uses certificates generated from another server. I issued the following command: Wget secure-protocol=auto https://httpshost/download/Package/test.pkg I get the following errors: Error: Certificate verification error for httpshost.: unable to get local issuer certificate. Error: certificate common name cbr.x.systems doesnt match requested host name httpshost Names are made up. How do I get it to recognize the certificate for httpshost? If I use an openssl command that logs the certificate to a file, is there a way that I can extract the certificate from the log file and pass that in to wget? Is there a document that someone can point me to that explains in detail the steps I would need to perform in order to perform the downloads? Thanks, Ron Cadima Bally Gaming and Systems Group 6601 S. Bermuda Rd. Las Vegas, NV 89119 Phone: Office:(702)914-1074 Cell:(702)280-8129
wget 1.11 alpha1 [Fwd: Bug#378691: wget --continue doesn't work with HTTP]
Hello, a wget -c problem report with the 1.11 alpha 1 version (http://bugs.debian.org/378691): I can reproduce the problem. If I have already 1 MB downloaded wget -c doesn't continue. Instead it starts to download again: Weitergeleitete Nachricht [EMAIL PROTECTED]:~$ strace -o wget-strace wget -c http://ftp.iasi.roedu.net/100MB --14:28:07-- http://ftp.iasi.roedu.net/100MB Resolving ftp.iasi.roedu.net... 192.129.4.120 Connecting to ftp.iasi.roedu.net|192.129.4.120|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 104857600 (100M) [text/plain] Saving to: dMB.8' The HTTP conversation: GET /100MB HTTP/1.0 User-Agent: Wget/1.11-alpha-1 Accept: */* Host: ftp.iasi.roedu.net Connection: Keep-Alive HTTP/1.1 200 OK Date: Tue, 18 Jul 2006 11:24:14 GMT Server: Apache/2.2.2 (Unix) Last-Modified: Sat, 03 Dec 2005 09:14:42 GMT ETag: a002e4cb-640-1dbb0480 Accept-Ranges: bytes Content-Length: 104857600 Keep-Alive: timeout=5, max=100 Connection: Keep-Alive Content-Type: text/plain With an older version of wget, same file, same server, it works. This version works with FTP. A strace (attached) shows that it doesn't even try to see if 100MB exists before sending the HTTP request. -- System Information: Debian Release: testing/unstable APT prefers experimental APT policy: (500, 'experimental'), (500, 'unstable'), (500, 'testing') Architecture: i386 (i686) Shell: /bin/sh linked to /bin/bash Kernel: Linux 2.6.17-1-686 Locale: LANG=en_GB.UTF-8, LC_CTYPE=en_GB.UTF-8 (charmap=UTF-8) Versions of packages wget depends on: ii libc62.3.999.2-8 GNU C Library: Shared libraries ii libssl0.9.8 0.9.8b-2SSL shared libraries -- Noèl Köthe noel debian.org Debian GNU/Linux, www.debian.org signature.asc Description: Dies ist ein digital signierter Nachrichtenteil
Re: wget 1.11 alpha1 [Fwd: Bug#378691: wget --continue doesn't workwith HTTP]
Zitat von Hrvoje Niksic [EMAIL PROTECTED]: Mauro, you will need to look at this one. Part of the problem is that Wget decides to save to index.html.1 although -c is in use. That is solved with the patch attached below. But the other part is that hstat.local_file is a NULL pointer when stat(hstat.local_file, st) is used to determine whether the file already exists in the -c case. That seems to be a result of your changes to the code -- previously, hstat.local_file would get initialied in http_loop. This looks as if if could also be the cause for the problems which I reported some weeks ago for the timestamping mode (http://www.mail-archive.com/wget@sunsite.dk/msg09083.html) J.Roderburg