Hello community, here is the log from the commit of package wget.5582 for openSUSE:13.2:Update checked in at 2016-09-10 11:20:32 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Comparing /work/SRC/openSUSE:13.2:Update/wget.5582 (Old) and /work/SRC/openSUSE:13.2:Update/.wget.5582.new (New) ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "wget.5582" Changes: -------- New Changes file: --- /dev/null 2016-07-07 10:01:34.856033756 +0200 +++ /work/SRC/openSUSE:13.2:Update/.wget.5582.new/wget.changes 2016-09-10 11:20:34.000000000 +0200 @@ -0,0 +1,703 @@ +------------------------------------------------------------------- +Wed Aug 31 15:04:50 UTC 2016 - josef.moell...@suse.com + +- Fixed a potential race condition by creating files with .tmp ext + and making them accessible to the current user only. + (bsc#995964, CVE-2016-7098, wget-CVE-2016-7098.patch) + +------------------------------------------------------------------- +Wed Jul 27 11:55:05 UTC 2016 - josef.moell...@suse.com + + - Fix for HTTP to a FTP redirection file name confusion + vulnerability + (bnc#984060, CVE-2016-4971, wget-ftp-path-CVE-2016-4971.patch). + +------------------------------------------------------------------- +Wed Oct 29 20:37:47 UTC 2014 - andreas.stie...@gmx.de + +- GNU wget 1.16: + This release contains a fix for symlink attack which could allow + a malicious ftp server to create arbitrary files, directories or + symbolic links and set their permissions when retrieving a + directory recursively through FTP. [CVE-2014-4877] [boo#902709] + * No longer create local symbolic links by default + --retr-symlinks=no option restores previous behaviour + * Use libpsl for verifying cookie domains. + * Default progress bar output changed. + * Introduce --show-progress to force display the progress bar. + * Introduce --no-config. The wgetrc files will not be read. + * Introduce --start-pos to allow starting downloads from a specified position. + * Fix a problem with ISA Server Proxy and keep-alive connections. +- refresh wget-libproxy.patch for upstream changes +- make some dependencies only required for testsuite optional + +------------------------------------------------------------------- +Sun Jun 8 07:19:29 UTC 2014 - andreas.stie...@gmx.de + +- Disable the testsuite + +------------------------------------------------------------------- +Tue Jan 21 15:32:00 UTC 2014 - kpet...@suse.com + +- Enabled the testsuite +- Modified libproxy.patch to include Makefile in tests/ + +------------------------------------------------------------------- +Sun Jan 19 22:02:25 UTC 2014 - andreas.stie...@gmx.de + +- GNU wget 1.15 + * Add support for --method. + * Add support for file names longer than MAX_FILE. + * Support FTP listing for the FTP Server on Windows Server 2008 R2. + * Fix a regression when -c and --content-disposition are used together. + * Support shorthand URLs in an input file. + * Fix -c with servers that don't specify a content-length. + * Add support for MD5-SESS + * Do not fail on non fatal GNU TLS alerts during handshake. + * Add support for --https-only. When used wget will follow only + * HTTPS links in recursive mode. + * Support Perfect-Forward Secrecy in --secure-protocol. + * Fix a problem with some IRI links that are not followed when contained in a + * HTML document. + * Support some FTP servers that return an empty list with "LIST -a". + * Specify Host with the HTTP CONNECT method. + * Use the correct HTTP method on a redirection. +- verify source tarball signatures +- modified patches: + * wget-1.14-openssl-no-intern.patch for upstream changes + * wget-fix-pod-syntax.diff for upstream changes + +------------------------------------------------------------------- +Thu Jun 20 13:29:01 UTC 2013 - co...@suse.com + +- add wget-fix-pod-syntax.diff to fix build with perl 5.18 + +------------------------------------------------------------------- +Thu May 2 17:50:50 UTC 2013 - p.drou...@gmail.com + +- Update to version 1.14 + + add support for content-on-error. It allows to store the HTTP + payload on 4xx or 5xx errors. + + add support for WARC files. + + fix a memory leak problem in the GNU TLS backend. + + autoreconf works again for distributed tarballs. + + print some diagnostic messages to stderr not to stdout. + + report stdout close errors. + + accept the --report-speed option. + + enable client certificates when GNU TLS is used. + + add support for TLS Server Name Indication. + + accept the arguments --accept-reject and --reject-regex. + + the GNU TLS backend honors correctly the timeout value. + + add support for RFC 2617 Digest Access Authentication. +- Drop patchs obsoleted by upstream + + wget-sni.patch + + wget-stdio.h.patch +- Rebase patchs to work with upstream + + wget-openssl-no-intern.patch > wget-1.14-openssl-no-intern.patch + + wget-no-ssl-comp.patch > wget-1.14-no-ssl-comp.patch + +------------------------------------------------------------------- +Thu May 2 09:49:33 UTC 2013 - seife+...@b1-systems.com + +- add makeinfo BuildRequires to fix build + +------------------------------------------------------------------- +Fri Apr 5 09:51:58 UTC 2013 - idon...@suse.com + +- Add Source URL, see https://en.opensuse.org/SourceUrls + +------------------------------------------------------------------- +Mon Nov 12 02:04:05 UTC 2012 - crrodrig...@opensuse.org + +- wget-no-ssl-comp.patch: Since the apperance of the "CRIME attack" + (CVE-2012-4929) HTTPS clients must not negotatiate ssl compression. + +------------------------------------------------------------------- +Thu Sep 27 13:46:49 UTC 2012 - crrodrig...@opensuse.org + +- Add wget-openssl-no-intern.patch to Build with OPENSSL_NO_SSL_INTERN, + which is openssl's poor man's version of visibility, to avoid breaking + applications ABI on library internal changes. + +------------------------------------------------------------------- +Fri Jul 27 20:03:31 UTC 2012 - a...@suse.de + +- Fix build with missing gets declaration (glibc 2.16) + +------------------------------------------------------------------- +Wed Mar 21 19:44:53 UTC 2012 - dims...@opensuse.org + +- Adjust wget-libproxy.patch: give debug output only when + opt.debug is set to non-zero values, so when -d is specified. + Fix bnc#753242. + +------------------------------------------------------------------- +Fri Dec 2 15:59:32 UTC 2011 - co...@suse.com + +- add automake as buildrequire to avoid implicit dependency + +------------------------------------------------------------------- +Wed Oct 19 09:34:59 UTC 2011 - m...@suse.com + +- New version: 1.13.4: + * Now --timestamping and --continue work well together. + * Return a network failure when FTP downloads fail and + --timestamping is specified. + * Support HTTP/1.1 + * Fix some portability issues. + * Handle properly malformed status line in a HTTP response. + * Ignore zero length domains in $no_proxy. + * Exit with failure if -k is specified and -O is not a regular + file. + * Cope better with unclosed html tags. + * Print diagnostic messages to stderr, not stdout. + * Do not use an additional HEAD request when + --content-disposition is used, but use directly GET. + * Report the average transfer speed correctly when multiple + URLs are specified and -c influences the transferred data + amount. + * By default, on server redirects, use the original URL to get + the local file name. Close CVE-2010-2252. This introduces a + backward-incompatibility; any script that relies on the old + behaviour must use --trust-server-names. + * Fix a problem when -k is used and some URLs are specified + trough CSS. + * Convert correctly URLs that need to be encoded to local files + when following links. + * Use persistent connections with proxies supporting them. + * Print the total download time as part of the summary for + recursive downloads. + * Now it is possible to specify a different startup + configuration file trough the --config option. + * Fix an infinite loop with the error '<filename> has sprung + into existence' on a network error and -nc is used. + * Now --adjust-extension does not modify the file extension if + the file ends in .htm. + * Support HTTP/1.1 307 redirects keep request method. + * Now --no-parent doesn't fetch undesired files if HTTP and + HTTPS are used by the same host on different pages. + * Do not attempt to remove the file if it is not in the accept + rules but it is the output destination file. + * Introduce `show_all_dns_entries' to print all IP addresses + corresponding to a DNS name when it is resolved. +- Adjuct patches to the new version. +- wget-1.12-nosslv2.patch got included upstream. + +------------------------------------------------------------------- +Sat Oct 15 18:19:59 UTC 2011 - crrodrig...@opensuse.org + +- fix typo in sni patch , in the IPV6 case should be + is_valid_ipv6_address() instead of is_valid_ipv4_address() +- Add comment to the patch referencing upstream tracker. + +------------------------------------------------------------------- +Fri Oct 14 05:01:53 UTC 2011 - crrodrig...@opensuse.org + +- Update nosslv2 patch with the version in upstream +- Wget now supports SNI (server name indication), patch ++++ 506 more lines (skipped) ++++ between /dev/null ++++ and /work/SRC/openSUSE:13.2:Update/.wget.5582.new/wget.changes New: ---- wget-1.14-no-ssl-comp.patch wget-1.14-openssl-no-intern.patch wget-1.16.tar.xz wget-1.16.tar.xz.sig wget-CVE-2016-7098.patch wget-fix-pod-syntax.diff wget-ftp-path-CVE-2016-4971.patch wget-libproxy.patch wget.changes wget.keyring wget.spec wgetrc.patch ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Other differences: ------------------ ++++++ wget.spec ++++++ # # spec file for package wget # # Copyright (c) 2016 SUSE LINUX GmbH, Nuernberg, Germany. # # All modifications and additions to the file contributed by third parties # remain the property of their copyright owners, unless otherwise agreed # upon. The license for this file, and modifications and additions to the # file, is the same license as for the pristine package itself (unless the # license for the pristine package is not an Open Source License, in which # case the license is the MIT License). An "Open Source License" is a # license that conforms to the Open Source Definition (Version 1.9) # published by the Open Source Initiative. # Please submit bugfixes or comments via http://bugs.opensuse.org/ # %bcond_with regression_tests Name: wget Version: 1.16 Release: 0 Summary: A Tool for Mirroring FTP and HTTP Servers License: GPL-3.0+ Group: Productivity/Networking/Web/Utilities Url: https://www.gnu.org/software/wget/ Source: https://ftp.gnu.org/gnu/wget/%name-%version.tar.xz Source1: https://ftp.gnu.org/gnu/wget/%name-%version.tar.xz.sig Source2: https://savannah.gnu.org/project/memberlist-gpgkeys.php?group=wget&download=1#/wget.keyring Patch0: wgetrc.patch Patch1: wget-libproxy.patch Patch5: wget-1.14-openssl-no-intern.patch Patch6: wget-1.14-no-ssl-comp.patch # PATCH-FIX-OPENSUSE fix pod syntax for perl 5.18 co...@suse.de Patch7: wget-fix-pod-syntax.diff Patch8: wget-ftp-path-CVE-2016-4971.patch Patch9: wget-CVE-2016-7098.patch BuildRequires: libpng-devel %if 0%{suse_version} > 1110 BuildRequires: libproxy-devel %endif BuildRequires: automake BuildRequires: libidn-devel BuildRequires: makeinfo BuildRequires: openssl-devel %if %{with regression_tests} # For the Testsuite BuildRequires: perl-HTTP-Daemon BuildRequires: perl-IO-Socket-SSL %endif BuildRequires: pkg-config BuildRequires: xz PreReq: %install_info_prereq BuildRoot: %{_tmppath}/%{name}-%{version}-build %description Wget enables you to retrieve WWW documents or FTP files from a server. This can be done in script files or via the command line. %prep %setup -q %patch0 %if 0%{suse_version} > 1110 %patch1 -p1 %endif %patch5 -p1 %patch6 %patch7 -p1 %patch8 -p1 %patch9 -p1 %build %if 0%{suse_version} > 1110 # only wget-libproxy.patch needs this autoreconf --force %endif %configure --with-ssl=openssl make %{?_smp_mflags} %check %if %{with regression_tests} make -C tests/ check %endif %install %makeinstall %find_lang %{name} %post %install_info --info-dir=%{_infodir} %{_infodir}/%{name}.info.gz %postun %install_info_delete --info-dir=%{_infodir} %{_infodir}/%{name}.info.gz %files -f %{name}.lang %defattr(-,root,root) %doc AUTHORS COPYING NEWS README MAILING-LIST %doc doc/sample.wgetrc util/rmold.pl %{_mandir}/*/wget* %{_infodir}/wget* %config(noreplace) %{_sysconfdir}/wgetrc %{_bindir}/* %changelog ++++++ wget-1.14-no-ssl-comp.patch ++++++ --- src/openssl.c.orig +++ src/openssl.c @@ -241,7 +241,9 @@ /* The OpenSSL library can handle renegotiations automatically, so tell it to do so. */ SSL_CTX_set_mode (ssl_ctx, SSL_MODE_AUTO_RETRY); - +#ifdef SSL_OP_NO_COMPRESSION + SSL_CTX_set_options(ssl_ctx, SSL_OP_NO_COMPRESSION); +#endif return true; error: ++++++ wget-1.14-openssl-no-intern.patch ++++++ --- src/openssl.c | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) Index: wget-1.15/src/openssl.c =================================================================== --- wget-1.15.orig/src/openssl.c 2014-01-19 21:35:59.000000000 +0000 +++ wget-1.15/src/openssl.c 2014-01-19 21:37:27.000000000 +0000 @@ -29,6 +29,7 @@ Corresponding Source for a non-source fo shall include the source code for the parts of OpenSSL used as well as that of the covered work. */ +#define OPENSSL_NO_SSL_INTERN #include "wget.h" #include <assert.h> @@ -479,7 +480,7 @@ ssl_connect_wget (int fd, const char *ho DEBUGP (("SSL handshake timed out.\n")); goto timeout; } - if (scwt_ctx.result <= 0 || conn->state != SSL_ST_OK) + if (scwt_ctx.result <= 0 || SSL_get_state(conn) != SSL_ST_OK) goto error; ctx = xnew0 (struct openssl_transport_context); ++++++ wget-CVE-2016-7098.patch ++++++ Index: wget-1.16/src/http.c =================================================================== --- wget-1.16.orig/src/http.c +++ wget-1.16/src/http.c @@ -39,6 +39,8 @@ as that of the covered work. */ #include <errno.h> #include <time.h> #include <locale.h> +#include <fcntl.h> + #include "hash.h" #include "http.h" @@ -1470,6 +1472,7 @@ struct http_stat wgint orig_file_size; /* size of file to compare for time-stamping */ time_t orig_file_tstamp; /* time-stamp of file to compare for * time-stamping */ + bool temporary; /* downloading a temporary file */ }; static void @@ -2479,6 +2482,15 @@ read_header: xfree_null (local_file); } + hs->temporary = opt.delete_after || opt.spider || !acceptable (hs->local_file); + if (hs->temporary) + { + char *tmp = NULL; + asprintf (&tmp, "%s.tmp", hs->local_file); + xfree (hs->local_file); + hs->local_file = tmp; + } + /* TODO: perform this check only once. */ if (!hs->existence_checked && file_exists_p (hs->local_file)) { @@ -2924,7 +2936,11 @@ read_header: open_id = 22; fp = fopen (hs->local_file, "wb", FOPEN_OPT_ARGS); #else /* def __VMS */ - fp = fopen (hs->local_file, "wb"); + if (hs->temporary) + fp = fdopen (open (hs->local_file, O_BINARY | O_CREAT | O_TRUNC | O_WRONLY, S_IRUSR | S_IWUSR), "wb"); + else + fp = fopen (hs->local_file, "wb"); + #endif /* def __VMS [else] */ } else ++++++ wget-fix-pod-syntax.diff ++++++ --- doc/texi2pod.pl | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) Index: wget-1.15/doc/texi2pod.pl =================================================================== --- wget-1.15.orig/doc/texi2pod.pl 2014-01-19 21:41:04.000000000 +0000 +++ wget-1.15/doc/texi2pod.pl 2014-01-19 21:41:31.000000000 +0000 @@ -294,7 +294,7 @@ while(<$inf>) { $_ = "\n=item C<$thing>\n"; } else { # Entity escapes prevent munging by the <> processing below. - $_ = "\n=item $ic\<$thing\>\n"; + $_ = "\n=item Z<>$ic\<$thing\>\n"; } } else { $_ = "\n=item $ic\n"; ++++++ wget-ftp-path-CVE-2016-4971.patch ++++++ >From e996e322ffd42aaa051602da182d03178d0f13e1 Mon Sep 17 00:00:00 2001 From: Giuseppe Scrivano <gscri...@redhat.com> Date: Mon, 06 Jun 2016 19:20:24 +0000 Subject: ftp: understand --trust-server-names on a HTTP->FTP redirect If not --trust-server-names is used, FTP will also get the destination file name from the original url specified by the user instead of the redirected url. Closes CVE-2016-4971. * src/ftp.c (ftp_get_listing): Add argument original_url. (getftp): Likewise. (ftp_loop_internal): Likewise. Use original_url to generate the file name if --trust-server-names is not provided. (ftp_retrieve_glob): Likewise. (ftp_loop): Likewise. Signed-off-by: Giuseppe Scrivano <gscri...@redhat.com> --- Index: wget-1.16/src/ftp.c =================================================================== --- wget-1.16.orig/src/ftp.c +++ wget-1.16/src/ftp.c @@ -235,14 +235,15 @@ print_length (wgint size, wgint start, b logputs (LOG_VERBOSE, !authoritative ? _(" (unauthoritative)\n") : "\n"); } -static uerr_t ftp_get_listing (struct url *, ccon *, struct fileinfo **); +static uerr_t ftp_get_listing (struct url *, struct url *, ccon *, struct fileinfo **); /* Retrieves a file with denoted parameters through opening an FTP connection to the server. It always closes the data connection, and closes the control connection in case of error. If warc_tmp is non-NULL, the downloaded data will be written there as well. */ static uerr_t -getftp (struct url *u, wgint passed_expected_bytes, wgint *qtyread, +getftp (struct url *u, struct url *original_url, + wgint passed_expected_bytes, wgint *qtyread, wgint restval, ccon *con, int count, wgint *last_expected_bytes, FILE *warc_tmp) { @@ -992,7 +993,7 @@ Error in server response, closing contro { bool exists = false; struct fileinfo *f; - uerr_t _res = ftp_get_listing (u, con, &f); + uerr_t _res = ftp_get_listing (u, original_url, con, &f); /* Set the DO_RETR command flag again, because it gets unset when calling ftp_get_listing() and would otherwise cause an assertion failure earlier on when this function gets repeatedly called @@ -1536,7 +1537,8 @@ Error in server response, closing contro This loop either gets commands from con, or (if ON_YOUR_OWN is set), makes them up to retrieve the file given by the URL. */ static uerr_t -ftp_loop_internal (struct url *u, struct fileinfo *f, ccon *con, char **local_file) +ftp_loop_internal (struct url *u, struct url *original_url, struct fileinfo *f, + ccon *con, char **local_file) { int count, orig_lp; wgint restval, len = 0, qtyread = 0; @@ -1560,7 +1562,7 @@ ftp_loop_internal (struct url *u, struct else { /* URL-derived file. Consider "-O file" name. */ - con->target = url_file_name (u, NULL); + con->target = url_file_name (opt.trustservernames || !original_url ? u : original_url, NULL); if (!opt.output_document) locf = con->target; else @@ -1676,8 +1678,8 @@ ftp_loop_internal (struct url *u, struct /* If we are working on a WARC record, getftp should also write to the warc_tmp file. */ - err = getftp (u, len, &qtyread, restval, con, count, &last_expected_bytes, - warc_tmp); + err = getftp (u, original_url, len, &qtyread, restval, con, count, + &last_expected_bytes, warc_tmp); if (con->csock == -1) con->st &= ~DONE_CWD; @@ -1830,7 +1832,8 @@ Removing file due to --delete-after in f /* Return the directory listing in a reusable format. The directory is specifed in u->dir. */ static uerr_t -ftp_get_listing (struct url *u, ccon *con, struct fileinfo **f) +ftp_get_listing (struct url *u, struct url *original_url, ccon *con, + struct fileinfo **f) { uerr_t err; char *uf; /* url file name */ @@ -1851,7 +1854,7 @@ ftp_get_listing (struct url *u, ccon *co con->target = xstrdup (lf); xfree (lf); - err = ftp_loop_internal (u, NULL, con, NULL); + err = ftp_loop_internal (u, original_url, NULL, con, NULL); lf = xstrdup (con->target); xfree (con->target); con->target = old_target; @@ -1874,8 +1877,9 @@ ftp_get_listing (struct url *u, ccon *co return err; } -static uerr_t ftp_retrieve_dirs (struct url *, struct fileinfo *, ccon *); -static uerr_t ftp_retrieve_glob (struct url *, ccon *, int); +static uerr_t ftp_retrieve_dirs (struct url *, struct url *, + struct fileinfo *, ccon *); +static uerr_t ftp_retrieve_glob (struct url *, struct url *, ccon *, int); static struct fileinfo *delelement (struct fileinfo *, struct fileinfo **); static void freefileinfo (struct fileinfo *f); @@ -1887,7 +1891,8 @@ static void freefileinfo (struct fileinf If opt.recursive is set, after all files have been retrieved, ftp_retrieve_dirs will be called to retrieve the directories. */ static uerr_t -ftp_retrieve_list (struct url *u, struct fileinfo *f, ccon *con) +ftp_retrieve_list (struct url *u, struct url *original_url, + struct fileinfo *f, ccon *con) { static int depth = 0; uerr_t err; @@ -2046,7 +2051,7 @@ Already have correct symlink %s -> %s\n\ else /* opt.retr_symlinks */ { if (dlthis) - err = ftp_loop_internal (u, f, con, NULL); + err = ftp_loop_internal (u, original_url, f, con, NULL); } /* opt.retr_symlinks */ break; case FT_DIRECTORY: @@ -2057,7 +2062,7 @@ Already have correct symlink %s -> %s\n\ case FT_PLAINFILE: /* Call the retrieve loop. */ if (dlthis) - err = ftp_loop_internal (u, f, con, NULL); + err = ftp_loop_internal (u, original_url, f, con, NULL); break; case FT_UNKNOWN: logprintf (LOG_NOTQUIET, _("%s: unknown/unsupported file type.\n"), @@ -2122,7 +2127,7 @@ Already have correct symlink %s -> %s\n\ /* We do not want to call ftp_retrieve_dirs here */ if (opt.recursive && !(opt.reclevel != INFINITE_RECURSION && depth >= opt.reclevel)) - err = ftp_retrieve_dirs (u, orig, con); + err = ftp_retrieve_dirs (u, original_url, orig, con); else if (opt.recursive) DEBUGP ((_("Will not retrieve dirs since depth is %d (max %d).\n"), depth, opt.reclevel)); @@ -2135,7 +2140,8 @@ Already have correct symlink %s -> %s\n\ ftp_retrieve_glob on each directory entry. The function knows about excluded directories. */ static uerr_t -ftp_retrieve_dirs (struct url *u, struct fileinfo *f, ccon *con) +ftp_retrieve_dirs (struct url *u, struct url *original_url, + struct fileinfo *f, ccon *con) { char *container = NULL; int container_size = 0; @@ -2185,7 +2191,7 @@ Not descending to %s as it is excluded/n odir = xstrdup (u->dir); /* because url_set_dir will free u->dir. */ url_set_dir (u, newdir); - ftp_retrieve_glob (u, con, GLOB_GETALL); + ftp_retrieve_glob (u, original_url, con, GLOB_GETALL); url_set_dir (u, odir); xfree (odir); @@ -2244,14 +2250,15 @@ is_invalid_entry (struct fileinfo *f) GLOB_GLOBALL, use globbing; if it's GLOB_GETALL, download the whole directory. */ static uerr_t -ftp_retrieve_glob (struct url *u, ccon *con, int action) +ftp_retrieve_glob (struct url *u, struct url *original_url, + ccon *con, int action) { struct fileinfo *f, *start; uerr_t res; con->cmd |= LEAVE_PENDING; - res = ftp_get_listing (u, con, &start); + res = ftp_get_listing (u, original_url, con, &start); if (res != RETROK) return res; /* First: weed out that do not conform the global rules given in @@ -2347,7 +2354,7 @@ ftp_retrieve_glob (struct url *u, ccon * if (start) { /* Just get everything. */ - res = ftp_retrieve_list (u, start, con); + res = ftp_retrieve_list (u, original_url, start, con); } else { @@ -2363,7 +2370,7 @@ ftp_retrieve_glob (struct url *u, ccon * { /* Let's try retrieving it anyway. */ con->st |= ON_YOUR_OWN; - res = ftp_loop_internal (u, NULL, con, NULL); + res = ftp_loop_internal (u, original_url, NULL, con, NULL); return res; } @@ -2383,8 +2390,8 @@ ftp_retrieve_glob (struct url *u, ccon * of URL. Inherently, its capabilities are limited on what can be encoded into a URL. */ uerr_t -ftp_loop (struct url *u, char **local_file, int *dt, struct url *proxy, - bool recursive, bool glob) +ftp_loop (struct url *u, struct url *original_url, char **local_file, int *dt, + struct url *proxy, bool recursive, bool glob) { ccon con; /* FTP connection */ uerr_t res; @@ -2405,16 +2412,17 @@ ftp_loop (struct url *u, char **local_fi if (!*u->file && !recursive) { struct fileinfo *f; - res = ftp_get_listing (u, &con, &f); + res = ftp_get_listing (u, original_url, &con, &f); if (res == RETROK) { if (opt.htmlify && !opt.spider) { + struct url *url_file = opt.trustservernames ? u : original_url; char *filename = (opt.output_document ? xstrdup (opt.output_document) : (con.target ? xstrdup (con.target) - : url_file_name (u, NULL))); + : url_file_name (url_file, NULL))); res = ftp_index (filename, u, f); if (res == FTPOK && opt.verbose) { @@ -2459,11 +2467,11 @@ ftp_loop (struct url *u, char **local_fi /* ftp_retrieve_glob is a catch-all function that gets called if we need globbing, time-stamping, recursion or preserve permissions. Its third argument is just what we really need. */ - res = ftp_retrieve_glob (u, &con, + res = ftp_retrieve_glob (u, original_url, &con, ispattern ? GLOB_GLOBALL : GLOB_GETONE); } else - res = ftp_loop_internal (u, NULL, &con, local_file); + res = ftp_loop_internal (u, original_url, NULL, &con, local_file); } if (res == FTPOK) res = RETROK; Index: wget-1.16/src/ftp.h =================================================================== --- wget-1.16.orig/src/ftp.h +++ wget-1.16/src/ftp.h @@ -152,7 +152,8 @@ enum wget_ftp_fstatus }; struct fileinfo *ftp_parse_ls (const char *, const enum stype); -uerr_t ftp_loop (struct url *, char **, int *, struct url *, bool, bool); +uerr_t ftp_loop (struct url *, struct url *, char **, int *, struct url *, + bool, bool); uerr_t ftp_index (const char *, struct url *, struct fileinfo *); Index: wget-1.16/src/retr.c =================================================================== --- wget-1.16.orig/src/retr.c +++ wget-1.16/src/retr.c @@ -807,7 +807,8 @@ retrieve_url (struct url * orig_parsed, if (redirection_count) oldrec = glob = false; - result = ftp_loop (u, &local_file, dt, proxy_url, recursive, glob); + result = ftp_loop (u, orig_parsed, &local_file, dt, proxy_url, + recursive, glob); recursive = oldrec; /* There is a possibility of having HTTP being redirected to ++++++ wget-libproxy.patch ++++++ --- configure.ac | 16 ++++++++++++++++ src/Makefile.am | 2 +- src/retr.c | 37 +++++++++++++++++++++++++++++++++++++ tests/Makefile.am | 1 + 4 files changed, 55 insertions(+), 1 deletion(-) Index: wget-1.16/configure.ac =================================================================== --- wget-1.16.orig/configure.ac 2014-10-29 20:41:01.000000000 +0000 +++ wget-1.16/configure.ac 2014-10-29 20:41:05.000000000 +0000 @@ -366,6 +366,22 @@ else fi +dnl +dnl libproxy support +dnl +AC_ARG_ENABLE(libproxy, + [ --enable-libproxy libproxy support for system wide proxy configuration]) +if test "${enable_libproxy}" != "no" +then + PKG_CHECK_MODULES([libproxy], [libproxy-1.0], [enable_libproxy=yes], [enable_libproxy=no]) +fi +if test "${enable_libproxy}" = "yes" +then + AC_SUBST(libproxy_CFLAGS) + AC_SUBST(libproxy_LIBS) + AC_DEFINE([HAVE_LIBPROXY], 1, [Define when using libproxy]) +fi + dnl ********************************************************************** dnl Checks for IPv6 dnl ********************************************************************** Index: wget-1.16/src/Makefile.am =================================================================== --- wget-1.16.orig/src/Makefile.am 2014-10-29 20:41:01.000000000 +0000 +++ wget-1.16/src/Makefile.am 2014-10-29 20:41:05.000000000 +0000 @@ -37,7 +37,7 @@ endif # The following line is losing on some versions of make! DEFS += -DSYSTEM_WGETRC=\"$(sysconfdir)/wgetrc\" -DLOCALEDIR=\"$(localedir)\" -LIBS += $(LIBICONV) $(LIBINTL) $(LIB_CLOCK_GETTIME) +LIBS += $(LIBICONV) $(LIBINTL) $(libproxy_LIBS) $(LIB_CLOCK_GETTIME) EXTRA_DIST = css.l css.c css_.c build_info.c.in Index: wget-1.16/src/retr.c =================================================================== --- wget-1.16.orig/src/retr.c 2014-10-29 20:41:01.000000000 +0000 +++ wget-1.16/src/retr.c 2014-10-29 20:41:05.000000000 +0000 @@ -57,6 +57,10 @@ as that of the covered work. */ #include "html-url.h" #include "iri.h" +#ifdef HAVE_LIBPROXY +#include "proxy.h" +#endif + /* Total size of downloaded files. Used to enforce quota. */ SUM_SIZE_INT total_downloaded_bytes; @@ -1266,7 +1270,40 @@ getproxy (struct url *u) break; } if (!proxy || !*proxy) +#ifdef HAVE_LIBPROXY + { + pxProxyFactory *pf = px_proxy_factory_new(); + if (!pf) + { + debug_logprintf (_("Allocating memory for libproxy failed")); + return NULL; + } + int i; + char direct[] = "direct://"; + + debug_logprintf (_("asking libproxy about url '%s'\n"), u->url); + char **proxies = px_proxy_factory_get_proxies(pf, u->url); + if (proxies[0]) + { + char *check = NULL; + asprintf(&check , "%s", proxies[0]); + debug_logprintf (_("libproxy suggest to use '%s'\n"), check); + if(strcmp(check ,direct) != 0) + { + asprintf(&proxy , "%s", proxies[0]); + debug_logprintf (_("case 2: libproxy setting to use '%s'\n"), proxy); + } + } + for(i=0;proxies[i];i++) free(proxies[i]); + free(proxies); + free(pf); + + if (!proxy || !*proxy) + return NULL; + } +#else return NULL; +#endif /* Handle shorthands. `rewritten_storage' is a kludge to allow getproxy() to return static storage. */ Index: wget-1.16/tests/Makefile.am =================================================================== --- wget-1.16.orig/tests/Makefile.am 2014-10-29 20:41:40.000000000 +0000 +++ wget-1.16/tests/Makefile.am 2014-10-29 20:42:18.000000000 +0000 @@ -33,6 +33,7 @@ # Version: $(VERSION) # +LIBS += $(libproxy_LIBS) ../src/wget$(EXEEXT): cd ../src && $(MAKE) $(AM_MAKEFLAGS) ++++++ wgetrc.patch ++++++ Index: doc/sample.wgetrc =================================================================== --- doc/sample.wgetrc.orig +++ doc/sample.wgetrc @@ -114,6 +114,9 @@ # To try ipv6 addresses first: #prefer-family = IPv6 +# +# Let the DNS resolver decide whether to prefer IPv4 or IPv6 +prefer-family = none # Set default IRI support state #iri = off