Brian May <[email protected]> writes: > Oh wait, this is a debian native package. Means I will probably have to > patch the files directly, not rely on debian/patches. So was only > working before because I was testing with patches applied. > > Curiously I am getting a test failure when testing without my patches.
Attached is the latest patch, now taking into account this is a Debian native package without any patches applied. -- Brian May <[email protected]>
diff -Nru ikiwiki-3.20141016.4/CHANGELOG ikiwiki-3.20141016.4+deb8u1/CHANGELOG --- ikiwiki-3.20141016.4/CHANGELOG 2017-01-12 05:18:52.000000000 +1100 +++ ikiwiki-3.20141016.4+deb8u1/CHANGELOG 2019-03-07 17:35:55.000000000 +1100 @@ -1,3 +1,10 @@ +ikiwiki (3.20141016.4+deb8u1) jessie-security; urgency=high + + * Non-maintainer upload by the LTS Team. + * CVE-2019-9187: Fix server-side request forgery via aggregate plugin. + + -- Brian May <[email protected]> Thu, 07 Mar 2019 17:35:55 +1100 + ikiwiki (3.20141016.4) jessie-security; urgency=high * Reference CVE-2016-4561 in 3.20141016.3 changelog diff -Nru ikiwiki-3.20141016.4/debian/changelog ikiwiki-3.20141016.4+deb8u1/debian/changelog --- ikiwiki-3.20141016.4/debian/changelog 2017-01-12 05:18:52.000000000 +1100 +++ ikiwiki-3.20141016.4+deb8u1/debian/changelog 2019-03-07 17:35:55.000000000 +1100 @@ -1,3 +1,10 @@ +ikiwiki (3.20141016.4+deb8u1) jessie-security; urgency=high + + * Non-maintainer upload by the LTS Team. + * CVE-2019-9187: Fix server-side request forgery via aggregate plugin. + + -- Brian May <[email protected]> Thu, 07 Mar 2019 17:35:55 +1100 + ikiwiki (3.20141016.4) jessie-security; urgency=high * Reference CVE-2016-4561 in 3.20141016.3 changelog diff -Nru ikiwiki-3.20141016.4/debian/control ikiwiki-3.20141016.4+deb8u1/debian/control --- ikiwiki-3.20141016.4/debian/control 2017-01-12 05:18:52.000000000 +1100 +++ ikiwiki-3.20141016.4+deb8u1/debian/control 2019-03-07 17:35:55.000000000 +1100 @@ -17,7 +17,8 @@ libnet-openid-consumer-perl, libxml-feed-perl, libxml-parser-perl, - libxml-twig-perl + libxml-twig-perl, + liblwpx-paranoidagent-perl, Maintainer: Simon McVittie <[email protected]> Uploaders: Josh Triplett <[email protected]> Standards-Version: 3.9.5 diff -Nru ikiwiki-3.20141016.4/doc/plugins/aggregate.mdwn ikiwiki-3.20141016.4+deb8u1/doc/plugins/aggregate.mdwn --- ikiwiki-3.20141016.4/doc/plugins/aggregate.mdwn 2017-01-12 05:18:52.000000000 +1100 +++ ikiwiki-3.20141016.4+deb8u1/doc/plugins/aggregate.mdwn 2019-03-07 17:35:55.000000000 +1100 @@ -11,6 +11,10 @@ one. Either the [[htmltidy]] or [[htmlbalance]] plugin is suggested, since feeds can easily contain html problems, some of which these plugins can fix. +Installing the [[!cpan LWPx::ParanoidAgent]] Perl module is strongly +recommended. The [[!cpan LWP]] module can also be used, but is susceptible +to server-side request forgery. + ## triggering aggregation You will need to run ikiwiki periodically from a cron job, passing it the diff -Nru ikiwiki-3.20141016.4/doc/plugins/blogspam.mdwn ikiwiki-3.20141016.4+deb8u1/doc/plugins/blogspam.mdwn --- ikiwiki-3.20141016.4/doc/plugins/blogspam.mdwn 2017-01-12 05:18:52.000000000 +1100 +++ ikiwiki-3.20141016.4+deb8u1/doc/plugins/blogspam.mdwn 2019-03-07 17:35:55.000000000 +1100 @@ -11,6 +11,8 @@ go to your Preferences page, and click the "Comment Moderation" button. The plugin requires the [[!cpan JSON]] perl module. +The [[!cpan LWPx::ParanoidAgent]] Perl module is recommended, +although this plugin can also fall back to [[!cpan LWP]]. You can control how content is tested via the `blogspam_options` setting. The list of options is [here](http://blogspam.net/api/testComment.html#options). diff -Nru ikiwiki-3.20141016.4/doc/plugins/openid.mdwn ikiwiki-3.20141016.4+deb8u1/doc/plugins/openid.mdwn --- ikiwiki-3.20141016.4/doc/plugins/openid.mdwn 2017-01-12 05:18:52.000000000 +1100 +++ ikiwiki-3.20141016.4+deb8u1/doc/plugins/openid.mdwn 2019-03-07 17:35:55.000000000 +1100 @@ -7,8 +7,11 @@ The plugin needs the [[!cpan Net::OpenID::Consumer]] perl module. Version 1.x is needed in order for OpenID v2 to work. -The [[!cpan LWPx::ParanoidAgent]] perl module is used if available, for -added security. Finally, the [[!cpan Crypt::SSLeay]] perl module is needed +The [[!cpan LWPx::ParanoidAgent]] Perl module is strongly recommended. +The [[!cpan LWP]] module can also be used, but is susceptible to +server-side request forgery. + +The [[!cpan Crypt::SSLeay]] Perl module is needed to support users entering "https" OpenID urls. This plugin is enabled by default, but can be turned off if you want to diff -Nru ikiwiki-3.20141016.4/doc/plugins/pinger.mdwn ikiwiki-3.20141016.4+deb8u1/doc/plugins/pinger.mdwn --- ikiwiki-3.20141016.4/doc/plugins/pinger.mdwn 2017-01-12 05:18:52.000000000 +1100 +++ ikiwiki-3.20141016.4+deb8u1/doc/plugins/pinger.mdwn 2019-03-07 17:35:55.000000000 +1100 @@ -10,9 +10,11 @@ To configure what URLs to ping, use the [[ikiwiki/directive/ping]] [[ikiwiki/directive]]. -The [[!cpan LWP]] perl module is used for pinging. Or the [[!cpan -LWPx::ParanoidAgent]] perl module is used if available, for added security. -Finally, the [[!cpan Crypt::SSLeay]] perl module is needed to support pinging +The [[!cpan LWPx::ParanoidAgent]] Perl module is strongly recommended. +The [[!cpan LWP]] module can also be used, but is susceptible +to server-side request forgery. + +The [[!cpan Crypt::SSLeay]] perl module is needed to support pinging "https" urls. By default the pinger will try to ping a site for 15 seconds before timing diff -Nru ikiwiki-3.20141016.4/doc/security.mdwn ikiwiki-3.20141016.4+deb8u1/doc/security.mdwn --- ikiwiki-3.20141016.4/doc/security.mdwn 2017-01-12 05:18:52.000000000 +1100 +++ ikiwiki-3.20141016.4+deb8u1/doc/security.mdwn 2019-03-07 17:35:55.000000000 +1100 @@ -526,3 +526,52 @@ able to attach images. Upgrading ImageMagick to a version where CVE-2016-3714 has been fixed is also recommended, but at the time of writing no such version is available. + +## Server-side request forgery via aggregate plugin + +The ikiwiki maintainers discovered that the [[plugins/aggregate]] plugin +did not use [[!cpan LWPx::ParanoidAgent]]. On sites where the +aggregate plugin is enabled, authorized wiki editors could tell ikiwiki +to fetch potentially undesired URIs even if LWPx::ParanoidAgent was +installed: + +* local files via `file:` URIs +* other URI schemes that might be misused by attackers, such as `gopher:` +* hosts that resolve to loopback IP addresses (127.x.x.x) +* hosts that resolve to RFC 1918 IP addresses (192.168.x.x etc.) + +This could be used by an attacker to publish information that should not have +been accessible, cause denial of service by requesting "tarpit" URIs that are +slow to respond, or cause undesired side-effects if local web servers implement +["unsafe"](https://tools.ietf.org/html/rfc7231#section-4.2.1) GET requests. +([[!debcve CVE-2019-9187]]) + +Additionally, if the LWPx::ParanoidAgent module was not installed, the +[[plugins/blogspam]], [[plugins/openid]] and [[plugins/pinger]] plugins +would fall back to [[!cpan LWP]], which is susceptible to similar attacks. +This is unlikely to be a practical problem for the blogspam plugin because +the URL it requests is under the control of the wiki administrator, but +the openid plugin can request URLs controlled by unauthenticated remote +users, and the pinger plugin can request URLs controlled by authorized +wiki editors. + +This is addressed in ikiwiki 3.20190228 as follows, with the same fixes +backported to Debian 9 in version 3.20170111.1: + +* URI schemes other than `http:` and `https:` are not accepted, preventing + access to `file:`, `gopher:`, etc. + +* If a proxy is [[configured in the ikiwiki setup file|tips/using_a_proxy]], + it is used for all outgoing `http:` and `https:` requests. In this case + the proxy is responsible for blocking any requests that are undesired, + including loopback or RFC 1918 addresses. + +* If a proxy is not configured, and LWPx::ParanoidAgent is installed, + it will be used. This prevents loopback and RFC 1918 IP addresses, and + sets a timeout to avoid denial of service via "tarpit" URIs. + +* Otherwise, the ordinary LWP user-agent will be used. This allows requests + to loopback and RFC 1918 IP addresses, and has less robust timeout + behaviour. We are not treating this as a vulnerability: if this + behaviour is not acceptable for your site, please make sure to install + LWPx::ParanoidAgent or disable the affected plugins. diff -Nru ikiwiki-3.20141016.4/doc/tips/using_a_proxy.mdwn ikiwiki-3.20141016.4+deb8u1/doc/tips/using_a_proxy.mdwn --- ikiwiki-3.20141016.4/doc/tips/using_a_proxy.mdwn 1970-01-01 10:00:00.000000000 +1000 +++ ikiwiki-3.20141016.4+deb8u1/doc/tips/using_a_proxy.mdwn 2019-03-07 17:35:55.000000000 +1100 @@ -0,0 +1,22 @@ +Some ikiwiki plugins make outgoing HTTP requests from the web server: + +* [[plugins/aggregate]] (to download Atom and RSS feeds) +* [[plugins/blogspam]] (to check whether a comment or edit is spam) +* [[plugins/openid]] (to authenticate users) +* [[plugins/pinger]] (to ping other ikiwiki installations) + +If your ikiwiki installation cannot contact the Internet without going +through a proxy, you can configure this in the [[setup file|setup]] by +setting environment variables: + + ENV: + http_proxy: "http://proxy.example.com:8080" + https_proxy: "http://proxy.example.com:8080" + # optional + no_proxy: ".example.com,www.example.org" + +Note that some plugins will use the configured proxy for all destinations, +even if they are listed in `no_proxy`. + +To avoid server-side request forgery attacks, ensure that your proxy does +not allow requests to addresses that are considered to be internal. diff -Nru ikiwiki-3.20141016.4/IkiWiki/Plugin/aggregate.pm ikiwiki-3.20141016.4+deb8u1/IkiWiki/Plugin/aggregate.pm --- ikiwiki-3.20141016.4/IkiWiki/Plugin/aggregate.pm 2017-01-12 05:18:52.000000000 +1100 +++ ikiwiki-3.20141016.4+deb8u1/IkiWiki/Plugin/aggregate.pm 2019-03-07 17:35:55.000000000 +1100 @@ -513,7 +513,10 @@ } $feed->{feedurl}=pop @urls; } - my $ua=useragent(); + # Using the for_url parameter makes sure we crash if used + # with an older IkiWiki.pm that didn't automatically try + # to use LWPx::ParanoidAgent. + my $ua=useragent(for_url => $feed->{feedurl}); my $res=URI::Fetch->fetch($feed->{feedurl}, UserAgent=>$ua); if (! $res) { $feed->{message}=URI::Fetch->errstr; diff -Nru ikiwiki-3.20141016.4/IkiWiki/Plugin/blogspam.pm ikiwiki-3.20141016.4+deb8u1/IkiWiki/Plugin/blogspam.pm --- ikiwiki-3.20141016.4/IkiWiki/Plugin/blogspam.pm 2017-01-12 05:18:52.000000000 +1100 +++ ikiwiki-3.20141016.4+deb8u1/IkiWiki/Plugin/blogspam.pm 2019-03-07 17:35:55.000000000 +1100 @@ -57,18 +57,10 @@ }; error $@ if $@; - eval q{use LWPx::ParanoidAgent}; - if (!$@) { - $client=LWPx::ParanoidAgent->new(agent => $config{useragent}); - } - else { - eval q{use LWP}; - if ($@) { - error $@; - return; - } - $client=useragent(); - } + # Using the for_url parameter makes sure we crash if used + # with an older IkiWiki.pm that didn't automatically try + # to use LWPx::ParanoidAgent. + $client=useragent(for_url => $config{blogspam_server}); } sub checkcontent (@) { diff -Nru ikiwiki-3.20141016.4/IkiWiki/Plugin/openid.pm ikiwiki-3.20141016.4+deb8u1/IkiWiki/Plugin/openid.pm --- ikiwiki-3.20141016.4/IkiWiki/Plugin/openid.pm 2017-01-12 05:18:52.000000000 +1100 +++ ikiwiki-3.20141016.4+deb8u1/IkiWiki/Plugin/openid.pm 2019-03-07 17:35:55.000000000 +1100 @@ -237,14 +237,10 @@ eval q{use Net::OpenID::Consumer}; error($@) if $@; - my $ua; - eval q{use LWPx::ParanoidAgent}; - if (! $@) { - $ua=LWPx::ParanoidAgent->new(agent => $config{useragent}); - } - else { - $ua=useragent(); - } + # We pass the for_url parameter, even though it's undef, because + # that will make sure we crash if used with an older IkiWiki.pm + # that didn't automatically try to use LWPx::ParanoidAgent. + my $ua=useragent(for_url => undef); # Store the secret in the session. my $secret=$session->param("openid_secret"); diff -Nru ikiwiki-3.20141016.4/IkiWiki/Plugin/pinger.pm ikiwiki-3.20141016.4+deb8u1/IkiWiki/Plugin/pinger.pm --- ikiwiki-3.20141016.4/IkiWiki/Plugin/pinger.pm 2017-01-12 05:18:52.000000000 +1100 +++ ikiwiki-3.20141016.4+deb8u1/IkiWiki/Plugin/pinger.pm 2019-03-07 17:35:55.000000000 +1100 @@ -70,17 +70,16 @@ eval q{use Net::INET6Glue::INET_is_INET6}; # may not be available my $ua; - eval q{use LWPx::ParanoidAgent}; - if (!$@) { - $ua=LWPx::ParanoidAgent->new(agent => $config{useragent}); - } - else { - eval q{use LWP}; - if ($@) { - debug(gettext("LWP not found, not pinging")); - return; - } - $ua=useragent(); + eval { + # We pass the for_url parameter, even though it's + # undef, because that will make sure we crash if used + # with an older IkiWiki.pm that didn't automatically + # try to use LWPx::ParanoidAgent. + $ua=useragent(for_url => undef); + }; + if ($@) { + debug(gettext("LWP not found, not pinging").": $@"); + return; } $ua->timeout($config{pinger_timeout} || 15); diff -Nru ikiwiki-3.20141016.4/IkiWiki.pm ikiwiki-3.20141016.4+deb8u1/IkiWiki.pm --- ikiwiki-3.20141016.4/IkiWiki.pm 2017-01-12 05:18:52.000000000 +1100 +++ ikiwiki-3.20141016.4+deb8u1/IkiWiki.pm 2019-03-07 17:35:55.000000000 +1100 @@ -2367,12 +2367,131 @@ $autofiles{$file}{generator}=$generator; } -sub useragent () { - return LWP::UserAgent->new( - cookie_jar => $config{cookiejar}, - env_proxy => 1, # respect proxy env vars +sub useragent (@) { + my %params = @_; + my $for_url = delete $params{for_url}; + # Fail safe, in case a plugin calling this function is relying on + # a future parameter to make the UA more strict + foreach my $key (keys %params) { + error "Internal error: useragent(\"$key\" => ...) not understood"; + } + + eval q{use LWP}; + error($@) if $@; + + my %args = ( agent => $config{useragent}, + cookie_jar => $config{cookiejar}, + env_proxy => 0, + protocols_allowed => [qw(http https)], ); + my %proxies; + + if (defined $for_url) { + # We know which URL we're going to fetch, so we can choose + # whether it's going to go through a proxy or not. + # + # We reimplement http_proxy, https_proxy and no_proxy here, so + # that we are not relying on LWP implementing them exactly the + # same way we do. + + eval q{use URI}; + error($@) if $@; + + my $proxy; + my $uri = URI->new($for_url); + + if ($uri->scheme eq 'http') { + $proxy = $ENV{http_proxy}; + # HTTP_PROXY is deliberately not implemented + # because the HTTP_* namespace is also used by CGI + } + elsif ($uri->scheme eq 'https') { + $proxy = $ENV{https_proxy}; + $proxy = $ENV{HTTPS_PROXY} unless defined $proxy; + } + else { + $proxy = undef; + } + + foreach my $var (qw(no_proxy NO_PROXY)) { + my $no_proxy = $ENV{$var}; + if (defined $no_proxy) { + foreach my $domain (split /\s*,\s*/, $no_proxy) { + if ($domain =~ s/^\*?\.//) { + # no_proxy="*.example.com" or + # ".example.com": match suffix + # against .example.com + if ($uri->host =~ m/(^|\.)\Q$domain\E$/i) { + $proxy = undef; + } + } + else { + # no_proxy="example.com": + # match exactly example.com + if (lc $uri->host eq lc $domain) { + $proxy = undef; + } + } + } + } + } + + if (defined $proxy) { + $proxies{$uri->scheme} = $proxy; + # Paranoia: make sure we can't bypass the proxy + $args{protocols_allowed} = [$uri->scheme]; + } + } + else { + # The plugin doesn't know yet which URL(s) it's going to + # fetch, so we have to make some conservative assumptions. + my $http_proxy = $ENV{http_proxy}; + my $https_proxy = $ENV{https_proxy}; + $https_proxy = $ENV{HTTPS_PROXY} unless defined $https_proxy; + + # We don't respect no_proxy here: if we are not using the + # paranoid user-agent, then we need to give the proxy the + # opportunity to reject undesirable requests. + + # If we have one, we need the other: otherwise, neither + # LWPx::ParanoidAgent nor the proxy would have the + # opportunity to filter requests for the other protocol. + if (defined $https_proxy && defined $http_proxy) { + %proxies = (http => $http_proxy, https => $https_proxy); + } + elsif (defined $https_proxy) { + %proxies = (http => $https_proxy, https => $https_proxy); + } + elsif (defined $http_proxy) { + %proxies = (http => $http_proxy, https => $http_proxy); + } + + } + + if (scalar keys %proxies) { + # The configured proxy is responsible for deciding which + # URLs are acceptable to fetch and which URLs are not. + my $ua = LWP::UserAgent->new(%args); + foreach my $scheme (@{$ua->protocols_allowed}) { + unless ($proxies{$scheme}) { + error "internal error: $scheme is allowed but has no proxy"; + } + } + # We can't pass the proxies in %args because that only + # works since LWP 6.24. + foreach my $scheme (keys %proxies) { + $ua->proxy($scheme, $proxies{$scheme}); + } + return $ua; + } + + eval q{use LWPx::ParanoidAgent}; + if ($@) { + print STDERR "warning: installing LWPx::ParanoidAgent is recommended\n"; + return LWP::UserAgent->new(%args); + } + return LWPx::ParanoidAgent->new(%args); } sub sortspec_translate ($$) { diff -Nru ikiwiki-3.20141016.4/t/aggregate-file.t ikiwiki-3.20141016.4+deb8u1/t/aggregate-file.t --- ikiwiki-3.20141016.4/t/aggregate-file.t 1970-01-01 10:00:00.000000000 +1000 +++ ikiwiki-3.20141016.4+deb8u1/t/aggregate-file.t 2019-03-07 17:35:55.000000000 +1100 @@ -0,0 +1,173 @@ +#!/usr/bin/perl +use utf8; +use warnings; +use strict; + +use Encode; +use Test::More; + +BEGIN { + plan(skip_all => "CGI not available") + unless eval q{ + use CGI qw(); + 1; + }; + + plan(skip_all => "IPC::Run not available") + unless eval q{ + use IPC::Run qw(run); + 1; + }; + + use_ok('IkiWiki'); + use_ok('YAML::XS'); +} + +# We check for English error messages +$ENV{LC_ALL} = 'C'; + +use Cwd qw(getcwd); +use Errno qw(ENOENT); + +my $installed = $ENV{INSTALLED_TESTS}; + +my @command; +if ($installed) { + @command = qw(ikiwiki --plugin inline); +} +else { + ok(! system("make -s ikiwiki.out")); + @command = ("perl", "-I".getcwd."/blib/lib", './ikiwiki.out', + '--underlaydir='.getcwd.'/underlays/basewiki', + '--set', 'underlaydirbase='.getcwd.'/underlays', + '--templatedir='.getcwd.'/templates'); +} + +sub write_old_file { + my $name = shift; + my $dir = shift; + my $content = shift; + writefile($name, $dir, $content); + ok(utime(333333333, 333333333, "$dir/$name")); +} + +sub write_setup_file { + my %params = @_; + my %setup = ( + wikiname => 'this is the name of my wiki', + srcdir => getcwd.'/t/tmp/in', + destdir => getcwd.'/t/tmp/out', + url => 'http://example.com', + cgiurl => 'http://example.com/cgi-bin/ikiwiki.cgi', + cgi_wrapper => getcwd.'/t/tmp/ikiwiki.cgi', + cgi_wrappermode => '0751', + add_plugins => [qw(aggregate)], + disable_plugins => [qw(emailauth openid passwordauth)], + aggregate_webtrigger => 1, + ); + if ($params{without_paranoia}) { + $setup{libdirs} = [getcwd.'/t/noparanoia']; + } + unless ($installed) { + $setup{ENV} = { 'PERL5LIB' => getcwd.'/blib/lib' }; + } + writefile("test.setup", "t/tmp", + "# IkiWiki::Setup::Yaml - YAML formatted setup file\n" . + Dump(\%setup)); +} + +sub thoroughly_rebuild { + ok(unlink("t/tmp/ikiwiki.cgi") || $!{ENOENT}); + ok(! system(@command, qw(--setup t/tmp/test.setup --rebuild --wrappers))); +} + +sub run_cgi { + my (%args) = @_; + my ($in, $out); + my $method = $args{method} || 'GET'; + my $environ = $args{environ} || {}; + my $params = $args{params} || { do => 'prefs' }; + + my %defaults = ( + SCRIPT_NAME => '/cgi-bin/ikiwiki.cgi', + HTTP_HOST => 'example.com', + ); + + my $cgi = CGI->new($args{params}); + my $query_string = $cgi->query_string(); + diag $query_string; + + if ($method eq 'POST') { + $defaults{REQUEST_METHOD} = 'POST'; + $in = $query_string; + $defaults{CONTENT_LENGTH} = length $in; + } else { + $defaults{REQUEST_METHOD} = 'GET'; + $defaults{QUERY_STRING} = $query_string; + } + + my %envvars = ( + %defaults, + %$environ, + ); + run(["./t/tmp/ikiwiki.cgi"], \$in, \$out, init => sub { + map { + $ENV{$_} = $envvars{$_} + } keys(%envvars); + }); + + return decode_utf8($out); +} + +sub test { + my $content; + + ok(! system(qw(rm -rf t/tmp))); + ok(! system(qw(mkdir t/tmp))); + + write_old_file('aggregator.mdwn', 't/tmp/in', + '[[!aggregate name="ssrf" url="file://'.getcwd.'/t/secret.rss"]]' + .'[[!inline pages="internal(aggregator/*)"]]'); + + write_setup_file(); + thoroughly_rebuild(); + + $content = run_cgi( + method => 'GET', + params => { + do => 'aggregate_webtrigger', + }, + ); + unlike($content, qr{creating new page}); + unlike($content, qr{Secrets}); + ok(! -e 't/tmp/in/.ikiwiki/transient/aggregator/ssrf'); + ok(! -e 't/tmp/in/.ikiwiki/transient/aggregator/ssrf/Secrets_go_here._aggregated'); + + thoroughly_rebuild(); + $content = readfile('t/tmp/out/aggregator/index.html'); + unlike($content, qr{Secrets}); + + diag('Trying test again with LWPx::ParanoidAgent disabled'); + + write_setup_file(without_paranoia => 1); + thoroughly_rebuild(); + + $content = run_cgi( + method => 'GET', + params => { + do => 'aggregate_webtrigger', + }, + ); + unlike($content, qr{creating new page}); + unlike($content, qr{Secrets}); + ok(! -e 't/tmp/in/.ikiwiki/transient/aggregator/ssrf'); + ok(! -e 't/tmp/in/.ikiwiki/transient/aggregator/ssrf/Secrets_go_here._aggregated'); + + thoroughly_rebuild(); + $content = readfile('t/tmp/out/aggregator/index.html'); + unlike($content, qr{Secrets}); +} + +test(); + +done_testing(); diff -Nru ikiwiki-3.20141016.4/t/noparanoia/LWPx/ParanoidAgent.pm ikiwiki-3.20141016.4+deb8u1/t/noparanoia/LWPx/ParanoidAgent.pm --- ikiwiki-3.20141016.4/t/noparanoia/LWPx/ParanoidAgent.pm 1970-01-01 10:00:00.000000000 +1000 +++ ikiwiki-3.20141016.4+deb8u1/t/noparanoia/LWPx/ParanoidAgent.pm 2019-03-07 17:35:55.000000000 +1100 @@ -0,0 +1,2 @@ +# make import fail +0; diff -Nru ikiwiki-3.20141016.4/t/secret.rss ikiwiki-3.20141016.4+deb8u1/t/secret.rss --- ikiwiki-3.20141016.4/t/secret.rss 1970-01-01 10:00:00.000000000 +1000 +++ ikiwiki-3.20141016.4+deb8u1/t/secret.rss 2019-03-07 17:35:55.000000000 +1100 @@ -0,0 +1,11 @@ +<?xml version="1.0"?> +<rss version="2.0"> +<channel> +<title>Secrets go here</title> +<description>Secrets go here</description> +<item> + <title>Secrets go here</title> + <description>Secrets go here</description> +</item> +</channel> +</rss> diff -Nru ikiwiki-3.20141016.4/t/useragent.t ikiwiki-3.20141016.4+deb8u1/t/useragent.t --- ikiwiki-3.20141016.4/t/useragent.t 1970-01-01 10:00:00.000000000 +1000 +++ ikiwiki-3.20141016.4+deb8u1/t/useragent.t 2019-03-07 17:35:55.000000000 +1100 @@ -0,0 +1,317 @@ +#!/usr/bin/perl +use warnings; +use strict; +use Test::More; + +my $have_paranoid_agent; +BEGIN { + plan(skip_all => 'LWP not available') + unless eval q{ + use LWP qw(); 1; + }; + use_ok("IkiWiki"); + $have_paranoid_agent = eval q{ + use LWPx::ParanoidAgent qw(); 1; + }; +} + +eval { useragent(future_feature => 1); }; +ok($@, 'future features should cause useragent to fail'); + +diag "==== No proxy ===="; +delete $ENV{http_proxy}; +delete $ENV{https_proxy}; +delete $ENV{no_proxy}; +delete $ENV{HTTPS_PROXY}; +delete $ENV{NO_PROXY}; + +diag "---- Unspecified URL ----"; +my $ua = useragent(for_url => undef); +SKIP: { + skip 'paranoid agent not available', 1 unless $have_paranoid_agent; + ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible'); +} +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]); +is($ua->proxy('http'), undef, 'No http proxy'); +is($ua->proxy('https'), undef, 'No https proxy'); + +diag "---- Specified URL ----"; +$ua = useragent(for_url => 'http://example.com'); +SKIP: { + skip 'paranoid agent not available', 1 unless $have_paranoid_agent; + ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible'); +} +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]); +is($ua->proxy('http'), undef, 'No http proxy'); +is($ua->proxy('https'), undef, 'No https proxy'); + +diag "==== Proxy for everything ===="; +$ENV{http_proxy} = 'http://proxy:8080'; +$ENV{https_proxy} = 'http://sproxy:8080'; +delete $ENV{no_proxy}; +delete $ENV{HTTPS_PROXY}; +delete $ENV{NO_PROXY}; + +diag "---- Unspecified URL ----"; +$ua = useragent(for_url => undef); +ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent'); +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]); +is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy'); +is($ua->proxy('https'), 'http://sproxy:8080', 'should use CONNECT proxy'); +$ua = useragent(for_url => 'http://example.com'); +ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent'); +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http)]); +is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy'); +# We don't care what $ua->proxy('https') is, because it won't be used +$ua = useragent(for_url => 'https://example.com'); +ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent'); +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(https)]); +# We don't care what $ua->proxy('http') is, because it won't be used +is($ua->proxy('https'), 'http://sproxy:8080', 'should use CONNECT proxy'); + +diag "==== Selective proxy ===="; +$ENV{http_proxy} = 'http://proxy:8080'; +$ENV{https_proxy} = 'http://sproxy:8080'; +$ENV{no_proxy} = '*.example.net,example.com,.example.org'; +delete $ENV{HTTPS_PROXY}; +delete $ENV{NO_PROXY}; + +diag "---- Unspecified URL ----"; +$ua = useragent(for_url => undef); +ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent'); +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]); +is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy'); +is($ua->proxy('https'), 'http://sproxy:8080', 'should use CONNECT proxy'); + +diag "---- Exact match for no_proxy ----"; +$ua = useragent(for_url => 'http://example.com'); +SKIP: { + skip 'paranoid agent not available', 1 unless $have_paranoid_agent; + ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible'); +} +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]); +is($ua->proxy('http'), undef); +is($ua->proxy('https'), undef); + +diag "---- Subdomain of exact domain in no_proxy ----"; +$ua = useragent(for_url => 'http://sub.example.com'); +ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent'); +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http)]); +is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy'); + +diag "---- example.net matches *.example.net ----"; +$ua = useragent(for_url => 'https://example.net'); +SKIP: { + skip 'paranoid agent not available', 1 unless $have_paranoid_agent; + ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible'); +} +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]); +is($ua->proxy('http'), undef); +is($ua->proxy('https'), undef); + +diag "---- sub.example.net matches *.example.net ----"; +$ua = useragent(for_url => 'https://sub.example.net'); +SKIP: { + skip 'paranoid agent not available', 1 unless $have_paranoid_agent; + ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible'); +} +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]); +is($ua->proxy('http'), undef); +is($ua->proxy('https'), undef); + +diag "---- badexample.net does not match *.example.net ----"; +$ua = useragent(for_url => 'https://badexample.net'); +ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent'); +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(https)]); +is($ua->proxy('https'), 'http://sproxy:8080', 'should use proxy'); + +diag "---- example.org matches .example.org ----"; +$ua = useragent(for_url => 'https://example.org'); +SKIP: { + skip 'paranoid agent not available', 1 unless $have_paranoid_agent; + ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible'); +} +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]); +is($ua->proxy('http'), undef); +is($ua->proxy('https'), undef); + +diag "---- sub.example.org matches .example.org ----"; +$ua = useragent(for_url => 'https://sub.example.org'); +SKIP: { + skip 'paranoid agent not available', 1 unless $have_paranoid_agent; + ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible'); +} +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]); +is($ua->proxy('http'), undef); +is($ua->proxy('https'), undef); + +diag "---- badexample.org does not match .example.org ----"; +$ua = useragent(for_url => 'https://badexample.org'); +ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent'); +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(https)]); +is($ua->proxy('https'), 'http://sproxy:8080', 'should use proxy'); + +diag "==== Selective proxy (alternate variables) ===="; +$ENV{http_proxy} = 'http://proxy:8080'; +delete $ENV{https_proxy}; +$ENV{HTTPS_PROXY} = 'http://sproxy:8080'; +delete $ENV{no_proxy}; +$ENV{NO_PROXY} = '*.example.net,example.com,.example.org'; + +diag "---- Unspecified URL ----"; +$ua = useragent(for_url => undef); +ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent'); +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]); +is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy'); +is($ua->proxy('https'), 'http://sproxy:8080', 'should use CONNECT proxy'); + +diag "---- Exact match for no_proxy ----"; +$ua = useragent(for_url => 'http://example.com'); +SKIP: { + skip 'paranoid agent not available', 1 unless $have_paranoid_agent; + ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible'); +} +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]); +is($ua->proxy('http'), undef); +is($ua->proxy('https'), undef); + +diag "---- Subdomain of exact domain in no_proxy ----"; +$ua = useragent(for_url => 'http://sub.example.com'); +ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent'); +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http)]); +is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy'); + +diag "---- example.net matches *.example.net ----"; +$ua = useragent(for_url => 'https://example.net'); +SKIP: { + skip 'paranoid agent not available', 1 unless $have_paranoid_agent; + ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible'); +} +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]); +is($ua->proxy('http'), undef); +is($ua->proxy('https'), undef); + +diag "---- sub.example.net matches *.example.net ----"; +$ua = useragent(for_url => 'https://sub.example.net'); +SKIP: { + skip 'paranoid agent not available', 1 unless $have_paranoid_agent; + ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible'); +} +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]); +is($ua->proxy('http'), undef); +is($ua->proxy('https'), undef); + +diag "---- badexample.net does not match *.example.net ----"; +$ua = useragent(for_url => 'https://badexample.net'); +ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent'); +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(https)]); +is($ua->proxy('https'), 'http://sproxy:8080', 'should use proxy'); + +diag "---- example.org matches .example.org ----"; +$ua = useragent(for_url => 'https://example.org'); +SKIP: { + skip 'paranoid agent not available', 1 unless $have_paranoid_agent; + ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible'); +} +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]); +is($ua->proxy('http'), undef); +is($ua->proxy('https'), undef); + +diag "---- sub.example.org matches .example.org ----"; +$ua = useragent(for_url => 'https://sub.example.org'); +SKIP: { + skip 'paranoid agent not available', 1 unless $have_paranoid_agent; + ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible'); +} +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]); +is($ua->proxy('http'), undef); +is($ua->proxy('https'), undef); + +diag "---- badexample.org does not match .example.org ----"; +$ua = useragent(for_url => 'https://badexample.org'); +ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent'); +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(https)]); +is($ua->proxy('https'), 'http://sproxy:8080', 'should use proxy'); + +diag "==== Selective proxy (many variables) ===="; +$ENV{http_proxy} = 'http://proxy:8080'; +$ENV{https_proxy} = 'http://sproxy:8080'; +# This one should be ignored in favour of https_proxy +$ENV{HTTPS_PROXY} = 'http://not.preferred.proxy:3128'; +# These two should be merged +$ENV{no_proxy} = '*.example.net,example.com'; +$ENV{NO_PROXY} = '.example.org'; + +diag "---- Unspecified URL ----"; +$ua = useragent(for_url => undef); +ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent'); +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]); +is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy'); +is($ua->proxy('https'), 'http://sproxy:8080', 'should use CONNECT proxy'); + +diag "---- Exact match for no_proxy ----"; +$ua = useragent(for_url => 'http://example.com'); +SKIP: { + skip 'paranoid agent not available', 1 unless $have_paranoid_agent; + ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible'); +} +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]); +is($ua->proxy('http'), undef); +is($ua->proxy('https'), undef); + +diag "---- Subdomain of exact domain in no_proxy ----"; +$ua = useragent(for_url => 'http://sub.example.com'); +ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent'); +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http)]); +is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy'); + +diag "---- example.net matches *.example.net ----"; +$ua = useragent(for_url => 'https://example.net'); +SKIP: { + skip 'paranoid agent not available', 1 unless $have_paranoid_agent; + ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible'); +} +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]); +is($ua->proxy('http'), undef); +is($ua->proxy('https'), undef); + +diag "---- sub.example.net matches *.example.net ----"; +$ua = useragent(for_url => 'https://sub.example.net'); +SKIP: { + skip 'paranoid agent not available', 1 unless $have_paranoid_agent; + ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible'); +} +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]); +is($ua->proxy('http'), undef); +is($ua->proxy('https'), undef); + +diag "---- badexample.net does not match *.example.net ----"; +$ua = useragent(for_url => 'https://badexample.net'); +ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent'); +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(https)]); +is($ua->proxy('https'), 'http://sproxy:8080', 'should use proxy'); + +diag "==== One but not the other ====\n"; +$ENV{http_proxy} = 'http://proxy:8080'; +delete $ENV{https_proxy}; +delete $ENV{HTTPS_PROXY}; +delete $ENV{no_proxy}; +delete $ENV{NO_PROXY}; +$ua = useragent(for_url => undef); +ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent'); +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]); +is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy'); +is($ua->proxy('https'), 'http://proxy:8080', 'should use proxy'); + +delete $ENV{http_proxy}; +$ENV{https_proxy} = 'http://sproxy:8080'; +delete $ENV{HTTPS_PROXY}; +delete $ENV{no_proxy}; +delete $ENV{NO_PROXY}; +$ua = useragent(for_url => undef); +ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of ParanoidAgent'); +is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]); +is($ua->proxy('http'), 'http://sproxy:8080', 'should use proxy'); +is($ua->proxy('https'), 'http://sproxy:8080', 'should use proxy'); + +done_testing;
