Package: release.debian.org
Severity: normal
User: [email protected]
Usertags: unblock

Please unblock package ikiwiki to fix CVE-2019-9187. It should migrate
naturally before the hard freeze in any case, but it might be worthwhile
to fast-track it.

Now that ikiwiki is a non-native package, I intend to use Debian patches
rather than new upstream releases for any subsequent updates that target
buster.

unblock ikiwiki/3.20190228-1

Thanks,
    smcv
diffstat for ikiwiki-3.20190207 ikiwiki-3.20190228

 CHANGELOG                                                                      
                                |   43 +
 IkiWiki.pm                                                                     
                                |  127 +++-
 IkiWiki/Plugin/aggregate.pm                                                    
                                |    5 
 IkiWiki/Plugin/blogspam.pm                                                     
                                |   16 
 IkiWiki/Plugin/openid.pm                                                       
                                |   12 
 IkiWiki/Plugin/pinger.pm                                                       
                                |   21 
 IkiWiki/Plugin/po.pm                                                           
                                |   50 -
 debian/changelog                                                               
                                |   44 +
 
doc/bugs/po:_second_or_subsequent_inline_of_translated_page_inlines_.po_file__44___not_translated_content.mdwn
 |    3 
 doc/news/version_3.20170622.mdwn                                               
                                |   31 
 doc/news/version_3.20190207.mdwn                                               
                                |   34 +
 doc/plugins/aggregate.mdwn                                                     
                                |    4 
 doc/plugins/blogspam.mdwn                                                      
                                |    2 
 doc/plugins/openid.mdwn                                                        
                                |    7 
 doc/plugins/pinger.mdwn                                                        
                                |    8 
 doc/rcs/cvs.mdwn                                                               
                                |    3 
 doc/security.mdwn                                                              
                                |   49 +
 doc/tips/using_a_proxy.mdwn                                                    
                                |   22 
 ikiwiki.spec                                                                   
                                |    2 
 po/ikiwiki.pot                                                                 
                                |   60 -
 t/aggregate-file.t                                                             
                                |  173 +++++
 t/noparanoia/LWPx/ParanoidAgent.pm                                             
                                |    2 
 t/po.t                                                                         
                                |   38 -
 t/secret.rss                                                                   
                                |   11 
 t/useragent.t                                                                  
                                |  317 ++++++++++
 25 files changed, 893 insertions(+), 191 deletions(-)

diff -Nru ikiwiki-3.20190207/CHANGELOG ikiwiki-3.20190228/CHANGELOG
--- ikiwiki-3.20190207/CHANGELOG        2019-02-07 11:08:41.000000000 +0000
+++ ikiwiki-3.20190228/CHANGELOG        2019-02-26 23:01:54.000000000 +0000
@@ -1,3 +1,46 @@
+ikiwiki (3.20190228) upstream; urgency=medium
+
+  * aggregate: Use LWPx::ParanoidAgent if available.
+    Previously blogspam, openid and pinger used this module if available,
+    but aggregate did not. This prevents server-side request forgery or
+    local file disclosure, and mitigates denial of service when slow
+    "tarpit" URLs are accessed.
+    (CVE-2019-9187)
+  * blogspam, openid, pinger: Use a HTTP proxy if configured, even if
+    LWPx::ParanoidAgent is installed.
+    Previously, only aggregate would obey proxy configuration. If a proxy
+    is used, the proxy (not ikiwiki) is responsible for preventing attacks
+    like CVE-2019-9187.
+  * aggregate, blogspam, openid, pinger: Do not access non-http, non-https
+    URLs.
+    Previously, these plugins would have allowed non-HTTP-based requests if
+    LWPx::ParanoidAgent was not installed. Preventing file URIs avoids local
+    file disclosure, and preventing other rarely-used URI schemes like
+    gopher mitigates request forgery attacks.
+  * aggregate, openid, pinger: Document LWPx::ParanoidAgent as strongly
+    recommended.
+    These plugins can request attacker-controlled URLs in some site
+    configurations.
+  * blogspam: Document LWPx::ParanoidAgent as desirable.
+    This plugin doesn't request attacker-controlled URLs, so it's
+    non-critical here.
+  * blogspam, openid, pinger: Consistently use cookiejar if configured.
+    Previously, these plugins would only obey this configuration if
+    LWPx::ParanoidAgent was not installed, but this appears to have been
+    unintended.
+  * po: Always filter .po files.
+    The po plugin in previous ikiwiki releases made the second and
+    subsequent filter call per (page, destpage) pair into a no-op,
+    apparently in an attempt to prevent *recursive* filtering (which as
+    far as we can tell can't happen anyway), with the undesired effect
+    of interpreting the raw .po file as page content (e.g. Markdown)
+    if it was inlined into the same page twice, which is apparently
+    something that tails.org does. Simplify this by deleting the code
+    that prevented repeated filtering. Thanks, intrigeri
+    (Closes: #911356)
+
+ -- Simon McVittie <[email protected]>  Tue, 26 Feb 2019 21:05:49 +0000
+
 ikiwiki (3.20190207) upstream; urgency=medium
 
   [ Amitai Schleier ]
diff -Nru ikiwiki-3.20190207/debian/changelog 
ikiwiki-3.20190228/debian/changelog
--- ikiwiki-3.20190207/debian/changelog 2019-02-07 11:13:08.000000000 +0000
+++ ikiwiki-3.20190228/debian/changelog 2019-02-26 23:04:42.000000000 +0000
@@ -1,3 +1,47 @@
+ikiwiki (3.20190228-1) unstable; urgency=high
+
+  * New upstream release
+    - aggregate: Use LWPx::ParanoidAgent if available.
+      Previously blogspam, openid and pinger used this module if available,
+      but aggregate did not. This prevents server-side request forgery or
+      local file disclosure, and mitigates denial of service when slow
+      "tarpit" URLs are accessed.
+      (CVE-2019-9187)
+    - blogspam, openid, pinger: Use a HTTP proxy if configured, even if
+      LWPx::ParanoidAgent is installed.
+      Previously, only aggregate would obey proxy configuration. If a proxy
+      is used, the proxy (not ikiwiki) is responsible for preventing attacks
+      like CVE-2019-9187.
+    - aggregate, blogspam, openid, pinger: Do not access non-http, non-https
+      URLs.
+      Previously, these plugins would have allowed non-HTTP-based requests if
+      LWPx::ParanoidAgent was not installed. Preventing file URIs avoids local
+      file disclosure, and preventing other rarely-used URI schemes like
+      gopher mitigates request forgery attacks.
+    - aggregate, openid, pinger: Document LWPx::ParanoidAgent as strongly
+      recommended.
+      These plugins can request attacker-controlled URLs in some site
+      configurations.
+    - blogspam: Document LWPx::ParanoidAgent as desirable.
+      This plugin doesn't request attacker-controlled URLs, so it's
+      non-critical here.
+    - blogspam, openid, pinger: Consistently use cookiejar if configured.
+      Previously, these plugins would only obey this configuration if
+      LWPx::ParanoidAgent was not installed, but this appears to have been
+      unintended.
+    - po: Always filter .po files.
+      The po plugin in previous ikiwiki releases made the second and
+      subsequent filter call per (page, destpage) pair into a no-op,
+      apparently in an attempt to prevent *recursive* filtering (which as
+      far as we can tell can't happen anyway), with the undesired effect
+      of interpreting the raw .po file as page content (e.g. Markdown)
+      if it was inlined into the same page twice, which is apparently
+      something that tails.org does. Simplify this by deleting the code
+      that prevented repeated filtering. Thanks, intrigeri
+      (Closes: #911356)
+
+ -- Simon McVittie <[email protected]>  Tue, 26 Feb 2019 23:04:42 +0000
+
 ikiwiki (3.20190207-1) unstable; urgency=medium
 
   [ Simon McVittie ]
diff -Nru 
ikiwiki-3.20190207/doc/bugs/po:_second_or_subsequent_inline_of_translated_page_inlines_.po_file__44___not_translated_content.mdwn
 
ikiwiki-3.20190228/doc/bugs/po:_second_or_subsequent_inline_of_translated_page_inlines_.po_file__44___not_translated_content.mdwn
--- 
ikiwiki-3.20190207/doc/bugs/po:_second_or_subsequent_inline_of_translated_page_inlines_.po_file__44___not_translated_content.mdwn
   2019-02-07 11:08:41.000000000 +0000
+++ 
ikiwiki-3.20190228/doc/bugs/po:_second_or_subsequent_inline_of_translated_page_inlines_.po_file__44___not_translated_content.mdwn
   2019-02-26 23:01:54.000000000 +0000
@@ -177,3 +177,6 @@
 
 If it's valid to remove the `alreadyfiltered` mechanism, my
 `wip/po-filter-every-time` branch does that. Please test?
+
+> intrigeri says [this change works as intended on 
tails.org](https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=911356#41),
+> so I've applied it. [[done]] --[[smcv]]
diff -Nru ikiwiki-3.20190207/doc/news/version_3.20170622.mdwn 
ikiwiki-3.20190228/doc/news/version_3.20170622.mdwn
--- ikiwiki-3.20190207/doc/news/version_3.20170622.mdwn 2019-02-07 
11:08:41.000000000 +0000
+++ ikiwiki-3.20190228/doc/news/version_3.20170622.mdwn 1970-01-01 
01:00:00.000000000 +0100
@@ -1,31 +0,0 @@
-ikiwiki 3.20170622 released with [[!toggle text="these changes"]]
-[[!toggleable text="""
-   * `t/git-cgi.t`: Wait 1 second before doing a revert that should work.
-     This hopefully fixes a race condition in which the test failed
-     around 6% of the time. (Closes: #[862494](http://bugs.debian.org/862494))
-   * Guard against set-but-empty `REMOTE_USER` CGI variable on
-     misconfigured nginx servers, and in general treat sessions with
-     a set-but-empty name as if they were not signed in.
-   * When the CGI fails, print the error to stderr, not "Died"
-   * mdwn: Don't mangle <code>&lt;style&gt;</code> into 
<code>&lt;elyts&gt;</code> under some circumstances
-   * mdwn: Enable footnotes by default when using the default Discount
-     implementation. A new `mdwn_footnotes` option can be used to disable
-     footnotes in MultiMarkdown and Discount.
-   * mdwn: Don't enable alphabetically labelled ordered lists by
-     default when using the default Discount implementation. A new
-     `mdwn_alpha_list` option can be used to restore the old
-     interpretation.
-   * osm: Convert savestate hook into a changes hook. savestate is not
-     the right place to write wiki content, and in particular this
-     breaks websetup if osm's dependencies are not installed, even
-     if the osm plugin is not actually enabled.
-     (Closes: #[719913](http://bugs.debian.org/719913))
-   * toc: if the heading is of the form `<h1 id="...">`, use that for
-     the link in the table of contents (but continue to generate
-     `<a name="index42"></a>` in case someone was relying on it).
-     Thanks, [[Antoine Beaupré|anarcat]]
-   * color: Do not leak markup into contexts that take only the plain
-     text, such as toc
-   * meta: Document `\[[!meta name="foo" content="bar"]]`
-   * debian: Use preferred https URL for Format of `debian/copyright`
-   * debian: Declare compliance with Debian Policy 4.0.0"""]]
diff -Nru ikiwiki-3.20190207/doc/news/version_3.20190207.mdwn 
ikiwiki-3.20190228/doc/news/version_3.20190207.mdwn
--- ikiwiki-3.20190207/doc/news/version_3.20190207.mdwn 1970-01-01 
01:00:00.000000000 +0100
+++ ikiwiki-3.20190228/doc/news/version_3.20190207.mdwn 2019-02-26 
23:01:54.000000000 +0000
@@ -0,0 +1,34 @@
+ikiwiki 3.20190207 released with [[!toggle text="these changes"]]
+[[!toggleable text="""
+ * [ Amitai Schleier ]
+   * graph: Add an optional "file" parameter
+   * emailauth: When email can't be sent, show the error message
+   * osm: Don't raise errors if tags don't have attached icons
+   * cgi: Avoid C compiler warnings for waitpid() on NetBSD
+ * [ Simon McVittie ]
+   * Hide popup template content from documentation (Closes: 
#[898836](http://bugs.debian.org/898836))
+   * meta: Make [[!meta date]] show an error if dates are invalid or
+     Date::Parse can't be loaded
+   * inline: Cope with non-ASCII `rootpage` parameter.
+     Thanks, Feng Shu
+   * table: Cope with non-ASCII content in CSV format tables.
+     Thanks, Feng Shu
+   * trail: Allow unescaped punctuation in `pagenames` parameter
+   * comments: Hide "add comment" link from print stylesheet.
+     Thanks, Antoine Beaupré
+   * recentchangesdiff, relativedate, toggle:
+     Import JavaScript at the end of the page content, not the beginning,
+     so that the browser can render content as soon as possible.
+     Thanks, Antoine Beaupré
+   * debian: Allow Breezy as an alternative to bzr
+     Thanks, Jelmer Vernooij
+   * inline: Add basic test coverage for [[!inline rootpage]]
+   * table: Add basic test coverage
+   * po: Add enough test coverage to reproduce Debian #911356
+   * comments: Improve test coverage
+   * tests: Exercise Unicode more
+ * [ Joey Hess ]
+   * aggregate: Fix aggregation of posts without a title.
+     Thanks, Alexandre Oliva
+   * poll: Added postlink and posttrail options for better multi-page polls.
+   * Fix permalink to comments."""]]
\ No newline at end of file
diff -Nru ikiwiki-3.20190207/doc/plugins/aggregate.mdwn 
ikiwiki-3.20190228/doc/plugins/aggregate.mdwn
--- ikiwiki-3.20190207/doc/plugins/aggregate.mdwn       2019-02-07 
11:08:41.000000000 +0000
+++ ikiwiki-3.20190228/doc/plugins/aggregate.mdwn       2019-02-26 
23:01:54.000000000 +0000
@@ -11,6 +11,10 @@
 one. Either the [[htmltidy]] or [[htmlbalance]] plugin is suggested, since
 feeds can easily contain html problems, some of which these plugins can fix.
 
+Installing the [[!cpan LWPx::ParanoidAgent]] Perl module is strongly
+recommended. The [[!cpan LWP]] module can also be used, but is susceptible
+to server-side request forgery.
+
 ## triggering aggregation
 
 You will need to run ikiwiki periodically from a cron job, passing it the
diff -Nru ikiwiki-3.20190207/doc/plugins/blogspam.mdwn 
ikiwiki-3.20190228/doc/plugins/blogspam.mdwn
--- ikiwiki-3.20190207/doc/plugins/blogspam.mdwn        2019-02-07 
11:08:41.000000000 +0000
+++ ikiwiki-3.20190228/doc/plugins/blogspam.mdwn        2019-02-26 
23:01:54.000000000 +0000
@@ -11,6 +11,8 @@
 go to your Preferences page, and click the "Comment Moderation" button.
 
 The plugin requires the [[!cpan JSON]] perl module.
+The [[!cpan LWPx::ParanoidAgent]] Perl module is recommended,
+although this plugin can also fall back to [[!cpan LWP]].
 
 You can control how content is tested via the `blogspam_options` setting.
 The list of options is 
[here](http://blogspam.net/api/2.0/testComment.html#options).
diff -Nru ikiwiki-3.20190207/doc/plugins/openid.mdwn 
ikiwiki-3.20190228/doc/plugins/openid.mdwn
--- ikiwiki-3.20190207/doc/plugins/openid.mdwn  2019-02-07 11:08:41.000000000 
+0000
+++ ikiwiki-3.20190228/doc/plugins/openid.mdwn  2019-02-26 23:01:54.000000000 
+0000
@@ -7,8 +7,11 @@
 The plugin needs the [[!cpan Net::OpenID::Consumer]] perl module.
 Version 1.x is needed in order for OpenID v2 to work.
 
-The [[!cpan LWPx::ParanoidAgent]] perl module is used if available, for
-added security. Finally, the [[!cpan Crypt::SSLeay]] perl module is needed
+The [[!cpan LWPx::ParanoidAgent]] Perl module is strongly recommended.
+The [[!cpan LWP]] module can also be used, but is susceptible to
+server-side request forgery.
+
+The [[!cpan Crypt::SSLeay]] Perl module is needed
 to support users entering "https" OpenID urls.
 
 This plugin is enabled by default, but can be turned off if you want to
diff -Nru ikiwiki-3.20190207/doc/plugins/pinger.mdwn 
ikiwiki-3.20190228/doc/plugins/pinger.mdwn
--- ikiwiki-3.20190207/doc/plugins/pinger.mdwn  2019-02-07 11:08:41.000000000 
+0000
+++ ikiwiki-3.20190228/doc/plugins/pinger.mdwn  2019-02-26 23:01:54.000000000 
+0000
@@ -10,9 +10,11 @@
 To configure what URLs to ping, use the [[ikiwiki/directive/ping]]
 [[ikiwiki/directive]].
 
-The [[!cpan LWP]] perl module is used for pinging. Or the [[!cpan
-LWPx::ParanoidAgent]] perl module is used if available, for added security.
-Finally, the [[!cpan Crypt::SSLeay]] perl module is needed to support pinging
+The [[!cpan LWPx::ParanoidAgent]] Perl module is strongly recommended.
+The [[!cpan LWP]] module can also be used, but is susceptible
+to server-side request forgery.
+
+The [[!cpan Crypt::SSLeay]] perl module is needed to support pinging
 "https" urls.
 
 By default the pinger will try to ping a site for 15 seconds before timing
diff -Nru ikiwiki-3.20190207/doc/rcs/cvs.mdwn 
ikiwiki-3.20190228/doc/rcs/cvs.mdwn
--- ikiwiki-3.20190207/doc/rcs/cvs.mdwn 2019-02-07 11:08:41.000000000 +0000
+++ ikiwiki-3.20190228/doc/rcs/cvs.mdwn 2019-02-26 23:01:54.000000000 +0000
@@ -5,7 +5,8 @@
 
 ### Usage
 7. Install [[!cpan File::chdir]], [[!cpan File::ReadBackwards]],
-   [cvsps](http://www.cobite.com/cvsps/), and
+   [cvsps](http://www.cobite.com/cvsps/)
+   (note: probably not [cvsps3](http://www.catb.org/~esr/cvsps/)), and
    [cvsweb](http://www.freebsd.org/projects/cvsweb.html) or the like.
 7. Adjust CVS-related parameters in your setup file.
 
diff -Nru ikiwiki-3.20190207/doc/security.mdwn 
ikiwiki-3.20190228/doc/security.mdwn
--- ikiwiki-3.20190207/doc/security.mdwn        2019-02-07 11:08:41.000000000 
+0000
+++ ikiwiki-3.20190228/doc/security.mdwn        2019-02-26 23:01:54.000000000 
+0000
@@ -611,3 +611,52 @@
 in version 3.20141016.4.
 
 ([[!debcve CVE-2017-0356]]/OVE-20170111-0001)
+
+## Server-side request forgery via aggregate plugin
+
+The ikiwiki maintainers discovered that the [[plugins/aggregate]] plugin
+did not use [[!cpan LWPx::ParanoidAgent]]. On sites where the
+aggregate plugin is enabled, authorized wiki editors could tell ikiwiki
+to fetch potentially undesired URIs even if LWPx::ParanoidAgent was
+installed:
+
+* local files via `file:` URIs
+* other URI schemes that might be misused by attackers, such as `gopher:`
+* hosts that resolve to loopback IP addresses (127.x.x.x)
+* hosts that resolve to RFC 1918 IP addresses (192.168.x.x etc.)
+
+This could be used by an attacker to publish information that should not have
+been accessible, cause denial of service by requesting "tarpit" URIs that are
+slow to respond, or cause undesired side-effects if local web servers implement
+["unsafe"](https://tools.ietf.org/html/rfc7231#section-4.2.1) GET requests.
+([[!debcve CVE-2019-9187]])
+
+Additionally, if the LWPx::ParanoidAgent module was not installed, the
+[[plugins/blogspam]], [[plugins/openid]] and [[plugins/pinger]] plugins
+would fall back to [[!cpan LWP]], which is susceptible to similar attacks.
+This is unlikely to be a practical problem for the blogspam plugin because
+the URL it requests is under the control of the wiki administrator, but
+the openid plugin can request URLs controlled by unauthenticated remote
+users, and the pinger plugin can request URLs controlled by authorized
+wiki editors.
+
+This is addressed in ikiwiki 3.20190228 as follows, with the same fixes
+backported to Debian 9 in version 3.20170111.1:
+
+* URI schemes other than `http:` and `https:` are not accepted, preventing
+  access to `file:`, `gopher:`, etc.
+
+* If a proxy is [[configured in the ikiwiki setup file|tips/using_a_proxy]],
+  it is used for all outgoing `http:` and `https:` requests. In this case
+  the proxy is responsible for blocking any requests that are undesired,
+  including loopback or RFC 1918 addresses.
+
+* If a proxy is not configured, and LWPx::ParanoidAgent is installed,
+  it will be used. This prevents loopback and RFC 1918 IP addresses, and
+  sets a timeout to avoid denial of service via "tarpit" URIs.
+
+* Otherwise, the ordinary LWP user-agent will be used. This allows requests
+  to loopback and RFC 1918 IP addresses, and has less robust timeout
+  behaviour. We are not treating this as a vulnerability: if this
+  behaviour is not acceptable for your site, please make sure to install
+  LWPx::ParanoidAgent or disable the affected plugins.
diff -Nru ikiwiki-3.20190207/doc/tips/using_a_proxy.mdwn 
ikiwiki-3.20190228/doc/tips/using_a_proxy.mdwn
--- ikiwiki-3.20190207/doc/tips/using_a_proxy.mdwn      1970-01-01 
01:00:00.000000000 +0100
+++ ikiwiki-3.20190228/doc/tips/using_a_proxy.mdwn      2019-02-26 
23:01:54.000000000 +0000
@@ -0,0 +1,22 @@
+Some ikiwiki plugins make outgoing HTTP requests from the web server:
+
+* [[plugins/aggregate]] (to download Atom and RSS feeds)
+* [[plugins/blogspam]] (to check whether a comment or edit is spam)
+* [[plugins/openid]] (to authenticate users)
+* [[plugins/pinger]] (to ping other ikiwiki installations)
+
+If your ikiwiki installation cannot contact the Internet without going
+through a proxy, you can configure this in the [[setup file|setup]] by
+setting environment variables:
+
+    ENV:
+        http_proxy: "http://proxy.example.com:8080";
+        https_proxy: "http://proxy.example.com:8080";
+        # optional
+        no_proxy: ".example.com,www.example.org"
+
+Note that some plugins will use the configured proxy for all destinations,
+even if they are listed in `no_proxy`.
+
+To avoid server-side request forgery attacks, ensure that your proxy does
+not allow requests to addresses that are considered to be internal.
diff -Nru ikiwiki-3.20190207/IkiWiki/Plugin/aggregate.pm 
ikiwiki-3.20190228/IkiWiki/Plugin/aggregate.pm
--- ikiwiki-3.20190207/IkiWiki/Plugin/aggregate.pm      2019-02-07 
11:08:41.000000000 +0000
+++ ikiwiki-3.20190228/IkiWiki/Plugin/aggregate.pm      2019-02-26 
23:01:54.000000000 +0000
@@ -513,7 +513,10 @@
                        }
                        $feed->{feedurl}=pop @urls;
                }
-               my $ua=useragent();
+               # Using the for_url parameter makes sure we crash if used
+               # with an older IkiWiki.pm that didn't automatically try
+               # to use LWPx::ParanoidAgent.
+               my $ua=useragent(for_url => $feed->{feedurl});
                my $res=URI::Fetch->fetch($feed->{feedurl}, UserAgent=>$ua);
                if (! $res) {
                        $feed->{message}=URI::Fetch->errstr;
diff -Nru ikiwiki-3.20190207/IkiWiki/Plugin/blogspam.pm 
ikiwiki-3.20190228/IkiWiki/Plugin/blogspam.pm
--- ikiwiki-3.20190207/IkiWiki/Plugin/blogspam.pm       2019-02-07 
11:08:41.000000000 +0000
+++ ikiwiki-3.20190228/IkiWiki/Plugin/blogspam.pm       2019-02-26 
23:01:54.000000000 +0000
@@ -57,18 +57,10 @@
        };
        error $@ if $@;
 
-       eval q{use LWPx::ParanoidAgent};
-       if (!$@) {
-               $client=LWPx::ParanoidAgent->new(agent => $config{useragent});
-       }
-       else {
-               eval q{use LWP};
-               if ($@) {
-                       error $@;
-                       return;
-               }
-               $client=useragent();
-       }
+       # Using the for_url parameter makes sure we crash if used
+       # with an older IkiWiki.pm that didn't automatically try
+       # to use LWPx::ParanoidAgent.
+       $client=useragent(for_url => $config{blogspam_server});
 }
 
 sub checkcontent (@) {
diff -Nru ikiwiki-3.20190207/IkiWiki/Plugin/openid.pm 
ikiwiki-3.20190228/IkiWiki/Plugin/openid.pm
--- ikiwiki-3.20190207/IkiWiki/Plugin/openid.pm 2019-02-07 11:08:41.000000000 
+0000
+++ ikiwiki-3.20190228/IkiWiki/Plugin/openid.pm 2019-02-26 23:01:54.000000000 
+0000
@@ -219,14 +219,10 @@
        eval q{use Net::OpenID::Consumer};
        error($@) if $@;
 
-       my $ua;
-       eval q{use LWPx::ParanoidAgent};
-       if (! $@) {
-               $ua=LWPx::ParanoidAgent->new(agent => $config{useragent});
-       }
-       else {
-               $ua=useragent();
-       }
+       # We pass the for_url parameter, even though it's undef, because
+       # that will make sure we crash if used with an older IkiWiki.pm
+       # that didn't automatically try to use LWPx::ParanoidAgent.
+       my $ua=useragent(for_url => undef);
 
        # Store the secret in the session.
        my $secret=$session->param("openid_secret");
diff -Nru ikiwiki-3.20190207/IkiWiki/Plugin/pinger.pm 
ikiwiki-3.20190228/IkiWiki/Plugin/pinger.pm
--- ikiwiki-3.20190207/IkiWiki/Plugin/pinger.pm 2019-02-07 11:08:41.000000000 
+0000
+++ ikiwiki-3.20190228/IkiWiki/Plugin/pinger.pm 2019-02-26 23:01:54.000000000 
+0000
@@ -70,17 +70,16 @@
                eval q{use Net::INET6Glue::INET_is_INET6}; # may not be 
available
                
                my $ua;
-               eval q{use LWPx::ParanoidAgent};
-               if (!$@) {
-                       $ua=LWPx::ParanoidAgent->new(agent => 
$config{useragent});
-               }
-               else {
-                       eval q{use LWP};
-                       if ($@) {
-                               debug(gettext("LWP not found, not pinging"));
-                               return;
-                       }
-                       $ua=useragent();
+               eval {
+                       # We pass the for_url parameter, even though it's
+                       # undef, because that will make sure we crash if used
+                       # with an older IkiWiki.pm that didn't automatically
+                       # try to use LWPx::ParanoidAgent.
+                       $ua=useragent(for_url => undef);
+               };
+               if ($@) {
+                       debug(gettext("LWP not found, not pinging").": $@");
+                       return;
                }
                $ua->timeout($config{pinger_timeout} || 15);
                
diff -Nru ikiwiki-3.20190207/IkiWiki/Plugin/po.pm 
ikiwiki-3.20190228/IkiWiki/Plugin/po.pm
--- ikiwiki-3.20190207/IkiWiki/Plugin/po.pm     2019-02-07 11:08:41.000000000 
+0000
+++ ikiwiki-3.20190228/IkiWiki/Plugin/po.pm     2019-02-26 23:01:54.000000000 
+0000
@@ -51,7 +51,6 @@
        hook(type => "checkcontent", id => "po", call => \&checkcontent);
        hook(type => "canremove", id => "po", call => \&canremove);
        hook(type => "canrename", id => "po", call => \&canrename);
-       hook(type => "editcontent", id => "po", call => \&editcontent);
        hook(type => "formbuilder_setup", id => "po", call => 
\&formbuilder_setup, last => 1);
        hook(type => "formbuilder", id => "po", call => \&formbuilder);
 
@@ -303,9 +302,8 @@
        my $page = $params{page};
        my $destpage = $params{destpage};
        my $content = $params{content};
-       if (istranslation($page) && ! alreadyfiltered($page, $destpage)) {
+       if (istranslation($page)) {
                $content = po_to_markup($page, $content);
-               setalreadyfiltered($page, $destpage);
        }
        return $content;
 }
@@ -520,15 +518,6 @@
        return undef;
 }
 
-# As we're previewing or saving a page, the content may have
-# changed, so tell the next filter() invocation it must not be lazy.
-sub editcontent () {
-       my %params=@_;
-
-       unsetalreadyfiltered($params{page}, $params{page});
-       return $params{content};
-}
-
 sub formbuilder_setup (@) {
        my %params=@_;
        my $form=$params{form};
@@ -737,42 +726,6 @@
 }
 
 # ,----
-# | Blackboxes for private data
-# `----
-
-{
-       my %filtered;
-
-       sub alreadyfiltered($$) {
-               my $page=shift;
-               my $destpage=shift;
-
-               return exists $filtered{$page}{$destpage}
-                        && $filtered{$page}{$destpage} eq 1;
-       }
-
-       sub setalreadyfiltered($$) {
-               my $page=shift;
-               my $destpage=shift;
-
-               $filtered{$page}{$destpage}=1;
-       }
-
-       sub unsetalreadyfiltered($$) {
-               my $page=shift;
-               my $destpage=shift;
-
-               if (exists $filtered{$page}{$destpage}) {
-                       delete $filtered{$page}{$destpage};
-               }
-       }
-
-       sub resetalreadyfiltered() {
-               undef %filtered;
-       }
-}
-
-# ,----
 # | Helper functions
 # `----
 
@@ -1146,7 +1099,6 @@
                IkiWiki::rcs_update();
        }
        # Reinitialize module's private variables.
-       resetalreadyfiltered();
        resettranslationscache();
        flushmemoizecache();
        # Trigger a wiki refresh.
diff -Nru ikiwiki-3.20190207/IkiWiki.pm ikiwiki-3.20190228/IkiWiki.pm
--- ikiwiki-3.20190207/IkiWiki.pm       2019-02-07 11:08:41.000000000 +0000
+++ ikiwiki-3.20190228/IkiWiki.pm       2019-02-26 23:01:54.000000000 +0000
@@ -2469,12 +2469,131 @@
        $autofiles{$file}{generator}=$generator;
 }
 
-sub useragent () {
-       return LWP::UserAgent->new(
-               cookie_jar => $config{cookiejar},
-               env_proxy => 1,         # respect proxy env vars
+sub useragent (@) {
+       my %params = @_;
+       my $for_url = delete $params{for_url};
+       # Fail safe, in case a plugin calling this function is relying on
+       # a future parameter to make the UA more strict
+       foreach my $key (keys %params) {
+               error "Internal error: useragent(\"$key\" => ...) not 
understood";
+       }
+
+       eval q{use LWP};
+       error($@) if $@;
+
+       my %args = (
                agent => $config{useragent},
+               cookie_jar => $config{cookiejar},
+               env_proxy => 0,
+               protocols_allowed => [qw(http https)],
        );
+       my %proxies;
+
+       if (defined $for_url) {
+               # We know which URL we're going to fetch, so we can choose
+               # whether it's going to go through a proxy or not.
+               #
+               # We reimplement http_proxy, https_proxy and no_proxy here, so
+               # that we are not relying on LWP implementing them exactly the
+               # same way we do.
+
+               eval q{use URI};
+               error($@) if $@;
+
+               my $proxy;
+               my $uri = URI->new($for_url);
+
+               if ($uri->scheme eq 'http') {
+                       $proxy = $ENV{http_proxy};
+                       # HTTP_PROXY is deliberately not implemented
+                       # because the HTTP_* namespace is also used by CGI
+               }
+               elsif ($uri->scheme eq 'https') {
+                       $proxy = $ENV{https_proxy};
+                       $proxy = $ENV{HTTPS_PROXY} unless defined $proxy;
+               }
+               else {
+                       $proxy = undef;
+               }
+
+               foreach my $var (qw(no_proxy NO_PROXY)) {
+                       my $no_proxy = $ENV{$var};
+                       if (defined $no_proxy) {
+                               foreach my $domain (split /\s*,\s*/, $no_proxy) 
{
+                                       if ($domain =~ s/^\*?\.//) {
+                                               # no_proxy="*.example.com" or
+                                               # ".example.com": match suffix
+                                               # against .example.com
+                                               if ($uri->host =~ 
m/(^|\.)\Q$domain\E$/i) {
+                                                       $proxy = undef;
+                                               }
+                                       }
+                                       else {
+                                               # no_proxy="example.com":
+                                               # match exactly example.com
+                                               if (lc $uri->host eq lc 
$domain) {
+                                                       $proxy = undef;
+                                               }
+                                       }
+                               }
+                       }
+               }
+
+               if (defined $proxy) {
+                       $proxies{$uri->scheme} = $proxy;
+                       # Paranoia: make sure we can't bypass the proxy
+                       $args{protocols_allowed} = [$uri->scheme];
+               }
+       }
+       else {
+               # The plugin doesn't know yet which URL(s) it's going to
+               # fetch, so we have to make some conservative assumptions.
+               my $http_proxy = $ENV{http_proxy};
+               my $https_proxy = $ENV{https_proxy};
+               $https_proxy = $ENV{HTTPS_PROXY} unless defined $https_proxy;
+
+               # We don't respect no_proxy here: if we are not using the
+               # paranoid user-agent, then we need to give the proxy the
+               # opportunity to reject undesirable requests.
+
+               # If we have one, we need the other: otherwise, neither
+               # LWPx::ParanoidAgent nor the proxy would have the
+               # opportunity to filter requests for the other protocol.
+               if (defined $https_proxy && defined $http_proxy) {
+                       %proxies = (http => $http_proxy, https => $https_proxy);
+               }
+               elsif (defined $https_proxy) {
+                       %proxies = (http => $https_proxy, https => 
$https_proxy);
+               }
+               elsif (defined $http_proxy) {
+                       %proxies = (http => $http_proxy, https => $http_proxy);
+               }
+
+       }
+
+       if (scalar keys %proxies) {
+               # The configured proxy is responsible for deciding which
+               # URLs are acceptable to fetch and which URLs are not.
+               my $ua = LWP::UserAgent->new(%args);
+               foreach my $scheme (@{$ua->protocols_allowed}) {
+                       unless ($proxies{$scheme}) {
+                               error "internal error: $scheme is allowed but 
has no proxy";
+                       }
+               }
+               # We can't pass the proxies in %args because that only
+               # works since LWP 6.24.
+               foreach my $scheme (keys %proxies) {
+                       $ua->proxy($scheme, $proxies{$scheme});
+               }
+               return $ua;
+       }
+
+       eval q{use LWPx::ParanoidAgent};
+       if ($@) {
+               print STDERR "warning: installing LWPx::ParanoidAgent is 
recommended\n";
+               return LWP::UserAgent->new(%args);
+       }
+       return LWPx::ParanoidAgent->new(%args);
 }
 
 sub sortspec_translate ($$) {
diff -Nru ikiwiki-3.20190207/ikiwiki.spec ikiwiki-3.20190228/ikiwiki.spec
--- ikiwiki-3.20190207/ikiwiki.spec     2019-02-07 11:08:41.000000000 +0000
+++ ikiwiki-3.20190228/ikiwiki.spec     2019-02-26 23:01:54.000000000 +0000
@@ -1,5 +1,5 @@
 Name:           ikiwiki
-Version: 3.20190207
+Version: 3.20190228
 Release:        1%{?dist}
 Summary:        A wiki compiler
 
diff -Nru ikiwiki-3.20190207/po/ikiwiki.pot ikiwiki-3.20190228/po/ikiwiki.pot
--- ikiwiki-3.20190207/po/ikiwiki.pot   2019-02-07 11:08:41.000000000 +0000
+++ ikiwiki-3.20190228/po/ikiwiki.pot   2019-02-26 23:01:54.000000000 +0000
@@ -8,7 +8,7 @@
 msgstr ""
 "Project-Id-Version: PACKAGE VERSION\n"
 "Report-Msgid-Bugs-To: \n"
-"POT-Creation-Date: 2019-02-07 11:08+0000\n"
+"POT-Creation-Date: 2019-02-26 23:01+0000\n"
 "PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n"
 "Last-Translator: FULL NAME <EMAIL@ADDRESS>\n"
 "Language-Team: LANGUAGE <[email protected]>\n"
@@ -109,30 +109,30 @@
 msgid "could not find feed at %s"
 msgstr ""
 
-#: ../IkiWiki/Plugin/aggregate.pm:529
+#: ../IkiWiki/Plugin/aggregate.pm:532
 msgid "feed not found"
 msgstr ""
 
-#: ../IkiWiki/Plugin/aggregate.pm:540
+#: ../IkiWiki/Plugin/aggregate.pm:543
 #, perl-format
 msgid "(invalid UTF-8 stripped from feed)"
 msgstr ""
 
-#: ../IkiWiki/Plugin/aggregate.pm:548
+#: ../IkiWiki/Plugin/aggregate.pm:551
 #, perl-format
 msgid "(feed entities escaped)"
 msgstr ""
 
-#: ../IkiWiki/Plugin/aggregate.pm:558
+#: ../IkiWiki/Plugin/aggregate.pm:561
 msgid "feed crashed XML::Feed!"
 msgstr ""
 
-#: ../IkiWiki/Plugin/aggregate.pm:654
+#: ../IkiWiki/Plugin/aggregate.pm:657
 #, perl-format
 msgid "creating new page %s"
 msgstr ""
 
-#: ../IkiWiki/Plugin/aggregate.pm:684 ../IkiWiki/Plugin/edittemplate.pm:137
+#: ../IkiWiki/Plugin/aggregate.pm:687 ../IkiWiki/Plugin/edittemplate.pm:137
 msgid "failed to process template:"
 msgstr ""
 
@@ -191,7 +191,7 @@
 msgid "creating index page %s"
 msgstr ""
 
-#: ../IkiWiki/Plugin/blogspam.pm:139
+#: ../IkiWiki/Plugin/blogspam.pm:131
 msgid ""
 "Sorry, but that looks like spam to <a href=\"http://blogspam.net/";
 "\">blogspam</a>: "
@@ -732,7 +732,7 @@
 msgid "Ignoring ping directive for wiki %s (this wiki is %s)"
 msgstr ""
 
-#: ../IkiWiki/Plugin/pinger.pm:80
+#: ../IkiWiki/Plugin/pinger.pm:81
 msgid "LWP not found, not pinging"
 msgstr ""
 
@@ -740,87 +740,87 @@
 msgid "warning: Old po4a detected! Recommend upgrade to 0.35."
 msgstr ""
 
-#: ../IkiWiki/Plugin/po.pm:179
+#: ../IkiWiki/Plugin/po.pm:178
 #, perl-format
 msgid "%s is not a valid language code"
 msgstr ""
 
-#: ../IkiWiki/Plugin/po.pm:191
+#: ../IkiWiki/Plugin/po.pm:190
 #, perl-format
 msgid ""
 "%s is not a valid value for po_link_to, falling back to po_link_to=default"
 msgstr ""
 
-#: ../IkiWiki/Plugin/po.pm:196
+#: ../IkiWiki/Plugin/po.pm:195
 msgid ""
 "po_link_to=negotiated requires usedirs to be enabled, falling back to "
 "po_link_to=default"
 msgstr ""
 
-#: ../IkiWiki/Plugin/po.pm:473
+#: ../IkiWiki/Plugin/po.pm:471
 msgid "updated PO files"
 msgstr ""
 
-#: ../IkiWiki/Plugin/po.pm:496
+#: ../IkiWiki/Plugin/po.pm:494
 msgid ""
 "Can not remove a translation. If the master page is removed, however, its "
 "translations will be removed as well."
 msgstr ""
 
-#: ../IkiWiki/Plugin/po.pm:516
+#: ../IkiWiki/Plugin/po.pm:514
 msgid ""
 "Can not rename a translation. If the master page is renamed, however, its "
 "translations will be renamed as well."
 msgstr ""
 
-#: ../IkiWiki/Plugin/po.pm:975
+#: ../IkiWiki/Plugin/po.pm:928
 #, perl-format
 msgid "POT file (%s) does not exist"
 msgstr ""
 
-#: ../IkiWiki/Plugin/po.pm:989
+#: ../IkiWiki/Plugin/po.pm:942
 #, perl-format
 msgid "failed to copy underlay PO file to %s"
 msgstr ""
 
-#: ../IkiWiki/Plugin/po.pm:997
+#: ../IkiWiki/Plugin/po.pm:950
 #, perl-format
 msgid "failed to update %s"
 msgstr ""
 
-#: ../IkiWiki/Plugin/po.pm:1003
+#: ../IkiWiki/Plugin/po.pm:956
 #, perl-format
 msgid "failed to copy the POT file to %s"
 msgstr ""
 
-#: ../IkiWiki/Plugin/po.pm:1039
+#: ../IkiWiki/Plugin/po.pm:992
 msgid "N/A"
 msgstr ""
 
-#: ../IkiWiki/Plugin/po.pm:1050
+#: ../IkiWiki/Plugin/po.pm:1003
 #, perl-format
 msgid "failed to translate %s"
 msgstr ""
 
-#: ../IkiWiki/Plugin/po.pm:1133
+#: ../IkiWiki/Plugin/po.pm:1086
 msgid "removed obsolete PO files"
 msgstr ""
 
-#: ../IkiWiki/Plugin/po.pm:1190 ../IkiWiki/Plugin/po.pm:1202
-#: ../IkiWiki/Plugin/po.pm:1241
+#: ../IkiWiki/Plugin/po.pm:1142 ../IkiWiki/Plugin/po.pm:1154
+#: ../IkiWiki/Plugin/po.pm:1193
 #, perl-format
 msgid "failed to write %s"
 msgstr ""
 
-#: ../IkiWiki/Plugin/po.pm:1200
+#: ../IkiWiki/Plugin/po.pm:1152
 msgid "failed to translate"
 msgstr ""
 
-#: ../IkiWiki/Plugin/po.pm:1253
+#: ../IkiWiki/Plugin/po.pm:1205
 msgid "invalid gettext data, go back to previous page to continue edit"
 msgstr ""
 
-#: ../IkiWiki/Plugin/po.pm:1296
+#: ../IkiWiki/Plugin/po.pm:1248
 #, perl-format
 msgid "%s has invalid syntax: must use CODE|NAME"
 msgstr ""
@@ -1395,17 +1395,17 @@
 msgid "yes"
 msgstr ""
 
-#: ../IkiWiki.pm:2507
+#: ../IkiWiki.pm:2626
 #, perl-format
 msgid "invalid sort type %s"
 msgstr ""
 
-#: ../IkiWiki.pm:2528
+#: ../IkiWiki.pm:2647
 #, perl-format
 msgid "unknown sort type %s"
 msgstr ""
 
-#: ../IkiWiki.pm:2677
+#: ../IkiWiki.pm:2796
 #, perl-format
 msgid "cannot match pages: %s"
 msgstr ""
diff -Nru ikiwiki-3.20190207/t/aggregate-file.t 
ikiwiki-3.20190228/t/aggregate-file.t
--- ikiwiki-3.20190207/t/aggregate-file.t       1970-01-01 01:00:00.000000000 
+0100
+++ ikiwiki-3.20190228/t/aggregate-file.t       2019-02-26 23:01:54.000000000 
+0000
@@ -0,0 +1,173 @@
+#!/usr/bin/perl
+use utf8;
+use warnings;
+use strict;
+
+use Encode;
+use Test::More;
+
+BEGIN {
+       plan(skip_all => "CGI not available")
+               unless eval q{
+                       use CGI qw();
+                       1;
+               };
+
+       plan(skip_all => "IPC::Run not available")
+               unless eval q{
+                       use IPC::Run qw(run);
+                       1;
+               };
+
+       use_ok('IkiWiki');
+       use_ok('YAML::XS');
+}
+
+# We check for English error messages
+$ENV{LC_ALL} = 'C';
+
+use Cwd qw(getcwd);
+use Errno qw(ENOENT);
+
+my $installed = $ENV{INSTALLED_TESTS};
+
+my @command;
+if ($installed) {
+       @command = qw(ikiwiki --plugin inline);
+}
+else {
+       ok(! system("make -s ikiwiki.out"));
+       @command = ("perl", "-I".getcwd."/blib/lib", './ikiwiki.out',
+               '--underlaydir='.getcwd.'/underlays/basewiki',
+               '--set', 'underlaydirbase='.getcwd.'/underlays',
+               '--templatedir='.getcwd.'/templates');
+}
+
+sub write_old_file {
+       my $name = shift;
+       my $dir = shift;
+       my $content = shift;
+       writefile($name, $dir, $content);
+       ok(utime(333333333, 333333333, "$dir/$name"));
+}
+
+sub write_setup_file {
+       my %params = @_;
+       my %setup = (
+               wikiname => 'this is the name of my wiki',
+               srcdir => getcwd.'/t/tmp/in',
+               destdir => getcwd.'/t/tmp/out',
+               url => 'http://example.com',
+               cgiurl => 'http://example.com/cgi-bin/ikiwiki.cgi',
+               cgi_wrapper => getcwd.'/t/tmp/ikiwiki.cgi',
+               cgi_wrappermode => '0751',
+               add_plugins => [qw(aggregate)],
+               disable_plugins => [qw(emailauth openid passwordauth)],
+               aggregate_webtrigger => 1,
+       );
+       if ($params{without_paranoia}) {
+               $setup{libdirs} = [getcwd.'/t/noparanoia'];
+       }
+       unless ($installed) {
+               $setup{ENV} = { 'PERL5LIB' => getcwd.'/blib/lib' };
+       }
+       writefile("test.setup", "t/tmp",
+               "# IkiWiki::Setup::Yaml - YAML formatted setup file\n" .
+               Dump(\%setup));
+}
+
+sub thoroughly_rebuild {
+       ok(unlink("t/tmp/ikiwiki.cgi") || $!{ENOENT});
+       ok(! system(@command, qw(--setup t/tmp/test.setup --rebuild 
--wrappers)));
+}
+
+sub run_cgi {
+       my (%args) = @_;
+       my ($in, $out);
+       my $method = $args{method} || 'GET';
+       my $environ = $args{environ} || {};
+       my $params = $args{params} || { do => 'prefs' };
+
+       my %defaults = (
+               SCRIPT_NAME     => '/cgi-bin/ikiwiki.cgi',
+               HTTP_HOST       => 'example.com',
+       );
+
+       my $cgi = CGI->new($args{params});
+       my $query_string = $cgi->query_string();
+       diag $query_string;
+
+       if ($method eq 'POST') {
+               $defaults{REQUEST_METHOD} = 'POST';
+               $in = $query_string;
+               $defaults{CONTENT_LENGTH} = length $in;
+       } else {
+               $defaults{REQUEST_METHOD} = 'GET';
+               $defaults{QUERY_STRING} = $query_string;
+       }
+
+       my %envvars = (
+               %defaults,
+               %$environ,
+       );
+       run(["./t/tmp/ikiwiki.cgi"], \$in, \$out, init => sub {
+               map {
+                       $ENV{$_} = $envvars{$_}
+               } keys(%envvars);
+       });
+
+       return decode_utf8($out);
+}
+
+sub test {
+       my $content;
+
+       ok(! system(qw(rm -rf t/tmp)));
+       ok(! system(qw(mkdir t/tmp)));
+
+       write_old_file('aggregator.mdwn', 't/tmp/in',
+               '[[!aggregate name="ssrf" 
url="file://'.getcwd.'/t/secret.rss"]]'
+               .'[[!inline pages="internal(aggregator/*)"]]');
+
+       write_setup_file();
+       thoroughly_rebuild();
+
+       $content = run_cgi(
+               method => 'GET',
+               params => {
+                       do => 'aggregate_webtrigger',
+               },
+       );
+       unlike($content, qr{creating new page});
+       unlike($content, qr{Secrets});
+       ok(! -e 't/tmp/in/.ikiwiki/transient/aggregator/ssrf');
+       ok(! -e 
't/tmp/in/.ikiwiki/transient/aggregator/ssrf/Secrets_go_here._aggregated');
+
+       thoroughly_rebuild();
+       $content = readfile('t/tmp/out/aggregator/index.html');
+       unlike($content, qr{Secrets});
+
+       diag('Trying test again with LWPx::ParanoidAgent disabled');
+
+       write_setup_file(without_paranoia => 1);
+       thoroughly_rebuild();
+
+       $content = run_cgi(
+               method => 'GET',
+               params => {
+                       do => 'aggregate_webtrigger',
+               },
+       );
+       unlike($content, qr{creating new page});
+       unlike($content, qr{Secrets});
+       ok(! -e 't/tmp/in/.ikiwiki/transient/aggregator/ssrf');
+       ok(! -e 
't/tmp/in/.ikiwiki/transient/aggregator/ssrf/Secrets_go_here._aggregated');
+
+       thoroughly_rebuild();
+       $content = readfile('t/tmp/out/aggregator/index.html');
+       unlike($content, qr{Secrets});
+}
+
+test();
+
+done_testing();
diff -Nru ikiwiki-3.20190207/t/noparanoia/LWPx/ParanoidAgent.pm 
ikiwiki-3.20190228/t/noparanoia/LWPx/ParanoidAgent.pm
--- ikiwiki-3.20190207/t/noparanoia/LWPx/ParanoidAgent.pm       1970-01-01 
01:00:00.000000000 +0100
+++ ikiwiki-3.20190228/t/noparanoia/LWPx/ParanoidAgent.pm       2019-02-26 
23:01:54.000000000 +0000
@@ -0,0 +1,2 @@
+# make import fail
+0;
diff -Nru ikiwiki-3.20190207/t/po.t ikiwiki-3.20190228/t/po.t
--- ikiwiki-3.20190207/t/po.t   2019-02-07 11:08:41.000000000 +0000
+++ ikiwiki-3.20190228/t/po.t   2019-02-26 23:01:54.000000000 +0000
@@ -447,25 +447,10 @@
        \s*
        <p>Entre\sles\sinlines</p>
        \s*
-       .*      # TODO: This paragraph gets mangled (Debian #911356)
-       \s*
-       <p>Après\sles\sinlines</p>
-}sx);
-
-TODO: {
-local $TODO = "Debian bug #911356";
-like($output{'debian911356.fr'}, qr{
-       <p>Avant\sla\spremière\sinline</p>
-       \s*
-       <p>Contenu\sfrançais</p>
-       \s*
-       <p>Entre\sles\sinlines</p>
-       \s*
        <p>Contenu\sfrançais</p>
        \s*
        <p>Après\sles\sinlines</p>
 }sx);
-};
 
 # Variation of Debian #911356 without using raw inlines.
 like($output{debian911356ish}, qr{
@@ -511,28 +496,6 @@
        \s*
        <p>Entre\sles\sinlines</p>
        \s*
-       .*      # TODO: This paragraph gets mangled (Debian #911356)
-       \s*
-       <p>Après\sles\sinlines</p>
-}sx);
-
-TODO: {
-local $TODO = "Debian bug #911356";
-like($output{'debian911356ish.fr'}, qr{
-       <p>Avant\sla\spremière\sinline</p>
-       \s*
-       <!--feedlinks-->
-       \s*
-       <div\sclass="inlinecontent">
-       \s*
-       <h6>debian911356-inlined\.fr</h6>
-       \s*
-       <p>Contenu\sfrançais</p>
-       \s*
-       </div><!--inlinecontent-->
-       \s*
-       <p>Entre\sles\sinlines</p>
-       \s*
        <!--feedlinks-->
        \s*
        <div\sclass="inlinecontent">
@@ -545,6 +508,5 @@
        \s*
        <p>Après\sles\sinlines</p>
 }sx);
-};
 
 done_testing;
diff -Nru ikiwiki-3.20190207/t/secret.rss ikiwiki-3.20190228/t/secret.rss
--- ikiwiki-3.20190207/t/secret.rss     1970-01-01 01:00:00.000000000 +0100
+++ ikiwiki-3.20190228/t/secret.rss     2019-02-26 23:01:54.000000000 +0000
@@ -0,0 +1,11 @@
+<?xml version="1.0"?>
+<rss version="2.0">
+<channel>
+<title>Secrets go here</title>
+<description>Secrets go here</description>
+<item>
+  <title>Secrets go here</title>
+  <description>Secrets go here</description>
+</item>
+</channel>
+</rss>
diff -Nru ikiwiki-3.20190207/t/useragent.t ikiwiki-3.20190228/t/useragent.t
--- ikiwiki-3.20190207/t/useragent.t    1970-01-01 01:00:00.000000000 +0100
+++ ikiwiki-3.20190228/t/useragent.t    2019-02-26 23:01:54.000000000 +0000
@@ -0,0 +1,317 @@
+#!/usr/bin/perl
+use warnings;
+use strict;
+use Test::More;
+
+my $have_paranoid_agent;
+BEGIN {
+       plan(skip_all => 'LWP not available')
+               unless eval q{
+                       use LWP qw(); 1;
+               };
+       use_ok("IkiWiki");
+       $have_paranoid_agent = eval q{
+               use LWPx::ParanoidAgent qw(); 1;
+       };
+}
+
+eval { useragent(future_feature => 1); };
+ok($@, 'future features should cause useragent to fail');
+
+diag "==== No proxy ====";
+delete $ENV{http_proxy};
+delete $ENV{https_proxy};
+delete $ENV{no_proxy};
+delete $ENV{HTTPS_PROXY};
+delete $ENV{NO_PROXY};
+
+diag "---- Unspecified URL ----";
+my $ua = useragent(for_url => undef);
+SKIP: {
+       skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
+       ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
+}
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
+is($ua->proxy('http'), undef, 'No http proxy');
+is($ua->proxy('https'), undef, 'No https proxy');
+
+diag "---- Specified URL ----";
+$ua = useragent(for_url => 'http://example.com');
+SKIP: {
+       skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
+       ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
+}
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
+is($ua->proxy('http'), undef, 'No http proxy');
+is($ua->proxy('https'), undef, 'No https proxy');
+
+diag "==== Proxy for everything ====";
+$ENV{http_proxy} = 'http://proxy:8080';
+$ENV{https_proxy} = 'http://sproxy:8080';
+delete $ENV{no_proxy};
+delete $ENV{HTTPS_PROXY};
+delete $ENV{NO_PROXY};
+
+diag "---- Unspecified URL ----";
+$ua = useragent(for_url => undef);
+ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of 
ParanoidAgent');
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
+is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
+is($ua->proxy('https'), 'http://sproxy:8080', 'should use CONNECT proxy');
+$ua = useragent(for_url => 'http://example.com');
+ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of 
ParanoidAgent');
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http)]);
+is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
+# We don't care what $ua->proxy('https') is, because it won't be used
+$ua = useragent(for_url => 'https://example.com');
+ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of 
ParanoidAgent');
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(https)]);
+# We don't care what $ua->proxy('http') is, because it won't be used
+is($ua->proxy('https'), 'http://sproxy:8080', 'should use CONNECT proxy');
+
+diag "==== Selective proxy ====";
+$ENV{http_proxy} = 'http://proxy:8080';
+$ENV{https_proxy} = 'http://sproxy:8080';
+$ENV{no_proxy} = '*.example.net,example.com,.example.org';
+delete $ENV{HTTPS_PROXY};
+delete $ENV{NO_PROXY};
+
+diag "---- Unspecified URL ----";
+$ua = useragent(for_url => undef);
+ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of 
ParanoidAgent');
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
+is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
+is($ua->proxy('https'), 'http://sproxy:8080', 'should use CONNECT proxy');
+
+diag "---- Exact match for no_proxy ----";
+$ua = useragent(for_url => 'http://example.com');
+SKIP: {
+       skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
+       ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
+}
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
+is($ua->proxy('http'), undef);
+is($ua->proxy('https'), undef);
+
+diag "---- Subdomain of exact domain in no_proxy ----";
+$ua = useragent(for_url => 'http://sub.example.com');
+ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of 
ParanoidAgent');
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http)]);
+is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
+
+diag "---- example.net matches *.example.net ----";
+$ua = useragent(for_url => 'https://example.net');
+SKIP: {
+       skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
+       ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
+}
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
+is($ua->proxy('http'), undef);
+is($ua->proxy('https'), undef);
+
+diag "---- sub.example.net matches *.example.net ----";
+$ua = useragent(for_url => 'https://sub.example.net');
+SKIP: {
+       skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
+       ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
+}
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
+is($ua->proxy('http'), undef);
+is($ua->proxy('https'), undef);
+
+diag "---- badexample.net does not match *.example.net ----";
+$ua = useragent(for_url => 'https://badexample.net');
+ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of 
ParanoidAgent');
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(https)]);
+is($ua->proxy('https'), 'http://sproxy:8080', 'should use proxy');
+
+diag "---- example.org matches .example.org ----";
+$ua = useragent(for_url => 'https://example.org');
+SKIP: {
+       skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
+       ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
+}
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
+is($ua->proxy('http'), undef);
+is($ua->proxy('https'), undef);
+
+diag "---- sub.example.org matches .example.org ----";
+$ua = useragent(for_url => 'https://sub.example.org');
+SKIP: {
+       skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
+       ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
+}
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
+is($ua->proxy('http'), undef);
+is($ua->proxy('https'), undef);
+
+diag "---- badexample.org does not match .example.org ----";
+$ua = useragent(for_url => 'https://badexample.org');
+ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of 
ParanoidAgent');
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(https)]);
+is($ua->proxy('https'), 'http://sproxy:8080', 'should use proxy');
+
+diag "==== Selective proxy (alternate variables) ====";
+$ENV{http_proxy} = 'http://proxy:8080';
+delete $ENV{https_proxy};
+$ENV{HTTPS_PROXY} = 'http://sproxy:8080';
+delete $ENV{no_proxy};
+$ENV{NO_PROXY} = '*.example.net,example.com,.example.org';
+
+diag "---- Unspecified URL ----";
+$ua = useragent(for_url => undef);
+ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of 
ParanoidAgent');
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
+is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
+is($ua->proxy('https'), 'http://sproxy:8080', 'should use CONNECT proxy');
+
+diag "---- Exact match for no_proxy ----";
+$ua = useragent(for_url => 'http://example.com');
+SKIP: {
+       skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
+       ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
+}
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
+is($ua->proxy('http'), undef);
+is($ua->proxy('https'), undef);
+
+diag "---- Subdomain of exact domain in no_proxy ----";
+$ua = useragent(for_url => 'http://sub.example.com');
+ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of 
ParanoidAgent');
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http)]);
+is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
+
+diag "---- example.net matches *.example.net ----";
+$ua = useragent(for_url => 'https://example.net');
+SKIP: {
+       skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
+       ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
+}
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
+is($ua->proxy('http'), undef);
+is($ua->proxy('https'), undef);
+
+diag "---- sub.example.net matches *.example.net ----";
+$ua = useragent(for_url => 'https://sub.example.net');
+SKIP: {
+       skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
+       ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
+}
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
+is($ua->proxy('http'), undef);
+is($ua->proxy('https'), undef);
+
+diag "---- badexample.net does not match *.example.net ----";
+$ua = useragent(for_url => 'https://badexample.net');
+ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of 
ParanoidAgent');
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(https)]);
+is($ua->proxy('https'), 'http://sproxy:8080', 'should use proxy');
+
+diag "---- example.org matches .example.org ----";
+$ua = useragent(for_url => 'https://example.org');
+SKIP: {
+       skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
+       ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
+}
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
+is($ua->proxy('http'), undef);
+is($ua->proxy('https'), undef);
+
+diag "---- sub.example.org matches .example.org ----";
+$ua = useragent(for_url => 'https://sub.example.org');
+SKIP: {
+       skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
+       ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
+}
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
+is($ua->proxy('http'), undef);
+is($ua->proxy('https'), undef);
+
+diag "---- badexample.org does not match .example.org ----";
+$ua = useragent(for_url => 'https://badexample.org');
+ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of 
ParanoidAgent');
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(https)]);
+is($ua->proxy('https'), 'http://sproxy:8080', 'should use proxy');
+
+diag "==== Selective proxy (many variables) ====";
+$ENV{http_proxy} = 'http://proxy:8080';
+$ENV{https_proxy} = 'http://sproxy:8080';
+# This one should be ignored in favour of https_proxy
+$ENV{HTTPS_PROXY} = 'http://not.preferred.proxy:3128';
+# These two should be merged
+$ENV{no_proxy} = '*.example.net,example.com';
+$ENV{NO_PROXY} = '.example.org';
+
+diag "---- Unspecified URL ----";
+$ua = useragent(for_url => undef);
+ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of 
ParanoidAgent');
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
+is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
+is($ua->proxy('https'), 'http://sproxy:8080', 'should use CONNECT proxy');
+
+diag "---- Exact match for no_proxy ----";
+$ua = useragent(for_url => 'http://example.com');
+SKIP: {
+       skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
+       ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
+}
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
+is($ua->proxy('http'), undef);
+is($ua->proxy('https'), undef);
+
+diag "---- Subdomain of exact domain in no_proxy ----";
+$ua = useragent(for_url => 'http://sub.example.com');
+ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of 
ParanoidAgent');
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http)]);
+is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
+
+diag "---- example.net matches *.example.net ----";
+$ua = useragent(for_url => 'https://example.net');
+SKIP: {
+       skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
+       ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
+}
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
+is($ua->proxy('http'), undef);
+is($ua->proxy('https'), undef);
+
+diag "---- sub.example.net matches *.example.net ----";
+$ua = useragent(for_url => 'https://sub.example.net');
+SKIP: {
+       skip 'paranoid agent not available', 1 unless $have_paranoid_agent;
+       ok($ua->isa('LWPx::ParanoidAgent'), 'uses ParanoidAgent if possible');
+}
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
+is($ua->proxy('http'), undef);
+is($ua->proxy('https'), undef);
+
+diag "---- badexample.net does not match *.example.net ----";
+$ua = useragent(for_url => 'https://badexample.net');
+ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of 
ParanoidAgent');
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(https)]);
+is($ua->proxy('https'), 'http://sproxy:8080', 'should use proxy');
+
+diag "==== One but not the other ====\n";
+$ENV{http_proxy} = 'http://proxy:8080';
+delete $ENV{https_proxy};
+delete $ENV{HTTPS_PROXY};
+delete $ENV{no_proxy};
+delete $ENV{NO_PROXY};
+$ua = useragent(for_url => undef);
+ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of 
ParanoidAgent');
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
+is($ua->proxy('http'), 'http://proxy:8080', 'should use proxy');
+is($ua->proxy('https'), 'http://proxy:8080', 'should use proxy');
+
+delete $ENV{http_proxy};
+$ENV{https_proxy} = 'http://sproxy:8080';
+delete $ENV{HTTPS_PROXY};
+delete $ENV{no_proxy};
+delete $ENV{NO_PROXY};
+$ua = useragent(for_url => undef);
+ok(! $ua->isa('LWPx::ParanoidAgent'), 'should use proxy instead of 
ParanoidAgent');
+is_deeply([sort @{$ua->protocols_allowed}], [sort qw(http https)]);
+is($ua->proxy('http'), 'http://sproxy:8080', 'should use proxy');
+is($ua->proxy('https'), 'http://sproxy:8080', 'should use proxy');
+
+done_testing;

Reply via email to