Xqt has uploaded a new change for review. ( 
https://gerrit.wikimedia.org/r/344829 )

Change subject: [bugfix] Enable specialbots.py for PY3
......................................................................

[bugfix] Enable specialbots.py for PY3

- There is a different behaviour between py2 and py3 of the info() content
  model. This patch solves it by getting the content parts in different ways.

  See https://bugs.python.org/issue4773

Bug: T161457
Change-Id: I326d2f1aea543a4aae614a90d16c5756a45af4a0
---
M pywikibot/specialbots.py
1 file changed, 17 insertions(+), 9 deletions(-)


  git pull ssh://gerrit.wikimedia.org:29418/pywikibot/core 
refs/changes/29/344829/1

diff --git a/pywikibot/specialbots.py b/pywikibot/specialbots.py
index c524269..d252db9 100644
--- a/pywikibot/specialbots.py
+++ b/pywikibot/specialbots.py
@@ -3,7 +3,7 @@
 """Library containing special bots."""
 #
 # (C) Rob W.W. Hooft, Andre Engels 2003-2004
-# (C) Pywikibot team, 2003-2016
+# (C) Pywikibot team, 2003-2017
 #
 # Distributed under the terms of the MIT license.
 #
@@ -23,12 +23,10 @@
 from pywikibot import config
 
 from pywikibot.bot import BaseBot
-from pywikibot.tools import (
-    deprecated
-)
+from pywikibot.tools import PY2, deprecated
 from pywikibot.tools.formatter import color_format
 
-if sys.version_info[0] > 2:
+if not PY2:
     from urllib.parse import urlparse
     from urllib.request import URLopener
 
@@ -141,14 +139,24 @@
                 uo.addheader('Range', 'bytes=%s-' % rlen)
 
             infile = uo.open(file_url)
+            info = infile.info()
 
-            if 'text/html' in infile.info().getheader('Content-Type'):
+            if PY2:
+                content_type = info.getheader('Content-Type')
+                content_len = info.getheader('Content-Length')
+                accept_ranges = info.getheader('Accept-Ranges')
+            else:
+                content_type = info.get('Content-Type')
+                content_len = info.get('Content-Length')
+                accept_ranges = info.get('Accept-Ranges')
+
+
+            if 'text/html' in content_type:
                 pywikibot.output(u"Couldn't download the image: "
                                  "the requested URL was not found on server.")
                 return
 
-            content_len = infile.info().getheader('Content-Length')
-            accept_ranges = infile.info().getheader('Accept-Ranges') == 'bytes'
+            valid_ranges = accept_ranges == 'bytes'
 
             if resume:
                 _contents += infile.read()
@@ -166,7 +174,7 @@
                     pywikibot.output(
                         u"Connection closed at byte %s (%s left)"
                         % (rlen, content_len))
-                    if accept_ranges and rlen > 0:
+                    if valid_ranges and rlen > 0:
                         resume = True
                     pywikibot.output(u"Sleeping for %d seconds..." % dt)
                     time.sleep(dt)

-- 
To view, visit https://gerrit.wikimedia.org/r/344829
To unsubscribe, visit https://gerrit.wikimedia.org/r/settings

Gerrit-MessageType: newchange
Gerrit-Change-Id: I326d2f1aea543a4aae614a90d16c5756a45af4a0
Gerrit-PatchSet: 1
Gerrit-Project: pywikibot/core
Gerrit-Branch: master
Gerrit-Owner: Xqt <i...@gno.de>

_______________________________________________
MediaWiki-commits mailing list
MediaWiki-commits@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-commits

Reply via email to