Hello community,

here is the log from the commit of package python3-requests for 
openSUSE:Factory checked in at 2015-02-27 10:59:54
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python3-requests (Old)
 and      /work/SRC/openSUSE:Factory/.python3-requests.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Package is "python3-requests"

Changes:
--------
--- /work/SRC/openSUSE:Factory/python3-requests/python3-requests.changes        
2015-01-20 12:37:38.000000000 +0100
+++ /work/SRC/openSUSE:Factory/.python3-requests.new/python3-requests.changes   
2015-02-27 11:00:29.000000000 +0100
@@ -1,0 +2,33 @@
+Thu Feb 26 02:44:36 UTC 2015 - a...@gmx.de
+
+- update to version 2.5.3:
+  * Revert changes to our vendored certificate bundle. For more
+    context see (#2455, #2456, and http://bugs.python.org/issue23476)
+
+- changes from version 2.5.2:
+  * Features and Improvements
+    - Add sha256 fingerprint support. (shazow/urllib3#540)
+    - Improve the performance of headers. (shazow/urllib3#544)
+  * Bugfixes
+    - Copy pip’s import machinery. When downstream redistributors
+      remove requests.packages.urllib3 the import machinery will
+      continue to let those same symbols work. Example usage in
+      requests’ documentation and 3rd-party libraries relying on the
+      vendored copies of urllib3 will work without having to fallback
+      to the system urllib3.
+    - Attempt to quote parts of the URL on redirect if unquoting and
+      then quoting fails. (#2356)
+    - Fix filename type check for multipart form-data uploads. (#2411)
+    - Properly handle the case where a server issuing digest
+      authentication challenges provides both auth and auth-int
+      qop-values. (#2408)
+    - Fix a socket leak. (shazow/urllib3#549)
+    - Fix multiple Set-Cookie headers properly. (shazow/urllib3#534)
+    - Disable the built-in hostname verification. (shazow/urllib3#526)
+    - Fix the behaviour of decoding an exhausted
+      stream. (shazow/urllib3#535)
+  * Security
+    - Pulled in an updated cacert.pem.
+    - Drop RC4 from the default cipher list. (shazow/urllib3#551)
+
+-------------------------------------------------------------------

Old:
----
  requests-2.5.1.tar.gz

New:
----
  requests-2.5.3.tar.gz

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Other differences:
------------------
++++++ python3-requests.spec ++++++
--- /var/tmp/diff_new_pack.CCAdQI/_old  2015-02-27 11:00:31.000000000 +0100
+++ /var/tmp/diff_new_pack.CCAdQI/_new  2015-02-27 11:00:31.000000000 +0100
@@ -1,7 +1,7 @@
 #
 # spec file for package python3-requests
 #
-# Copyright (c) 2015 SUSE LINUX Products GmbH, Nuernberg, Germany.
+# Copyright (c) 2015 SUSE LINUX GmbH, Nuernberg, Germany.
 #
 # All modifications and additions to the file contributed by third parties
 # remain the property of their copyright owners, unless otherwise agreed
@@ -17,7 +17,7 @@
 
 
 Name:           python3-requests
-Version:        2.5.1
+Version:        2.5.3
 Release:        0
 Url:            http://python-requests.org
 Summary:        Awesome Python HTTP Library That's Actually Usable

++++++ requests-2.5.1.tar.gz -> requests-2.5.3.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/requests-2.5.1/HISTORY.rst 
new/requests-2.5.3/HISTORY.rst
--- old/requests-2.5.1/HISTORY.rst      2014-12-23 18:50:19.000000000 +0100
+++ new/requests-2.5.3/HISTORY.rst      2015-02-24 17:33:01.000000000 +0100
@@ -3,6 +3,61 @@
 Release History
 ---------------
 
+2.5.3 (2015-02-24)
+++++++++++++++++++
+
+**Bugfixes**
+
+- Revert changes to our vendored certificate bundle. For more context see
+  (#2455, #2456, and http://bugs.python.org/issue23476)
+
+2.5.2 (2015-02-23)
+++++++++++++++++++
+
+**Features and Improvements**
+
+- Add sha256 fingerprint support. (`shazow/urllib3#540`_)
+
+- Improve the performance of headers. (`shazow/urllib3#544`_)
+
+**Bugfixes**
+
+- Copy pip's import machinery. When downstream redistributors remove
+  requests.packages.urllib3 the import machinery will continue to let those
+  same symbols work. Example usage in requests' documentation and 3rd-party
+  libraries relying on the vendored copies of urllib3 will work without having
+  to fallback to the system urllib3.
+
+- Attempt to quote parts of the URL on redirect if unquoting and then quoting
+  fails. (#2356)
+
+- Fix filename type check for multipart form-data uploads. (#2411)
+
+- Properly handle the case where a server issuing digest authentication
+  challenges provides both auth and auth-int qop-values. (#2408)
+
+- Fix a socket leak. (`shazow/urllib3#549`_)
+
+- Fix multiple ``Set-Cookie`` headers properly. (`shazow/urllib3#534`_)
+
+- Disable the built-in hostname verification. (`shazow/urllib3#526`_)
+
+- Fix the behaviour of decoding an exhausted stream. (`shazow/urllib3#535`_)
+
+**Security**
+
+- Pulled in an updated ``cacert.pem``.
+
+- Drop RC4 from the default cipher list. (`shazow/urllib3#551`_)
+
+.. _shazow/urllib3#551: https://github.com/shazow/urllib3/pull/551
+.. _shazow/urllib3#549: https://github.com/shazow/urllib3/pull/549
+.. _shazow/urllib3#544: https://github.com/shazow/urllib3/pull/544
+.. _shazow/urllib3#540: https://github.com/shazow/urllib3/pull/540
+.. _shazow/urllib3#535: https://github.com/shazow/urllib3/pull/535
+.. _shazow/urllib3#534: https://github.com/shazow/urllib3/pull/534
+.. _shazow/urllib3#526: https://github.com/shazow/urllib3/pull/526
+
 2.5.1 (2014-12-23)
 ++++++++++++++++++
 
@@ -103,7 +158,7 @@
 - Support for connect timeouts! Timeout now accepts a tuple (connect, read) 
which is used to set individual connect and read timeouts.
 - Allow copying of PreparedRequests without headers/cookies.
 - Updated bundled urllib3 version.
-- Refactored settings loading from environment — new 
`Session.merge_environment_settings`.
+- Refactored settings loading from environment -- new 
`Session.merge_environment_settings`.
 - Handle socket errors in iter_content.
 
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/requests-2.5.1/LICENSE new/requests-2.5.3/LICENSE
--- old/requests-2.5.1/LICENSE  2014-06-23 21:22:31.000000000 +0200
+++ new/requests-2.5.3/LICENSE  2015-02-24 17:27:00.000000000 +0100
@@ -1,4 +1,4 @@
-Copyright 2014 Kenneth Reitz
+Copyright 2015 Kenneth Reitz
 
    Licensed under the Apache License, Version 2.0 (the "License");
    you may not use this file except in compliance with the License.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/requests-2.5.1/PKG-INFO new/requests-2.5.3/PKG-INFO
--- old/requests-2.5.1/PKG-INFO 2014-12-23 18:55:23.000000000 +0100
+++ new/requests-2.5.3/PKG-INFO 2015-02-24 17:33:44.000000000 +0100
@@ -1,6 +1,6 @@
 Metadata-Version: 1.1
 Name: requests
-Version: 2.5.1
+Version: 2.5.3
 Summary: Python HTTP for Humans.
 Home-page: http://python-requests.org
 Author: Kenneth Reitz
@@ -9,11 +9,11 @@
 Description: Requests: HTTP for Humans
         =========================
         
-        .. image:: https://badge.fury.io/py/requests.png
-            :target: http://badge.fury.io/py/requests
+        .. image:: https://img.shields.io/pypi/v/requests.svg
+            :target: https://pypi.python.org/pypi/requests
         
-        .. image:: https://pypip.in/d/requests/badge.png
-                :target: https://crate.io/packages/requests/
+        .. image:: https://img.shields.io/pypi/dm/requests.svg
+                :target: https://pypi.python.org/pypi/requests
         
         
         Requests is an Apache2 Licensed HTTP library, written in Python, for 
human
@@ -98,6 +98,61 @@
         Release History
         ---------------
         
+        2.5.3 (2015-02-24)
+        ++++++++++++++++++
+        
+        **Bugfixes**
+        
+        - Revert changes to our vendored certificate bundle. For more context 
see
+          (#2455, #2456, and http://bugs.python.org/issue23476)
+        
+        2.5.2 (2015-02-23)
+        ++++++++++++++++++
+        
+        **Features and Improvements**
+        
+        - Add sha256 fingerprint support. (`shazow/urllib3#540`_)
+        
+        - Improve the performance of headers. (`shazow/urllib3#544`_)
+        
+        **Bugfixes**
+        
+        - Copy pip's import machinery. When downstream redistributors remove
+          requests.packages.urllib3 the import machinery will continue to let 
those
+          same symbols work. Example usage in requests' documentation and 
3rd-party
+          libraries relying on the vendored copies of urllib3 will work 
without having
+          to fallback to the system urllib3.
+        
+        - Attempt to quote parts of the URL on redirect if unquoting and then 
quoting
+          fails. (#2356)
+        
+        - Fix filename type check for multipart form-data uploads. (#2411)
+        
+        - Properly handle the case where a server issuing digest authentication
+          challenges provides both auth and auth-int qop-values. (#2408)
+        
+        - Fix a socket leak. (`shazow/urllib3#549`_)
+        
+        - Fix multiple ``Set-Cookie`` headers properly. (`shazow/urllib3#534`_)
+        
+        - Disable the built-in hostname verification. (`shazow/urllib3#526`_)
+        
+        - Fix the behaviour of decoding an exhausted stream. 
(`shazow/urllib3#535`_)
+        
+        **Security**
+        
+        - Pulled in an updated ``cacert.pem``.
+        
+        - Drop RC4 from the default cipher list. (`shazow/urllib3#551`_)
+        
+        .. _shazow/urllib3#551: https://github.com/shazow/urllib3/pull/551
+        .. _shazow/urllib3#549: https://github.com/shazow/urllib3/pull/549
+        .. _shazow/urllib3#544: https://github.com/shazow/urllib3/pull/544
+        .. _shazow/urllib3#540: https://github.com/shazow/urllib3/pull/540
+        .. _shazow/urllib3#535: https://github.com/shazow/urllib3/pull/535
+        .. _shazow/urllib3#534: https://github.com/shazow/urllib3/pull/534
+        .. _shazow/urllib3#526: https://github.com/shazow/urllib3/pull/526
+        
         2.5.1 (2014-12-23)
         ++++++++++++++++++
         
@@ -198,7 +253,7 @@
         - Support for connect timeouts! Timeout now accepts a tuple (connect, 
read) which is used to set individual connect and read timeouts.
         - Allow copying of PreparedRequests without headers/cookies.
         - Updated bundled urllib3 version.
-        - Refactored settings loading from environment — new 
`Session.merge_environment_settings`.
+        - Refactored settings loading from environment -- new 
`Session.merge_environment_settings`.
         - Handle socket errors in iter_content.
         
         
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/requests-2.5.1/README.rst 
new/requests-2.5.3/README.rst
--- old/requests-2.5.1/README.rst       2014-12-09 03:41:38.000000000 +0100
+++ new/requests-2.5.3/README.rst       2015-02-24 17:27:00.000000000 +0100
@@ -1,11 +1,11 @@
 Requests: HTTP for Humans
 =========================
 
-.. image:: https://badge.fury.io/py/requests.png
-    :target: http://badge.fury.io/py/requests
+.. image:: https://img.shields.io/pypi/v/requests.svg
+    :target: https://pypi.python.org/pypi/requests
 
-.. image:: https://pypip.in/d/requests/badge.png
-        :target: https://crate.io/packages/requests/
+.. image:: https://img.shields.io/pypi/dm/requests.svg
+        :target: https://pypi.python.org/pypi/requests
 
 
 Requests is an Apache2 Licensed HTTP library, written in Python, for human
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/requests-2.5.1/requests/__init__.py 
new/requests-2.5.3/requests/__init__.py
--- old/requests-2.5.1/requests/__init__.py     2014-12-23 18:50:46.000000000 
+0100
+++ new/requests-2.5.3/requests/__init__.py     2015-02-24 17:33:01.000000000 
+0100
@@ -36,17 +36,17 @@
 The other HTTP methods are supported - see `requests.api`. Full documentation
 is at <http://python-requests.org>.
 
-:copyright: (c) 2014 by Kenneth Reitz.
+:copyright: (c) 2015 by Kenneth Reitz.
 :license: Apache 2.0, see LICENSE for more details.
 
 """
 
 __title__ = 'requests'
-__version__ = '2.5.1'
-__build__ = 0x020501
+__version__ = '2.5.3'
+__build__ = 0x020503
 __author__ = 'Kenneth Reitz'
 __license__ = 'Apache 2.0'
-__copyright__ = 'Copyright 2014 Kenneth Reitz'
+__copyright__ = 'Copyright 2015 Kenneth Reitz'
 
 # Attempt to enable urllib3's SNI support, if possible
 try:
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/requests-2.5.1/requests/auth.py 
new/requests-2.5.3/requests/auth.py
--- old/requests-2.5.1/requests/auth.py 2014-12-23 18:43:13.000000000 +0100
+++ new/requests-2.5.3/requests/auth.py 2015-02-24 17:27:00.000000000 +0100
@@ -124,13 +124,15 @@
         s += os.urandom(8)
 
         cnonce = (hashlib.sha1(s).hexdigest()[:16])
-        noncebit = "%s:%s:%s:%s:%s" % (nonce, ncvalue, cnonce, qop, HA2)
         if _algorithm == 'MD5-SESS':
             HA1 = hash_utf8('%s:%s:%s' % (HA1, nonce, cnonce))
 
         if qop is None:
             respdig = KD(HA1, "%s:%s" % (nonce, HA2))
         elif qop == 'auth' or 'auth' in qop.split(','):
+            noncebit = "%s:%s:%s:%s:%s" % (
+                nonce, ncvalue, cnonce, 'auth', HA2
+                )
             respdig = KD(HA1, noncebit)
         else:
             # XXX handle auth-int.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/requests-2.5.1/requests/compat.py 
new/requests-2.5.3/requests/compat.py
--- old/requests-2.5.1/requests/compat.py       2014-12-23 18:43:04.000000000 
+0100
+++ new/requests-2.5.3/requests/compat.py       2015-02-24 17:27:00.000000000 
+0100
@@ -21,58 +21,6 @@
 #: Python 3.x?
 is_py3 = (_ver[0] == 3)
 
-#: Python 3.0.x
-is_py30 = (is_py3 and _ver[1] == 0)
-
-#: Python 3.1.x
-is_py31 = (is_py3 and _ver[1] == 1)
-
-#: Python 3.2.x
-is_py32 = (is_py3 and _ver[1] == 2)
-
-#: Python 3.3.x
-is_py33 = (is_py3 and _ver[1] == 3)
-
-#: Python 3.4.x
-is_py34 = (is_py3 and _ver[1] == 4)
-
-#: Python 2.7.x
-is_py27 = (is_py2 and _ver[1] == 7)
-
-#: Python 2.6.x
-is_py26 = (is_py2 and _ver[1] == 6)
-
-#: Python 2.5.x
-is_py25 = (is_py2 and _ver[1] == 5)
-
-#: Python 2.4.x
-is_py24 = (is_py2 and _ver[1] == 4)   # I'm assuming this is not by choice.
-
-
-# ---------
-# Platforms
-# ---------
-
-
-# Syntax sugar.
-_ver = sys.version.lower()
-
-is_pypy = ('pypy' in _ver)
-is_jython = ('jython' in _ver)
-is_ironpython = ('iron' in _ver)
-
-# Assume CPython, if nothing else.
-is_cpython = not any((is_pypy, is_jython, is_ironpython))
-
-# Windows-based system.
-is_windows = 'win32' in str(sys.platform).lower()
-
-# Standard Linux 2+ system.
-is_linux = ('linux' in str(sys.platform).lower())
-is_osx = ('darwin' in str(sys.platform).lower())
-is_hpux = ('hpux' in str(sys.platform).lower())   # Complete guess.
-is_solaris = ('solar==' in str(sys.platform).lower())   # Complete guess.
-
 try:
     import simplejson as json
 except (ImportError, SyntaxError):
@@ -99,7 +47,6 @@
     basestring = basestring
     numeric_types = (int, long, float)
 
-
 elif is_py3:
     from urllib.parse import urlparse, urlunparse, urljoin, urlsplit, 
urlencode, quote, unquote, quote_plus, unquote_plus, urldefrag
     from urllib.request import parse_http_list, getproxies, proxy_bypass
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/requests-2.5.1/requests/cookies.py 
new/requests-2.5.3/requests/cookies.py
--- old/requests-2.5.1/requests/cookies.py      2014-06-23 21:22:31.000000000 
+0200
+++ new/requests-2.5.3/requests/cookies.py      2015-02-24 17:27:00.000000000 
+0100
@@ -157,26 +157,28 @@
 
 
 class RequestsCookieJar(cookielib.CookieJar, collections.MutableMapping):
-    """Compatibility class; is a cookielib.CookieJar, but exposes a dict 
interface.
+    """Compatibility class; is a cookielib.CookieJar, but exposes a dict
+    interface.
 
     This is the CookieJar we create by default for requests and sessions that
     don't specify one, since some clients may expect response.cookies and
     session.cookies to support dict operations.
 
-    Don't use the dict interface internally; it's just for compatibility with
-    with external client code. All `requests` code should work out of the box
-    with externally provided instances of CookieJar, e.g., LWPCookieJar and
-    FileCookieJar.
-
-    Caution: dictionary operations that are normally O(1) may be O(n).
+    Requests does not use the dict interface internally; it's just for
+    compatibility with external client code. All requests code should work
+    out of the box with externally provided instances of ``CookieJar``, e.g.
+    ``LWPCookieJar`` and ``FileCookieJar``.
 
     Unlike a regular CookieJar, this class is pickleable.
-    """
 
+    .. warning:: dictionary operations that are normally O(1) may be O(n).
+    """
     def get(self, name, default=None, domain=None, path=None):
         """Dict-like get() that also supports optional domain and path args in
         order to resolve naming collisions from using one cookie jar over
-        multiple domains. Caution: operation is O(n), not O(1)."""
+        multiple domains.
+
+        .. warning:: operation is O(n), not O(1)."""
         try:
             return self._find_no_duplicates(name, domain, path)
         except KeyError:
@@ -199,37 +201,38 @@
         return c
 
     def iterkeys(self):
-        """Dict-like iterkeys() that returns an iterator of names of cookies 
from the jar.
-        See itervalues() and iteritems()."""
+        """Dict-like iterkeys() that returns an iterator of names of cookies
+        from the jar. See itervalues() and iteritems()."""
         for cookie in iter(self):
             yield cookie.name
 
     def keys(self):
-        """Dict-like keys() that returns a list of names of cookies from the 
jar.
-        See values() and items()."""
+        """Dict-like keys() that returns a list of names of cookies from the
+        jar. See values() and items()."""
         return list(self.iterkeys())
 
     def itervalues(self):
-        """Dict-like itervalues() that returns an iterator of values of 
cookies from the jar.
-        See iterkeys() and iteritems()."""
+        """Dict-like itervalues() that returns an iterator of values of cookies
+        from the jar. See iterkeys() and iteritems()."""
         for cookie in iter(self):
             yield cookie.value
 
     def values(self):
-        """Dict-like values() that returns a list of values of cookies from 
the jar.
-        See keys() and items()."""
+        """Dict-like values() that returns a list of values of cookies from the
+        jar. See keys() and items()."""
         return list(self.itervalues())
 
     def iteritems(self):
-        """Dict-like iteritems() that returns an iterator of name-value tuples 
from the jar.
-        See iterkeys() and itervalues()."""
+        """Dict-like iteritems() that returns an iterator of name-value tuples
+        from the jar. See iterkeys() and itervalues()."""
         for cookie in iter(self):
             yield cookie.name, cookie.value
 
     def items(self):
-        """Dict-like items() that returns a list of name-value tuples from the 
jar.
-        See keys() and values(). Allows client-code to call 
"dict(RequestsCookieJar)
-        and get a vanilla python dict of key value pairs."""
+        """Dict-like items() that returns a list of name-value tuples from the
+        jar. See keys() and values(). Allows client-code to call
+        ``dict(RequestsCookieJar)`` and get a vanilla python dict of key value
+        pairs."""
         return list(self.iteritems())
 
     def list_domains(self):
@@ -259,8 +262,9 @@
         return False  # there is only one domain in jar
 
     def get_dict(self, domain=None, path=None):
-        """Takes as an argument an optional domain and path and returns a 
plain old
-        Python dict of name-value pairs of cookies that meet the 
requirements."""
+        """Takes as an argument an optional domain and path and returns a plain
+        old Python dict of name-value pairs of cookies that meet the
+        requirements."""
         dictionary = {}
         for cookie in iter(self):
             if (domain is None or cookie.domain == domain) and (path is None
@@ -269,21 +273,24 @@
         return dictionary
 
     def __getitem__(self, name):
-        """Dict-like __getitem__() for compatibility with client code. Throws 
exception
-        if there are more than one cookie with name. In that case, use the more
-        explicit get() method instead. Caution: operation is O(n), not O(1)."""
+        """Dict-like __getitem__() for compatibility with client code. Throws
+        exception if there are more than one cookie with name. In that case,
+        use the more explicit get() method instead.
+
+        .. warning:: operation is O(n), not O(1)."""
 
         return self._find_no_duplicates(name)
 
     def __setitem__(self, name, value):
-        """Dict-like __setitem__ for compatibility with client code. Throws 
exception
-        if there is already a cookie of that name in the jar. In that case, 
use the more
-        explicit set() method instead."""
+        """Dict-like __setitem__ for compatibility with client code. Throws
+        exception if there is already a cookie of that name in the jar. In that
+        case, use the more explicit set() method instead."""
 
         self.set(name, value)
 
     def __delitem__(self, name):
-        """Deletes a cookie given a name. Wraps cookielib.CookieJar's 
remove_cookie_by_name()."""
+        """Deletes a cookie given a name. Wraps ``cookielib.CookieJar``'s
+        ``remove_cookie_by_name()``."""
         remove_cookie_by_name(self, name)
 
     def set_cookie(self, cookie, *args, **kwargs):
@@ -300,10 +307,11 @@
             super(RequestsCookieJar, self).update(other)
 
     def _find(self, name, domain=None, path=None):
-        """Requests uses this method internally to get cookie values. Takes as 
args name
-        and optional domain and path. Returns a cookie.value. If there are 
conflicting cookies,
-        _find arbitrarily chooses one. See _find_no_duplicates if you want an 
exception thrown
-        if there are conflicting cookies."""
+        """Requests uses this method internally to get cookie values. Takes as
+        args name and optional domain and path. Returns a cookie.value. If
+        there are conflicting cookies, _find arbitrarily chooses one. See
+        _find_no_duplicates if you want an exception thrown if there are
+        conflicting cookies."""
         for cookie in iter(self):
             if cookie.name == name:
                 if domain is None or cookie.domain == domain:
@@ -313,10 +321,11 @@
         raise KeyError('name=%r, domain=%r, path=%r' % (name, domain, path))
 
     def _find_no_duplicates(self, name, domain=None, path=None):
-        """__get_item__ and get call _find_no_duplicates -- never used in 
Requests internally.
-        Takes as args name and optional domain and path. Returns a 
cookie.value.
-        Throws KeyError if cookie is not found and CookieConflictError if 
there are
-        multiple cookies that match name and optionally domain and path."""
+        """Both ``__get_item__`` and ``get`` call this function: it's never
+        used elsewhere in Requests. Takes as args name and optional domain and
+        path. Returns a cookie.value. Throws KeyError if cookie is not found
+        and CookieConflictError if there are multiple cookies that match name
+        and optionally domain and path."""
         toReturn = None
         for cookie in iter(self):
             if cookie.name == name:
@@ -440,7 +449,7 @@
     """
     if not isinstance(cookiejar, cookielib.CookieJar):
         raise ValueError('You can only merge into CookieJar')
-    
+
     if isinstance(cookies, dict):
         cookiejar = cookiejar_from_dict(
             cookies, cookiejar=cookiejar, overwrite=False)
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/requests-2.5.1/requests/packages/__init__.py 
new/requests-2.5.3/requests/packages/__init__.py
--- old/requests-2.5.1/requests/packages/__init__.py    2014-12-23 
18:43:04.000000000 +0100
+++ new/requests-2.5.3/requests/packages/__init__.py    2015-02-24 
17:27:00.000000000 +0100
@@ -1,3 +1,95 @@
+"""
+Copyright (c) Donald Stufft, pip, and individual contributors
+
+Permission is hereby granted, free of charge, to any person obtaining
+a copy of this software and associated documentation files (the
+"Software"), to deal in the Software without restriction, including
+without limitation the rights to use, copy, modify, merge, publish,
+distribute, sublicense, and/or sell copies of the Software, and to
+permit persons to whom the Software is furnished to do so, subject to
+the following conditions:
+
+The above copyright notice and this permission notice shall be
+included in all copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
+EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
+MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
+NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
+LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
+OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
+WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
+"""
 from __future__ import absolute_import
 
-from . import urllib3
+import sys
+
+
+class VendorAlias(object):
+
+    def __init__(self):
+        self._vendor_name = __name__
+        self._vendor_pkg = self._vendor_name + "."
+
+    def find_module(self, fullname, path=None):
+        if fullname.startswith(self._vendor_pkg):
+            return self
+
+    def load_module(self, name):
+        # Ensure that this only works for the vendored name
+        if not name.startswith(self._vendor_pkg):
+            raise ImportError(
+                "Cannot import %s, must be a subpackage of '%s'." % (
+                    name, self._vendor_name,
+                )
+            )
+
+        # Check to see if we already have this item in sys.modules, if we do
+        # then simply return that.
+        if name in sys.modules:
+            return sys.modules[name]
+
+        # Check to see if we can import the vendor name
+        try:
+            # We do this dance here because we want to try and import this
+            # module without hitting a recursion error because of a bunch of
+            # VendorAlias instances on sys.meta_path
+            real_meta_path = sys.meta_path[:]
+            try:
+                sys.meta_path = [
+                    m for m in sys.meta_path
+                    if not isinstance(m, VendorAlias)
+                ]
+                __import__(name)
+                module = sys.modules[name]
+            finally:
+                # Re-add any additions to sys.meta_path that were made while
+                # during the import we just did, otherwise things like
+                # requests.packages.urllib3.poolmanager will fail.
+                for m in sys.meta_path:
+                    if m not in real_meta_path:
+                        real_meta_path.append(m)
+
+                # Restore sys.meta_path with any new items.
+                sys.meta_path = real_meta_path
+        except ImportError:
+            # We can't import the vendor name, so we'll try to import the
+            # "real" name.
+            real_name = name[len(self._vendor_pkg):]
+            try:
+                __import__(real_name)
+                module = sys.modules[real_name]
+            except ImportError:
+                raise ImportError("No module named '%s'" % (name,))
+
+        # If we've gotten here we've found the module we're looking for, either
+        # as part of our vendored package, or as the real name, so we'll add
+        # it to sys.modules as the vendored name so that we don't have to do
+        # the lookup again.
+        sys.modules[name] = module
+
+        # Finally, return the loaded module
+        return module
+
+
+sys.meta_path.append(VendorAlias())
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/requests-2.5.1/requests/packages/urllib3/__init__.py 
new/requests-2.5.3/requests/packages/urllib3/__init__.py
--- old/requests-2.5.1/requests/packages/urllib3/__init__.py    2014-12-01 
22:02:39.000000000 +0100
+++ new/requests-2.5.3/requests/packages/urllib3/__init__.py    2015-02-24 
17:27:00.000000000 +0100
@@ -55,7 +55,7 @@
 del NullHandler
 
 
-# Set security warning to only go off once by default.
+# Set security warning to always go off by default.
 import warnings
 warnings.simplefilter('always', exceptions.SecurityWarning)
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/requests-2.5.1/requests/packages/urllib3/_collections.py 
new/requests-2.5.3/requests/packages/urllib3/_collections.py
--- old/requests-2.5.1/requests/packages/urllib3/_collections.py        
2014-12-01 22:02:39.000000000 +0100
+++ new/requests-2.5.3/requests/packages/urllib3/_collections.py        
2015-02-24 17:27:00.000000000 +0100
@@ -1,7 +1,7 @@
 from collections import Mapping, MutableMapping
 try:
     from threading import RLock
-except ImportError: # Platform-specific: No threads available
+except ImportError:  # Platform-specific: No threads available
     class RLock:
         def __enter__(self):
             pass
@@ -10,16 +10,18 @@
             pass
 
 
-try: # Python 2.7+
+try:  # Python 2.7+
     from collections import OrderedDict
 except ImportError:
     from .packages.ordered_dict import OrderedDict
-from .packages.six import iterkeys, itervalues
+from .packages.six import iterkeys, itervalues, PY3
 
 
 __all__ = ['RecentlyUsedContainer', 'HTTPHeaderDict']
 
 
+MULTIPLE_HEADERS_ALLOWED = frozenset(['cookie', 'set-cookie', 'set-cookie2'])
+
 _Null = object()
 
 
@@ -97,7 +99,14 @@
             return list(iterkeys(self._container))
 
 
-class HTTPHeaderDict(MutableMapping):
+_dict_setitem = dict.__setitem__
+_dict_getitem = dict.__getitem__
+_dict_delitem = dict.__delitem__
+_dict_contains = dict.__contains__
+_dict_setdefault = dict.setdefault
+
+
+class HTTPHeaderDict(dict):
     """
     :param headers:
         An iterable of field-value pairs. Must not contain multiple field names
@@ -129,25 +138,72 @@
     'foo=bar, baz=quxx'
     >>> headers['Content-Length']
     '7'
-
-    If you want to access the raw headers with their original casing
-    for debugging purposes you can access the private ``._data`` attribute
-    which is a normal python ``dict`` that maps the case-insensitive key to a
-    list of tuples stored as (case-sensitive-original-name, value). Using the
-    structure from above as our example:
-
-    >>> headers._data
-    {'set-cookie': [('Set-Cookie', 'foo=bar'), ('set-cookie', 'baz=quxx')],
-    'content-length': [('content-length', '7')]}
     """
 
     def __init__(self, headers=None, **kwargs):
-        self._data = {}
-        if headers is None:
-            headers = {}
-        self.update(headers, **kwargs)
+        dict.__init__(self)
+        if headers is not None:
+            self.extend(headers)
+        if kwargs:
+            self.extend(kwargs)
+
+    def __setitem__(self, key, val):
+        return _dict_setitem(self, key.lower(), (key, val))
+
+    def __getitem__(self, key):
+        val = _dict_getitem(self, key.lower())
+        return ', '.join(val[1:])
+
+    def __delitem__(self, key):
+        return _dict_delitem(self, key.lower())
 
-    def add(self, key, value):
+    def __contains__(self, key):
+        return _dict_contains(self, key.lower())
+
+    def __eq__(self, other):
+        if not isinstance(other, Mapping) and not hasattr(other, 'keys'):
+            return False
+        if not isinstance(other, type(self)):
+            other = type(self)(other)
+        return dict((k1, self[k1]) for k1 in self) == dict((k2, other[k2]) for 
k2 in other)
+
+    def __ne__(self, other):
+        return not self.__eq__(other)
+
+    values = MutableMapping.values
+    get = MutableMapping.get
+    update = MutableMapping.update
+    
+    if not PY3: # Python 2
+        iterkeys = MutableMapping.iterkeys
+        itervalues = MutableMapping.itervalues
+
+    __marker = object()
+
+    def pop(self, key, default=__marker):
+        '''D.pop(k[,d]) -> v, remove specified key and return the 
corresponding value.
+          If key is not found, d is returned if given, otherwise KeyError is 
raised.
+        '''
+        # Using the MutableMapping function directly fails due to the private 
marker.
+        # Using ordinary dict.pop would expose the internal structures.
+        # So let's reinvent the wheel.
+        try:
+            value = self[key]
+        except KeyError:
+            if default is self.__marker:
+                raise
+            return default
+        else:
+            del self[key]
+            return value
+
+    def discard(self, key):
+        try:
+            del self[key]
+        except KeyError:
+            pass
+
+    def add(self, key, val):
         """Adds a (name, value) pair, doesn't overwrite the value if it already
         exists.
 
@@ -156,43 +212,108 @@
         >>> headers['foo']
         'bar, baz'
         """
-        self._data.setdefault(key.lower(), []).append((key, value))
+        key_lower = key.lower()
+        new_vals = key, val
+        # Keep the common case aka no item present as fast as possible
+        vals = _dict_setdefault(self, key_lower, new_vals)
+        if new_vals is not vals:
+            # new_vals was not inserted, as there was a previous one
+            if isinstance(vals, list):
+                # If already several items got inserted, we have a list
+                vals.append(val)
+            else:
+                # vals should be a tuple then, i.e. only one item so far
+                if key_lower in MULTIPLE_HEADERS_ALLOWED:
+                    # Need to convert the tuple to list for further extension
+                    _dict_setitem(self, key_lower, [vals[0], vals[1], val])
+                else:
+                    _dict_setitem(self, key_lower, new_vals)
+
+    def extend(*args, **kwargs):
+        """Generic import function for any type of header-like object.
+        Adapted version of MutableMapping.update in order to insert items
+        with self.add instead of self.__setitem__
+        """
+        if len(args) > 2:
+            raise TypeError("update() takes at most 2 positional "
+                            "arguments ({} given)".format(len(args)))
+        elif not args:
+            raise TypeError("update() takes at least 1 argument (0 given)")
+        self = args[0]
+        other = args[1] if len(args) >= 2 else ()
+        
+        if isinstance(other, Mapping):
+            for key in other:
+                self.add(key, other[key])
+        elif hasattr(other, "keys"):
+            for key in other.keys():
+                self.add(key, other[key])
+        else:
+            for key, value in other:
+                self.add(key, value)
+
+        for key, value in kwargs.items():
+            self.add(key, value)
 
     def getlist(self, key):
         """Returns a list of all the values for the named field. Returns an
         empty list if the key doesn't exist."""
-        return self[key].split(', ') if key in self else []
-
-    def copy(self):
-        h = HTTPHeaderDict()
-        for key in self._data:
-            for rawkey, value in self._data[key]:
-                h.add(rawkey, value)
-        return h
-
-    def __eq__(self, other):
-        if not isinstance(other, Mapping):
-            return False
-        other = HTTPHeaderDict(other)
-        return dict((k1, self[k1]) for k1 in self._data) == \
-                dict((k2, other[k2]) for k2 in other._data)
-
-    def __getitem__(self, key):
-        values = self._data[key.lower()]
-        return ', '.join(value[1] for value in values)
-
-    def __setitem__(self, key, value):
-        self._data[key.lower()] = [(key, value)]
-
-    def __delitem__(self, key):
-        del self._data[key.lower()]
-
-    def __len__(self):
-        return len(self._data)
-
-    def __iter__(self):
-        for headers in itervalues(self._data):
-            yield headers[0][0]
+        try:
+            vals = _dict_getitem(self, key.lower())
+        except KeyError:
+            return []
+        else:
+            if isinstance(vals, tuple):
+                return [vals[1]]
+            else:
+                return vals[1:]
+
+    # Backwards compatibility for httplib
+    getheaders = getlist
+    getallmatchingheaders = getlist
+    iget = getlist
 
     def __repr__(self):
-        return '%s(%r)' % (self.__class__.__name__, dict(self.items()))
+        return "%s(%s)" % (type(self).__name__, dict(self.itermerged()))
+
+    def copy(self):
+        clone = type(self)()
+        for key in self:
+            val = _dict_getitem(self, key)
+            if isinstance(val, list):
+                # Don't need to convert tuples
+                val = list(val)
+            _dict_setitem(clone, key, val)
+        return clone
+
+    def iteritems(self):
+        """Iterate over all header lines, including duplicate ones."""
+        for key in self:
+            vals = _dict_getitem(self, key)
+            for val in vals[1:]:
+                yield vals[0], val
+
+    def itermerged(self):
+        """Iterate over all headers, merging duplicate ones together."""
+        for key in self:
+            val = _dict_getitem(self, key)
+            yield val[0], ', '.join(val[1:])
+
+    def items(self):
+        return list(self.iteritems())
+
+    @classmethod
+    def from_httplib(cls, message, duplicates=('set-cookie',)): # Python 2
+        """Read headers from a Python 2 httplib message object."""
+        ret = cls(message.items())
+        # ret now contains only the last header line for each duplicate.
+        # Importing with all duplicates would be nice, but this would
+        # mean to repeat most of the raw parsing already done, when the
+        # message object was created. Extracting only the headers of interest 
+        # separately, the cookies, should be faster and requires less
+        # extra code.
+        for key in duplicates:
+            ret.discard(key)
+            for val in message.getheaders(key):
+                ret.add(key, val)
+            return ret
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/requests-2.5.1/requests/packages/urllib3/connectionpool.py 
new/requests-2.5.3/requests/packages/urllib3/connectionpool.py
--- old/requests-2.5.1/requests/packages/urllib3/connectionpool.py      
2014-12-01 22:02:39.000000000 +0100
+++ new/requests-2.5.3/requests/packages/urllib3/connectionpool.py      
2015-02-24 17:27:00.000000000 +0100
@@ -72,6 +72,21 @@
         return '%s(host=%r, port=%r)' % (type(self).__name__,
                                          self.host, self.port)
 
+    def __enter__(self):
+        return self
+
+    def __exit__(self, exc_type, exc_val, exc_tb):
+        self.close()
+        # Return False to re-raise any potential exceptions
+        return False
+
+    def close():
+        """
+        Close all pooled connections and disable the pool.
+        """
+        pass
+
+
 # This is taken from 
http://hg.python.org/cpython/file/7aaba721ebc0/Lib/socket.py#l252
 _blocking_errnos = set([errno.EAGAIN, errno.EWOULDBLOCK])
 
@@ -266,6 +281,10 @@
         """
         pass
 
+    def _prepare_proxy(self, conn):
+        # Nothing to do for HTTP connections.
+        pass
+
     def _get_timeout(self, timeout):
         """ Helper that always returns a :class:`urllib3.util.Timeout` """
         if timeout is _Default:
@@ -349,7 +368,7 @@
 
         # Receive the response from the server
         try:
-            try:  # Python 2.7+, use buffering of HTTP responses
+            try:  # Python 2.7, use buffering of HTTP responses
                 httplib_response = conn.getresponse(buffering=True)
             except TypeError:  # Python 2.6 and older
                 httplib_response = conn.getresponse()
@@ -510,11 +529,18 @@
 
         try:
             # Request a connection from the queue.
+            timeout_obj = self._get_timeout(timeout)
             conn = self._get_conn(timeout=pool_timeout)
 
+            conn.timeout = timeout_obj.connect_timeout
+
+            is_new_proxy_conn = self.proxy is not None and not getattr(conn, 
'sock', None)
+            if is_new_proxy_conn:
+                self._prepare_proxy(conn)
+
             # Make the request on the httplib connection object.
             httplib_response = self._make_request(conn, method, url,
-                                                  timeout=timeout,
+                                                  timeout=timeout_obj,
                                                   body=body, headers=headers)
 
             # If we're going to release the connection in ``finally:``, then
@@ -547,6 +573,14 @@
                 conn = None
             raise SSLError(e)
 
+        except SSLError:
+            # Treat SSLError separately from BaseSSLError to preserve
+            # traceback.
+            if conn:
+                conn.close()
+                conn = None
+            raise
+
         except (TimeoutError, HTTPException, SocketError, ConnectionError) as 
e:
             if conn:
                 # Discard the connection for these exceptions. It will be
@@ -554,14 +588,13 @@
                 conn.close()
                 conn = None
 
-            stacktrace = sys.exc_info()[2]
             if isinstance(e, SocketError) and self.proxy:
                 e = ProxyError('Cannot connect to proxy.', e)
             elif isinstance(e, (SocketError, HTTPException)):
                 e = ProtocolError('Connection aborted.', e)
 
-            retries = retries.increment(method, url, error=e,
-                                        _pool=self, _stacktrace=stacktrace)
+            retries = retries.increment(method, url, error=e, _pool=self,
+                                        _stacktrace=sys.exc_info()[2])
             retries.sleep()
 
             # Keep track of the error for the retry warning.
@@ -673,24 +706,26 @@
                           assert_fingerprint=self.assert_fingerprint)
             conn.ssl_version = self.ssl_version
 
-        if self.proxy is not None:
-            # Python 2.7+
-            try:
-                set_tunnel = conn.set_tunnel
-            except AttributeError:  # Platform-specific: Python 2.6
-                set_tunnel = conn._set_tunnel
-
-            if sys.version_info <= (2, 6, 4) and not self.proxy_headers:   # 
Python 2.6.4 and older
-                set_tunnel(self.host, self.port)
-            else:
-                set_tunnel(self.host, self.port, self.proxy_headers)
-
-            # Establish tunnel connection early, because otherwise httplib
-            # would improperly set Host: header to proxy's IP:port.
-            conn.connect()
-
         return conn
 
+    def _prepare_proxy(self, conn):
+        """
+        Establish tunnel connection early, because otherwise httplib
+        would improperly set Host: header to proxy's IP:port.
+        """
+        # Python 2.7+
+        try:
+            set_tunnel = conn.set_tunnel
+        except AttributeError:  # Platform-specific: Python 2.6
+            set_tunnel = conn._set_tunnel
+
+        if sys.version_info <= (2, 6, 4) and not self.proxy_headers:   # 
Python 2.6.4 and older
+            set_tunnel(self.host, self.port)
+        else:
+            set_tunnel(self.host, self.port, self.proxy_headers)
+
+        conn.connect()
+
     def _new_conn(self):
         """
         Return a fresh :class:`httplib.HTTPSConnection`.
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/requests-2.5.1/requests/packages/urllib3/contrib/pyopenssl.py 
new/requests-2.5.3/requests/packages/urllib3/contrib/pyopenssl.py
--- old/requests-2.5.1/requests/packages/urllib3/contrib/pyopenssl.py   
2014-12-01 22:02:39.000000000 +0100
+++ new/requests-2.5.3/requests/packages/urllib3/contrib/pyopenssl.py   
2015-02-24 17:27:00.000000000 +0100
@@ -191,6 +191,11 @@
                 return b''
             else:
                 raise
+        except OpenSSL.SSL.ZeroReturnError as e:
+            if self.connection.get_shutdown() == OpenSSL.SSL.RECEIVED_SHUTDOWN:
+                return b''
+            else:
+                raise
         except OpenSSL.SSL.WantReadError:
             rd, wd, ed = select.select(
                 [self.socket], [], [], self.socket.gettimeout())
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/requests-2.5.1/requests/packages/urllib3/poolmanager.py 
new/requests-2.5.3/requests/packages/urllib3/poolmanager.py
--- old/requests-2.5.1/requests/packages/urllib3/poolmanager.py 2014-08-29 
21:37:47.000000000 +0200
+++ new/requests-2.5.3/requests/packages/urllib3/poolmanager.py 2015-02-24 
17:27:00.000000000 +0100
@@ -8,7 +8,7 @@
 from ._collections import RecentlyUsedContainer
 from .connectionpool import HTTPConnectionPool, HTTPSConnectionPool
 from .connectionpool import port_by_scheme
-from .exceptions import LocationValueError
+from .exceptions import LocationValueError, MaxRetryError
 from .request import RequestMethods
 from .util.url import parse_url
 from .util.retry import Retry
@@ -64,6 +64,14 @@
         self.pools = RecentlyUsedContainer(num_pools,
                                            dispose_func=lambda p: p.close())
 
+    def __enter__(self):
+        return self
+
+    def __exit__(self, exc_type, exc_val, exc_tb):
+        self.clear()
+        # Return False to re-raise any potential exceptions
+        return False
+
     def _new_pool(self, scheme, host, port):
         """
         Create a new :class:`ConnectionPool` based on host, port and scheme.
@@ -167,7 +175,14 @@
         if not isinstance(retries, Retry):
             retries = Retry.from_int(retries, redirect=redirect)
 
-        kw['retries'] = retries.increment(method, redirect_location)
+        try:
+            retries = retries.increment(method, url, response=response, 
_pool=conn)
+        except MaxRetryError:
+            if retries.raise_on_redirect:
+                raise
+            return response
+
+        kw['retries'] = retries
         kw['redirect'] = redirect
 
         log.info("Redirecting %s -> %s" % (url, redirect_location))
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/requests-2.5.1/requests/packages/urllib3/response.py 
new/requests-2.5.3/requests/packages/urllib3/response.py
--- old/requests-2.5.1/requests/packages/urllib3/response.py    2014-08-29 
21:37:47.000000000 +0200
+++ new/requests-2.5.3/requests/packages/urllib3/response.py    2015-02-24 
17:27:00.000000000 +0100
@@ -4,12 +4,11 @@
 
 from ._collections import HTTPHeaderDict
 from .exceptions import ProtocolError, DecodeError, ReadTimeoutError
-from .packages.six import string_types as basestring, binary_type
+from .packages.six import string_types as basestring, binary_type, PY3
 from .connection import HTTPException, BaseSSLError
 from .util.response import is_fp_closed
 
 
-
 class DeflateDecoder(object):
 
     def __init__(self):
@@ -21,6 +20,9 @@
         return getattr(self._obj, name)
 
     def decompress(self, data):
+        if not data:
+            return data
+
         if not self._first_try:
             return self._obj.decompress(data)
 
@@ -36,9 +38,23 @@
                 self._data = None
 
 
+class GzipDecoder(object):
+
+    def __init__(self):
+        self._obj = zlib.decompressobj(16 + zlib.MAX_WBITS)
+
+    def __getattr__(self, name):
+        return getattr(self._obj, name)
+
+    def decompress(self, data):
+        if not data:
+            return data
+        return self._obj.decompress(data)
+
+
 def _get_decoder(mode):
     if mode == 'gzip':
-        return zlib.decompressobj(16 + zlib.MAX_WBITS)
+        return GzipDecoder()
 
     return DeflateDecoder()
 
@@ -76,9 +92,10 @@
                  strict=0, preload_content=True, decode_content=True,
                  original_response=None, pool=None, connection=None):
 
-        self.headers = HTTPHeaderDict()
-        if headers:
-            self.headers.update(headers)
+        if isinstance(headers, HTTPHeaderDict):
+            self.headers = headers
+        else:
+            self.headers = HTTPHeaderDict(headers)
         self.status = status
         self.version = version
         self.reason = reason
@@ -202,7 +219,7 @@
 
             except BaseSSLError as e:
                 # FIXME: Is there a better way to differentiate between 
SSLErrors?
-                if not 'read operation timed out' in str(e):  # Defensive:
+                if 'read operation timed out' not in str(e):  # Defensive:
                     # This shouldn't happen but just in case we're missing an 
edge
                     # case, let's avoid swallowing SSL errors.
                     raise
@@ -267,14 +284,16 @@
         Remaining parameters are passed to the HTTPResponse constructor, along
         with ``original_response=r``.
         """
-
-        headers = HTTPHeaderDict()
-        for k, v in r.getheaders():
-            headers.add(k, v)
+        headers = r.msg
+        if not isinstance(headers, HTTPHeaderDict):
+            if PY3: # Python 3
+                headers = HTTPHeaderDict(headers.items())
+            else: # Python 2
+                headers = HTTPHeaderDict.from_httplib(headers)
 
         # HTTPResponse objects in Python 3 don't have a .strict attribute
         strict = getattr(r, 'strict', 0)
-        return ResponseCls(body=r,
+        resp = ResponseCls(body=r,
                            headers=headers,
                            status=r.status,
                            version=r.version,
@@ -282,6 +301,7 @@
                            strict=strict,
                            original_response=r,
                            **response_kw)
+        return resp
 
     # Backwards-compatibility methods for httplib.HTTPResponse
     def getheaders(self):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/requests-2.5.1/requests/packages/urllib3/util/connection.py 
new/requests-2.5.3/requests/packages/urllib3/util/connection.py
--- old/requests-2.5.1/requests/packages/urllib3/util/connection.py     
2014-08-29 21:37:47.000000000 +0200
+++ new/requests-2.5.3/requests/packages/urllib3/util/connection.py     
2015-02-24 17:27:00.000000000 +0100
@@ -82,6 +82,7 @@
             err = _
             if sock is not None:
                 sock.close()
+                sock = None
 
     if err is not None:
         raise err
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/requests-2.5.1/requests/packages/urllib3/util/retry.py 
new/requests-2.5.3/requests/packages/urllib3/util/retry.py
--- old/requests-2.5.1/requests/packages/urllib3/util/retry.py  2014-12-01 
22:02:39.000000000 +0100
+++ new/requests-2.5.3/requests/packages/urllib3/util/retry.py  2015-02-24 
17:27:00.000000000 +0100
@@ -190,7 +190,7 @@
         return isinstance(err, (ReadTimeoutError, ProtocolError))
 
     def is_forced_retry(self, method, status_code):
-        """ Is this method/response retryable? (Based on method/codes 
whitelists)
+        """ Is this method/status code retryable? (Based on method/codes 
whitelists)
         """
         if self.method_whitelist and method.upper() not in 
self.method_whitelist:
             return False
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' 
old/requests-2.5.1/requests/packages/urllib3/util/ssl_.py 
new/requests-2.5.3/requests/packages/urllib3/util/ssl_.py
--- old/requests-2.5.1/requests/packages/urllib3/util/ssl_.py   2014-12-01 
22:02:39.000000000 +0100
+++ new/requests-2.5.3/requests/packages/urllib3/util/ssl_.py   2015-02-24 
17:27:00.000000000 +0100
@@ -1,5 +1,5 @@
 from binascii import hexlify, unhexlify
-from hashlib import md5, sha1
+from hashlib import md5, sha1, sha256
 
 from ..exceptions import SSLError
 
@@ -29,8 +29,8 @@
 except ImportError:
     _DEFAULT_CIPHERS = (
         
'ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:ECDH+HIGH:'
-        
'DH+HIGH:ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:RSA+HIGH:RSA+3DES:ECDH+RC4:'
-        'DH+RC4:RSA+RC4:!aNULL:!eNULL:!MD5'
+        
'DH+HIGH:ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:RSA+HIGH:RSA+3DES:!aNULL:'
+        '!eNULL:!MD5'
     )
 
 try:
@@ -96,7 +96,8 @@
     # this digest.
     hashfunc_map = {
         16: md5,
-        20: sha1
+        20: sha1,
+        32: sha256,
     }
 
     fingerprint = fingerprint.replace(':', '').lower()
@@ -211,7 +212,9 @@
 
     context.verify_mode = cert_reqs
     if getattr(context, 'check_hostname', None) is not None:  # 
Platform-specific: Python 3.2
-        context.check_hostname = (context.verify_mode == ssl.CERT_REQUIRED)
+        # We do our own verification, including fingerprints and alternative
+        # hostnames. So disable it here
+        context.check_hostname = False
     return context
 
 
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/requests-2.5.1/requests/utils.py 
new/requests-2.5.3/requests/utils.py
--- old/requests-2.5.1/requests/utils.py        2014-12-23 18:43:04.000000000 
+0100
+++ new/requests-2.5.3/requests/utils.py        2015-02-24 17:27:00.000000000 
+0100
@@ -25,7 +25,8 @@
 from . import certs
 from .compat import parse_http_list as _parse_list_header
 from .compat import (quote, urlparse, bytes, str, OrderedDict, unquote, is_py2,
-                     builtin_str, getproxies, proxy_bypass, urlunparse)
+                     builtin_str, getproxies, proxy_bypass, urlunparse,
+                     basestring)
 from .cookies import RequestsCookieJar, cookiejar_from_dict
 from .structures import CaseInsensitiveDict
 from .exceptions import InvalidURL
@@ -115,7 +116,8 @@
 def guess_filename(obj):
     """Tries to guess the filename of the given object."""
     name = getattr(obj, 'name', None)
-    if name and isinstance(name, builtin_str) and name[0] != '<' and name[-1] 
!= '>':
+    if (name and isinstance(name, basestring) and name[0] != '<' and
+            name[-1] != '>'):
         return os.path.basename(name)
 
 
@@ -418,10 +420,18 @@
     This function passes the given URI through an unquote/quote cycle to
     ensure that it is fully and consistently quoted.
     """
-    # Unquote only the unreserved characters
-    # Then quote only illegal characters (do not quote reserved, unreserved,
-    # or '%')
-    return quote(unquote_unreserved(uri), safe="!#$%&'()*+,/:;=?@[]~")
+    safe_with_percent = "!#$%&'()*+,/:;=?@[]~"
+    safe_without_percent = "!#$&'()*+,/:;=?@[]~"
+    try:
+        # Unquote only the unreserved characters
+        # Then quote only illegal characters (do not quote reserved,
+        # unreserved, or '%')
+        return quote(unquote_unreserved(uri), safe=safe_with_percent)
+    except InvalidURL:
+        # We couldn't unquote the given URI, so let's try quoting it, but
+        # there may be unquoted '%'s in the URI. We need to make sure they're
+        # properly quoted so they do not cause issues elsewhere.
+        return quote(uri, safe=safe_without_percent)
 
 
 def address_in_network(ip, net):
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/requests-2.5.1/requests.egg-info/PKG-INFO 
new/requests-2.5.3/requests.egg-info/PKG-INFO
--- old/requests-2.5.1/requests.egg-info/PKG-INFO       2014-12-23 
18:55:23.000000000 +0100
+++ new/requests-2.5.3/requests.egg-info/PKG-INFO       2015-02-24 
17:33:43.000000000 +0100
@@ -1,6 +1,6 @@
 Metadata-Version: 1.1
 Name: requests
-Version: 2.5.1
+Version: 2.5.3
 Summary: Python HTTP for Humans.
 Home-page: http://python-requests.org
 Author: Kenneth Reitz
@@ -9,11 +9,11 @@
 Description: Requests: HTTP for Humans
         =========================
         
-        .. image:: https://badge.fury.io/py/requests.png
-            :target: http://badge.fury.io/py/requests
+        .. image:: https://img.shields.io/pypi/v/requests.svg
+            :target: https://pypi.python.org/pypi/requests
         
-        .. image:: https://pypip.in/d/requests/badge.png
-                :target: https://crate.io/packages/requests/
+        .. image:: https://img.shields.io/pypi/dm/requests.svg
+                :target: https://pypi.python.org/pypi/requests
         
         
         Requests is an Apache2 Licensed HTTP library, written in Python, for 
human
@@ -98,6 +98,61 @@
         Release History
         ---------------
         
+        2.5.3 (2015-02-24)
+        ++++++++++++++++++
+        
+        **Bugfixes**
+        
+        - Revert changes to our vendored certificate bundle. For more context 
see
+          (#2455, #2456, and http://bugs.python.org/issue23476)
+        
+        2.5.2 (2015-02-23)
+        ++++++++++++++++++
+        
+        **Features and Improvements**
+        
+        - Add sha256 fingerprint support. (`shazow/urllib3#540`_)
+        
+        - Improve the performance of headers. (`shazow/urllib3#544`_)
+        
+        **Bugfixes**
+        
+        - Copy pip's import machinery. When downstream redistributors remove
+          requests.packages.urllib3 the import machinery will continue to let 
those
+          same symbols work. Example usage in requests' documentation and 
3rd-party
+          libraries relying on the vendored copies of urllib3 will work 
without having
+          to fallback to the system urllib3.
+        
+        - Attempt to quote parts of the URL on redirect if unquoting and then 
quoting
+          fails. (#2356)
+        
+        - Fix filename type check for multipart form-data uploads. (#2411)
+        
+        - Properly handle the case where a server issuing digest authentication
+          challenges provides both auth and auth-int qop-values. (#2408)
+        
+        - Fix a socket leak. (`shazow/urllib3#549`_)
+        
+        - Fix multiple ``Set-Cookie`` headers properly. (`shazow/urllib3#534`_)
+        
+        - Disable the built-in hostname verification. (`shazow/urllib3#526`_)
+        
+        - Fix the behaviour of decoding an exhausted stream. 
(`shazow/urllib3#535`_)
+        
+        **Security**
+        
+        - Pulled in an updated ``cacert.pem``.
+        
+        - Drop RC4 from the default cipher list. (`shazow/urllib3#551`_)
+        
+        .. _shazow/urllib3#551: https://github.com/shazow/urllib3/pull/551
+        .. _shazow/urllib3#549: https://github.com/shazow/urllib3/pull/549
+        .. _shazow/urllib3#544: https://github.com/shazow/urllib3/pull/544
+        .. _shazow/urllib3#540: https://github.com/shazow/urllib3/pull/540
+        .. _shazow/urllib3#535: https://github.com/shazow/urllib3/pull/535
+        .. _shazow/urllib3#534: https://github.com/shazow/urllib3/pull/534
+        .. _shazow/urllib3#526: https://github.com/shazow/urllib3/pull/526
+        
         2.5.1 (2014-12-23)
         ++++++++++++++++++
         
@@ -198,7 +253,7 @@
         - Support for connect timeouts! Timeout now accepts a tuple (connect, 
read) which is used to set individual connect and read timeouts.
         - Allow copying of PreparedRequests without headers/cookies.
         - Updated bundled urllib3 version.
-        - Refactored settings loading from environment — new 
`Session.merge_environment_settings`.
+        - Refactored settings loading from environment -- new 
`Session.merge_environment_settings`.
         - Handle socket errors in iter_content.
         
         
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/requests-2.5.1/test_requests.py 
new/requests-2.5.3/test_requests.py
--- old/requests-2.5.1/test_requests.py 2014-12-23 18:43:04.000000000 +0100
+++ new/requests-2.5.3/test_requests.py 2015-02-24 17:27:00.000000000 +0100
@@ -301,13 +301,15 @@
         r = s.get(url)
         assert r.status_code == 200
 
-    def test_connection_error(self):
+    def test_connection_error_invalid_domain(self):
         """Connecting to an unknown domain should raise a ConnectionError"""
         with pytest.raises(ConnectionError):
-            requests.get("http://fooobarbangbazbing.httpbin.org";)
+            requests.get("http://doesnotexist.google.com";)
 
+    def test_connection_error_invalid_port(self):
+        """Connecting to an invalid port should raise a ConnectionError"""
         with pytest.raises(ConnectionError):
-            requests.get("http://httpbin.org:1";)
+            requests.get("http://httpbin.org:1";, timeout=1)
 
     def test_LocationParseError(self):
         """Inputing a URL that cannot be parsed should raise an InvalidURL 
error"""
@@ -1265,6 +1267,32 @@
             'http://localhost.localdomain:5000/v1.0/') == {}
         assert get_environ_proxies('http://www.requests.com/') != {}
 
+    def test_guess_filename_when_int(self):
+        from requests.utils import guess_filename
+        assert None is guess_filename(1)
+
+    def test_guess_filename_when_filename_is_an_int(self):
+        from requests.utils import guess_filename
+        fake = type('Fake', (object,), {'name': 1})()
+        assert None is guess_filename(fake)
+
+    def test_guess_filename_with_file_like_obj(self):
+        from requests.utils import guess_filename
+        from requests import compat
+        fake = type('Fake', (object,), {'name': b'value'})()
+        guessed_name = guess_filename(fake)
+        assert b'value' == guessed_name
+        assert isinstance(guessed_name, compat.bytes)
+
+    def test_guess_filename_with_unicode_name(self):
+        from requests.utils import guess_filename
+        from requests import compat
+        filename = b'value'.decode('utf-8')
+        fake = type('Fake', (object,), {'name': filename})()
+        guessed_name = guess_filename(fake)
+        assert filename == guessed_name
+        assert isinstance(guessed_name, compat.str)
+
     def test_is_ipv4_address(self):
         from requests.utils import is_ipv4_address
         assert is_ipv4_address('8.8.8.8')
@@ -1301,6 +1329,22 @@
         assert username == percent_encoding_test_chars
         assert password == percent_encoding_test_chars
 
+    def test_requote_uri_with_unquoted_percents(self):
+        """Ensure we handle unquoted percent signs in redirects.
+
+        See: https://github.com/kennethreitz/requests/issues/2356
+        """
+        from requests.utils import requote_uri
+        bad_uri = 'http://example.com/fiz?buz=%ppicture'
+        quoted = 'http://example.com/fiz?buz=%25ppicture'
+        assert quoted == requote_uri(bad_uri)
+
+    def test_requote_uri_properly_requotes(self):
+        """Ensure requoting doesn't break expectations."""
+        from requests.utils import requote_uri
+        quoted = 'http://example.com/fiz?buz=%25ppicture'
+        assert quoted == requote_uri(quoted)
+
 
 class TestMorselToCookieExpires(unittest.TestCase):
 

-- 
To unsubscribe, e-mail: opensuse-commit+unsubscr...@opensuse.org
For additional commands, e-mail: opensuse-commit+h...@opensuse.org

Reply via email to