Hello community,

here is the log from the commit of package python3-keepalive for 
openSUSE:Factory checked in at 2015-12-03 13:31:34
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/python3-keepalive (Old)
 and      /work/SRC/openSUSE:Factory/.python3-keepalive.new (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Package is "python3-keepalive"

Changes:
--------
--- /work/SRC/openSUSE:Factory/python3-keepalive/python3-keepalive.changes      
2015-11-17 14:22:38.000000000 +0100
+++ /work/SRC/openSUSE:Factory/.python3-keepalive.new/python3-keepalive.changes 
2015-12-03 13:31:35.000000000 +0100
@@ -1,0 +2,50 @@
+Fri Nov 27 20:33:28 UTC 2015 - [email protected]
+
+- update to version 0.5:
+  * fixed the usage some urllib.Request getters also on POST requests,
+    see commit 018f118 for further context (issue #2)
+  * b'' prefix is already known by py 2.6, so upgraded setup
+    meta-information (issue #2)
+  * more specific python versions because the patch introduced by
+    issue #2
+  * trying to fix issue #2 by resorting to the b string prefix, which
+    raises the minimum python version to >=2.7
+
+-------------------------------------------------------------------
+Thu Nov  5 16:32:54 UTC 2015 - [email protected]
+
+- specfile:
+  * updated license
+
+- update to version 0.4.1:
+  * aded license header to the setup.py file
+  * fixed missing import
+  * prepared next dev iteration, that this time yes I hope it wouldn't
+    need to happen...
+
+- changes from version 0.4:
+  * added module __version__ metadata as discussed in
+    rdflib/sparqlwrapper#65
+  * updated license headers
+  * improved patch from 018f1180845160ef72b5fe50c9efbf4fed7da53e to
+    address the issue identified by @xflr6 in RDFLib/sparqlwrapper#65
+    patched to void TypeError exceptions (Can't convert 'bytes' object
+    to str implicitly) in Python 3.x
+  * better wording
+  * updated meta information
+
+- changes from version 0.3:
+  * removed usage some urllib.Request getters that were deprecated in
+    3.3 and removed in 3.4, see:
+    https://docs.python.org/3.3/library/urllib.request.html#request-objects
+  * added 2to3 support
+  * patched ssl factory support
+  * imported newer version (from 2006):
+
+-------------------------------------------------------------------
+Wed Nov  4 16:37:24 UTC 2015 - [email protected]
+
+- update to version 0.1.1:
+  (no changelog available)
+
+-------------------------------------------------------------------

Old:
----
  keepalive-0.1.tar.gz

New:
----
  keepalive-0.5.tar.gz

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Other differences:
------------------
++++++ python3-keepalive.spec ++++++
--- /var/tmp/diff_new_pack.6YCgTz/_old  2015-12-03 13:31:36.000000000 +0100
+++ /var/tmp/diff_new_pack.6YCgTz/_new  2015-12-03 13:31:36.000000000 +0100
@@ -1,7 +1,7 @@
 #
 # spec file for package python3-keepalive
 #
-# Copyright (c) 2015 SUSE LINUX Products GmbH, Nuernberg, Germany.
+# Copyright (c) 2015 SUSE LINUX GmbH, Nuernberg, Germany.
 #
 # All modifications and additions to the file contributed by third parties
 # remain the property of their copyright owners, unless otherwise agreed
@@ -17,10 +17,10 @@
 
 
 Name:           python3-keepalive
-Version:        0.1
+Version:        0.5
 Release:        0
 Summary:        Urllib keepalive support
-License:        GPL-3.0
+License:        LGPL-2.1+
 Group:          Development/Languages/Python
 Url:            https://github.com/wikier/keepalive
 Source:         
https://pypi.python.org/packages/source/k/keepalive/keepalive-%{version}.tar.gz

++++++ keepalive-0.1.tar.gz -> keepalive-0.5.tar.gz ++++++
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/keepalive-0.1/PKG-INFO new/keepalive-0.5/PKG-INFO
--- old/keepalive-0.1/PKG-INFO  2015-11-03 14:01:48.000000000 +0100
+++ new/keepalive-0.5/PKG-INFO  2015-11-27 18:44:38.000000000 +0100
@@ -1,18 +1,24 @@
 Metadata-Version: 1.1
 Name: keepalive
-Version: 0.1
-Summary: urllib keepalive support
+Version: 0.5
+Summary: urllib keepalive support for python
 Home-page: https://github.com/wikier/keepalive
-Author: Sergio Fernandez
+Author: Sergio Fernández
 Author-email: [email protected]
-License: GNU GPL
+License: GNU LGPL
 Download-URL: https://github.com/wikier/keepalive/releases
 Description: An HTTP handler for `urllib2` that supports HTTP 1.1 and 
keepalive.
 Keywords: python http urllib keepalive
 Platform: any
-Classifier: Development Status :: 3 - Alpha
+Classifier: Development Status :: 4 - Beta
 Classifier: Intended Audience :: Developers
+Classifier: Topic :: Software Development :: Libraries :: Python Modules
 Classifier: License :: OSI Approved :: Apache Software License
 Classifier: Operating System :: OS Independent
 Classifier: Programming Language :: Python :: 2
+Classifier: Programming Language :: Python :: 2.6
+Classifier: Programming Language :: Python :: 2.7
 Classifier: Programming Language :: Python :: 3
+Classifier: Programming Language :: Python :: 3.3
+Classifier: Programming Language :: Python :: 3.4
+Classifier: Programming Language :: Python :: 3.5
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/keepalive-0.1/keepalive/__init__.py 
new/keepalive-0.5/keepalive/__init__.py
--- old/keepalive-0.1/keepalive/__init__.py     2015-11-03 13:51:56.000000000 
+0100
+++ new/keepalive-0.5/keepalive/__init__.py     2015-11-27 18:43:52.000000000 
+0100
@@ -1 +1,25 @@
+# -*- coding: utf-8 -*-
+
+#   This library is free software; you can redistribute it and/or
+#   modify it under the terms of the GNU Lesser General Public
+#   License as published by the Free Software Foundation; either
+#   version 2.1 of the License, or (at your option) any later version.
+#
+#   This library is distributed in the hope that it will be useful,
+#   but WITHOUT ANY WARRANTY; without even the implied warranty of
+#   MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+#   Lesser General Public License for more details.
+#
+#   You should have received a copy of the GNU Lesser General Public
+#   License along with this library; if not, write to the 
+#      Free Software Foundation, Inc., 
+#      59 Temple Place, Suite 330, 
+#      Boston, MA  02111-1307  USA
+
+# This file was part of urlgrabber, a high-level cross-protocol url-grabber
+# Copyright 2002-2004 Michael D. Stenner, Ryan Tomayko
+# Copyright 2015 Sergio Fernández
+
 from keepalive import *
+
+__version__ = "0.5"
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/keepalive-0.1/keepalive/keepalive.py 
new/keepalive-0.5/keepalive/keepalive.py
--- old/keepalive-0.1/keepalive/keepalive.py    2015-11-03 12:39:19.000000000 
+0100
+++ new/keepalive-0.5/keepalive/keepalive.py    2015-11-25 12:48:40.000000000 
+0100
@@ -1,7 +1,26 @@
-"""An HTTP handler for urllib2 that supports HTTP 1.1 and keepalive.
+# -*- coding: utf-8 -*-
+
+#   This library is free software; you can redistribute it and/or
+#   modify it under the terms of the GNU Lesser General Public
+#   License as published by the Free Software Foundation; either
+#   version 2.1 of the License, or (at your option) any later version.
+#
+#   This library is distributed in the hope that it will be useful,
+#   but WITHOUT ANY WARRANTY; without even the implied warranty of
+#   MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+#   Lesser General Public License for more details.
+#
+#   You should have received a copy of the GNU Lesser General Public
+#   License along with this library; if not, write to the 
+#      Free Software Foundation, Inc., 
+#      59 Temple Place, Suite 330, 
+#      Boston, MA  02111-1307  USA
+
+# This file was part of urlgrabber, a high-level cross-protocol url-grabber
+# Copyright 2002-2004 Michael D. Stenner, Ryan Tomayko
+# Copyright 2015 Sergio Fernández
 
-This is keepalive.py from the urlgrabber project, available under the GNU LGPL
-ported to python 3 (with help from 2to3).
+"""An HTTP handler for urllib2 that supports HTTP 1.1 and keepalive.
 
 >>> import urllib2
 >>> from keepalive import HTTPHandler
@@ -11,6 +30,11 @@
 >>> 
 >>> fo = urllib2.urlopen('http://www.python.org')
 
+If a connection to a given host is requested, and all of the existing
+connections are still in use, another connection will be opened.  If
+the handler tries to use an existing connection but it fails in some
+way, it will be closed and removed from the pool.
+
 To remove the handler, simply re-run build_opener with no arguments, and
 install that opener.
 
@@ -22,6 +46,12 @@
   close_all()
   open_connections()
 
+NOTE: using the close_connection and close_all methods of the handler
+should be done with care when using multiple threads.
+  * there is nothing that prevents another thread from creating new
+    connections immediately after connections are closed
+  * no checks are done to prevent in-use connections from being closed
+
 >>> keepalive_handler.close_all()
 
 EXTRA ATTRIBUTES AND METHODS
@@ -46,125 +76,295 @@
   urllib2 tries to do clever things with error codes 301, 302, 401,
   and 407, and it wraps the object upon return.
 
-  You can optionally set the module-level global HANDLE_ERRORS to 0,
-  in which case the handler will always return the object directly.
-  If you like the fancy handling of errors, don't do this.  If you
-  prefer to see your error codes, then do.
+  For python versions earlier than 2.4, you can avoid this fancy error
+  handling by setting the module-level global HANDLE_ERRORS to zero.
+  You see, prior to 2.4, it's the HTTP Handler's job to determine what
+  to handle specially, and what to just pass up.  HANDLE_ERRORS == 0
+  means "pass everything up".  In python 2.4, however, this job no
+  longer belongs to the HTTP Handler and is now done by a NEW handler,
+  HTTPErrorProcessor.  Here's the bottom line:
+
+    python version < 2.4
+        HANDLE_ERRORS == 1  (default) pass up 200, treat the rest as
+                            errors
+        HANDLE_ERRORS == 0  pass everything up, error processing is
+                            left to the calling code
+    python version >= 2.4
+        HANDLE_ERRORS == 1  pass up 200, treat the rest as errors
+        HANDLE_ERRORS == 0  (default) pass everything up, let the
+                            other handlers (specifically,
+                            HTTPErrorProcessor) decide what to do
+
+  In practice, setting the variable either way makes little difference
+  in python 2.4, so for the most consistent behavior across versions,
+  you probably just want to use the defaults, which will give you
+  exceptions on errors.
 
 """
 
-import urllib.request, urllib.error, urllib.parse
-import http.client
+# $Id: keepalive.py,v 1.17 2006/12/08 00:14:16 mstenner Exp $
+
+import urllib2
+import httplib
 import socket
+import thread
 
-#STRING_VERSION = '.'.join(map(str, VERSION))
-DEBUG = 0
-HANDLE_ERRORS = 1
+DEBUG = None
 
-class HTTPHandler(urllib.request.HTTPHandler):
+import sys
+if sys.version_info < (2, 4): HANDLE_ERRORS = 1
+else: HANDLE_ERRORS = 0
+    
+class ConnectionManager:
+    """
+    The connection manager must be able to:
+      * keep track of all existing
+      """
     def __init__(self):
-        urllib.request.HTTPHandler.__init__(self)
-        self._connections = {}
-    
+        self._lock = thread.allocate_lock()
+        self._hostmap = {} # map hosts to a list of connections
+        self._connmap = {} # map connections to host
+        self._readymap = {} # map connection to ready state
+
+    def add(self, host, connection, ready):
+        self._lock.acquire()
+        try:
+            if not self._hostmap.has_key(host): self._hostmap[host] = []
+            self._hostmap[host].append(connection)
+            self._connmap[connection] = host
+            self._readymap[connection] = ready
+        finally:
+            self._lock.release()
+
+    def remove(self, connection):
+        self._lock.acquire()
+        try:
+            try:
+                host = self._connmap[connection]
+            except KeyError:
+                pass
+            else:
+                del self._connmap[connection]
+                del self._readymap[connection]
+                self._hostmap[host].remove(connection)
+                if not self._hostmap[host]: del self._hostmap[host]
+        finally:
+            self._lock.release()
+
+    def set_ready(self, connection, ready):
+        try: self._readymap[connection] = ready
+        except KeyError: pass
+        
+    def get_ready_conn(self, host):
+        conn = None
+        self._lock.acquire()
+        try:
+            if self._hostmap.has_key(host):
+                for c in self._hostmap[host]:
+                    if self._readymap[c]:
+                        self._readymap[c] = 0
+                        conn = c
+                        break
+        finally:
+            self._lock.release()
+        return conn
+
+    def get_all(self, host=None):
+        if host:
+            return list(self._hostmap.get(host, []))
+        else:
+            return dict(self._hostmap)
+
+class KeepAliveHandler:
+    def __init__(self):
+        self._cm = ConnectionManager()
+        
+    #### Connection Management
+    def open_connections(self):
+        """return a list of connected hosts and the number of connections
+        to each.  [('foo.com:80', 2), ('bar.org', 1)]"""
+        return [(host, len(li)) for (host, li) in self._cm.get_all().items()]
+
     def close_connection(self, host):
-        """close connection to <host>
+        """close connection(s) to <host>
         host is the host:port spec, as in 'www.cnn.com:8080' as passed in.
         no error occurs if there is no connection to that host."""
-        self._remove_connection(host, close=1)
-    
-    def open_connections(self):
-        """return a list of connected hosts"""
-        return list(self._connections.keys())
-    
+        for h in self._cm.get_all(host):
+            self._cm.remove(h)
+            h.close()
+        
     def close_all(self):
         """close all open connections"""
-        for host, conn in list(self._connections.items()):
-            conn.close()
-        self._connections = {}
-    
-    def _remove_connection(self, host, close=0):
-        if host in self._connections:
-            if close: self._connections[host].close()
-            del self._connections[host]
-    
-    def _start_connection(self, h, req):
-        try:
-            if req.has_data():
-                data = req.get_data()
-                h.putrequest('POST', req.get_selector())
-                if 'Content-type' not in req.headers:
-                    h.putheader('Content-type',
-                                'application/x-www-form-urlencoded')
-                if 'Content-length' not in req.headers:
-                    h.putheader('Content-length', '%d' % len(data))
-            else:
-                h.putrequest('GET', req.get_selector())
-        except socket.error as err:
-            raise urllib.error.URLError(err)
+        for host, conns in self._cm.get_all().items():
+            for h in conns:
+                self._cm.remove(h)
+                h.close()
         
-        for args in self.parent.addheaders:
-            h.putheader(*args)
-        for k, v in list(req.headers.items()):
-            h.putheader(k, v)
-        h.endheaders()
-        if req.has_data():
-            h.send(data)
-    
-    def do_open(self, http_class, req):
-        host = req.get_host()
-        if not host:
-            raise urllib.error.URLError('no host given')
+    def _request_closed(self, request, host, connection):
+        """tells us that this request is now closed and the the
+        connection is ready for another request"""
+        self._cm.set_ready(connection, 1)
+
+    def _remove_connection(self, host, connection, close=0):
+        if close: connection.close()
+        self._cm.remove(connection)
         
+    #### Transaction Execution
+    def do_open(self, req):
+        host = req.host
+        if not host:
+            raise urllib2.URLError('no host given')
+
         try:
-            need_new_connection = 1
-            h = self._connections.get(host)
-            if not h is None:
-                try:
-                    self._start_connection(h, req)
-                except socket.error as e:
-                    r = None
-                else:
-                    try: r = h.getresponse()
-                    except http.client.ResponseNotReady as e: r = None
-                
-                if r is None or r.version == 9:
-                    # httplib falls back to assuming HTTP 0.9 if it gets a
-                    # bad header back.  This is most likely to happen if
-                    # the socket has been closed by the server since we
-                    # last used the connection.
-                    if DEBUG: print("failed to re-use connection to %s" % host)
-                    h.close()
-                else:
-                    if DEBUG: print("re-using connection to %s" % host)
-                    need_new_connection = 0
-            if need_new_connection:
-                if DEBUG: print("creating new connection to %s" % host)
-                h = http_class(host)
-                self._connections[host] = h
-                self._start_connection(h, req)
+            h = self._cm.get_ready_conn(host)
+            while h:
+                r = self._reuse_connection(h, req, host)
+
+                # if this response is non-None, then it worked and we're
+                # done.  Break out, skipping the else block.
+                if r: break
+
+                # connection is bad - possibly closed by server
+                # discard it and ask for the next free connection
+                h.close()
+                self._cm.remove(h)
+                h = self._cm.get_ready_conn(host)
+            else:
+                # no (working) free connections were found.  Create a new one.
+                h = self._get_connection(host)
+                if DEBUG: DEBUG.info("creating new connection to %s (%d)",
+                                     host, id(h))
+                self._cm.add(host, h, 0)
+                self._start_transaction(h, req)
                 r = h.getresponse()
-        except socket.error as err:
-            raise urllib.error.URLError(err)
-        
+        except (socket.error, httplib.HTTPException), err:
+            raise urllib2.URLError(err)
+            
+        if DEBUG: DEBUG.info("STATUS: %s, %s", r.status, r.reason)
+
         # if not a persistent connection, don't try to reuse it
-        if r.will_close: self._remove_connection(host)
-        
-        if DEBUG:
-            print("STATUS: %s, %s" % (r.status, r.reason))
+        if r.will_close:
+            if DEBUG: DEBUG.info('server will close connection, discarding')
+            self._cm.remove(h)
+
         r._handler = self
         r._host = host
         r._url = req.get_full_url()
+        r._connection = h
+        r.code = r.status
+        r.headers = r.msg
+        r.msg = r.reason
         
         if r.status == 200 or not HANDLE_ERRORS:
             return r
         else:
-            return self.parent.error('http', req, r, r.status, r.reason, r.msg)
-    
+            return self.parent.error('http', req, r,
+                                     r.status, r.msg, r.headers)
+
+    def _reuse_connection(self, h, req, host):
+        """start the transaction with a re-used connection
+        return a response object (r) upon success or None on failure.
+        This DOES not close or remove bad connections in cases where
+        it returns.  However, if an unexpected exception occurs, it
+        will close and remove the connection before re-raising.
+        """
+        try:
+            self._start_transaction(h, req)
+            r = h.getresponse()
+            # note: just because we got something back doesn't mean it
+            # worked.  We'll check the version below, too.
+        except (socket.error, httplib.HTTPException):
+            r = None
+        except:
+            # adding this block just in case we've missed
+            # something we will still raise the exception, but
+            # lets try and close the connection and remove it
+            # first.  We previously got into a nasty loop
+            # where an exception was uncaught, and so the
+            # connection stayed open.  On the next try, the
+            # same exception was raised, etc.  The tradeoff is
+            # that it's now possible this call will raise
+            # a DIFFERENT exception
+            if DEBUG: DEBUG.error("unexpected exception - closing " + \
+                                  "connection to %s (%d)", host, id(h))
+            self._cm.remove(h)
+            h.close()
+            raise
+                    
+        if r is None or r.version == 9:
+            # httplib falls back to assuming HTTP 0.9 if it gets a
+            # bad header back.  This is most likely to happen if
+            # the socket has been closed by the server since we
+            # last used the connection.
+            if DEBUG: DEBUG.info("failed to re-use connection to %s (%d)",
+                                 host, id(h))
+            r = None
+        else:
+            if DEBUG: DEBUG.info("re-using connection to %s (%d)", host, id(h))
+
+        return r
+
+    def _start_transaction(self, h, req):
+        try:
+            if req.data:
+                data = req.data
+                if hasattr(req, 'selector'):
+                    h.putrequest('POST', req.selector)
+                else:
+                    h.putrequest('POST', req.get_selector())
+                if not req.headers.has_key('Content-type'):
+                    h.putheader('Content-type',
+                                'application/x-www-form-urlencoded')
+                if not req.headers.has_key('Content-length'):
+                    h.putheader('Content-length', '%d' % len(data))
+            else:
+                if hasattr(req, 'selector'):
+                    h.putrequest('GET', req.selector)
+                else:
+                    h.putrequest('GET', req.get_selector())
+        except (socket.error, httplib.HTTPException), err:
+            raise urllib2.URLError(err)
+
+        for args in self.parent.addheaders:
+            h.putheader(*args)
+        for k, v in req.headers.items():
+            h.putheader(k, v)
+        h.endheaders()
+        if req.data:
+            h.send(data)
+
+    def _get_connection(self, host):
+        return NotImplementedError
+
+class HTTPHandler(KeepAliveHandler, urllib2.HTTPHandler):
+    def __init__(self):
+        KeepAliveHandler.__init__(self)
+
     def http_open(self, req):
-        return self.do_open(HTTPConnection, req)
+        return self.do_open(req)
 
-class HTTPResponse(http.client.HTTPResponse):
+    def _get_connection(self, host):
+        return HTTPConnection(host)
 
+class HTTPSHandler(KeepAliveHandler, urllib2.HTTPSHandler):
+    def __init__(self, ssl_factory=None):
+        KeepAliveHandler.__init__(self)
+        if not ssl_factory:
+            try:
+                import sslfactory
+                ssl_factory = sslfactory.get_factory()
+            except ImportError:
+                pass
+        self._ssl_factory = ssl_factory
+    
+    def https_open(self, req):
+        return self.do_open(req)
+
+    def _get_connection(self, host):
+        try: return self._ssl_factory.get_https_connection(host)
+        except AttributeError: return HTTPSConnection(host)
+        
+class HTTPResponse(httplib.HTTPResponse):
     # we need to subclass HTTPResponse in order to
     # 1) add readline() and readlines() methods
     # 2) add close_connection() methods
@@ -186,24 +386,34 @@
 
     def __init__(self, sock, debuglevel=0, strict=0, method=None):
         if method: # the httplib in python 2.3 uses the method arg
-            http.client.HTTPResponse.__init__(self, sock, debuglevel, method)
+            httplib.HTTPResponse.__init__(self, sock, debuglevel, method)
         else: # 2.2 doesn't
-            http.client.HTTPResponse.__init__(self, sock, debuglevel)
+            httplib.HTTPResponse.__init__(self, sock, debuglevel)
         self.fileno = sock.fileno
+        self.code = None
         self._rbuf = b""
         self._rbufsize = 8096
         self._handler = None # inserted by the handler later
         self._host = None    # (same)
         self._url = None     # (same)
+        self._connection = None # (same)
+
+    _raw_read = httplib.HTTPResponse.read
 
-    _raw_read = http.client.HTTPResponse.read
+    def close(self):
+        if self.fp:
+            self.fp.close()
+            self.fp = None
+            if self._handler:
+                self._handler._request_closed(self, self._host,
+                                              self._connection)
 
     def close_connection(self):
+        self._handler._remove_connection(self._host, self._connection, close=1)
         self.close()
-        self._handler._remove_connection(self._host, close=1)
         
     def info(self):
-        return self.msg
+        return self.headers
 
     def geturl(self):
         return self._url
@@ -221,16 +431,16 @@
                 return s
 
         s = self._rbuf + self._raw_read(amt)
-        self._rbuf = ''
+        self._rbuf = b""
         return s
 
     def readline(self, limit=-1):
-        data = ""
-        i = self._rbuf.find(b'\n')
+        data = b""
+        i = self._rbuf.find('\n')
         while i < 0 and not (0 < limit <= len(self._rbuf)):
             new = self._raw_read(self._rbufsize)
             if not new: break
-            i = new.find(b'\n')
+            i = new.find('\n')
             if i >= 0: i = i + len(self._rbuf)
             self._rbuf = self._rbuf + new
         if i < 0: i = len(self._rbuf)
@@ -252,9 +462,12 @@
         return list
 
 
-class HTTPConnection(http.client.HTTPConnection):
+class HTTPConnection(httplib.HTTPConnection):
     # use the modified response class
     response_class = HTTPResponse
+
+class HTTPSConnection(httplib.HTTPSConnection):
+    response_class = HTTPResponse
     
 #########################################################################
 #####   TEST FUNCTIONS
@@ -264,85 +477,86 @@
     global HANDLE_ERRORS
     orig = HANDLE_ERRORS
     keepalive_handler = HTTPHandler()
-    opener = urllib.request.build_opener(keepalive_handler)
-    urllib.request.install_opener(opener)
+    opener = urllib2.build_opener(keepalive_handler)
+    urllib2.install_opener(opener)
     pos = {0: 'off', 1: 'on'}
     for i in (0, 1):
-        print("  fancy error handling %s (HANDLE_ERRORS = %i)" % (pos[i], i))
+        print "  fancy error handling %s (HANDLE_ERRORS = %i)" % (pos[i], i)
         HANDLE_ERRORS = i
         try:
-            fo = urllib.request.urlopen(url)
+            fo = urllib2.urlopen(url)
             foo = fo.read()
             fo.close()
             try: status, reason = fo.status, fo.reason
             except AttributeError: status, reason = None, None
-        except IOError as e:
-            print("  EXCEPTION: %s" % e)
+        except IOError, e:
+            print "  EXCEPTION: %s" % e
             raise
         else:
-            print("  status = %s, reason = %s" % (status, reason))
+            print "  status = %s, reason = %s" % (status, reason)
     HANDLE_ERRORS = orig
     hosts = keepalive_handler.open_connections()
-    print("open connections:", ' '.join(hosts))
+    print "open connections:", hosts
     keepalive_handler.close_all()
 
 def continuity(url):
-    from hashlib import md5
+    import md5
     format = '%25s: %s'
     
     # first fetch the file with the normal http handler
-    opener = urllib.request.build_opener()
-    urllib.request.install_opener(opener)
-    fo = urllib.request.urlopen(url)
+    opener = urllib2.build_opener()
+    urllib2.install_opener(opener)
+    fo = urllib2.urlopen(url)
     foo = fo.read()
     fo.close()
-    m = md5(foo)
-    print(format % ('normal urllib', m.hexdigest()))
+    m = md5.new(foo)
+    print format % ('normal urllib', m.hexdigest())
 
     # now install the keepalive handler and try again
-    opener = urllib.request.build_opener(HTTPHandler())
-    urllib.request.install_opener(opener)
+    opener = urllib2.build_opener(HTTPHandler())
+    urllib2.install_opener(opener)
 
-    fo = urllib.request.urlopen(url)
+    fo = urllib2.urlopen(url)
     foo = fo.read()
     fo.close()
-    m = md5(foo)
-    print(format % ('keepalive read', m.hexdigest()))
+    m = md5.new(foo)
+    print format % ('keepalive read', m.hexdigest())
 
-    fo = urllib.request.urlopen(url)
-    foo = b''
+    fo = urllib2.urlopen(url)
+    foo = ''
     while 1:
         f = fo.readline()
         if f: foo = foo + f
         else: break
     fo.close()
-    m = md5(foo)
-    print(format % ('keepalive readline', m.hexdigest()))
+    m = md5.new(foo)
+    print format % ('keepalive readline', m.hexdigest())
 
 def comp(N, url):
-    print('  making %i connections to:\n  %s' % (N, url))
+    print '  making %i connections to:\n  %s' % (N, url)
 
     sys.stdout.write('  first using the normal urllib handlers')
     # first use normal opener
-    opener = urllib.request.build_opener()
-    urllib.request.install_opener(opener)
+    opener = urllib2.build_opener()
+    urllib2.install_opener(opener)
     t1 = fetch(N, url)
-    print('  TIME: %.3f s' % t1)
+    print '  TIME: %.3f s' % t1
 
     sys.stdout.write('  now using the keepalive handler       ')
     # now install the keepalive handler and try again
-    opener = urllib.request.build_opener(HTTPHandler())
-    urllib.request.install_opener(opener)
+    opener = urllib2.build_opener(HTTPHandler())
+    urllib2.install_opener(opener)
     t2 = fetch(N, url)
-    print('  TIME: %.3f s' % t2)
-    print('  improvement factor: %.2f' % (t1/t2, ))
+    print '  TIME: %.3f s' % t2
+    print '  improvement factor: %.2f' % (t1/t2, )
     
 def fetch(N, url, delay=0):
+    import time
     lens = []
     starttime = time.time()
     for i in range(N):
         if delay and i > 0: time.sleep(delay)
-        fo = urllib.request.urlopen(url)
+        fo = urllib2.urlopen(url)
         foo = fo.read()
         fo.close()
         lens.append(len(foo))
@@ -352,22 +566,59 @@
     for i in lens[1:]:
         j = j + 1
         if not i == lens[0]:
-            print("WARNING: inconsistent length on read %i: %i" % (j, i))
+            print "WARNING: inconsistent length on read %i: %i" % (j, i)
 
     return diff
 
+def test_timeout(url):
+    global DEBUG
+    dbbackup = DEBUG
+    class FakeLogger:
+        def debug(self, msg, *args): print msg % args
+        info = warning = error = debug
+    DEBUG = FakeLogger()
+    print "  fetching the file to establish a connection"
+    fo = urllib2.urlopen(url)
+    data1 = fo.read()
+    fo.close()
+ 
+    i = 20
+    print "  waiting %i seconds for the server to close the connection" % i
+    while i > 0:
+        sys.stdout.write('\r  %2i' % i)
+        sys.stdout.flush()
+        time.sleep(1)
+        i -= 1
+    sys.stderr.write('\r')
+
+    print "  fetching the file a second time"
+    fo = urllib2.urlopen(url)
+    data2 = fo.read()
+    fo.close()
+
+    if data1 == data2:
+        print '  data are identical'
+    else:
+        print '  ERROR: DATA DIFFER'
+
+    DEBUG = dbbackup
+
+    
 def test(url, N=10):
-    print("checking error hander (do this on a non-200)")
+    print "checking error hander (do this on a non-200)"
     try: error_handler(url)
-    except IOError as e:
-        print("exiting - exception will prevent further tests")
+    except IOError, e:
+        print "exiting - exception will prevent further tests"
         sys.exit()
-    print()
-    print("performing continuity test (making sure stuff isn't corrupted)")
+    print
+    print "performing continuity test (making sure stuff isn't corrupted)"
     continuity(url)
-    print()
-    print("performing speed comparison")
+    print
+    print "performing speed comparison"
     comp(N, url)
+    print
+    print "performing dropped-connection check"
+    test_timeout(url)
     
 if __name__ == '__main__':
     import time
@@ -376,6 +627,7 @@
         N = int(sys.argv[1])
         url = sys.argv[2]
     except:
-        print("%s <integer> <url>" % sys.argv[0])
+        print "%s <integer> <url>" % sys.argv[0]
     else:
         test(url, N)
+
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/keepalive-0.1/keepalive.egg-info/PKG-INFO 
new/keepalive-0.5/keepalive.egg-info/PKG-INFO
--- old/keepalive-0.1/keepalive.egg-info/PKG-INFO       2015-11-03 
14:01:42.000000000 +0100
+++ new/keepalive-0.5/keepalive.egg-info/PKG-INFO       2015-11-27 
18:44:33.000000000 +0100
@@ -1,18 +1,24 @@
 Metadata-Version: 1.1
 Name: keepalive
-Version: 0.1
-Summary: urllib keepalive support
+Version: 0.5
+Summary: urllib keepalive support for python
 Home-page: https://github.com/wikier/keepalive
-Author: Sergio Fernandez
+Author: Sergio Fernández
 Author-email: [email protected]
-License: GNU GPL
+License: GNU LGPL
 Download-URL: https://github.com/wikier/keepalive/releases
 Description: An HTTP handler for `urllib2` that supports HTTP 1.1 and 
keepalive.
 Keywords: python http urllib keepalive
 Platform: any
-Classifier: Development Status :: 3 - Alpha
+Classifier: Development Status :: 4 - Beta
 Classifier: Intended Audience :: Developers
+Classifier: Topic :: Software Development :: Libraries :: Python Modules
 Classifier: License :: OSI Approved :: Apache Software License
 Classifier: Operating System :: OS Independent
 Classifier: Programming Language :: Python :: 2
+Classifier: Programming Language :: Python :: 2.6
+Classifier: Programming Language :: Python :: 2.7
 Classifier: Programming Language :: Python :: 3
+Classifier: Programming Language :: Python :: 3.3
+Classifier: Programming Language :: Python :: 3.4
+Classifier: Programming Language :: Python :: 3.5
diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' 
'--exclude=.svnignore' old/keepalive-0.1/setup.py new/keepalive-0.5/setup.py
--- old/keepalive-0.1/setup.py  2015-11-03 14:01:13.000000000 +0100
+++ new/keepalive-0.5/setup.py  2015-11-25 12:44:10.000000000 +0100
@@ -1,14 +1,54 @@
+# -*- coding: utf-8 -*-
+
+#   This library is free software; you can redistribute it and/or
+#   modify it under the terms of the GNU Lesser General Public
+#   License as published by the Free Software Foundation; either
+#   version 2.1 of the License, or (at your option) any later version.
+#
+#   This library is distributed in the hope that it will be useful,
+#   but WITHOUT ANY WARRANTY; without even the implied warranty of
+#   MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+#   Lesser General Public License for more details.
+#
+#   You should have received a copy of the GNU Lesser General Public
+#   License along with this library; if not, write to the 
+#      Free Software Foundation, Inc., 
+#      59 Temple Place, Suite 330, 
+#      Boston, MA  02111-1307  USA
+
+# This file was part of urlgrabber, a high-level cross-protocol url-grabber
+# Copyright 2002-2004 Michael D. Stenner, Ryan Tomayko
+# Copyright 2015 Sergio Fernández
+
 from setuptools import setup
+import sys
+
+try:
+    import six
+    py3 = six.PY3
+except:
+    py3 = sys.version_info[0] >= 3
+
+# metadata
+if py3:
+    import re
+    _version_re = re.compile(r'__version__\s*=\s*"(.*)"')
+    for line in open('keepalive/__init__.py', encoding='utf-8'):
+        version_match = _version_re.match(line)
+        if version_match:
+            _version = version_match.group(1)
+else:
+    import keepalive
+    _version = keepalive.__version__
 
 setup(
       name = 'keepalive',
-      version = 0.1,
-      description = 'urllib keepalive support',
+      version = _version,
+      description = 'urllib keepalive support for python',
       long_description = 'An HTTP handler for `urllib2` that supports HTTP 1.1 
and keepalive.',
-      license = 'GNU GPL',
-      author = "Benjamin Woodruff",
-      author_email = "[email protected]",
-      maintainer = 'Sergio Fernandez',
+      license = 'GNU LGPL',
+      author = "mstenner, rtomayko",
+      maintainer = 'Sergio Fernández',
       maintainer_email = '[email protected]',
       url = 'https://github.com/wikier/keepalive',
       download_url = 'https://github.com/wikier/keepalive/releases',
@@ -17,13 +57,19 @@
       requires = [],
       install_requires = [],
       classifiers =  [
-        'Development Status :: 3 - Alpha',
+        'Development Status :: 4 - Beta',
         'Intended Audience :: Developers',
+        'Topic :: Software Development :: Libraries :: Python Modules',
         'License :: OSI Approved :: Apache Software License',
         'Operating System :: OS Independent',
         'Programming Language :: Python :: 2',
+        'Programming Language :: Python :: 2.6',
+        'Programming Language :: Python :: 2.7',
         'Programming Language :: Python :: 3',
+        'Programming Language :: Python :: 3.3',
+        'Programming Language :: Python :: 3.4',
+        'Programming Language :: Python :: 3.5',
       ],
       keywords = 'python http urllib keepalive',
-      use_2to3 = True,
+      use_2to3 = True
 )


Reply via email to