** Description changed:

  Ubuntu version: Ubuntu 14.04.1 LTS
  python-urllib3 version: 1.7.1-1build1
  
  Steps to reproduce:
  
  1. set up an http proxy
  2. configure a ProxyManager to use said http proxy
- 3. make successive GET requests to https://pypi.python.org/ 
+ 3. make successive GET requests to https://pypi.python.org/
  
  example script: https://gist.github.com/stratoukos/7545c5c909fa9b5d1cfb
  
  What happens:
  
  urllib3 stops using the http proxy after the connection is dropped
  (after 24 requests in my testing with pypi. other people have seen
  different numbers)
  
  What I expected to happen:
  
  urllib3 should always use the proxy
  
  Other Info:
  
  This has been fixed in commit 1c30a1f3 of urllib3 and included in its
  1.8.3 release. This bug also affects pip and requests as reported here:
  https://github.com/pypa/pip/issues/1805
  
  I really hope the bugfix will be backported, since pip is currently
  unusable behind an outgoing firewall on 14.04
+ 
+ [Impact]
+ urllib3 stops using the proxy after a connection is dropped making users of
+ python-urllib3 (such as pip) that are behind a firewall unable to connect to 
external sites. 
+ 
+ [Test Case]
+ 1. Start a trusty VM
+ 2. Get the test script ($ wget 
https://gist.githubusercontent.com/stratoukos/7545c5c909fa9b5d1cfb/raw/456381dff95d503818d35c393e71ec0272ab08d3/gistfile1.py
 -O test.py)
+ 3. Install Squid ($ apt-get install squid)
+ 4. Install python-urllib3 if it's not installed ($ apt-get install 
python-urllib3)
+ 5. Block outgoing connections (
+     $ sudo iptables -A OUTPUT -m owner --uid-owner root -j ACCEPT
+     $ sudo iptables -A OUTPUT -m owner --uid-owner proxy -j ACCEPT
+     $ sudo iptables -A OUTPUT -p tcp --dport 80 -j DROP
+     $ sudo iptables -A OUTPUT -p tcp --dport 443 -j DROP
+ )
+ 6. Run the test script ($ python test.py)
+ 7. In another terminal, tail the squid log ($ sudo tailf 
/var/log/squid3/access.log)
+ 
+ With python-urllib3 1.7.1-1build1, one would see a connection timeout after 24
+ or so requests, while with the backported package, one sees hits in the proxy
+ log meaning the requests are going through the proxy after a reconnect.
+ 
+ [Regression Potential]
+ 
+ The fix was released on 1.8.3 and the current upstream code is on 1.9.1 so the
+ potential for the upstream fix code to be wrong is minimal. That said, the
+ current upstream code moved some modules around and the backport is an attempt
+ to recreate the intent of the patch in the old codebase available on Trusty
+ making the mininmal amount of changes necessary to get things working.

-- 
You received this bug notification because you are a member of Ubuntu
Bugs, which is subscribed to Ubuntu.
https://bugs.launchpad.net/bugs/1412545

Title:
  proxy isn't used after a dropped connection

To manage notifications about this bug go to:
https://bugs.launchpad.net/ubuntu/+source/python-urllib3/+bug/1412545/+subscriptions

-- 
ubuntu-bugs mailing list
[email protected]
https://lists.ubuntu.com/mailman/listinfo/ubuntu-bugs

Reply via email to