On 4/9/2013 7:41 AM, cab...@gmail.com wrote:
Hi,
I have been using Java/Perl professionally for many years and have been trying
to learn python3 recently. As my first program, I tried writing a class for a
small project, and I am having really hard time understanding exception
handling in urllib and in python in general...
Basically, what I want to do is very simple,
Very funny ;-). What you are trying to do, as your first project, is
interact with the large, multi-layered, non=deterministic monster known
as Internet, with timeout handling, through multiple layers of library
code. When it comes to exception handling, this is about the most
complex thing you can do.
try to fetch something "tryurllib.request.urlopen(request)", and:
- If request times out or connection is reset, re-try n times
- If it fails, return an error
- If it works return the content.
But, this simple requirement became a nightmare for me. I am really confused
about how I should be checking this because:
- When connection times out, I sometimes get URLException with "reason"
field set to socket.timeout, and checking (isinstance(exception.reason, socket.timeout))
works fine
- But sometimes I get socket.timeout exception directly, and it has no
"reason" field, so above statement fails, since there is no reason field there.
If you are curious why the different exceptions for seemingly the same
problem, you can look at the printed traceback to see where the
different exceptions come from. Either don't catch the exceptions,
re-raise them, or explicitly grab the traceback (from exc_info, I believe)
- Connection reset is a totally different exception
- Not to mention, some exceptions have msg / reason / errno fields but some
don't, so there is no way of knowing exception details unless you check them
one by one. The only common thing I could was to find call __str__()?
The system is probably a bit more ragged then it might be if completely
re-designed from scratch.
- Since, there are too many possible exceptions, you need to catch
BaseException (I received URLError, socket.timeout, ConnectionRefusedError,
ConnectionResetError, BadStatusLine, and none share a common parent). And,
catching the top level exception is not a good thing.
You are right, catching BaseException is bad. In particular, it will
catch KeyboardInterrupt from a user trying to stop the process. It is
also unnecessary as all the exceptions you want to catch are derived
from Exception, which itself is derived from BaseException.
So, I ended up writing the following, but from everything I know, this looks
really ugly and wrong???
try:
response = urllib.request.urlopen(request)
content = response.read()
except BaseException as ue:
except Exception as ue:
if (isinstance(ue, socket.timeout) or (hasattr(ue, "reason") and
isinstance(ue.reason, socket.timeout)) or isinstance(ue, ConnectionResetError)):
print("REQUEST TIMED OUT")
or, something like:
except:
except Exception:
(a1,a2,a3) = sys.exc_info()
errorString = a2.__str__()
if ((errorString.find("Connection reset by peer") >= 0) or
(errorString.find("error timed out") >= 0)):
--
Terry Jan Reedy
--
http://mail.python.org/mailman/listinfo/python-list