On Jan 26, 2008 11:14 PM, Raymond Hettinger <[EMAIL PROTECTED]> wrote: > The idea that programmers are confused by int(3.7)-->3 may not be nuts, but > it doesn't match any experience I've had with any > programmer, ever.
In C, I'm pretty sure I've seen people write (long)x where they'd have been better off with lround(x). They know that the cast truncates, and generally that they actually wanted to round, but the type they wanted to get to was foremost in their mind, so they didn't bother to think about it a little and write what they really wanted. It's not that they're confused; it's that they're accepting a default that shouldn't be a default. Your other points seem to have been answered already, although people will disagree on how compelling the answers are, so I won't repeat them here. -- Namasté, Jeffrey Yasskin _______________________________________________ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com