John Machin [EMAIL PROTECTED] wrote:
1. Robustness: Both versions will crash (in the sense of an unhandled
2. Efficiency: I don't see the disk I/O inefficiency in calling
3. Don't itemise perceived flaws in other people's postings. It may
give off a hostile impression.
1. Robustness: Both
Anders J. Munch wrote:
Another way is the strategy of it's easier to ask forgiveness than to
ask permission.
If you replace:
if(not os.path.isdir(zfdir)):
os.makedirs(zfdir)
with:
try:
os.makedirs(zfdir)
except EnvironmentError:
pass
then not only will your
Anders J. Munch wrote:
Another way is the strategy of it's easier to ask forgiveness than
to
ask permission.
If you replace:
if(not os.path.isdir(zfdir)):
os.makedirs(zfdir)
with:
try:
os.makedirs(zfdir)
except EnvironmentError:
pass
then not only
On Sat, 1 Jan 2005 14:20:06 +0100, Anders J. Munch
[EMAIL PROTECTED] wrote:
One of the posters inspired me to do profiling on my newbie script
(pasted below). After measurements I have found that the speed
of Python, at least in the area where my script works, is surprisingly
high.
Pretty
Bulba! [EMAIL PROTECTED] wrote:
One of the posters inspired me to do profiling on my newbie script
(pasted below). After measurements I have found that the speed
of Python, at least in the area where my script works, is surprisingly
high.
Pretty good code for someone who calls himself a
On Fri, 2004-12-31 at 11:17, Jeremy Bowers wrote:
I would point out a couple of other ideas, though you may be aware of
them: Compressing all the files seperately, if they are small, may greatly
reduce the final compression since similarities between the files can not
be exploited.
True;
Craig Ringer wrote:
On Fri, 2004-12-31 at 11:17, Jeremy Bowers wrote:
I would point out a couple of other ideas, though you may be aware of
them: Compressing all the files seperately, if they are small, may greatly
reduce the final compression since similarities between the files can not
be
On Fri, 31 Dec 2004 13:19:44 +0100, Reinhold Birkenfeld
[EMAIL PROTECTED] wrote:
True; however, it's my understanding that compressing individual files
also means that in the case of damage to the archive it is possible to
recover the files after the damaged file. This cannot be guaranteed
On Thu, 30 Dec 2004 22:17:10 -0500, Jeremy Bowers [EMAIL PROTECTED]
wrote:
I would point out a couple of other ideas, though you may be aware of
them: Compressing all the files seperately, if they are small, may greatly
reduce the final compression since similarities between the files can not
be
Bulba! [EMAIL PROTECTED] writes:
The only thing I'm missing in this picture is knowledge if my script
could be further optimised (not that I actually need better
performance, I'm just curious what possible solutions could be).
Any takers among the experienced guys?
There's another
One of the posters inspired me to do profiling on my newbie script
(pasted below). After measurements I have found that the speed
of Python, at least in the area where my script works, is surprisingly
high.
This is the experiment: a script recreates the folder hierarchy
somewhere else and stores
On Fri, 31 Dec 2004 01:41:13 +0100, Bulba! wrote:
One of the posters inspired me to do profiling on my newbie script (pasted
below). After measurements I have found that the speed of Python, at least
in the area where my script works, is surprisingly high.
This is the experiment: a script
12 matches
Mail list logo