Sworddragon added the comment:
I have extended the benchmark a little and here are my new results:
concatenate_string() : 0.037489
concatenate_bytes(): 2.920202
concatenate_bytearray(): 0.157311
concatenate_string_io(): 0.035397
concatenate_bytes_io()
R. David Murray added the comment:
Please take these observations and questions to python-list. They aren't
really appropriate for the bug tracker. We aren't going to add the
optimization shortcut for bytes unless someone does a bunch of convincing on
python-ideas, which seems unlikely (but
Sworddragon added the comment:
We aren't going to add the optimization shortcut for bytes
There is still the question: Why isn't this going to be optimized?
--
___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue19801
New submission from Sworddragon:
In the attachments is a testcase which does concatenate 10 times a string
and than 10 times a bytes object. Here is my result:
sworddragon@ubuntu:~/tmp$ ./test.py
String: 0.03165316581726074
Bytes : 0.5805566310882568
--
components: Benchmarks
R. David Murray added the comment:
It is definitely not a good idea to rely on that optimization of += for string.
Obviously bytes doesn't have the same optimization. (String didn't either for
a while in Python3, and there was some controversy around adding it back
exactly because one
Changes by R. David Murray rdmur...@bitdance.com:
--
type: behavior - performance
___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue19801
___
___
Antoine Pitrou added the comment:
Indeed. If you want to concatenate a lot of bytes objects efficiently, there
are three solutions:
- concatenate to a bytearray
- write to a io.BytesIO object
- use b''.join to concatenate all objects at once
--
nosy: +pitrou
Changes by Benjamin Peterson benja...@python.org:
--
resolution: - wont fix
status: open - closed
___
Python tracker rep...@bugs.python.org
http://bugs.python.org/issue19801
___