Yhojann Aguilera added the comment:
For big files (like as >= 1gb) can not load the all string on memory, need use
a file stream using open().
--
___
Python tracker
<https://bugs.python.org/issu
Yhojann Aguilera added the comment:
Thanks, works fine, but anyway why not give the option to work binary? the
delimiters can be represented with binary values. In python it is difficult to
autodetect the encoding of characters in a file
New submission from Yhojann Aguilera :
Unable parse a csv with latin iso charset.
with open('./exported.csv', newline='') as csvFileHandler:
csvHandler = csv.reader(csvFileHandler, delimiter=';',
quotechar='"')
for line in c
Yhojann Aguilera added the comment:
I hope that when an error occurs, python tells me what the problem is. The
abort core error is a problem at a lower level than python because python is
not able to recognize or handle the error.
The main problem is that I exceeded the maximum number of
Yhojann Aguilera added the comment:
Same problem using Python 3.6.8 on Ubuntu 18.04 LTS.
For now, solve this using
LD_PRELOAD=libgcc_s.so.1 python3 ...
For more details and pocs: https://github.com/WHK102/wss/issues/2
--
nosy: +Yhojann Aguilera
versions: +Python 3.6
New submission from Yhojann Aguilera :
The functios like as PyUnicode_FromString use a printf format in char array
argument. Example: PyUnicode_FromString("a%22b"); in module interprete the %22
as 22 blank spaces. A double quote in module add a backslash. Poc:
Y try send a string f