New submission from Oliver Margetts <oliver.marge...@gmail.com>:

Creating large enums takes a significant amount of time. Moreover this appears 
to be nonlinear in the number of entries in the enum. Locally, importing a 
single python file and taking this to the extreme:

 1000 entries - 0.058s
10000 entries - 4.327s

This is partially addressed by https://bugs.python.org/issue38659 and I can 
confirm that using `@_simple_enum` does not have this problem. But it seems 
like that API is only intended for internal use and the 'happy path' for 
user-defined enums is still not good.

Note that it is not simply parsing the file / creating the instances, it is to 
do with the cardinality. Creating 100 enums with 100 entries each is far faster 
than a single 10000 entry enum.

----------
files: huge.py
messages: 403512
nosy: olliemath
priority: normal
severity: normal
status: open
title: Enum creation non-linear in the number of values
type: performance
versions: Python 3.10, Python 3.11, Python 3.7
Added file: https://bugs.python.org/file50332/huge.py

_______________________________________
Python tracker <rep...@bugs.python.org>
<https://bugs.python.org/issue45417>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to