On Tue, Jun 16, 2020 at 9:30 PM Victor Stinner <vstin...@python.org> wrote:
>
> Le mar. 16 juin 2020 à 10:42, Inada Naoki <songofaca...@gmail.com> a écrit :
> > Hmm,  Is there any chance to add DeprecationWarning in Python 3.9?
>
> In my experience, more and more projects are running their test suite
> with -Werror, which is a good thing. Introducing a new warning is
> likely to "break" many of these projects. For example, in Fedora, we
> run the test suite when we build a package. If a test fails, the
> package build fails and we have to decide to either ignore the failing
> tests (not good) or find a solution to repair the tests (update the
> code base to new C API functions).
>

But Python 3.9 is still in beta phase and we have enough time to get feedback.
If the new warning is unacceptable breakage, we can remove it in RC phase.

>
> > It is an interesting idea, but I think it is too complex.
> > Fixing all packages in the PyPI would be a better approach.
>
> It's not the first time that we have to take such decision. "Fixing
> all PyPI packages" is not possible. Python core developers are limited
> are so we can only port a very low number of packages. Breaking
> packages on purpose force developers to upgrade their code base, it
> should work better than deprecation warnings. But it is likely to make
> some people unhappy.
>

OK, My terminology was wrong.  Not all, but almost of living packages.

* This change doesn't affect to pure Python packages.
* Most of the rest uses Cython.  Since I already report an issue to Cython,
  regenerating with new Cython release fixes them.
* Most of the rest support PEP 393 already.

So I expect only few percents of active packages will be affected.

This is a list of use of deprecated APIs from the top 4000 packages,
except PyArg_ParseTuple(AndKeywords).
Files generated by Cython are excluded.  But most of them are false
positives yet (e.g. in `#if PY2`).
https://github.com/methane/notes/blob/master/2020/wchar-cache/deprecated-use

I have filed some issues and sent some pull requests already after I
created this thread.

> Having a separated hash table would prevent to break many PyPI
> packages by continuing to provide the backward compatibility. We can
> even consider to disable it by default, but provide a temporary option
> to opt-in for backward compatibility. For example, "python3.10 -X
> unicode_compat".
>
> I proposed sys.set_python_compat_version(version) in the rejected PEP
> 606, but this PEP was too broad:
> https://www.python.org/dev/peps/pep-0606/
>
> The question is if it's worth it to pay the maintenance burden on the
> Python side, or to drop backward compatibility if it's "too
> expensive".
>
> I understood that your first motivation is to reduce PyASCIObject
> structure size. Using a hash table, the overhead would only be paid by
> users of the deprecated functions. But it requires to keep the code
> and so continue to maintain it. Maybe I missed some drawbacks.
>

Memory usage is the most important motivation.  But runtime cost of
PyUnicode_READY and maintenance cost of legacy unicode matters too.

I will reconsider your idea.  But I still feel that helping many third
parties is the most constructive way.

Regards,

-- 
Inada Naoki  <songofaca...@gmail.com>
_______________________________________________
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/KBBR2KQPNKSPQIPR5UKW2ALM3QGNDBEU/
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to