On 18/11/2021 19:07, Stefan van der Walt wrote:
if we do this, we should probably go through each of the 200+ open PRs (or, at
least, the non-conflicted ones), apply the formatter, and then squash the PR
into a single commit. We can do that by script.
We had to deal with this issue in
Yes, there is also lots of such spam on other @python.org mailing lists
lately.
I sent a message to postmaster / python.org about it earlier today. Will
get back once there is a response.
--
Roman
On 29/09/2021 11:28, Andras Deak wrote:
On Wed, Sep 29, 2021 at 11:15 AM Ralf Gommers
For the first benchmark apparently A.dot(B) with A real and B complex is
a known issue performance wise https://github.com/numpy/numpy/issues/10468
In general, it might be worth trying different BLAS backends. For
instance, if you install numpy from conda-forge you should be able to
switch
One issue with adding a tolerance to np.unique for floats is say you have
[0, 0.1, 0.2, 0.3, 0.4, 0.5] with atol=0.15
Should this return a single element or multiple ones? One once side each
consecutive float is closer than the tolerance to the next one but the
first one and the last one are
.
To add to the previously listed projects that would benefit from this,
we are currently considering to start using some (minimal) type
annotations in scikit-learn.
--
Roman Yurchak
On 24/03/2020 18:00, Stephan Hoyer wrote:
When we started numpy-stubs [1] a few years ago, putting type
annotations