All, I am excited to announce the release of MyGrad 2.0.
MyGrad's primary goal is to make automatic differentiation accessible and easy to use across the NumPy ecosystem (see [1] for more detailed comments). Source: https://github.com/rsokl/MyGrad Docs: https://mygrad.readthedocs.io/en/latest/ MyGrad's only dependency is NumPy and (as of version 2.0) it makes keen use of NumPy's excellent protocols for overriding functions and ufuncs. Thus you can "drop in" a mygrad-tensor into your pure NumPy code and compute derivatives through it. Ultimately, MyGrad could be extended to bring autodiff to other array-based libraries like CuPy, Sparse, and Dask. For full release notes see [2]. Feedback, critiques, and ideas are welcome! Cheers, Ryan Soklaski [1] MyGrad is not meant to "compete" with the likes of PyTorch and JAX, which are fantastically-fast and powerful autodiff libraries. Rather, its emphasis is on being lightweight and seamless to use in NumPy-centric workflows. [2] https://mygrad.readthedocs.io/en/latest/changes.html#v2-0-0
_______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@python.org https://mail.python.org/mailman/listinfo/numpy-discussion