I wouldn't the discount the performance impact on real world benchmarks for 
these functions. Just to name a couple of examples:


  *   7x speed up of np.exp and np.log results in a 2x speed up of training 
neural networks like logistic regression [1]. I would expect np.tanh will show 
similar results for neural networks.
  *   Vectorizing even simple functions like np.maximum results in a 1.3x speed 
up of sklearn's Kmeans algorithm [2]

Raghuveer

[1] https://github.com/numpy/numpy/pull/13134
[2] https://github.com/numpy/numpy/pull/14867
_______________________________________________
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com

Reply via email to