zhipeng93 commented on a change in pull request #73:
URL: https://github.com/apache/flink-ml/pull/73#discussion_r834119649
##########
File path: flink-ml-core/src/main/java/org/apache/flink/ml/linalg/BLAS.java
##########
@@ -32,9 +32,31 @@ public static double asum(DenseVector x) {
}
/** y += a * x . */
- public static void axpy(double a, DenseVector x, DenseVector y) {
+ public static void axpy(double a, Vector x, DenseVector y) {
Preconditions.checkArgument(x.size() == y.size(), "Vector size
mismatched.");
- JAVA_BLAS.daxpy(x.size(), a, x.values, 1, y.values, 1);
+ if (x instanceof SparseVector) {
+ axpy(a, (SparseVector) x, y);
+ } else {
+ axpy(a, (DenseVector) x, y);
+ }
+ }
+
+ /** Computes the hadamard product of the two vectors (y = y \hdot x). */
+ public static void hDot(Vector x, Vector y) {
Review comment:
Good question! There has also been a discussion of the reason that BLAS
does not provide a hadamard dot [1] --- Mostly saying that there are no
performance gains.
However, I'd like to say that introducing `hDot` here is simply for
convienice. I need to use it twice.
By the way, we can replace the logic in MinMaxScaler if we have this method.
This will certainly simplify the implementation.
What do you think?
[1] https://github.com/xianyi/OpenBLAS/issues/1083
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]