Github user dbtsai commented on a diff in the pull request:
https://github.com/apache/spark/pull/3462#discussion_r20970806
--- Diff: mllib/src/main/scala/org/apache/spark/mllib/linalg/Vectors.scala
---
@@ -261,6 +261,57 @@ object Vectors {
sys.error("Unsupported Breeze vector type: " + v.getClass.getName)
}
}
+
+ /**
+ * Returns the p-norm of this vector.
+ * @param vector input vector.
+ * @param p norm.
+ * @return norm in L^p^ space.
+ */
+ private[spark] def norm(vector: Vector, p: Double): Double = {
+ require(p >= 1.0)
+ val values = vector match {
+ case dv: DenseVector => dv.values
+ case sv: SparseVector => sv.values
+ case v => throw new IllegalArgumentException("Do not support vector
type " + v.getClass)
+ }
+ val size = values.size
+
+ if (p == 1) {
--- End diff --
ha~ It only works if I change type from `Double` to `Int`. See the oracle
doc you referenced `The Java Virtual Machine's tableswitch and lookupswitch
instructions operate only on int data. Because operations on byte, char, or
short values are internally promoted to int, a switch whose expression
evaluates to one of those types is compiled as though it evaluated to type int.`
With
```scala
def fun1(p: Int) = {
(p: @switch) match {
case 1 => 1
case 2 => 2
case _ => p
}
}
```
I got
```
public fun1(I)I
L0
LINENUMBER 147 L0
ILOAD 1
ISTORE 2
ILOAD 2
TABLESWITCH
1: L1
2: L2
default: L3
L3
LINENUMBER 150 L3
FRAME APPEND [I]
ILOAD 1
GOTO L4
L2
LINENUMBER 149 L2
FRAME SAME
ICONST_2
GOTO L4
L1
LINENUMBER 148 L1
FRAME SAME
ICONST_1
L4
LINENUMBER 147 L4
FRAME SAME1 I
IRETURN
L5
LOCALVARIABLE this Lorg/apache/spark/mllib/stat/Test$; L0 L5 0
LOCALVARIABLE p I L0 L5 1
MAXSTACK = 1
MAXLOCALS = 3
```
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]