[jira] [Commented] (SPARK-10356) MLlib: Normalization should use absolute values

2015-08-30 Thread Sean Owen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-10356?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14721511#comment-14721511
 ] 

Sean Owen commented on SPARK-10356:
---

Exactly. Your code does not compute a 1 norm because you are summing elements 
and not their absolute values. The normalization in Spark is correct. To be 
clear normalization makes the norm 1; you are testing some other condition that 
is not true. 

> MLlib: Normalization should use absolute values
> ---
>
> Key: SPARK-10356
> URL: https://issues.apache.org/jira/browse/SPARK-10356
> Project: Spark
>  Issue Type: Bug
>  Components: MLlib
>Affects Versions: 1.4.1
>Reporter: Carsten Schnober
>  Labels: easyfix
>   Original Estimate: 2h
>  Remaining Estimate: 2h
>
> The normalizer does not handle vectors with negative values properly. It can 
> be tested with the following code
> {code}
> val normalized = new Normalizer(1.0).transform(v: Vector)
> normalizer.toArray.sum == 1.0
> {code}
> This yields true if all values in Vector v are positive, but false when v 
> contains one or more negative values. This is because the values in v are 
> taken immediately without applying {{abs()}},
> This (probably) does not occur for {{p=2.0}} because the values are squared 
> and hence positive anyway.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-10356) MLlib: Normalization should use absolute values

2015-08-30 Thread Carsten Schnober (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-10356?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14721502#comment-14721502
 ] 

Carsten Schnober commented on SPARK-10356:
--

According to 
[[Wikipedia][https://en.wikipedia.org/wiki/Norm_%28mathematics%29#p-norm], each 
value's absolute value should be used to compute the norm:

{||x||_p := (sum(|x|^p)^1/p}

For p = 1, this results in:

{||x||_1 := sum(|x|)}

I suppose the issue is thus actually located in the {norm()} method.


> MLlib: Normalization should use absolute values
> ---
>
> Key: SPARK-10356
> URL: https://issues.apache.org/jira/browse/SPARK-10356
> Project: Spark
>  Issue Type: Bug
>  Components: MLlib
>Affects Versions: 1.4.1
>Reporter: Carsten Schnober
>  Labels: easyfix
>   Original Estimate: 2h
>  Remaining Estimate: 2h
>
> The normalizer does not handle vectors with negative values properly. It can 
> be tested with the following code
> {code}
> val normalized = new Normalizer(1.0).transform(v: Vector)
> normalizer.toArray.sum == 1.0
> {code}
> This yields true if all values in Vector v are positive, but false when v 
> contains one or more negative values. This is because the values in v are 
> taken immediately without applying {{abs()}},
> This (probably) does not occur for {{p=2.0}} because the values are squared 
> and hence positive anyway.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-10356) MLlib: Normalization should use absolute values

2015-08-30 Thread Sean Owen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-10356?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14721494#comment-14721494
 ] 

Sean Owen commented on SPARK-10356:
---

It's not true that the sum of the elements will be 1 after this normalization. 
It's true that the sum of their absolute values will be.

{code}
scala> val v = Vectors.dense(-1.0, 2.0)
v: org.apache.spark.mllib.linalg.Vector = [-1.0,2.0]
scala> new Normalizer(1.0).transform(v)
res2: org.apache.spark.mllib.linalg.Vector = 
[-0.,0.]
{code}

That looks correct. You're not expecting the result to have all positive 
entries right?

> MLlib: Normalization should use absolute values
> ---
>
> Key: SPARK-10356
> URL: https://issues.apache.org/jira/browse/SPARK-10356
> Project: Spark
>  Issue Type: Bug
>  Components: MLlib
>Affects Versions: 1.4.1
>Reporter: Carsten Schnober
>  Labels: easyfix
>   Original Estimate: 2h
>  Remaining Estimate: 2h
>
> The normalizer does not handle vectors with negative values properly. It can 
> be tested with the following code
> {code}
> val normalized = new Normalizer(1.0).transform(v: Vector)
> normalizer.toArray.sum == 1.0
> {code}
> This yields true if all values in Vector v are positive, but false when v 
> contains one or more negative values. This is because the values in v are 
> taken immediately without applying {{abs()}},
> This (probably) does not occur for {{p=2.0}} because the values are squared 
> and hence positive anyway.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org