Github user GeorgeDittmar commented on a diff in the pull request:
https://github.com/apache/spark/pull/6112#discussion_r31382733
--- Diff: mllib/src/main/scala/org/apache/spark/mllib/linalg/Vectors.scala
---
@@ -717,6 +719,49 @@ class SparseVector(
new SparseVector(size, ii, vv)
}
}
+
+ override def argmax: Int = {
+ if (size == 0) {
+ -1
+ } else {
+
+ //grab first active index and value by default
+ var maxIdx = indices(0)
+ var maxValue = values(0)
+
+ foreachActive { (i, v) =>
+ if (v > maxValue) {
+ maxIdx = i
+ maxValue = v
+ }
+ }
+
+ // look for inactive values incase all active node values are
negative
+ if(size != values.size && maxValue <= 0){
--- End diff --
So are you thinking of the case where we have an inactive value thats set
to something like 1? I dont think the api allows you to do that. My
understanding of this case is that we will return idx=0 if 0 is the only max
value found. Its technically correct since that active zero happens at the very
beginning of the vector. I dont think we skip it due to the fact that someone
decided to create a sparse vector with an active zero value. I am pretty sure i
cover this case in my unit tests but I'll go back to the code real quick to
double check.
Also no worries. Better to find bugs than not right? lol.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]