Github user mengxr commented on a diff in the pull request:

    https://github.com/apache/spark/pull/4279#discussion_r23832753
  
    --- Diff: 
mllib/src/main/scala/org/apache/spark/mllib/linalg/distributed/BlockMatrix.scala
 ---
    @@ -172,6 +174,36 @@ class BlockMatrix(
         assert(cols <= nCols, s"The number of columns $cols is more than 
claimed $nCols.")
       }
     
    +  def validate(): Unit = {
    +    logDebug("Validating BlockMatrix...")
    +    // check if the matrix is larger than the claimed dimensions
    +    estimateDim()
    +    logDebug("BlockMatrix dimensions are okay...")
    +
    +    // Check if there are multiple MatrixBlocks with the same index.
    +    val indexCounts = blockInfo.countByKey().foreach { case (key, cnt) =>
    +      if (cnt > 1) {
    +        throw new SparkException(s"There are MatrixBlocks with duplicate 
indices. Please remove " +
    +          s"blocks with duplicate indices. You may call reduceByKey on the 
underlying RDD and " +
    +          s"sum the duplicates. You may convert the matrices to Breeze 
before summing them up.")
    --- End diff --
    
    You have `s"..."` but no variable is used inside. We should at least put 
`key` to the error message. For the error message, it should be sufficient to 
say "Found duplicate block coordinate: $key.". `sum the duplicates` is just our 
guess, which may confuse users if they should do something else. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to