srowen commented on a change in pull request #21632: [SPARK-19591][ML][MLlib]
Add sample weights to decision trees
URL: https://github.com/apache/spark/pull/21632#discussion_r243595499
##########
File path: mllib/src/main/scala/org/apache/spark/ml/tree/impl/BaggedPoint.scala
##########
@@ -33,13 +33,20 @@ import org.apache.spark.util.random.XORShiftRandom
* this datum has 1 copy, 0 copies, and 4 copies in the 3 subsamples,
respectively.
*
* @param datum Data instance
- * @param subsampleWeights Weight of this instance in each subsampled dataset.
- *
- * TODO: This does not currently support (Double) weighted instances. Once
MLlib has weighted
- * dataset support, update. (We store subsampleWeights as Double for
this future extension.)
+ * @param subsampleCounts Number of samples of this instance in each
subsampled dataset.
+ * @param sampleWeight The weight of this instance.
*/
-private[spark] class BaggedPoint[Datum](val datum: Datum, val
subsampleWeights: Array[Double])
- extends Serializable
+private[spark] class BaggedPoint[Datum](
+ val datum: Datum,
+ val subsampleCounts: Array[Int],
+ val sampleWeight: Double) extends Serializable {
Review comment:
Consider adding a constructor that specifies a sampleWeight of 1.0 so you
don't have to set it everywhere you call it with 1.0
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]