Github user jkbradley commented on a diff in the pull request:
https://github.com/apache/spark/pull/5330#discussion_r27743710
--- Diff:
mllib/src/main/scala/org/apache/spark/mllib/tree/GradientBoostedTrees.scala ---
@@ -195,17 +195,24 @@ object GradientBoostedTrees extends Logging {
baseLearners(0) = firstTreeModel
baseLearnerWeights(0) = 1.0
val startingModel = new GradientBoostedTreesModel(Regression,
Array(firstTreeModel), Array(1.0))
- logDebug("error of gbt = " + loss.computeError(startingModel, input))
+
+ var predError: RDD[(Double, Double)] = GradientBoostedTreesModel.
+ computeInitialPredictionAndError(input, 1.0, firstTreeModel, loss)
+ logDebug("error of gbt = " + predError.values.mean())
// Note: A model of type regression is used since we require raw
prediction
timer.stop("building tree 0")
- var bestValidateError = if (validate) loss.computeError(startingModel,
validationInput) else 0.0
+ var validatePredError: RDD[(Double, Double)] =
GradientBoostedTreesModel.
+ computeInitialPredictionAndError(validationInput, 1.0,
firstTreeModel, loss)
+ var bestValidateError = if (validate) validatePredError.values.mean()
else 0.0
var bestM = 1
// psuedo-residual for second iteration
- data = input.map(point => LabeledPoint(loss.gradient(startingModel,
point),
- point.features))
+ data = predError.zip(input).map {
+ case ((pred, _), point) => LabeledPoint(loss.gradient(pred,
point.label), point.features)
--- End diff --
You can check out Algorithm 1 in this paper: Friedman. "Stochastic Gradient
Boosting." 1999.
Intuitively, we want to minimize the loss, and using the gradient of the
loss gives useful weights indicating how well/poorly we are doing at predicting
each instance. Since the gradient points in the direction of increasing loss,
we want to negate it.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]