Hi,

Trying to use LBFGS as the optimizer, do I need to implement feature scaling 
via StandardScaler or does LBFGS do it by default?

Following code  generated error " Failure again!  Giving up and returning, 
Maybe the objective is just poorly behaved ?".

val data = sc.textFile("file:///data/Train/final2.train")
val parsedata = data.map { line =>
val partsdata = line.split(',')
LabeledPoint(partsdata(0).toDouble, Vectors.dense(partsdata(1).split(' 
').map(_.toDouble)))
}

val train = parsedata.map(x => (x.label, 
MLUtils.appendBias(x.features))).cache()

val numCorrections = 10
val convergenceTol = 1e-4
val maxNumIterations = 50
val regParam = 0.1
val initialWeightsWithIntercept = Vectors.dense(new Array[Double](2))

val (weightsWithIntercept, loss) = LBFGS.runLBFGS(train,
  new LeastSquaresGradient(),
  new SquaredL2Updater(),
  numCorrections,
  convergenceTol,
  maxNumIterations,
  regParam,
  initialWeightsWithIntercept)

Did I implement LBFGS for Linear Regression via "LeastSquareGradient()"   
correctly?

Thanks
Tri

Reply via email to