Hi Arunkumar,

I think L-BFGS will not work since L-BFGS algorithm assumes that the
objective function will be always the same (i.e., the data is the
same) for entire optimization process to construct the approximated
Hessian matrix. In the streaming case, the data will be changing, so
it will cause problem for the algorithm.

Sincerely,

DB Tsai
-------------------------------------------------------
Blog: https://www.dbtsai.com


On Mon, Mar 16, 2015 at 3:19 PM, EcoMotto Inc. <ecomot...@gmail.com> wrote:
> Hello,
>
> I am new to spark streaming API.
>
> I wanted to ask if I can apply LBFGS (with LeastSquaresGradient) on
> streaming data? Currently I am using forecahRDD for parsing through DStream
> and I am generating a model based on each RDD. Am I doing anything logically
> wrong here?
> Thank you.
>
> Sample Code:
>
> val algorithm = new LBFGS(new LeastSquaresGradient(), new SimpleUpdater())
> var initialWeights =
> Vectors.dense(Array.fill(numFeatures)(scala.util.Random.nextDouble()))
> var isFirst = true
> var model = new LinearRegressionModel(null,1.0)
>
> parsedData.foreachRDD{rdd =>
>   if(isFirst) {
>     val weights = algorithm.optimize(rdd, initialWeights)
>     val w = weights.toArray
>     val intercept = w.head
>     model = new LinearRegressionModel(Vectors.dense(w.drop(1)), intercept)
>     isFirst = false
>   }else{
>     var ab = ArrayBuffer[Double]()
>     ab.insert(0, model.intercept)
>     ab.appendAll( model.weights.toArray)
>     print("Intercept = "+model.intercept+" :: modelWeights =
> "+model.weights)
>     initialWeights = Vectors.dense(ab.toArray)
>     print("Initial Weights: "+ initialWeights)
>     val weights = algorithm.optimize(rdd, initialWeights)
>     val w = weights.toArray
>     val intercept = w.head
>     model = new LinearRegressionModel(Vectors.dense(w.drop(1)), intercept)
>   }
>
>
>
> Best Regards,
> Arunkumar

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to