Github user mengxr commented on the pull request:
https://github.com/apache/spark/pull/582#issuecomment-42623415
LGTM. Thanks!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/582
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabl
Github user pwendell commented on the pull request:
https://github.com/apache/spark/pull/582#issuecomment-42623759
Thanks, merged.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user mengxr commented on the pull request:
https://github.com/apache/spark/pull/582#issuecomment-42257123
@dbtsai Could you please update the `mllib-optimization.md` and include an
example of L-BFGS?
---
If your project is set up for it, you can reply to this email and have yo
Please remove itthere is no stochastic bfgswe will put an admm
wrapper over bfgs which has better optimization properties than sgd
On Apr 28, 2014 10:34 PM, "mengxr" wrote:
> Github user mengxr commented on the pull request:
>
> https://github.com/apache/spark/pull/582#issuecomme
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/582#issuecomment-41753419
All automated tests passed.
Refer to this link for build results:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/14577/
---
If your project
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/582#issuecomment-41753418
Merged build finished. All automated tests passed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/582#issuecomment-41751530
Merged build triggered.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not ha
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/582#issuecomment-41751535
Merged build started.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user dbtsai commented on the pull request:
https://github.com/apache/spark/pull/582#issuecomment-41751464
Make sense from the inverse of hessian point of view. Just remove it!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user mengxr commented on the pull request:
https://github.com/apache/spark/pull/582#issuecomment-41749238
I prefer removing the miniBatchFraction. Those quasi-Newton methods
approximate the inverse of Hessian. It doesn't make sense if the gradients are
computed from a varying o
Github user dbtsai commented on the pull request:
https://github.com/apache/spark/pull/582#issuecomment-41740842
@mengxr Just did some hack on trying to implement the right "stochastic"
L-BFGS, and it kind of works as long as we don't change the objective function.
But there is no go
Github user mengxr commented on the pull request:
https://github.com/apache/spark/pull/582#issuecomment-41643015
I think it is good to remove `miniBatchFraction` from `LBFGS`'s params in
this PR, unless someone has a good understanding of the behavior of
"stochastic" L-BFGS.
---
If
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/582#issuecomment-41623307
All automated tests passed.
Refer to this link for build results:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/14541/
---
If your project
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/582#issuecomment-41623306
Merged build finished. All automated tests passed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/582#issuecomment-41619500
Merged build triggered.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not ha
Github user AmplabJenkins commented on the pull request:
https://github.com/apache/spark/pull/582#issuecomment-41619512
Merged build started.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
GitHub user dbtsai opened a pull request:
https://github.com/apache/spark/pull/582
[SPARK-1157][MLlib] Bug fix: lossHistory should be monotonically decresing
Instead of recording the loss in the costFun for each time that optimizer
calls costFun, we get the loss from the api provide
18 matches
Mail list logo