[
https://issues.apache.org/jira/browse/SPARK-10408?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15351922#comment-15351922
]
Alexander Ulanov commented on SPARK-10408:
------------------------------------------
Here is the PR https://github.com/apache/spark/pull/13621
> Autoencoder
> -----------
>
> Key: SPARK-10408
> URL: https://issues.apache.org/jira/browse/SPARK-10408
> Project: Spark
> Issue Type: Umbrella
> Components: ML
> Affects Versions: 1.5.0
> Reporter: Alexander Ulanov
> Assignee: Alexander Ulanov
> Priority: Minor
>
> Goal: Implement various types of autoencoders
> Requirements:
> 1)Basic (deep) autoencoder that supports different types of inputs: binary,
> real in [0..1]. real in [-inf, +inf]
> 2)Sparse autoencoder i.e. L1 regularization. It should be added as a feature
> to the MLP and then used here
> 3)Denoising autoencoder
> 4)Stacked autoencoder for pre-training of deep networks. It should support
> arbitrary network layers
> References:
> 1. Vincent, Pascal, et al. "Extracting and composing robust features with
> denoising autoencoders." Proceedings of the 25th international conference on
> Machine learning. ACM, 2008.
> http://www.iro.umontreal.ca/~vincentp/Publications/denoising_autoencoders_tr1316.pdf
>
> 2.
> http://machinelearning.wustl.edu/mlpapers/paper_files/ICML2011Rifai_455.pdf,
> 3. Vincent, P., Larochelle, H., Lajoie, I., Bengio, Y., and Manzagol, P.-A.
> (2010). Stacked denoising autoencoders: Learning useful representations in a
> deep network with a local denoising criterion. Journal of Machine Learning
> Research, 11(3371–3408).
> http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.297.3484&rep=rep1&type=pdf
> 4, 5, 6. Bengio, Yoshua, et al. "Greedy layer-wise training of deep
> networks." Advances in neural information processing systems 19 (2007): 153.
> http://www.iro.umontreal.ca/~lisa/pointeurs/dbn_supervised_tr1282.pdf
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]