Hi, Marcus

Thanks for your reply. Sorry for forget to  add the cc to mailing list in
last email.

 I think adding another model besides WGAN or SGAN would fulfill that
> requirement.



What do you think which model we can add besides WGAN or SGAN? Do you mean
that add a model associate with  WGAN or SGAN?

what's more, do you think that it is feasible to implement both the WGAN
and SGAN separately?

Thanks

2017-03-19 23:32 GMT+08:00 Marcus Edel <[email protected]>:

> Hello YuLun
>
> welcome and thanks for getting in touch!
>
> I think the WGAN is wonderful, so I want to implement it too. and I'm
> wonder
> that is it full enough for three month's work to just implement one module
> between SGAN and WGAN? but when I want to integrate two modules I found
> there is
> not much in common between them. So I'm not sure what should I do. Can you
> give
> me some advice and guide me what should I do next?
>
>
> It is a really great idea and well written paper. Regarding if
> implementing a
> single model SGAN or WGAN is enough work for GSoC, I don't think so, even
> if you
> like to implement a bunch of different test scenarios. I think adding
> another
> model besides WGAN or SGAN would fulfill that requirement. What do you
> think?
>
> Thanks,
> Marcus
>
> On 19 Mar 2017, at 08:40, YuLun Cai <[email protected]> wrote:
>
> Hello,
>    I am YuLun Cai from China. I am currently in my first year of Master 
> studies. I am interested in participating inGSoC 17 with mlpack in Essential 
> Deep Learning Modules.
>
>    Among the topics given on the wiki page, I am interested in implemening 
> GAN modules. I have done a course in Advance Machine Learning and I've 
> finished the Stanford course "CS231n: Convolutional Neural Networks for 
> Visual Recognition" for self-study, which help me a lot in understand the 
> deep learning.
>
>    I've built the mlpack from source in my own machine successfully, then I 
> look at the source code in the ANN module(the activation_functions, lots of 
> layers and the api in ffn.hpp and rnn.hpp to learn how to build a neural 
> network in mlpack) .
>
>    I also learn to resource about GAN in the GSOC project wiki, I think the 
> "Stacked Generative Adversarial Networks"[1] is interesting, which consists 
> of a top-down stack of GANs and try to invert the hierarchical 
> representations of a discriminative bottom-up deep network to generate images.
>
>    In addition, recently the Wasserstein GAN paper[2] gets a lot of 
> attention, many people think it is excellent:
>    * it proposes a new GAN training algorithm that works well on the common 
> GAN datasets
>    * there is just a little difference between the original GAN and WGAN 
> algorithm
>    * its training algorithm is backed up by theory. it clarifies that  the 
> original GAN sometimes doesn't provide gradient to train when using KL 
> divergence or JS divergence, and prove that through the Wasserstein distance 
> the gradient always can be provided.
>    * In the Wasserstein  GAN, it can train the discriminator to convergence 
> and also can improve the stability of learning, get rid of the mode collapse.
>
>    I think the WGAN is wonderful, so I want to implement it too. and I'm 
> wonder that is it full enough for three month's work to just implement one 
> module between SGAN and WGAN? but when I want to integrate two modules I 
> found there is not much in common between them. So I'm not sure what should I 
> do. Can you give me some advice and guide me what should I do next?
> Thanks
>
> [1] https://arxiv.org/abs/1612.04357
> [2] https://arxiv.org/abs/1701.07875
>
> _______________________________________________
> mlpack mailing list
> [email protected]
> http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack
>
>
>
_______________________________________________
mlpack mailing list
[email protected]
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack

Reply via email to