[mlpack] GSoC Proposal: Graph Convolutional Networks (GCN)
Hello Mentors, Thank you for your reviews regarding the project on the IRC. The've helped me shape the proposal well. The following is the flow of the proposal for Graph Convolutional Networks. I've tried to keep it as concise as possible. We could discuss in depth regarding each or any of the parts of the proposal, if required. Please provide suggestions or any comments for the same. Thanks. *Proposal:* Since Graph Convolutional Networks work on graphs, it would be a good extension to MLpack's breadth. The reference paper link: https://arxiv.org/abs/1609.02907 *Preparatory work:* 1. Explore Sparse Matrices and Multiplication with Dense-Sparse matrices with respect to Armadillo and MLpack 2. Graphs could be represented as Sparse matrices (Adjacency Matrix) so the Input/Output of graph data would not be an issue. *API structure:* The API for GCN would be very similar to MLpack's regular Convolution Operation, Convolution Layer and CNN. I've gone through the source code for Convolution in MLpack repository. *Phase 1:* 1. Implement Graph Convolutional *Operation*: H(l + 1) = σ(D̂-½ Ã D̂-½ H(l) W(l)) [refer the paper: https://arxiv.org/abs/1609.02907] 2. Testing & Documentation for Graph Convolutional *Operation* *Phase 2:* 1. Implement Graph Convolutional *Layer*: Forward(), Backward(), Serialize(), Gradient(), etc. 2. Testing & Documentation for the same. *Phase 3: * 1. Implement Graph Convolutional *Network* in mlpack/models 2. Testing Documenting & Benchmarking of Graph Convolutional *Network*. Benchmarking with respect to tf and keras implementation provided by author of the paper. *Future Scope:*1. Spatio Temporal Graph Networks (ST-GCN, DCRNN) 2. Graph Attention Networks (GAT) *References:* 1. (curated repositories) https://github.com/Jiakui/awesome-gcn 2. (survey paper) https://arxiv.org/abs/1901.00596 *One query:* The Graph datasets available online have different formats like GraphML, Adjacency List in a CSV, Edge List in a CSV, etc. They would have to be converted to Adjacency Matrix. Most of the time the conversion is *straightforward* in O(E) time, where E is the number of edges. What are your views on having this functionality of ToAdjacencyMatrix() in MLpack. I think it should not be in MLpack. We could say that MLpack expects AdjacencyMatrix (Sparse or Dense) for Graph Networks. But if you think that this functionality should exist, I could start implementing it, asap. Thank you once again! Regards, Hemal Mamtora Final Year, Computer Engineering Sardar Patel Institute of Technology, Mumbai Contact: +91 75061 89728 ___ mlpack mailing list mlpack@lists.mlpack.org http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack
[mlpack] GSOC_IDEA_CLARIFICATION
Hello, I am Shivam Behl, a computer engineering undergraduate at Thapar Institute, India. I am not very clear about the meaning of the idea " Application of ANN Algorithms implemented in MLPack ". Does it mean using the pre implemented algorithms to make kaggle-like kernels/mini-projects as it talks about the potential use of MLPack? If so, what factors should we focus on to make it GSOC level? And will deployment as the application/website will be counted? We are expected to do modifications to the MLPack Library for this project, right? Sorry if this feels like a silly question, I am in the process of understanding the library now and studying shortlisting other Ideas. I was not able to completely understand the expectations for this one. Regards, Shivam Behl ___ mlpack mailing list mlpack@lists.mlpack.org http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack
Re: [mlpack] GSoC Proposal for implementing HyperNEAT and es-HyperNEAT
Hey Pranav, It would be great if my NEAT implementation could be extended. I think you should focus on CPPNs and HyperNEAT, since it seems infeasible to also create ES-HyperNEAT in the time left. CPPN's are pretty nice - off the top of my head, it would be cool if we could add something to the website where people could use CPPNs to generate art like on PicBreeder or EndlessForms. It would also be nice to have HyperNEAT. An issue important to address would be how you could use the existing NEAT implementation and extend it to HyperNEAT - what changes would be required, if any. On Sat, Mar 14, 2020 at 2:34 AM PESALADINNE PRANAV REDDY . < f20180...@hyderabad.bits-pilani.ac.in> wrote: > Hey everyone, My name is Pranav Reddy and my idea for GSoC is to implement > HyperNEAT and if time permits es-HyperNEAT as well. I feel like this is a > good idea since as far as I've seen there are so few HyperNEAT > implementations out there. > > All of this would be using the NEAT implementation that was added last > year as HyperNEAT relies on it. HyperNEAT also involves CPPNs which I plan > to implement first. Since CPPNs are very similar to ANNs this shouldn't be > too much of a problem. > Following which I will implement HyperNEAT based off of the paper > http://eplex.cs.ucf.edu/publications/2009/stanley-alife09. For this we > would mainly be applying the NEAT algorithm to a CPPN. I will also be > implementing a user defined substrate as described in the aforementioned > paper. > > On completion of HyperNEAT, if time permits I would also like to implement > Evolvable Substrate HyperNEAT() as it builds off of HyperNEAT directly. For > this, the substrate would also have to evolve with each generation. Further > details can be found in this paper: > http://eplex.cs.ucf.edu/publications/2012/risi-alife12. I will only > complete this if there is time of course but I hope that I am able to. > > Of course testing is also a very important part and I will test each > method in the following ways: > CPPN : > I think the best test for this would be creating images using CPPNs to > view spatial patterns such as bilateral symmetry, imperfect symmetry, > repetition with variation, etc. as can be seen here : > http://picbreeder.org/. > HyperNeat : > For now my idea is to test this using the visual discrimination experiment > in the paper http://eplex.cs.ucf.edu/publications/2009/stanley-alife09. > If I can think of a better experiment or if anyone has any suggestions I > will do that. > es-HyperNEAT: > As of yet, I have not been able to find any experiment that does not > involve using robots in a controlled environment so any suggestions for > this test would be greatly appreciated. > > Another reason I think this project would be appropriate is that it is a > very sequential project which will result in at least something solid being > merged into the codebase in case everything planned is not completed on > time. I will provide a more detailed phase by phase implementation > hopefully in a few days for the same. > Any suggestions are greatly appreciated. Also sorry if it was a long read. > Thanks in advance. > > > > ___ > mlpack mailing list > mlpack@lists.mlpack.org > http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack > ___ mlpack mailing list mlpack@lists.mlpack.org http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack
[mlpack] GSoC 2020 Proposal Discussion
Hi everyone, I'm going to be applying for GSoC this summer and my proposal centers around extending the Reinforcement Learning module of mlpack. In summary, what I propose to do is: 1) Rainbow DQN 3 of its 6 components are already implemented in mlpack, leaving Dueling, Distributional and Noisy DQN. Since these are extensions to the standard DQN which is already implemented, I estimate that implementing these 3 DQN's, and then Rainbow itself, along with documentation and tests should take around 6 weeks. 2) Actor Critic Models and algorithms If I'm not mistaken, currently mlpack does not implement Actor-Critic models, though there is an issue open (#2262). If no one has implemented it by then, I propose laying the foundation by implementing the basic architecture, and if time permits, add one of the more state-of-the-art algorithms for training it (e.g. A2C). On the other hand, if it is implemented, I will stick to implementing extensions to the AC model (I'm currently deliberating between A2C, Soft Actor-Critic and Optimistic Actor-Critic) in the remaining 6 weeks. I'd like to know if the timeline I've proposed is realistic, if all assumptions are correct, and all algorithms mentioned are of relevance to mlpack. Thanks for your time! Yours Sincerely, Sriram S K ___ mlpack mailing list mlpack@lists.mlpack.org http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack