Re: [DISCUSS] Proposing MXNet for the Apache Incubator
Please sign me up as a committer - I've been working with Mu at work on MXNet (Amazon) and would love to get more involved in the project. Github ID: jspisak On 2017-01-05 21:12 (-0800), Henri Yandell wrote: > Hello Incubator, > > I'd like to propose a new incubator Apache MXNet podling. > > The existing MXNet project (http://mxnet.io - 1.5 years old, 15 committers, > 200 contributors) is very interested in joining Apache. MXNet is an > open-source deep learning framework that allows you to define, train, and > deploy deep neural networks on a wide array of devices, from cloud > infrastructure to mobile devices. > > The wiki proposal page is located here: > > https://wiki.apache.org/incubator/MXNetProposal > > I've included the text below in case anyone wants to focus on parts of it > in a reply. > > Looking forward to your thoughts, and for lots of interested Apache members > to volunteer to mentor the project in addition to Sebastian and myself. > > Currently the list of committers is based on the current active coders, so > we're also very interested in hearing from anyone else who is interested in > working on the project, be they current or future contributor! > > Thanks, > > Hen > On behalf of the MXNet project > > - > > = MXNet: Apache Incubator Proposal = > > == Abstract == > > MXNet is a Flexible and Efficient Library for Deep Learning > > == Proposal == > > MXNet is an open-source deep learning framework that allows you to define, > train, and deploy deep neural networks on a wide array of devices, from > cloud infrastructure to mobile devices. It is highly scalable, allowing for > fast model training, and supports a flexible programming model and multiple > languages. MXNet allows you to mix symbolic and imperative programming > flavors to maximize both efficiency and productivity. MXNet is built on a > dynamic dependency scheduler that automatically parallelizes both symbolic > and imperative operations on the fly. A graph optimization layer on top of > that makes symbolic execution fast and memory efficient. The MXNet library > is portable and lightweight, and it scales to multiple GPUs and multiple > machines. > > == Background == > > Deep learning is a subset of Machine learning and refers to a class of > algorithms that use a hierarchical approach with non-linearities to > discover and learn representations within data. Deep Learning has recently > become very popular due to its applicability and advancement of domains > such as Computer Vision, Speech Recognition, Natural Language Understanding > and Recommender Systems. With pervasive and cost effective cloud computing, > large labeled datasets and continued algorithmic innovation, Deep Learning > has become the one of the most popular classes of algorithms for machine > learning practitioners in recent years. > > == Rational == > > The adoption of deep learning is quickly expanding from initial deep domain > experts rooted in academia to data scientists and developers working to > deploy intelligent services and products. Deep learning however has many > challenges. These include model training time (which can take days to > weeks), programmability (not everyone writes Python or C++ and like > symbolic programming) and balancing production readiness (support for > things like failover) with development flexibility (ability to program > different ways, support for new operators and model types) and speed of > execution (fast and scalable model training). Other frameworks excel on > some but not all of these aspects. > > > == Initial Goals == > > MXNet is a fairly established project on GitHub with its first code > contribution in April 2015 and roughly 200 contributors. It is used by > several large companies and some of the top research institutions on the > planet. Initial goals would be the following: > > 1. Move the existing codebase(s) to Apache > 1. Integrate with the Apache development process/sign CLAs > 1. Ensure all dependencies are compliant with Apache License version 2.0 > 1. Incremental development and releases per Apache guidelines > 1. Establish engineering discipline and a predictable release cadence of > high quality releases > 1. Expand the community beyond the current base of expert level users > 1. Improve usability and the overall developer/user experience > 1. Add additional functionality to address newer problem types and > algorithms > > > == Current Status == > > === Meritocracy === > > The MXNet project already operates on meritocratic principles. Today, MXNet > has developers worldwide and has accepted multiple major patches from a > diverse set of contributors within both industry and academia. We would > like to follow ASF meritocratic principles to encourage more developers to > contribute in this project. We know that only active and committed > developers from a diverse set of backgrounds can make MXNet a successful > project. We are also improving the d
Re: [DISCUSS] Proposing MXNet for the Apache Incubator
Awesome! Please sign me up as a committer - I've been working with Mu on MXNet (Amazon) and would love to get more involved with project! GitHub ID: jspisak Sent from Joe's iPhone On 2017-01-06 08:44 (-0800), Henri Yandell wrote: > Understood. I saw that Greg had recently approved another podling to do> > this. Though, assuming approved, there will still need to be some infra> > headscratching on the 3,000 issues currently on the main dmlc/mxnet repo> > and how imports are best done :) The simplest would be to transfer the> > current repo as is over at GitHub - not sure if that's been done before.> > > On Thu, Jan 5, 2017 at 11:32 PM, Henry Saputra > > wrote:> > > > This is great news and I am looking forward to it =)> > >> > > According to proposal, the community want to stick with Github issues for> > > tracking issues and bugs?> > > I suppose this needs a nod by Greg Stein as rep from Apache Infra to> > > confirm that this is ok for incubation and how would it impact during> > > graduation.> > >> > > - Henry> > >> > > On Thu, Jan 5, 2017 at 9:12 PM, Henri Yandell wrote:> > >> > > > Hello Incubator,> > > >> > > > I'd like to propose a new incubator Apache MXNet podling.> > > >> > > > The existing MXNet project (http://mxnet.io - 1.5 years old, 15> > > > committers,> > > > 200 contributors) is very interested in joining Apache. MXNet is an> > > > open-source deep learning framework that allows you to define, train, > > > and> > > > deploy deep neural networks on a wide array of devices, from cloud> > > > infrastructure to mobile devices.> > > >> > > > The wiki proposal page is located here:> > > >> > > > https://wiki.apache.org/incubator/MXNetProposal> > > >> > > > I've included the text below in case anyone wants to focus on parts of > > > it> > > > in a reply.> > > >> > > > Looking forward to your thoughts, and for lots of interested Apache> > > members> > > > to volunteer to mentor the project in addition to Sebastian and myself.> > > >> > > > Currently the list of committers is based on the current active coders,> > > so> > > > we're also very interested in hearing from anyone else who is interested> > > in> > > > working on the project, be they current or future contributor!> > > >> > > > Thanks,> > > >> > > > Hen> > > > On behalf of the MXNet project> > > >> > > > -> > > >> > > > = MXNet: Apache Incubator Proposal => > > >> > > > == Abstract ==> > > >> > > > MXNet is a Flexible and Efficient Library for Deep Learning> > > >> > > > == Proposal ==> > > >> > > > MXNet is an open-source deep learning framework that allows you to> > > define,> > > > train, and deploy deep neural networks on a wide array of devices, from> > > > cloud infrastructure to mobile devices. It is highly scalable, allowing> > > for> > > > fast model training, and supports a flexible programming model and> > > multiple> > > > languages. MXNet allows you to mix symbolic and imperative programming> > > > flavors to maximize both efficiency and productivity. MXNet is built on > > > a> > > > dynamic dependency scheduler that automatically parallelizes both> > > symbolic> > > > and imperative operations on the fly. A graph optimization layer on top> > > of> > > > that makes symbolic execution fast and memory efficient. The MXNet> > > library> > > > is portable and lightweight, and it scales to multiple GPUs and multiple> > > > machines.> > > >> > > > == Background ==> > > >> > > > Deep learning is a subset of Machine learning and refers to a class of> > > > algorithms that use a hierarchical approach with non-linearities to> > > > discover and learn representations within data. Deep Learning has> > > recently> > > > become very popular due to its applicability and advancement of domains> > > > such as Computer Vision, Speech Recognition, Natural Language> > > Understanding> > > > and Recommender Systems. With pervasive and cost effective cloud> > > computing,> > > > large labeled datasets and continued algorithmic innovation, Deep> > > Learning> > > > has become the one of the most popular classes of algorithms for machine> > > > learning practitioners in recent years.> > > >> > > > == Rational ==> > > >> > > > The adoption of deep learning is quickly expanding from initial deep> > > domain> > > > experts rooted in academia to data scientists and developers working to> > > > deploy intelligent services and products. Deep learning however has many> > > > challenges. These include model training time (which can take days to> > > > weeks), programmability (not everyone writes Python or C++ and like> > > > symbolic programming) and balancing production readiness (support for> > > > things like failover) with development flexibility (ability to program> > > > different ways, support for new operators and model types) and speed of> > > > execution (fast and scalable model training). Other frameworks excel on> > >
Re: [DISCUSS] Proposing MXNet for the Apache Incubator
Awesome! Please sign me up as a committer - I've been working with Mu on MXNet (Amazon) and would love to get more involved with project! GitHub ID: jspisak Sent from Joe's iPhone - To unsubscribe, e-mail: general-unsubscr...@incubator.apache.org For additional commands, e-mail: general-h...@incubator.apache.org