pymc3

2017-10-02 Thread Dominic Divakaruni
anyone interested in helping out with a MXNet backend for pymc3 now that Theano is dead? https://twitter.com/twiecki/status/914594840456900608 https://github.com/pymc-devs/pymc4_prototypes

Re: What's everyone working on?

2017-10-02 Thread Dominic Divakaruni
Seb is talking about support for Cuda 9 and cuDNN 7. Pull requests below. @ptrendx and Dick Carter are working through some performance issues but should be done in a week (hopefully). Jun, Bhavin, Tensor RT runtime is a different subject. Nvidia is helping build a converter for MXNet models. Not

Cutting a v0.12 release for Volta V100 support in MXNet

2017-10-02 Thread Seb Kiureghian
We are close to making MXNet the first DL framework to support training and inference on the new and super-fast Nvidia Volta V100. It would be great to get MXNet in front of developers who are at the cutting edge of Deep Learning. What do you all think of a v0.12 release sometime this month to

Re: What's everyone working on?

2017-10-02 Thread Bhavin Thaker
Hi Seb: please use a different email thread for new topics of discussion. Hi Jun: I think Seb may be referring to Volta V100 support in MXNet and NOT P4/P40 inference accelerators. Corrections/clarifications welcome. Bhavin Thaker. On Mon, Oct 2, 2017 at 8:22 PM Jun Wu

Re: What's everyone working on?

2017-10-02 Thread Chris Olivier
+1 On Mon, Oct 2, 2017 at 8:04 PM Dominic Divakaruni < dominic.divakar...@gmail.com> wrote: >  > > On Mon, Oct 2, 2017 at 8:02 PM Seb Kiureghian wrote: > > > It would be awesome if MXNet were the first DL framework to support > Nvidia > > Volta. What do you all think about

Re: What's everyone working on?

2017-10-02 Thread Dominic Divakaruni
 On Mon, Oct 2, 2017 at 8:02 PM Seb Kiureghian wrote: > It would be awesome if MXNet were the first DL framework to support Nvidia > Volta. What do you all think about cutting a v0.12 release once that > integration is ready? > > On Wed, Sep 27, 2017 at 10:38 PM, Jun Wu

Re: What's everyone working on?

2017-10-02 Thread Seb Kiureghian
It would be awesome if MXNet were the first DL framework to support Nvidia Volta. What do you all think about cutting a v0.12 release once that integration is ready? On Wed, Sep 27, 2017 at 10:38 PM, Jun Wu wrote: > I had been working on the sparse tensor project with

Re: MXNet Slack Channel

2017-10-02 Thread Kenta Iwasaki
Hi Seb, Might you invite me as well? Many thanks, Kenta Iwasaki On Tue, Oct 3, 2017 at 6:07 AM, Seb Kiureghian wrote: > invited > > On Mon, Oct 2, 2017 at 3:00 PM, Joshua Arnold > > wrote: > > > I would like to join the MXNet slack channel. >

Re: PR builds are currently failing due to a known issue

2017-10-02 Thread Meghna Baijal
Hi Jason, I did go through some of Beam’s source code but did not find any way to overcome my current problem. Could you please point me in the right direction? Thanks, Meghna Baijal On Thu, Sep 28, 2017 at 10:32 AM, Daniel Pono Takamori wrote: > Unfortunately we won't be able

Re: MXNet Slack Channel

2017-10-02 Thread Seb Kiureghian
invited On Mon, Oct 2, 2017 at 3:00 PM, Joshua Arnold wrote: > I would like to join the MXNet slack channel. > > Thanks, > Josh >

Re: Apache MXNet build failures are mostly valid - verify before merge

2017-10-02 Thread Gautam
Thanks All. I've created the JIRA to mark the protected branch for master https://issues.apache.org/jira/browse/INCUBATOR-205. We also need to add all the committers to be code owner as discussed in the slack, I've opened a PR for it https://github.com/apache/incubator-mxnet/pull/8128. Good

Re: New Apache MXNet logo idea

2017-10-02 Thread Lupesko, Hagay
Like it! It’s cute, fun and playful. For me it also associates with speed. Hagay On 9/28/17, 20:08, "Henri Yandell" wrote: Love :) Lots of good connections here. Nice feather style/colour in the bunny's silhouette, nice "magic" overlay for connection to