Agree to track the 3rd party packages which make MXNet more prosperous :)
Before building the CI, I suggest to create the related labels, like sockeye,
gluonCV, gluonNLP, etc, in the GitHub and give the high priority for these
So the issue/PR can be fixed quickly and these important
Update on the issue 1. and 4.:
For 1., I fixed the notice year in master branch . If we are to create a new
rc, the fix should be cherry-picked.
For 4., MKLDNN has found the issue  and posted the fix in their master
branch. I'm requesting that the fix be backported for the minor version
On Sun, Feb 10, 2019 at 10:43 PM Hagay Lupesko wrote:
> Wanted to chime in as well.
> I have reviewed the design shared in the mail offline with Ankit, Lai and
> Naveen (we work in the same team in Amazon).
> I think it does a good job at simplifying many low-complexity training use
Thanks for the proposal, Felix. On one hand, I agree that richer workload from
the ecosystem helps find issues in MXNet early. On the other hand, I'm
concerned about tightly coupling the development of projects.
Monitoring the upstream library and addressing problems for upgrading
I started a proposal page on it
It is a big chunk of work and needs some serious analysis - but it's a
starting point for a conversation :)
On Tue, Jan 22, 2019 at 1:56 PM Carin Meier wrote:
> I've heard
Can we move the VOTE forward since the RAT license should not be a problem that
block the release. We can always add that one in our future releases (e.g 1.4.1
As you may aware, 1.4.0 release started very early this year and delayed a
couple of times until now. From the
I do believe in the benefit of MXNet community, MXNet 1.4 is a important
release with many useful features for our users:
1. Java Inference API, JVM memory management, Julia APIs
2. Multiple important directional experimental features - Subgraph API,
control flow operators, Topology aware
I was wondering if there was any particular reason why we are building and
testing mxnet with USE_LIBJPEG_TURBO=0. I noticed that we are shipping it
with USE_LIBJPEG_TURBO=1 (eg. make/pip/pip_linux_cpu.mk).
I ran into issues trying to compile mxnet with the libjpegturbo flag on
Dear community -
based on Justin's and community feedback I'm suggesting to restart the vote.
+1: 2 votes (Henri, Jason)
-1: 1 vote (Luciano)
+1: 1 vote (Kellen)
The community is investigating feedback from Luciano that the exclusion
file is to broad
Horovod is going to release it's 0.16.0 in the coming week with MXNet
integration. We need to release 1.4.0 which includes all the dependencies
for Horovod integration.
On Mon, Feb 11, 2019 at 9:30 PM Steffen Rochel
> Dear community -
> based on Justin's and
Thank you for the request! The CI team is currently working on improving
our benchmarking platform and will evaluate this request carefully.
On Mon, Feb 11, 2019 at 3:59 PM Carin Meier wrote:
> Can't speak for the CI team, but in general I think that it is good idea.
+100 on Iblis's thoughts:
"We know tools and frameworks keep changing.
People learn the lesson from making and attempting.
It's just the path of the human technology evolution.
The point is the ideas/experiences
which this community is going to surprise you at."
On Mon, Feb 11, 2019 at
Can't speak for the CI team, but in general I think that it is good idea.
On a separate note, I've been playing around with Sockeye recently and it's
great! Awesome work and glad to see MXNet used for such cutting edge use
I'd love to see closer collaboration with the Sockeye team and
well, I'm not going to talk about technical stuffs.
You can find some design concepts on doc or wiki.
For me, working on MXNet is a rare chance to verify my ideas of
a machine learning framework.
I would like to ask around whether there is interest in the community to
test nightly builds of MXNet with third-party packages that depend on MXNet
and act as early adopters. The goal is to catch regressions in MXNet early,
allowing time for bug fixes before a new release is cut.
Mail list logo