Hello dev@,
I would like to ask around whether there is interest in the community to
test nightly builds of MXNet with third-party packages that depend on MXNet
and act as early adopters. The goal is to catch regressions in MXNet early,
allowing time for bug fixes before a new release is cut.
well, I'm not going to talk about technical stuffs.
You can find some design concepts on doc or wiki.
(https://mxnet.incubator.apache.org/versions/master/architecture/index.html)
For me, working on MXNet is a rare chance to verify my ideas of
a machine learning framework.
During implementing MXNe
+100 on Iblis's thoughts:
"We know tools and frameworks keep changing.
People learn the lesson from making and attempting.
It's just the path of the human technology evolution.
The point is the ideas/experiences
which this community is going to surprise you at."
- Carin
On Mon, Feb 11, 2019 at
Can't speak for the CI team, but in general I think that it is good idea.
On a separate note, I've been playing around with Sockeye recently and it's
great! Awesome work and glad to see MXNet used for such cutting edge use
cases.
I'd love to see closer collaboration with the Sockeye team and MXNet
Hi Felix,
Thank you for the request! The CI team is currently working on improving
our benchmarking platform and will evaluate this request carefully.
Chance Bair
On Mon, Feb 11, 2019 at 3:59 PM Carin Meier wrote:
> Can't speak for the CI team, but in general I think that it is good idea.
>
Agree to track the 3rd party packages which make MXNet more prosperous :)
Before building the CI, I suggest to create the related labels, like sockeye,
gluonCV, gluonNLP, etc, in the GitHub and give the high priority for these
issues/PR.
So the issue/PR can be fixed quickly and these important
STOP
On Sun, Feb 10, 2019 at 10:43 PM Hagay Lupesko wrote:
> Wanted to chime in as well.
> I have reviewed the design shared in the mail offline with Ankit, Lai and
> Naveen (we work in the same team in Amazon).
>
> I think it does a good job at simplifying many low-complexity training use
> cas
Update on the issue 1. and 4.:
For 1., I fixed the notice year in master branch [1]. If we are to create a new
rc, the fix should be cherry-picked.
For 4., MKLDNN has found the issue [2] and posted the fix in their master
branch. I'm requesting that the fix be backported for the minor version 0.1
Thanks for the proposal, Felix. On one hand, I agree that richer workload from
the ecosystem helps find issues in MXNet early. On the other hand, I'm
concerned about tightly coupling the development of projects.
Monitoring the upstream library and addressing problems for upgrading
dependency sh
I started a proposal page on it
https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=103089990
It is a big chunk of work and needs some serious analysis - but it's a
starting point for a conversation :)
On Tue, Jan 22, 2019 at 1:56 PM Carin Meier wrote:
> Thanks!
>
> I've heard this
Hi All,
Can we move the VOTE forward since the RAT license should not be a problem that
block the release. We can always add that one in our future releases (e.g 1.4.1
or 1.5.0).
As you may aware, 1.4.0 release started very early this year and delayed a
couple of times until now. From the Apac
I do believe in the benefit of MXNet community, MXNet 1.4 is a important
release with many useful features for our users:
1. Java Inference API, JVM memory management, Julia APIs
2. Multiple important directional experimental features - Subgraph API,
control flow operators, Topology aware all-red
Dear community -
based on Justin's and community feedback I'm suggesting to restart the vote.
Current status:
binding votes:
+1: 2 votes (Henri, Jason)
-1: 1 vote (Luciano)
non-binding:
+1: 1 vote (Kellen)
The community is investigating feedback from Luciano that the exclusion
file is to broad a
+1 binding
Horovod is going to release it's 0.16.0 in the coming week with MXNet
integration. We need to release 1.4.0 which includes all the dependencies
for Horovod integration.
Best,
Lin
On Mon, Feb 11, 2019 at 9:30 PM Steffen Rochel
wrote:
> Dear community -
> based on Justin's and communi
Hello everyone,
I was wondering if there was any particular reason why we are building and
testing mxnet with USE_LIBJPEG_TURBO=0. I noticed that we are shipping it
with USE_LIBJPEG_TURBO=1 (eg. make/pip/pip_linux_cpu.mk).
I ran into issues trying to compile mxnet with the libjpegturbo flag on
Ub
15 matches
Mail list logo