+1 and suggest consolidating all maintenance releases under the same
major.minor version into a single branch
On Wed, Jan 24, 2018 at 9:06 PM, Meghna Baijal
wrote:
> I agree. If the release candidate is being cut from the master branch, it
> should be considered a
I agree. If the release candidate is being cut from the master branch, it
should be considered a minor release.
Anyway the effort involved in the release process is exactly the same in
either case.
Thanks,
Meghna
On Jan 24, 2018 8:56 PM, "Marco de Abreu"
wrote:
>
Are there any particular reasons why we are classifying this release as
patch instead of minor release? As far as I know, we don't have any tests
in place to determine API changes and thus can't guarantee that this is an
actual patch release. Considering the fact that PRs have been merged
without
the profiling PR contains a small breaking change, but i don’t think it’s
going into 1.0.1
On Wed, Jan 24, 2018 at 6:48 PM Haibin Lin wrote:
> Hi everyone,
>
> Since the plan was to cut a branch from the master branch, the code will
> include changes other than the bug
Hi everyone,
Since the plan was to cut a branch from the master branch, the code will
include changes other than the bug fix PRs noted in the release note. Is
anyone aware of any API changes in the current MXNet master branch? In
particular, are there backward incompatible ones?
Best,
Haibin
On
Hello,
we just had a test failure in test_operator_gpu.test_correlation (sourced
from unittest/test_operator.py) on the master branch (tracked at
https://github.com/apache/incubator-mxnet/issues/9553).
Could somebody please have a look?
Best regards,
Marco
Marco,
Thanks a lot for looking through this ! Some comments below -
1. *R-package:* Before we create the final tarball for the release, the
R-package is explicitly removed from the cloned MXNet repo. The only info I
have in this regard is that “there are some unresolved licensing issues
Hi Meghna,
thank you for driving the licensing issues!
- R-package: In the linked wiki, you're mentioning that R-package is not a
part of the release. Could you please elaborate? From my understand, all
files in the GitHub repository are part of the release.
- Dockerfiles: I just checked another
Hello,
This is an update on the current status of the license fixes (all details
in the wiki linked below)–
1. I am constantly updating this wiki, so you can check it at any time
to know the status -
https://cwiki.apache.org/confluence/display/MXNET/MXNet+Source+Licenses
2. All 7 PRs
Hi
We have identified that cuda cudnn autotune produces a significant
spike of ram usage when finding the best convolution algorithm.
As far as we understand this is inside the cudnn library. But in
platforms like the TX1 where we only have 4G this is problematic as
the spike is close to 4G.
10 matches
Mail list logo