eric-haibin-lin commented on issue #18543:
URL:
https://github.com/apache/incubator-mxnet/issues/18543#issuecomment-663098706
I'm still looking into this. Currently the mirror pass requires some
shape/type information which is missing at the point of calling the pass
gigasquid commented on issue #17783:
URL:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-663085890
@saudet @szha - I think we be a good path forward (from the Clojure
perspective)
This is an automated m
mxnet-bot commented on pull request #18660:
URL: https://github.com/apache/incubator-mxnet/pull/18660#issuecomment-663083306
Jenkins CI successfully triggered : [centos-cpu]
This is an automated message from the Apache Git Se
Yiyan66 commented on pull request #18660:
URL: https://github.com/apache/incubator-mxnet/pull/18660#issuecomment-663083249
@mxnet-bot run ci [centos-cpu]
This is an automated message from the Apache Git Service.
To respond to
szha commented on issue #18776:
URL:
https://github.com/apache/incubator-mxnet/issues/18776#issuecomment-663071856
Thanks for reporting. We are removing that op in the RNN layer in the 2.0.
It will need to be registered in 1.x. That operator should be registered as
simple concatenation fo
ptrendx opened a new pull request #18778:
URL: https://github.com/apache/incubator-mxnet/pull/18778
@szha
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use th
mxnet-bot commented on pull request #18778:
URL: https://github.com/apache/incubator-mxnet/pull/18778#issuecomment-663069406
Hey @ptrendx , Thanks for submitting the PR
All tests are already queued to run once. If tests fail, you can trigger one
or more tests again with the following co
josephevans commented on pull request #18752:
URL: https://github.com/apache/incubator-mxnet/pull/18752#issuecomment-663066996
@mxnet-bot run ci [unix-gpu]
This is an automated message from the Apache Git Service.
To respond
mxnet-bot commented on pull request #18752:
URL: https://github.com/apache/incubator-mxnet/pull/18752#issuecomment-663067039
Jenkins CI successfully triggered : [unix-gpu]
This is an automated message from the Apache Git Serv
szha commented on issue #18774:
URL:
https://github.com/apache/incubator-mxnet/issues/18774#issuecomment-663065512
@niranjannilekani thanks for reporting. @hetong007 could you help on this?
This is an automated message from
szha commented on issue #17783:
URL:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-663064169
@saudet this looks awesome! An 18% improvement in throughput is quite
significant for switching the way of integration for a frontend binding. I
think we should definitely st
anko-intel commented on a change in pull request #18777:
URL: https://github.com/apache/incubator-mxnet/pull/18777#discussion_r459504958
##
File path: src/operator/tensor/elemwise_sum.cc
##
@@ -118,11 +118,26 @@ void ElementWiseSumComputeExCPU(const nnvm::NodeAttrs&
attrs,
anko-intel commented on a change in pull request #18777:
URL: https://github.com/apache/incubator-mxnet/pull/18777#discussion_r459502487
##
File path: src/operator/tensor/elemwise_sum.cc
##
@@ -118,11 +118,26 @@ void ElementWiseSumComputeExCPU(const nnvm::NodeAttrs&
attrs,
mxnet-bot commented on pull request #18777:
URL: https://github.com/apache/incubator-mxnet/pull/18777#issuecomment-663042745
Hey @bgawrych , Thanks for submitting the PR
All tests are already queued to run once. If tests fail, you can trigger one
or more tests again with the following c
bgawrych opened a new pull request #18777:
URL: https://github.com/apache/incubator-mxnet/pull/18777
## Description ##
This PR fixes bug which occurs when training gluonCV deeplab with oneDNN
support.
Original issue: https://github.com/dmlc/gluon-cv/issues/1368
To reproduce:
saudet commented on issue #17783:
URL:
https://github.com/apache/incubator-mxnet/issues/17783#issuecomment-662994965
Hi, instead of JNA, I would be happy to provide bindings for the C API and
maintain packages based on the JavaCPP Presets here:
https://github.com/bytedeco/javacpp-preset
xizi opened a new issue #18776:
URL: https://github.com/apache/incubator-mxnet/issues/18776
export rnn.GRU op failed, error message as follow:
AttributeError: No conversion function registered for op type
_rnn_param_concat yet.
--
bgawrych commented on pull request #18708:
URL: https://github.com/apache/incubator-mxnet/pull/18708#issuecomment-662950170
@TaoLv @PatricZhao Can you take a look?
CC: @Yiyan66 as owner of the fix npx.softmax for 0-sized inputs (#18158)
--
Yiyan66 commented on pull request #18660:
URL: https://github.com/apache/incubator-mxnet/pull/18660#issuecomment-662903653
@mxnet-bot run ci [centos-cpu, centos-gpu]
This is an automated message from the Apache Git Service.
T
mxnet-bot commented on pull request #18660:
URL: https://github.com/apache/incubator-mxnet/pull/18660#issuecomment-662903709
Jenkins CI successfully triggered : [centos-cpu, centos-gpu]
This is an automated message from the A
chinakook commented on issue #18643:
URL:
https://github.com/apache/incubator-mxnet/issues/18643#issuecomment-662876105
A very big and complex graph with dynamic ops may be fail to delayed infer.
You can modularize them and then compose modules.
--
wkcn commented on pull request #18714:
URL: https://github.com/apache/incubator-mxnet/pull/18714#issuecomment-662869997
@ChaiBapchya , This PR looks good to me too. : )
This is an automated message from the Apache Git Service
chinakook commented on issue #18643:
URL:
https://github.com/apache/incubator-mxnet/issues/18643#issuecomment-662868075
I also encounter this problem, delayed infer may be fail.
This is an automated message from the Apache G
bgawrych commented on pull request #18708:
URL: https://github.com/apache/incubator-mxnet/pull/18708#issuecomment-662852190
@mxnet-bot run ci [edge, unix-gpu]
This is an automated message from the Apache Git Service.
To respo
mxnet-bot commented on pull request #18708:
URL: https://github.com/apache/incubator-mxnet/pull/18708#issuecomment-662852213
Jenkins CI successfully triggered : [edge, unix-gpu]
This is an automated message from the Apache Gi
DickJC123 commented on pull request #18424:
URL: https://github.com/apache/incubator-mxnet/pull/18424#issuecomment-662849662
@MoisesHer Thanks for your patience- I know you've been waiting for this
PR's functionality for your PR. Stay tuned- I'm hopeful you'll be unblocked
soon.
---
DickJC123 commented on pull request #18424:
URL: https://github.com/apache/incubator-mxnet/pull/18424#issuecomment-662846028
Sorry I got side-tracked on a different PR and let this sit idle for awhile.
At this point, I'm done with the prep of this PR and would be happy to respond
to a rev
DickJC123 opened a new issue #18775:
URL: https://github.com/apache/incubator-mxnet/issues/18775
## Description
This was observed during the development of
https://github.com/apache/incubator-mxnet/pull/18424. I've developed a fix
which I've added via commit
https://github.com/apache/
niranjannilekani opened a new issue #18774:
URL: https://github.com/apache/incubator-mxnet/issues/18774
## Description
Iam unable to install mxnet package in R 4.0.1
### Error Message
package ‘mxnet’ was installed before R 4.0.0: please re-install it
## To Reproduce
(I
leezu commented on issue #18764:
URL:
https://github.com/apache/incubator-mxnet/issues/18764#issuecomment-662839367
That's a separate problem. @eric-haibin-lin mentioned the problem does not
apply to 1.x nightly build
This
szha commented on issue #18764:
URL:
https://github.com/apache/incubator-mxnet/issues/18764#issuecomment-662832752
It still seems to be the case in 2.0 in #18772
This is an automated message from the Apache Git Service.
To
leezu edited a comment on issue #18772:
URL:
https://github.com/apache/incubator-mxnet/issues/18772#issuecomment-662781755
Horovod includes the MXNet C++ headers and based on them interacts with the
Engine:
https://github.com/horovod/horovod/blob/cf022be959a7c9431a8415729758b26dec
leezu edited a comment on issue #10988:
URL:
https://github.com/apache/incubator-mxnet/issues/10988#issuecomment-662817730
```
[2020-07-23T03:30:14.821Z]
tests/python/gpu/test_numpy_fallback.py::test_np_fallback_decorator PASSED [
19%]
[2020-07-23T03:30:16.176Z]
tests/python/gpu/te
leezu commented on issue #15832:
URL:
https://github.com/apache/incubator-mxnet/issues/15832#issuecomment-662817805
https://github.com/apache/incubator-mxnet/issues/10988
This is an automated message from the Apache Git Serv
leezu commented on issue #10988:
URL:
https://github.com/apache/incubator-mxnet/issues/10988#issuecomment-662817730
```
[2020-07-23T03:30:14.821Z]
tests/python/gpu/test_numpy_fallback.py::test_np_fallback_decorator PASSED [
19%]
[2020-07-23T03:30:16.176Z]
tests/python/gpu/test_oper
josephevans commented on pull request #18752:
URL: https://github.com/apache/incubator-mxnet/pull/18752#issuecomment-662817291
@mxnet-bot run ci [website]
This is an automated message from the Apache Git Service.
To respond t
mxnet-bot commented on pull request #18752:
URL: https://github.com/apache/incubator-mxnet/pull/18752#issuecomment-662817305
Jenkins CI successfully triggered : [website]
This is an automated message from the Apache Git Servi
ChaiBapchya commented on pull request #18714:
URL: https://github.com/apache/incubator-mxnet/pull/18714#issuecomment-662811732
@wkcn Appreciate your help with this cherry-pick. Changes look good to me.
Can we merge it?
This
wkcn commented on pull request #18714:
URL: https://github.com/apache/incubator-mxnet/pull/18714#issuecomment-662805396
Hi @ChaiBapchya , all tests passed : )
This is an automated message from the Apache Git Service.
To respo
ChaiBapchya commented on pull request #18773:
URL: https://github.com/apache/incubator-mxnet/pull/18773#issuecomment-662787469
Looks good since it's a cherry-pick.
For functionality check, broadcast_axis specific tests in unix-cpu should
verify that on CI.
For performance check, do y
ptrendx merged pull request #18742:
URL: https://github.com/apache/incubator-mxnet/pull/18742
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above
access2rohit commented on pull request #18773:
URL: https://github.com/apache/incubator-mxnet/pull/18773#issuecomment-662783995
@leezu @sandeep-krishnamurthy can you help merge ? These are cherry-picked
from master(already merged there)
mxnet-bot commented on pull request #18773:
URL: https://github.com/apache/incubator-mxnet/pull/18773#issuecomment-662783346
Hey @access2rohit , Thanks for submitting the PR
All tests are already queued to run once. If tests fail, you can trigger one
or more tests again with the followi
access2rohit opened a new pull request #18773:
URL: https://github.com/apache/incubator-mxnet/pull/18773
## Description ##
Back port optimization to broadcast_axis for CPU and GPU to MXNet1.7.x
## Checklist ##
### Essentials ###
Please feel free to remove inapplicable items f
leezu edited a comment on issue #18772:
URL:
https://github.com/apache/incubator-mxnet/issues/18772#issuecomment-662781755
Horovod includes the MXNet C++ headers and based on them interacts with the
Engine:
https://github.com/horovod/horovod/blob/cf022be959a7c9431a8415729758b26dec
leezu edited a comment on issue #18772:
URL:
https://github.com/apache/incubator-mxnet/issues/18772#issuecomment-662781755
Horovod includes the MXNet C++ headers and based on them interacts with the
Engine:
https://github.com/horovod/horovod/blob/cf022be959a7c9431a8415729758b26dec
leezu commented on issue #18772:
URL:
https://github.com/apache/incubator-mxnet/issues/18772#issuecomment-662781755
Horovod includes the MXNet C++ headers and based on them interacts with the
Engine:
https://github.com/horovod/horovod/blob/cf022be959a7c9431a8415729758b26dec1a87e5/
ZheyuYe commented on issue #18766:
URL:
https://github.com/apache/incubator-mxnet/issues/18766#issuecomment-662778617
@eric-haibin-lin Thanks a lot. I'll try this version.
This is an automated message from the Apache Git Se
DickJC123 commented on pull request #18424:
URL: https://github.com/apache/incubator-mxnet/pull/18424#issuecomment-662773737
I have another clean-up commit or two on this...
This is an automated message from the Apache Git Se
eric-haibin-lin commented on issue #18766:
URL:
https://github.com/apache/incubator-mxnet/issues/18766#issuecomment-662772739
I have a version that seems to work with mxnet built from source here:
https://github.com/eric-haibin-lin/horovod/tree/mx2
feel free to try it out if you need i
eric-haibin-lin commented on issue #18772:
URL:
https://github.com/apache/incubator-mxnet/issues/18772#issuecomment-662772210
```
[1,0]:(gdb) bt
[1,0]:#0 0x77419b80 in pthread_mutex_lock () from
/lib64/libpthread.so.0
[1,0]:#1 0x7fff68a1b81d in
mxnet::engine::Thread
eric-haibin-lin opened a new issue #18772:
URL: https://github.com/apache/incubator-mxnet/issues/18772
I am working on a bug fix for mxnet master with my horovod branch:
https://github.com/eric-haibin-lin/horovod/tree/mx2
I noticed that the example passes if I use mxnet built from s
gilbertfrancois edited a comment on issue #18751:
URL:
https://github.com/apache/incubator-mxnet/issues/18751#issuecomment-662755383
Coming back to mxnet: it looks like it is possible to do a forward pass
(inference mode) on cpu when the BatchNorm is placed with Dense layers. Because
on C
gilbertfrancois commented on issue #18751:
URL:
https://github.com/apache/incubator-mxnet/issues/18751#issuecomment-662756717
I suspect that the behaviour is corrected when the update of moving_mean and
moving_var on GPU is done in the backward pass, like it is on CPU. It will
solve the N
gilbertfrancois edited a comment on issue #18751:
URL:
https://github.com/apache/incubator-mxnet/issues/18751#issuecomment-662755383
Coming back to mxnet: it looks like it is possible to do a forward pass
(inference mode) on cpu when the BatchNorm is placed with Dense layers. Because
on C
gilbertfrancois commented on issue #18751:
URL:
https://github.com/apache/incubator-mxnet/issues/18751#issuecomment-662755383
Coming back to mxnet: it looks like it is possible to do a forward pass
(inference mode) on cpu when the BatchNorm is placed with Dense layers. But on
gpu, it trie
gilbertfrancois edited a comment on issue #18751:
URL:
https://github.com/apache/incubator-mxnet/issues/18751#issuecomment-662750082
Ok, I see that. But I guess it is the same intended behaviour as pyTorch
nn.BatchNorm1d for Dense layers, which takes as input (N, C). The normalization
is
mxnet-bot commented on pull request #18771:
URL: https://github.com/apache/incubator-mxnet/pull/18771#issuecomment-662753844
Hey @leezu , Thanks for submitting the PR
All tests are already queued to run once. If tests fail, you can trigger one
or more tests again with the following comm
leezu opened a new pull request #18771:
URL: https://github.com/apache/incubator-mxnet/pull/18771
- Delete unused Dockerfiles
- Delete unused install/*.sh scripts
- Consolidate ubuntu_gpu_tensorrt and ubuntu_gpu
- Remove deprecated logic in ci/build.py (no longer needed with
do
gilbertfrancois edited a comment on issue #18751:
URL:
https://github.com/apache/incubator-mxnet/issues/18751#issuecomment-662750082
Ok, I see that. But I guess it is the same intended behaviour as pyTorch
nn.BatchNorm1d for Dense layers, which takes as input (N, C). The normalization
is
mxnet-bot commented on pull request #18742:
URL: https://github.com/apache/incubator-mxnet/pull/18742#issuecomment-662752044
Jenkins CI successfully triggered : [unix-gpu]
This is an automated message from the Apache Git Serv
ChaiBapchya commented on pull request #18742:
URL: https://github.com/apache/incubator-mxnet/pull/18742#issuecomment-662752028
@mxnet-bot run ci [unix-gpu]
This is an automated message from the Apache Git Service.
To respond
gilbertfrancois commented on issue #18751:
URL:
https://github.com/apache/incubator-mxnet/issues/18751#issuecomment-662750082
Ok, I see that. But I guess it is the same intended behaviour as pyTorch
nn.BatchNorm1d, which takes as input (N, C). The normalization is done over C
features. E.
nabulsi commented on issue #18759:
URL:
https://github.com/apache/incubator-mxnet/issues/18759#issuecomment-662740615
@mseth10 the wheel is currently enough for me. I can move forward now, but I
am worried if in the next few days/weeks I find that I need something more and
I will have to
TristonC edited a comment on issue #18751:
URL:
https://github.com/apache/incubator-mxnet/issues/18751#issuecomment-662730136
After nn.Flatten(), the batch norm is actually performed on a 1xCx1x1
Tensor, where C is 9408 for the first batch norm layer in tail, and it is 32
for the second b
TristonC commented on issue #18751:
URL:
https://github.com/apache/incubator-mxnet/issues/18751#issuecomment-662730136
After nn.Flatten(), the batch norm is actually performed on a 1xCx1x1
Tensor, where C is 9408 for the first batch norm layer in tail, and it is 32
for the second batch mo
gilbertfrancois commented on issue #18751:
URL:
https://github.com/apache/incubator-mxnet/issues/18751#issuecomment-662729925
Hi @TristonC, the project is for training. I adapted the script. It It does
now one training step, with forward - backward pass and a validation step.
The h
TristonC removed a comment on issue #18751:
URL:
https://github.com/apache/incubator-mxnet/issues/18751#issuecomment-662721253
It looks like there is a bug there for doing batch norm with 1D array, when
the batch size is 1. For example, in this case, after flat, there vector size
is 9408,
TristonC commented on issue #18751:
URL:
https://github.com/apache/incubator-mxnet/issues/18751#issuecomment-662721253
It looks like there is a bug there for doing batch norm with 1D array, when
the batch size is 1. For example, in this case, after flat, there vector size
is 9408, which m
DickJC123 commented on a change in pull request #18424:
URL: https://github.com/apache/incubator-mxnet/pull/18424#discussion_r459109387
##
File path: src/c_api/c_api_test.cc
##
@@ -106,3 +106,25 @@ int MXRemoveSubgraphPropertyOpNamesV2(const char*
prop_name) {
}
API_END(
DickJC123 commented on a change in pull request #18424:
URL: https://github.com/apache/incubator-mxnet/pull/18424#discussion_r459107133
##
File path: tests/python/gpu/test_kvstore_gpu.py
##
@@ -20,8 +20,8 @@
import os
import mxnet as mx
import numpy as np
-import pytest
-fro
mseth10 commented on issue #18759:
URL:
https://github.com/apache/incubator-mxnet/issues/18759#issuecomment-662716632
@nabulsi that's great news. I have not yet tested the cross compilation
script provided on the installation page, and it might need some fixing. Until
that is done, is the
TristonC edited a comment on issue #18751:
URL:
https://github.com/apache/incubator-mxnet/issues/18751#issuecomment-662687910
@gilbertfrancois I did a quick test, to answer your question:
> I don't understand why y_out from MyNet with BatchNorm on GPU still
contains real numbers, given
TristonC edited a comment on issue #18751:
URL:
https://github.com/apache/incubator-mxnet/issues/18751#issuecomment-662687910
@gilbertfrancois I did a quick test, to answer your question:
> I don't understand why y_out from MyNet with BatchNorm on GPU still
contains real numbers, given
TristonC edited a comment on issue #18751:
URL:
https://github.com/apache/incubator-mxnet/issues/18751#issuecomment-662687910
@gilbertfrancois I did a quick test, to answer your question:
> I don't understand why y_out from MyNet with BatchNorm on GPU still
contains real numbers, given
aaronmarkham merged pull request #66:
URL: https://github.com/apache/incubator-mxnet-site/pull/66
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL a
TristonC edited a comment on issue #18751:
URL:
https://github.com/apache/incubator-mxnet/issues/18751#issuecomment-662687910
@gilbertfrancois I did a quick test, to answer your question:
> I don't understand why y_out from MyNet with BatchNorm on GPU still
contains real numbers, given
TristonC commented on issue #18751:
URL:
https://github.com/apache/incubator-mxnet/issues/18751#issuecomment-662687910
@gilbertfrancois I did a quick test, to answer your question:
> I don't understand why y_out from MyNet with BatchNorm on GPU still
contains real numbers, given that th
TristonC edited a comment on issue #18751:
URL:
https://github.com/apache/incubator-mxnet/issues/18751#issuecomment-662687910
@gilbertfrancois I did a quick test, to answer your question:
> I don't understand why y_out from MyNet with BatchNorm on GPU still
contains real numbers, given
ys2843 opened a new pull request #66:
URL: https://github.com/apache/incubator-mxnet-site/pull/66
This PR contains a latest website build based on mxnet master branch. This
is to sync beta-stage website with the latest mxnet website.
---
leezu opened a new issue #18770:
URL: https://github.com/apache/incubator-mxnet/issues/18770
MXNet ignores the byte-order of numpy data-types and always uses native
byte-order.
```
>>> import mxnet as mx
>>> import numpy as np
>>> print(mx.np.arange(10, dtype=np.dtype('>> pr
szha commented on issue #16167:
URL:
https://github.com/apache/incubator-mxnet/issues/16167#issuecomment-662620865
@fhieber we are planning to release the first public beta on this somewhere
in August. At the moment we are finalizing some API changes and also validating
them in GluonNLP.
szha commented on pull request #18478:
URL: https://github.com/apache/incubator-mxnet/pull/18478#issuecomment-662619908
Yes sounds good.
This is an automated message from the Apache Git Service.
To respond to the message, ple
ys2843 commented on pull request #65:
URL:
https://github.com/apache/incubator-mxnet-site/pull/65#issuecomment-662605128
> Please add description.
Done, @sandeep-krishnamurthy
This is an automated message from the Ap
sandeep-krishnamurthy commented on pull request #65:
URL:
https://github.com/apache/incubator-mxnet-site/pull/65#issuecomment-662596749
Please add description.
This is an automated message from the Apache Git Service.
To res
wkcn commented on pull request #18714:
URL: https://github.com/apache/incubator-mxnet/pull/18714#issuecomment-662593292
Hi @ChaiBapchya ,
I found a compilation error in
http://jenkins.mxnet-ci.amazon-ml.com/blue/organizations/jenkins/mxnet-validation%2Funix-cpu/detail/PR-18714/3/pipeline
ys2843 opened a new pull request #65:
URL: https://github.com/apache/incubator-mxnet-site/pull/65
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL a
xidulu commented on pull request #18758:
URL: https://github.com/apache/incubator-mxnet/pull/18758#issuecomment-662566711
@leezu
Thx for pointing that out, I will take a look at it.
This is an automated message from the A
szha edited a comment on pull request #18758:
URL: https://github.com/apache/incubator-mxnet/pull/18758#issuecomment-662183511
~not yet. I think we should provide the automation in CI first, which may
take some time. in the meantime, I think we can check in the jupyter notebook
for now wit
leezu commented on pull request #18758:
URL: https://github.com/apache/incubator-mxnet/pull/18758#issuecomment-662563255
@xidulu @szha it's supported via
https://github.com/apache/incubator-mxnet/blob/243ade93bcb8b7962d1faeb89c98409e3ae0d7a4/docs/python_docs/python/Makefile#L31-L33
--
TristonC commented on issue #18751:
URL:
https://github.com/apache/incubator-mxnet/issues/18751#issuecomment-662547038
@gilbertfrancois Is your project for training or inference? In your script,
it uses autograd, but it does not do backward(). The reason I asked this, is
BatchNorm behave
aaronmarkham merged pull request #64:
URL: https://github.com/apache/incubator-mxnet-site/pull/64
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL a
marcoabreu commented on issue #18753:
URL:
https://github.com/apache/incubator-mxnet/issues/18753#issuecomment-662538274
Since resolving a merge conflict will always result in a new commit hash,
it's inevitable that it's rerun by CI.
Merge conflict or not - a PR status should go sta
DickJC123 commented on issue #18753:
URL:
https://github.com/apache/incubator-mxnet/issues/18753#issuecomment-662534016
When my in-development PRs become unmergeable due to other accepted PRs, I
merge the current master with my PR and push the result, which generates
another CI run. Is i
fhieber commented on issue #16167:
URL:
https://github.com/apache/incubator-mxnet/issues/16167#issuecomment-662345601
@szha is there a recent estimate on the timeline for MXNet 2.0? Would you
recommend to develop downstream toolkits (e.g. Sockeye) against the master
branch now or rather w
suyz526 edited a comment on issue #18727:
URL:
https://github.com/apache/incubator-mxnet/issues/18727#issuecomment-662315508
Hi,
In
[https://mxnet.apache.org/api/python/docs/api](https://mxnet.apache.org/api/python/docs/api),
almost every page is blank, e.g.
[ndarray](https://mxn
suyz526 commented on issue #18727:
URL:
https://github.com/apache/incubator-mxnet/issues/18727#issuecomment-662315508
Hi,
In
[https://mxnet.apache.org/api/python/docs/api](https://mxnet.apache.org/api/python/docs/api),
almost every page is empty.
This 'beta' page works:
[
ciyongch commented on pull request #18478:
URL: https://github.com/apache/incubator-mxnet/pull/18478#issuecomment-662275312
Ok, got it, then let's keep it as is now and try to finalize the fix
solution in the next release.
Actually how to handle the dual license issue or re-license the t
ChaiBapchya commented on pull request #18714:
URL: https://github.com/apache/incubator-mxnet/pull/18714#issuecomment-662272392
@wkcn cherry-picking 1.x into 1.6 doesn't quite resolve CI issues.. any idea?
This is an automated
ChaiBapchya commented on pull request #18742:
URL: https://github.com/apache/incubator-mxnet/pull/18742#issuecomment-662271682
@mxnet-bot run ci [unix-gpu]
This is an automated message from the Apache Git Service.
To respond
601 - 700 of 103965 matches
Mail list logo