yuxihu commented on issue #15260: Gluon Trainer fails with Horovod
DistributedTrainer
URL:
https://github.com/apache/incubator-mxnet/issues/15260#issuecomment-503219146
@chandana1332 Please reopen the issue if you have any additional questions.
yuxihu closed issue #15260: Gluon Trainer fails with Horovod DistributedTrainer
URL: https://github.com/apache/incubator-mxnet/issues/15260
This is an automated message from the Apache Git Service.
To respond to the message,
chandana1332 commented on issue #15260: Gluon Trainer fails with Horovod
DistributedTrainer
URL:
https://github.com/apache/incubator-mxnet/issues/15260#issuecomment-503220043
@yuxihu thank you! That worked.
This is an
anirudh2290 commented on a change in pull request #15171: Upgrade archive
utility and add back FC improvement
URL: https://github.com/apache/incubator-mxnet/pull/15171#discussion_r294933030
##
File path: Makefile
##
@@ -368,10 +368,32 @@ endif
# Guard against
Zha0q1 commented on a change in pull request #15210: [WIP] Custom Operator
Profiling Enhancement
URL: https://github.com/apache/incubator-mxnet/pull/15210#discussion_r294941672
##
File path: src/engine/naive_engine.cc
##
@@ -159,7 +160,11 @@ class NaiveEngine final :
This is an automated email from the ASF dual-hosted git repository.
zhengda pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git
The following commit(s) were added to refs/heads/master by this push:
new d7e2139 [MXNET-1417][Performance]
zheng-da merged pull request #15262: [MXNET-1417][Performance] Caching Dynamic
Shape Checking Result
URL: https://github.com/apache/incubator-mxnet/pull/15262
This is an automated message from the Apache Git Service.
To
larroy commented on a change in pull request #15109: [DOC] refine autograd docs
URL: https://github.com/apache/incubator-mxnet/pull/15109#discussion_r295004565
##
File path: docs/api/python/autograd/autograd.md
##
@@ -76,7 +82,63 @@ Detailed tutorials are available in Part
piyushghai edited a comment on issue #15254:
mxnet(mxnet-full_2.11-linux-x86_64-gpu-1.5.0-SNAPSHOT) cannot support cuda10.1?
URL:
https://github.com/apache/incubator-mxnet/issues/15254#issuecomment-503237205
> **Thank you all. I lowered the CUDA version(9.2), and now it's OK. But my
code
frankfliu commented on issue #10883: make err on RK3399
URL:
https://github.com/apache/incubator-mxnet/issues/10883#issuecomment-503224582
@mxnet-label-bot add [Pending Requester Info]
This is an automated message from the
Zha0q1 opened a new pull request #15210: [WIP] Custom Operator Profiling
Enhancement
URL: https://github.com/apache/incubator-mxnet/pull/15210
## Description ##
fix: https://github.com/apache/incubator-mxnet/issues/15241
I have implemented the new feature.
Need to add test cases.
Zha0q1 closed pull request #15210: [WIP] Custom Operator Profiling Enhancement
URL: https://github.com/apache/incubator-mxnet/pull/15210
This is an automated message from the Apache Git Service.
To respond to the message,
larroy commented on a change in pull request #15109: [DOC] refine autograd docs
URL: https://github.com/apache/incubator-mxnet/pull/15109#discussion_r294991307
##
File path: docs/api/python/autograd/autograd.md
##
@@ -76,7 +82,63 @@ Detailed tutorials are available in Part
larroy commented on a change in pull request #15109: [DOC] refine autograd docs
URL: https://github.com/apache/incubator-mxnet/pull/15109#discussion_r294991025
##
File path: docs/api/python/autograd/autograd.md
##
@@ -76,7 +82,63 @@ Detailed tutorials are available in Part
larroy commented on a change in pull request #15109: [DOC] refine autograd docs
URL: https://github.com/apache/incubator-mxnet/pull/15109#discussion_r295007167
##
File path: docs/api/python/autograd/autograd.md
##
@@ -76,7 +82,63 @@ Detailed tutorials are available in Part
This is an automated email from the ASF dual-hosted git repository.
reminisce pushed a commit to branch numpy
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git
The following commit(s) were added to refs/heads/numpy by this push:
new 96520cb [numpy] Fix d2l chapter 5
Ishitori commented on issue #15268: Backward doesn't work on LSTM with
sequence_length
URL:
https://github.com/apache/incubator-mxnet/issues/15268#issuecomment-503258503
@stephenrawls, any help with that?
This is an
reminisce merged pull request #15264: [numpy] Fix d2l chapter 5
URL: https://github.com/apache/incubator-mxnet/pull/15264
This is an automated message from the Apache Git Service.
To respond to the message, please log on to
larroy commented on a change in pull request #15171: Upgrade archive utility
and add back FC improvement
URL: https://github.com/apache/incubator-mxnet/pull/15171#discussion_r294985177
##
File path: Makefile
##
@@ -368,10 +368,32 @@ endif
# Guard against displaying
larroy commented on a change in pull request #15210: Custom Operator Profiling
Enhancement
URL: https://github.com/apache/incubator-mxnet/pull/15210#discussion_r294984824
##
File path: src/engine/naive_engine.cc
##
@@ -159,7 +160,11 @@ class NaiveEngine final : public
larroy commented on a change in pull request #15109: [DOC] refine autograd docs
URL: https://github.com/apache/incubator-mxnet/pull/15109#discussion_r295006893
##
File path: docs/api/python/autograd/autograd.md
##
@@ -76,7 +82,63 @@ Detailed tutorials are available in Part
larroy commented on a change in pull request #15109: [DOC] refine autograd docs
URL: https://github.com/apache/incubator-mxnet/pull/15109#discussion_r295007691
##
File path: docs/api/python/autograd/autograd.md
##
@@ -76,7 +82,63 @@ Detailed tutorials are available in Part
leleamol commented on issue #15268: Backward doesn't work on LSTM with
sequence_length
URL:
https://github.com/apache/incubator-mxnet/issues/15268#issuecomment-503302822
@mxnet-label-bot add [Bug, Gluon]
This is an
Zha0q1 commented on a change in pull request #15210: [WIP] Custom Operator
Profiling Enhancement
URL: https://github.com/apache/incubator-mxnet/pull/15210#discussion_r294941153
##
File path: src/engine/naive_engine.cc
##
@@ -159,7 +160,11 @@ class NaiveEngine final :
Zha0q1 commented on a change in pull request #15210: [WIP] Custom Operator
Profiling Enhancement
URL: https://github.com/apache/incubator-mxnet/pull/15210#discussion_r294941153
##
File path: src/engine/naive_engine.cc
##
@@ -159,7 +160,11 @@ class NaiveEngine final :
Zha0q1 commented on a change in pull request #15210: [WIP] Custom Operator
Profiling Enhancement
URL: https://github.com/apache/incubator-mxnet/pull/15210#discussion_r294941153
##
File path: src/engine/naive_engine.cc
##
@@ -159,7 +160,11 @@ class NaiveEngine final :
This is an automated email from the ASF dual-hosted git repository.
marcoabreu pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet-site.git
The following commit(s) were added to refs/heads/asf-site by this push:
new 8b377ef Bump the publish
larroy commented on a change in pull request #15109: [DOC] refine autograd docs
URL: https://github.com/apache/incubator-mxnet/pull/15109#discussion_r294991726
##
File path: docs/api/python/autograd/autograd.md
##
@@ -76,7 +82,63 @@ Detailed tutorials are available in Part
larroy commented on issue #14535: [DOC] Updated install instructions for mac
URL: https://github.com/apache/incubator-mxnet/pull/14535#issuecomment-503288770
@aaronmarkham what do you suggest? to change python3 commands to just python
? I thought there was a discussion about dropping
larroy commented on a change in pull request #15109: [DOC] refine autograd docs
URL: https://github.com/apache/incubator-mxnet/pull/15109#discussion_r295006364
##
File path: docs/api/python/autograd/autograd.md
##
@@ -76,7 +82,63 @@ Detailed tutorials are available in Part
larroy commented on a change in pull request #15109: [DOC] refine autograd docs
URL: https://github.com/apache/incubator-mxnet/pull/15109#discussion_r29501
##
File path: docs/api/python/autograd/autograd.md
##
@@ -76,7 +82,63 @@ Detailed tutorials are available in Part
pallabdatta commented on issue #14653: [Feature Request] Support ONNX export of
LayerNorm operator
URL:
https://github.com/apache/incubator-mxnet/issues/14653#issuecomment-503297084
Hi I am interested to have LayerNorm support in ONNX for Pytorch to ONNX
workflow.
Thanks so much in
abhinavs95 commented on a change in pull request #15150: Fix dumps for Constant
initializer
URL: https://github.com/apache/incubator-mxnet/pull/15150#discussion_r294935660
##
File path: python/mxnet/initializer.py
##
@@ -464,6 +464,12 @@ def __init__(self, value):
piyushghai commented on issue #15254:
mxnet(mxnet-full_2.11-linux-x86_64-gpu-1.5.0-SNAPSHOT) cannot support cuda10.1?
URL:
https://github.com/apache/incubator-mxnet/issues/15254#issuecomment-503237205
> **Thank you all. I lowered the CUDA version(9.2), and now it's OK. But my
code can't
arcadiaphy opened a new issue #15267: Java examples broken with mxnet mkldnn
build
URL: https://github.com/apache/incubator-mxnet/issues/15267
## Description
I've built the scala-package with mxnet mkldnn and run the java demo
ImageClassification, but the demo is broken.
##
mxnet-label-bot commented on issue #15267: Java examples broken with mxnet
mkldnn build
URL:
https://github.com/apache/incubator-mxnet/issues/15267#issuecomment-503245717
Hey, this is the MXNet Label Bot.
Thank you for submitting the issue! I will try and suggest some labels so
that
Ishitori opened a new issue #15268: Backward doesn't work on LSTM with
sequence_length
URL: https://github.com/apache/incubator-mxnet/issues/15268
## Description
LSTM with out-of-the-box variable length was introduced in [this
PR](https://github.com/apache/incubator-mxnet/pull/14208/).
mxnet-label-bot commented on issue #15268: Backward doesn't work on LSTM with
sequence_length
URL:
https://github.com/apache/incubator-mxnet/issues/15268#issuecomment-503257407
Hey, this is the MXNet Label Bot.
Thank you for submitting the issue! I will try and suggest some labels so
larroy commented on issue #9859: How to run model trained with mxnet 1.0 on
android
URL:
https://github.com/apache/incubator-mxnet/issues/9859#issuecomment-503276413
Broken means amalgamation is broken? Could you paste some more information
on what's broken and what's the error?
larroy commented on a change in pull request #15109: [DOC] refine autograd docs
URL: https://github.com/apache/incubator-mxnet/pull/15109#discussion_r295005557
##
File path: docs/api/python/autograd/autograd.md
##
@@ -76,7 +82,63 @@ Detailed tutorials are available in Part
jmerkow commented on issue #14421: Updating mxnet from 1.0.0, networks give
different outputs
URL:
https://github.com/apache/incubator-mxnet/issues/14421#issuecomment-503291825
@larroy https://github.com/apache/incubator-mxnet/pull/15026
This is an automated email from the ASF dual-hosted git repository.
marcoabreu pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet-site.git
The following commit(s) were added to refs/heads/asf-site by this push:
new 1c64e06 Bump the publish
leleamol commented on issue #15268: Backward doesn't work on LSTM with
sequence_length
URL:
https://github.com/apache/incubator-mxnet/issues/15268#issuecomment-503331624
I could reproduce this issue. Here is a full callstack.
ubuntu@ip-172-31-31-181:~$ python lstm_test.py
leleamol commented on issue #15266: could not use blas when building with cmake
URL:
https://github.com/apache/incubator-mxnet/issues/15266#issuecomment-503331958
@mxnet-label-bot add [Build, Blas]
This is an automated
Zha0q1 commented on a change in pull request #15210: Custom Operator Profiling
Enhancement
URL: https://github.com/apache/incubator-mxnet/pull/15210#discussion_r295050855
##
File path: src/engine/naive_engine.cc
##
@@ -159,7 +160,11 @@ class NaiveEngine final : public
larroy commented on issue #15253: Add higher order gradient support `sigmoid`,
`tan`, `tanh`
URL: https://github.com/apache/incubator-mxnet/pull/15253#issuecomment-503337665
@mxnet-label-bot add [operator]
This is an
This is an automated email from the ASF dual-hosted git repository.
wkcn pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git
The following commit(s) were added to refs/heads/master by this push:
new ccbbf6b Fix java install docs (#15250)
junrushao1994 commented on issue #15262: [MXNET-1417][Performance] Caching
Dynamic Shape Checking Result
URL: https://github.com/apache/incubator-mxnet/pull/15262#issuecomment-503318433
> So the root cause of the degradation was the additional shape inferencing
passes?
Yep, the
larroy commented on issue #15262: [MXNET-1417][Performance] Caching Dynamic
Shape Checking Result
URL: https://github.com/apache/incubator-mxnet/pull/15262#issuecomment-503318341
thanks for fixing this, and it's cool that your orginal commit enables
dynamic shape.
apeforest commented on a change in pull request #15210: Custom Operator
Profiling Enhancement
URL: https://github.com/apache/incubator-mxnet/pull/15210#discussion_r295041240
##
File path: tests/python/unittest/test_profiler.py
##
@@ -269,6 +269,129 @@ def
apeforest commented on a change in pull request #15210: Custom Operator
Profiling Enhancement
URL: https://github.com/apache/incubator-mxnet/pull/15210#discussion_r295041736
##
File path: src/operator/custom/custom.cc
##
@@ -345,7 +345,7 @@ void ForwardEx(const
apeforest commented on a change in pull request #15210: Custom Operator
Profiling Enhancement
URL: https://github.com/apache/incubator-mxnet/pull/15210#discussion_r295041821
##
File path: src/operator/custom/custom.cc
##
@@ -415,7 +415,8 @@ void BackwardEx(const
apeforest commented on a change in pull request #15210: Custom Operator
Profiling Enhancement
URL: https://github.com/apache/incubator-mxnet/pull/15210#discussion_r295041352
##
File path: tests/python/unittest/test_profiler.py
##
@@ -269,6 +269,129 @@ def
larroy commented on issue #15253: Add higher order gradient support `sigmoid`,
`tan`, `tanh`
URL: https://github.com/apache/incubator-mxnet/pull/15253#issuecomment-503337603
@mxnet-label-bot add [pr-awaiting-review,autograd]
larroy commented on issue #15270: Fix warnings in CLang.
URL: https://github.com/apache/incubator-mxnet/pull/15270#issuecomment-503337489
@mxnet-label-bot add [pr-awaiting-review,build]
This is an automated message from the
adamcrussell commented on issue #15271: build error on OS X
URL:
https://github.com/apache/incubator-mxnet/issues/15271#issuecomment-503339878
Note that the fix I found is included in what I wrote above. I wasn't sure
enough of the cause to submit this as a PR and figured a note about
NeoZhangJianyu commented on issue #15251: cpp package fails to build
URL:
https://github.com/apache/incubator-mxnet/issues/15251#issuecomment-503359834
@aaronmarkham
1. Have you run 'make clean' before change the make parameters?
2. There is an error in your log:
```
leleamol commented on issue #15267: Java examples broken with mxnet mkldnn build
URL:
https://github.com/apache/incubator-mxnet/issues/15267#issuecomment-503307060
@mxnet-label-bot add [Java, Scala, MKLDNN]
This is an
apeforest commented on a change in pull request #15210: Custom Operator
Profiling Enhancement
URL: https://github.com/apache/incubator-mxnet/pull/15210#discussion_r295040348
##
File path: src/engine/threaded_engine.cc
##
@@ -333,9 +333,14 @@ void
apeforest commented on a change in pull request #15210: Custom Operator
Profiling Enhancement
URL: https://github.com/apache/incubator-mxnet/pull/15210#discussion_r295040225
##
File path: src/engine/naive_engine.cc
##
@@ -159,7 +160,11 @@ class NaiveEngine final : public
Roshrini commented on issue #15130: Add NaiveEngine tests in CI
URL: https://github.com/apache/incubator-mxnet/pull/15130#issuecomment-503329801
@xinyu-intel Thanks for working on this. I agree with Marco that naive
engine is slower. So, its better to have these as nightly tests. But how
ptrendx opened a new pull request #15272: Proper bulking of ops not using
FCompute
URL: https://github.com/apache/incubator-mxnet/pull/15272
## Description ##
PR #13890 made bulking of ops more performant in the hybridized models with
`static_alloc=True`. However, it was limited to ops
stephenrawls commented on issue #15268: Backward doesn't work on LSTM with
sequence_length
URL:
https://github.com/apache/incubator-mxnet/issues/15268#issuecomment-503340350
Looking at this now.
This is an automated message
apeforest commented on a change in pull request #15253: Add higher order
gradient support `sigmoid`, `tan`, `tanh`
URL: https://github.com/apache/incubator-mxnet/pull/15253#discussion_r295062112
##
File path: src/operator/tensor/elemwise_unary_op_basic.cc
##
@@ -121,7
anirudh2290 opened a new issue #15273: cuda memcheck failures with different
cuda versions
URL: https://github.com/apache/incubator-mxnet/issues/15273
This was encountered during work on the PR:
https://github.com/apache/incubator-mxnet/pull/15118. This is also related to
mxnet-label-bot commented on issue #15273: cuda memcheck failures with
different cuda versions
URL:
https://github.com/apache/incubator-mxnet/issues/15273#issuecomment-503344419
Hey, this is the MXNet Label Bot.
Thank you for submitting the issue! I will try and suggest some labels
anirudh2290 commented on a change in pull request #15210: Custom Operator
Profiling Enhancement
URL: https://github.com/apache/incubator-mxnet/pull/15210#discussion_r295047266
##
File path: src/engine/naive_engine.cc
##
@@ -159,7 +160,11 @@ class NaiveEngine final :
larroy commented on a change in pull request #14779: [Don't merge] Fully
connected, higher order grad
URL: https://github.com/apache/incubator-mxnet/pull/14779#discussion_r295031340
##
File path: tests/python/unittest/test_gluon.py
##
@@ -915,6 +915,24 @@ def
larroy commented on a change in pull request #14779: [Don't merge] Fully
connected, higher order grad
URL: https://github.com/apache/incubator-mxnet/pull/14779#discussion_r295031340
##
File path: tests/python/unittest/test_gluon.py
##
@@ -915,6 +915,24 @@ def
larroy commented on issue #15262: [MXNET-1417][Performance] Caching Dynamic
Shape Checking Result
URL: https://github.com/apache/incubator-mxnet/pull/15262#issuecomment-503317603
@junrushao1994 would have been nice that you tagged me for review as well,
since I was looking into this
larroy commented on issue #15262: [MXNET-1417][Performance] Caching Dynamic
Shape Checking Result
URL: https://github.com/apache/incubator-mxnet/pull/15262#issuecomment-503317990
So the root cause of the degradation was the additional shape inferencing
passes?
junrushao1994 commented on issue #15262: [MXNET-1417][Performance] Caching
Dynamic Shape Checking Result
URL: https://github.com/apache/incubator-mxnet/pull/15262#issuecomment-503318145
@larroy Sure! Will tag you next time
larroy opened a new pull request #15270: Fix warnings in CLang.
URL: https://github.com/apache/incubator-mxnet/pull/15270
## Description ##
In file included from ../src/kvstore/kvstore.cc:28:
../src/kvstore/./kvstore_local.h:281:23: warning: lambda capture 'this' is
not used
mseth10 commented on issue #14721: Random number generator seed setting does
not always work for `mxnet.ndarray.linalg.potrf`
URL:
https://github.com/apache/incubator-mxnet/issues/14721#issuecomment-503358310
@iaroslav-ai I ran on Ubuntu.
wkcn commented on issue #15250: Fix java install docs
URL: https://github.com/apache/incubator-mxnet/pull/15250#issuecomment-503362746
Thanks for your contribution!
This is an automated message from the Apache Git Service.
To
wkcn merged pull request #15250: Fix java install docs
URL: https://github.com/apache/incubator-mxnet/pull/15250
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub
Zha0q1 commented on a change in pull request #15210: Custom Operator Profiling
Enhancement
URL: https://github.com/apache/incubator-mxnet/pull/15210#discussion_r295029975
##
File path: src/engine/naive_engine.cc
##
@@ -159,7 +160,11 @@ class NaiveEngine final : public
larroy opened a new pull request #15269: [DOC] Clarify that global pooling is
going to reset padding
URL: https://github.com/apache/incubator-mxnet/pull/15269
This behaviour changed from older MXNet versions in which global pooling
would consider padding. This clarifies the user
larroy commented on a change in pull request #14779: [Don't merge] Fully
connected, higher order grad
URL: https://github.com/apache/incubator-mxnet/pull/14779#discussion_r295031340
##
File path: tests/python/unittest/test_gluon.py
##
@@ -915,6 +915,24 @@ def
anirudh2290 commented on a change in pull request #15210: Custom Operator
Profiling Enhancement
URL: https://github.com/apache/incubator-mxnet/pull/15210#discussion_r295047266
##
File path: src/engine/naive_engine.cc
##
@@ -159,7 +160,11 @@ class NaiveEngine final :
larroy commented on issue #15269: [DOC] Clarify that global pooling is going to
reset padding
URL: https://github.com/apache/incubator-mxnet/pull/15269#issuecomment-503337147
@mxnet-label-bot add [doc,pr-awaiting-review]
adamcrussell opened a new issue #15271: build error on OS X
URL: https://github.com/apache/incubator-mxnet/issues/15271
Note: Providing complete information in the most concise form is the best
way to get help. This issue template serves as the checklist for essential
information to most
mxnet-label-bot commented on issue #15271: build error on OS X
URL:
https://github.com/apache/incubator-mxnet/issues/15271#issuecomment-503339485
Hey, this is the MXNet Label Bot.
Thank you for submitting the issue! I will try and suggest some labels so
that the appropriate MXNet
apeforest commented on a change in pull request #15253: Add higher order
gradient support `sigmoid`, `tan`, `tanh`
URL: https://github.com/apache/incubator-mxnet/pull/15253#discussion_r295082102
##
File path: src/operator/tensor/elemwise_unary_op_basic.cc
##
@@ -121,7
reminisce opened a new pull request #15264: [numpy] Fix d2l chapter 5
URL: https://github.com/apache/incubator-mxnet/pull/15264
## Description ##
(Brief description on what this PR is about)
## Checklist ##
### Essentials ###
Please feel free to remove inapplicable items for
anirudh2290 commented on a change in pull request #15164: [C++] Improve
inference script to support benchmark on Imagenet
URL: https://github.com/apache/incubator-mxnet/pull/15164#discussion_r294622396
##
File path: cpp-package/include/mxnet-cpp/initializer.h
##
@@ -91,6
anirudh2290 commented on a change in pull request #15164: [C++] Improve
inference script to support benchmark on Imagenet
URL: https://github.com/apache/incubator-mxnet/pull/15164#discussion_r294620813
##
File path: cpp-package/example/inference/unit_test_imagenet_inference.sh
anirudh2290 commented on a change in pull request #15164: [C++] Improve
inference script to support benchmark on Imagenet
URL: https://github.com/apache/incubator-mxnet/pull/15164#discussion_r294620902
##
File path: cpp-package/example/inference/README.md
##
@@ -30,34
lucinyaLi commented on issue #14810: Add the Gluon Implementation of Deformable
Convolution
URL: https://github.com/apache/incubator-mxnet/pull/14810#issuecomment-502969476
I met some error when I used the deformableconv layer presented above.
The error is :
MXNetError: [14:16:07]
lucinyaLi edited a comment on issue #14810: Add the Gluon Implementation of
Deformable Convolution
URL: https://github.com/apache/incubator-mxnet/pull/14810#issuecomment-502994530
@ suyz526
This is a part of my net, I change some original conv layer in SSD into the
deformable conv.
lucinyaLi edited a comment on issue #14810: Add the Gluon Implementation of
Deformable Convolution
URL: https://github.com/apache/incubator-mxnet/pull/14810#issuecomment-502994530
@ suyz526
This is a part of my net, I change some original conv layer in SSD into the
deformable conv.
lucinyaLi edited a comment on issue #14810: Add the Gluon Implementation of
Deformable Convolution
URL: https://github.com/apache/incubator-mxnet/pull/14810#issuecomment-502994530
@ suyz526
This is a part of my net, I change some original conv layer in SSD into the
deformable conv.
lucinyaLi edited a comment on issue #14810: Add the Gluon Implementation of
Deformable Convolution
URL: https://github.com/apache/incubator-mxnet/pull/14810#issuecomment-502994530
@ suyz526
This is a part of my net, I change some original conv layer in SSD into the
deformable conv.
suyz526 commented on issue #14810: Add the Gluon Implementation of Deformable
Convolution
URL: https://github.com/apache/incubator-mxnet/pull/14810#issuecomment-502997580
> > CustomOp
> > This is a part of my net, I change some original conv layer in SSD into
the deformable conv. There
lucinyaLi commented on issue #14810: Add the Gluon Implementation of Deformable
Convolution
URL: https://github.com/apache/incubator-mxnet/pull/14810#issuecomment-503000372
@suyz526
init_scale = mx.nd.array([0.229, 0.224, 0.225], ctx=mx.gpu(1)).reshape((1,
3, 1, 1)) * 255
Or
mouryarishik commented on issue #9686: [Discussion] MXNet 2.0 Roadmap (was:
APIs that might be a good idea to break in 2.0)
URL:
https://github.com/apache/incubator-mxnet/issues/9686#issuecomment-503000563
I'd like to give some suggestions to improve Estimator experience.
- The
This is an automated email from the ASF dual-hosted git repository.
marcoabreu pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet-site.git
The following commit(s) were added to refs/heads/asf-site by this push:
new b7c3096 Bump the publish
lucinyaLi edited a comment on issue #14810: Add the Gluon Implementation of
Deformable Convolution
URL: https://github.com/apache/incubator-mxnet/pull/14810#issuecomment-502994530
@ suyz526
This is a part of my net, I change some original conv layer in SSD into the
deformable conv.
lucinyaLi commented on issue #14810: Add the Gluon Implementation of Deformable
Convolution
URL: https://github.com/apache/incubator-mxnet/pull/14810#issuecomment-502994530
> CustomOp
This is a part of my net, I change some original conv layer in SSD into the
deformable conv. There is
mxnet-label-bot commented on issue #15265: Run pretrained model on android
URL:
https://github.com/apache/incubator-mxnet/issues/15265#issuecomment-503009111
Hey, this is the MXNet Label Bot.
Thank you for submitting the issue! I will try and suggest some labels so
that the
1 - 100 of 149 matches
Mail list logo