[GitHub] [incubator-mxnet] mxnet-bot commented on pull request #18608: Cherry-pick #18310 #18355
mxnet-bot commented on pull request #18608: URL: https://github.com/apache/incubator-mxnet/pull/18608#issuecomment-660782639 Jenkins CI successfully triggered : [unix-cpu] This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] MoisesHer commented on pull request #18608: Cherry-pick #18310 #18355
MoisesHer commented on pull request #18608: URL: https://github.com/apache/incubator-mxnet/pull/18608#issuecomment-660782622 @mxnet-bot run ci [unix-cpu] This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] MoisesHer commented on pull request #18608: Cherry-pick #18310 #18355
MoisesHer commented on pull request #18608: URL: https://github.com/apache/incubator-mxnet/pull/18608#issuecomment-660781370 @mxnet-bot run ci [unix-gpu] This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] mxnet-bot commented on pull request #18608: Cherry-pick #18310 #18355
mxnet-bot commented on pull request #18608: URL: https://github.com/apache/incubator-mxnet/pull/18608#issuecomment-660781384 Jenkins CI successfully triggered : [unix-gpu] This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[incubator-mxnet-site] branch asf-site updated: Bump the publish timestamp.
This is an automated email from the ASF dual-hosted git repository. aaronmarkham pushed a commit to branch asf-site in repository https://gitbox.apache.org/repos/asf/incubator-mxnet-site.git The following commit(s) were added to refs/heads/asf-site by this push: new 540feca Bump the publish timestamp. 540feca is described below commit 540fecac8d3eb31705273293b4e7f61fbee52716 Author: mxnet-ci AuthorDate: Mon Jul 20 00:42:11 2020 + Bump the publish timestamp. --- date.txt | 1 + 1 file changed, 1 insertion(+) diff --git a/date.txt b/date.txt new file mode 100644 index 000..dc47863 --- /dev/null +++ b/date.txt @@ -0,0 +1 @@ +Mon Jul 20 00:42:11 UTC 2020
[GitHub] [incubator-mxnet] DickJC123 edited a comment on pull request #18694: Unittest tolerance handling improvements
DickJC123 edited a comment on pull request #18694: URL: https://github.com/apache/incubator-mxnet/pull/18694#issuecomment-660720326 Thanks @szha! You probably saw that I've been struggling to get a passing CI- running into and fixing many issues unrelated to my PR along the way. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] DickJC123 commented on pull request #18694: Unittest tolerance handling improvements
DickJC123 commented on pull request #18694: URL: https://github.com/apache/incubator-mxnet/pull/18694#issuecomment-660720326 Thanks @szha! You probably saw that I've been struggling to get a passing CI- running into and fixing many issues along the way. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] szha commented on issue #18755: test_gluon_probability_v2.py::test_gluon_kl and test_gluon_probability_v1.py::test_gluon_kl_v1 are flaky
szha commented on issue #18755: URL: https://github.com/apache/incubator-mxnet/issues/18755#issuecomment-660710856 fixed by #18694 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] szha closed issue #18737: test_operator_gpu.py::test_batchnorm_with_type inadvertently retests cases, misses others
szha closed issue #18737: URL: https://github.com/apache/incubator-mxnet/issues/18737 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] szha commented on issue #18731: test_numpy_default_dtype::test_default_float_dtype is not flagging true_divide op issues
szha commented on issue #18731: URL: https://github.com/apache/incubator-mxnet/issues/18731#issuecomment-660710766 fixed by #18694 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] szha closed issue #18736: unittest/test_numpy_interoperability.py inadvertently puts additional unittests on a fixed seed
szha closed issue #18736: URL: https://github.com/apache/incubator-mxnet/issues/18736 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] szha commented on issue #18747: unittests using @retry decorator can segfault if they fail
szha commented on issue #18747: URL: https://github.com/apache/incubator-mxnet/issues/18747#issuecomment-660710842 fixed by #18694 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] szha closed issue #18731: test_numpy_default_dtype::test_default_float_dtype is not flagging true_divide op issues
szha closed issue #18731: URL: https://github.com/apache/incubator-mxnet/issues/18731 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] szha commented on issue #18737: test_operator_gpu.py::test_batchnorm_with_type inadvertently retests cases, misses others
szha commented on issue #18737: URL: https://github.com/apache/incubator-mxnet/issues/18737#issuecomment-660710816 fixed by #18694 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] szha commented on issue #18736: unittest/test_numpy_interoperability.py inadvertently puts additional unittests on a fixed seed
szha commented on issue #18736: URL: https://github.com/apache/incubator-mxnet/issues/18736#issuecomment-660710800 fixed by #18694 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] szha closed issue #18755: test_gluon_probability_v2.py::test_gluon_kl and test_gluon_probability_v1.py::test_gluon_kl_v1 are flaky
szha closed issue #18755: URL: https://github.com/apache/incubator-mxnet/issues/18755 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] szha closed issue #18747: unittests using @retry decorator can segfault if they fail
szha closed issue #18747: URL: https://github.com/apache/incubator-mxnet/issues/18747 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] szha commented on pull request #18694: Unittest tolerance handling improvements
szha commented on pull request #18694: URL: https://github.com/apache/incubator-mxnet/pull/18694#issuecomment-660710673 Thanks for the fixes, @DickJC123. They are really helpful. I did a code review and examined the analysis included in each of them as well as the specific fixes. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[incubator-mxnet] branch master updated: Unittest tolerance handling improvements (#18694)
This is an automated email from the ASF dual-hosted git repository. zhasheng pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git The following commit(s) were added to refs/heads/master by this push: new 146b49e Unittest tolerance handling improvements (#18694) 146b49e is described below commit 146b49ead32b941f74db694f2d453cb25650d252 Author: Dick Carter AuthorDate: Sun Jul 19 14:12:50 2020 -0700 Unittest tolerance handling improvements (#18694) * Add sm arch 80 to Makefile * Add TF32 to cuBLAS GEMMs Signed-off-by: Serge Panev * Add CUDA version guards Signed-off-by: Serge Panev * Remove useless TF32 for double and old CUDA version Signed-off-by: Serge Panev * Factorize VERSION_ADJUSTED_TF32_MATH Signed-off-by: Serge Panev * Add TF32 considerations to test_util.py:check_consistency() * Bypass test_gluon_gpu.py:test_large_models if gmem >32GB * Default tols in assert_almost_equal() now a function of dtype and ctx * Expand types listed by default_tols() * Fix pylint * All with_seed() tests to waitall in teardown * Elevate MXNET_TEST_SEED logging to WARNING * Revert test_gluon_gpu.py:test_rnn_layer to default tols * Fix test_gluon_model_zoo_gpu.py::test_inference and test_operator_gpy.py::test_np_linalg_{solve,tensorinv} * test_numpy_interoperability.py to not fix seed for rest of CI * Further fix to test_np_linalg_tensorinv * Fix test_gluon_data.py:test_dataloader_context when run on 1-GPU system. * Fix test_operator_gpu.py::test_embedding_with_type * Fix test_operator_gpu.py::{test_*convolution_large_c,test_np_linalg_tensorsolve} * Remove unneeded print() from test_numpy_interoperability.py * Unify tol handling of check_consistency() and assert_almost_equal(). Test tweeks. * Add tol handling of assert_almost_equal() with number args * Add tol handling of bool comparisons * Fix test_numpy_op.py::test_np_random_rayleigh * Fix test_operator_gpu.py::test_batchnorm_with_type * Fix test_gluon.py::test_sync_batchnorm in cpu selftest * Improve unittest failure reporting * Add to robustness of test_operator_gpu.py::test_embedding_with_type * Check_consistency() to use equal backward gradients for increased test robustness * Fix test_operator_gpu.py::test_{fully_connected,gemm}. Add default_numeric_eps(). * test_utils.py fix for numeric gradient calc * Reinstate rtol=1e-2 for test_operator.py::test_order * Remove auto-cast of check_consistency() input data to least precise dtype (not needed) * Fix test_operator.py::test_{reciprocol,cbrt,rcbrt}_op * Expand default float64 numeric_eps for test_operator_gpu.py::test_sofmin * Fix segfault-on-error of @retry decorator. Add test isolation. * assert_almost_equal() to handle a,b scalars * Fix test_operator_gpu.py::test_gluon_{mvn,mvn_v1} race * Fix test_operator_gpu.py::test_flatten_slice_after_conv via scale * Remove test_utils.py:almost_equal_ignore_nan() * Fix sample vs. pop variance issue with test_numpy_op.py::test_npx_batch_norm * Expose test_utils.py:effective_dtype() and use to fix test_operator_gpu.py::test_np_linalg_svd * Fix true_divide int_array / int_scalar -> float_array to honor np_default_dtype * Try test_elemwise_binary_ops serial to avoid pytest worker crash * Fix (log_)softmax backward on empty ndarray * Temporarily log all CI seeds to troubleshoot seed non-determinism * Revert "Temporarily log all CI seeds to troubleshoot seed non-determinism" This reverts commit f60eff20785b812ac4fcd70d51359ee0cbfb3e47. * Temp log all CI seeds to troubleshoot unwanted seed determinism * Revert "Add sm arch 80 to Makefile" This reverts commit f9306cecc53b0633ef5f5b7b000802fbf0d73fe9. * Same fix of sample vs. pop variance issue, now with test_operator_gpu.py::test_batchnorm * Revert "Temp log all CI seeds to troubleshoot unwanted seed determinism" This reverts commit ff328efb0be3445690669d5437a6af575ff12b49. * Marking test_sparse_dot_grad with garbage_expected after teardown error * Fix flakiness of test_gluon_probability{_v1,_v2}.py::test_gluon_kl{_v1,} * Temp skip of test_aggregate_duplication on gpu * Add seeding to test_{numpy,}_contrib_gluon_data_vision.py. Make created files unique. * Add ndarray module isolation to help debug test_bbox_augmenters worker crash * Marking test_sparse_square_sum serial after pytest worker crash * Fix flakiness of test_gluon_probability{_v1,_v2}.py::test_half_cauchy{_v1,}
[GitHub] [incubator-mxnet] szha merged pull request #18694: Unittest tolerance handling improvements
szha merged pull request #18694: URL: https://github.com/apache/incubator-mxnet/pull/18694 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] D-Roberts commented on pull request #18757: Add qr backward for wide inputs with nrows < ncols
D-Roberts commented on pull request #18757: URL: https://github.com/apache/incubator-mxnet/pull/18757#issuecomment-660704858 @mxnet-bot run ci [edge, clang] This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] mxnet-bot commented on pull request #18757: Add qr backward for wide inputs with nrows < ncols
mxnet-bot commented on pull request #18757: URL: https://github.com/apache/incubator-mxnet/pull/18757#issuecomment-660704876 Jenkins CI successfully triggered : [edge, clang] This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] mjdenkowski commented on issue #18699: Simplified HybridBlock.forward commit made Sockeye 4% slower
mjdenkowski commented on issue #18699: URL: https://github.com/apache/incubator-mxnet/issues/18699#issuecomment-660692437 This benchmark was run using a trained [Sockeye](https://github.com/awslabs/sockeye) model. This is a full sequence-to-sequence model with HybridBlocks glued together by Python code. We're running inference (translating a test set with beam search), so there shouldn't be any backward operations. We can share a model with input data and a run script if that will help with debugging. Alternatively, do you have an idea of what good tests would be for determining frontend vs backend speed regressions or multi-API overhead? Are there build/runtime options for MXNet that we can try with our sample model? If we have a good idea of everything that changed in commit 83b5170, we can work through it by process of elimination. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] ys2843 edited a comment on pull request #18738: set website default version - test redirect
ys2843 edited a comment on pull request #18738: URL: https://github.com/apache/incubator-mxnet/pull/18738#issuecomment-660381271 > Not sure, but this didn't work: http://ec2-34-219-134-42.us-west-2.compute.amazonaws.com/community/contribute.html > Shouldn't the dropdown read 1.6 if the redirect worked? > > You can use the beta site for testing redirects. It's perfect for this kind of testing. > https://github.com/apache/incubator-mxnet-site/tree/beta-site I will add more clarification about this PR. The redirect test in this PR is described below Start: http://ec2-34-219-134-42.us-west-2.compute.amazonaws.com/versions/1.1.0/community/contribute.html To: http://ec2-34-219-134-42.us-west-2.compute.amazonaws.com/versions/1.6/community/contribute.html The redirect only takes effect when user visit `/versions/1.1.0/community/contribute.html` from outside of mxnet website. It could be from a google search result, or type this url into the browser directly. We get this info by checking whether HTTP request header `referer` is "apache.mxnet.org" or not. This is the behavior that I would like to verify on mxnet website server, it works well on preview host. And to prevent redirect looping, if user goes to `/versions/1.1.0` => top nav bar => community => contribute, this also goes to contribution page but redirect won't be triggered because user is coming from mxnet website. It looks like this beta git repo is too large to clone... This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] ys2843 edited a comment on pull request #18738: set website default version - test redirect
ys2843 edited a comment on pull request #18738: URL: https://github.com/apache/incubator-mxnet/pull/18738#issuecomment-660381271 > Not sure, but this didn't work: http://ec2-34-219-134-42.us-west-2.compute.amazonaws.com/community/contribute.html > Shouldn't the dropdown read 1.6 if the redirect worked? > > You can use the beta site for testing redirects. It's perfect for this kind of testing. > https://github.com/apache/incubator-mxnet-site/tree/beta-site I will add more clarification about this PR. The redirect test in this PR is described below Start: http://ec2-34-219-134-42.us-west-2.compute.amazonaws.com/versions/1.1.0/community/contribute.html To: http://ec2-34-219-134-42.us-west-2.compute.amazonaws.com/versions/1.6/community/contribute.html The redirect only takes effect when user visit `/versions/1.1.0/community/contribute.html` from outside of mxnet website. It could be from a google search result, or type this url into the browser directly. We get this info by checking whether HTTP request header `referer` is "apache.mxnet.org" or not. This is the behavior that I would like to verify, it works well on preview host. And to prevent redirect looping, if user goes to `/versions/1.1.0` => top nav bar => community => contribute, this also goes to contribution page but redirect won't be triggered because user is coming from mxnet website. It looks like this beta git repo is too large to clone... This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] ys2843 edited a comment on pull request #18738: set website default version - test redirect
ys2843 edited a comment on pull request #18738: URL: https://github.com/apache/incubator-mxnet/pull/18738#issuecomment-660381271 > Not sure, but this didn't work: http://ec2-34-219-134-42.us-west-2.compute.amazonaws.com/community/contribute.html > Shouldn't the dropdown read 1.6 if the redirect worked? > > You can use the beta site for testing redirects. It's perfect for this kind of testing. > https://github.com/apache/incubator-mxnet-site/tree/beta-site I will add more clarification about this PR. The redirect test in this PR is described below Start: http://ec2-34-219-134-42.us-west-2.compute.amazonaws.com/versions/1.1.0/community/contribute.html To: http://ec2-34-219-134-42.us-west-2.compute.amazonaws.com/versions/1.6/community/contribute.html The redirect only takes effect when user visit `/versions/1.1.0/community/contribute.html` from outside of mxnet website. It could be from a google search result, or type this url into the browser directly. ( we get this info by checking whether HTTP `referer` is "apache.mxnet.org" or not) And to prevent redirect looping, if user goes to `/versions/1.1.0` => top nav bar => community => contribute, this also goes to contribution page but redirect won't be triggered because user is coming from mxnet website. It looks like this beta git repo is too large to clone... This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] ys2843 edited a comment on pull request #18738: set website default version - test redirect
ys2843 edited a comment on pull request #18738: URL: https://github.com/apache/incubator-mxnet/pull/18738#issuecomment-660381271 > Not sure, but this didn't work: http://ec2-34-219-134-42.us-west-2.compute.amazonaws.com/community/contribute.html > Shouldn't the dropdown read 1.6 if the redirect worked? > > You can use the beta site for testing redirects. It's perfect for this kind of testing. > https://github.com/apache/incubator-mxnet-site/tree/beta-site I will add more clarification about this PR. The redirect test in this PR is described below Start: http://ec2-34-219-134-42.us-west-2.compute.amazonaws.com/versions/1.1.0/community/contribute.html To: http://ec2-34-219-134-42.us-west-2.compute.amazonaws.com/versions/1.6/community/contribute.html The redirect only takes effect when user visit `/versions/1.1.0/community/contribute.html` from outside of mxnet website. It could be from a google search result, or type this url into the browser directly. ( we get this info by checking whether HTTP request header `referer` is "apache.mxnet.org" or not) And to prevent redirect looping, if user goes to `/versions/1.1.0` => top nav bar => community => contribute, this also goes to contribution page but redirect won't be triggered because user is coming from mxnet website. It looks like this beta git repo is too large to clone... This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] ys2843 edited a comment on pull request #18738: set website default version - test redirect
ys2843 edited a comment on pull request #18738: URL: https://github.com/apache/incubator-mxnet/pull/18738#issuecomment-660381271 > Not sure, but this didn't work: http://ec2-34-219-134-42.us-west-2.compute.amazonaws.com/community/contribute.html > Shouldn't the dropdown read 1.6 if the redirect worked? > > You can use the beta site for testing redirects. It's perfect for this kind of testing. > https://github.com/apache/incubator-mxnet-site/tree/beta-site I will add more clarification about this PR. The redirect test in this PR is described below Start: http://ec2-34-219-134-42.us-west-2.compute.amazonaws.com/versions/1.1.0/community/contribute.html To: http://ec2-34-219-134-42.us-west-2.compute.amazonaws.com/versions/1.6/community/contribute.html The redirect only takes effect when user enter `/versions/1.1.0/community/contribute.html` from outside of mxnet website. It could be from a google search result, or type this url into the browser directly. ( check HTTP `referer` is "apache.mxnet.org" or not) And to prevent redirect looping, if user goes to `/versions/1.1.0` => top nav bar => community => contribute, this also goes to contribution page but redirect won't be triggered because user is coming from mxnet website. It looks like this beta git repo is too large to clone... This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] ys2843 edited a comment on pull request #18738: set website default version - test redirect
ys2843 edited a comment on pull request #18738: URL: https://github.com/apache/incubator-mxnet/pull/18738#issuecomment-660381271 > Not sure, but this didn't work: http://ec2-34-219-134-42.us-west-2.compute.amazonaws.com/community/contribute.html > Shouldn't the dropdown read 1.6 if the redirect worked? > > You can use the beta site for testing redirects. It's perfect for this kind of testing. > https://github.com/apache/incubator-mxnet-site/tree/beta-site I will add more clarification about this PR. The redirect test in this PR is described below Start: http://ec2-34-219-134-42.us-west-2.compute.amazonaws.com/versions/1.1.0/community/contribute.html To: http://ec2-34-219-134-42.us-west-2.compute.amazonaws.com/versions/1.6/community/contribute.html The redirect only takes effect when user visit `/versions/1.1.0/community/contribute.html` from outside of mxnet website. It could be from a google search result, or type this url into the browser directly. ( check HTTP `referer` is "apache.mxnet.org" or not) And to prevent redirect looping, if user goes to `/versions/1.1.0` => top nav bar => community => contribute, this also goes to contribution page but redirect won't be triggered because user is coming from mxnet website. It looks like this beta git repo is too large to clone... This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] ys2843 edited a comment on pull request #18738: set website default version - test redirect
ys2843 edited a comment on pull request #18738: URL: https://github.com/apache/incubator-mxnet/pull/18738#issuecomment-660381271 > Not sure, but this didn't work: http://ec2-34-219-134-42.us-west-2.compute.amazonaws.com/community/contribute.html > Shouldn't the dropdown read 1.6 if the redirect worked? > > You can use the beta site for testing redirects. It's perfect for this kind of testing. > https://github.com/apache/incubator-mxnet-site/tree/beta-site I will add more clarification about this PR. The redirect test in this PR is described below Start: http://ec2-34-219-134-42.us-west-2.compute.amazonaws.com/versions/1.1.0/community/contribute.html To: http://ec2-34-219-134-42.us-west-2.compute.amazonaws.com/versions/1.6/community/contribute.html The redirect only works when user enter `/versions/1.1.0/community/contribute.html` from outside of mxnet website. It could be from a google search result, or type this url into the browser directly. ( check HTTP `referer` is "apache.mxnet.org" or not) And to prevent redirect looping, if user goes to `/versions/1.1.0` => top nav bar => community => contribute, this also goes to contribution page but redirect won't be triggered because user is coming from mxnet website. It looks like this beta git repo is too large to clone... This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[incubator-mxnet-site] branch asf-site updated: Publish triggered by CI
This is an automated email from the ASF dual-hosted git repository. aaronmarkham pushed a commit to branch asf-site in repository https://gitbox.apache.org/repos/asf/incubator-mxnet-site.git The following commit(s) were added to refs/heads/asf-site by this push: new 5563c5b Publish triggered by CI 5563c5b is described below commit 5563c5b9366ffb6402d9ffae829ae2b9eb4c2e19 Author: mxnet-ci AuthorDate: Sun Jul 19 18:42:28 2020 + Publish triggered by CI --- api/python/docs/_modules/mxnet/util.html | 82 date.txt | 1 - feed.xml | 2 +- 3 files changed, 42 insertions(+), 43 deletions(-) diff --git a/api/python/docs/_modules/mxnet/util.html b/api/python/docs/_modules/mxnet/util.html index b0c301a..42e8ea84 100644 --- a/api/python/docs/_modules/mxnet/util.html +++ b/api/python/docs/_modules/mxnet/util.html @@ -815,7 +815,7 @@ return free_mem.value, total_mem.value -[docs]def set_np_shape(active): +def set_np_shape(active): Turns on/off NumPy shape semantics, in which `()` represents the shape of scalar tensors, and tuples with `0` elements, for example, `(0,)`, `(1, 0, 2)`, represent the shapes of zero-size tensors. This is turned off by default for keeping backward compatibility. @@ -859,10 +859,10 @@ deactivate both of them.) prev = ctypes.c_int() check_call(_LIB.MXSetIsNumpyShape(ctypes.c_int(active), ctypes.byref(prev))) -return bool(prev.value) +return bool(prev.value) -[docs]def is_np_shape(): +def is_np_shape(): Checks whether the NumPy shape semantics is currently turned on. In NumPy shape semantics, `()` represents the shape of scalar tensors, and tuples with `0` elements, for example, `(0,)`, `(1, 0, 2)`, represent @@ -893,7 +893,7 @@ curr = ctypes.c_bool() check_call(_LIB.MXIsNumpyShape(ctypes.byref(curr))) -return curr.value +return curr.value class _NumpyShapeScope(object): @@ -924,7 +924,7 @@ set_np_shape(self._prev_is_np_shape) -[docs]def np_shape(active=True): +def np_shape(active=True): Returns an activated/deactivated NumPy shape scope to be used in with statement and captures code that needs the NumPy shape semantics, i.e. support of scalar and zero-size tensors. @@ -990,10 +990,10 @@ assert arg_shapes[0] == () assert out_shapes[0] == () -return _NumpyShapeScope(active) +return _NumpyShapeScope(active) -[docs]def use_np_shape(func): +def use_np_shape(func): A decorator wrapping a function or class with activated NumPy-shape semantics. When `func` is a function, this ensures that the execution of the function is scoped with NumPy shape semantics, such as the support for zero-dim and zero size tensors. When @@ -1064,7 +1064,7 @@ return _with_np_shape else: raise TypeError(use_np_shape can only decorate classes and callable objects, -while received a {}.format(str(type(func +while received a {}.format(str(type(func def _sanity_check_params(func_name, unsupported_params, param_dict): @@ -1074,7 +1074,7 @@ .format(func_name, param_name)) -[docs]def set_module(module): +def set_module(module): Decorator for overriding __module__ on a function or class. Example usage:: @@ -1089,7 +1089,7 @@ if module is not None: func.__module__ = module return func -return decorator +return decorator class _NumpyArrayScope(object): @@ -1117,7 +1117,7 @@ _NumpyArrayScope._current.value = self._old_scope -[docs]def np_array(active=True): +def np_array(active=True): Returns an activated/deactivated NumPy-array scope to be used in with statement and captures code that needs the NumPy-array semantics. @@ -1143,10 +1143,10 @@ _NumpyShapeScope A scope object for wrapping the code w/ or w/o NumPy-shape semantics. -return _NumpyArrayScope(active) +return _NumpyArrayScope(active) -[docs]def is_np_array(): +[docs]def is_np_array(): Checks whether the NumPy-array semantics is currently turned on. This is currently used in Gluon for checking whether an array of type `mxnet.numpy.ndarray` or `mx.nd.NDArray` should be created. For example, at the time when a parameter @@ -1169,7 +1169,7 @@ _NumpyArrayScope._current, value) else False -[docs]def use_np_array(func): +def use_np_array(func): A decorator wrapping Gluon `Block`s and all its methods, properties, and static functions with the semantics of NumPy-array, which means that where ndarrays are created, `mxnet.numpy.ndarray`s should be created, instead of legacy ndarrays of type `mx.nd.NDArray`. @@ -1248,10 +1248,10 @@ return _with_np_array
[incubator-mxnet-site] branch asf-site updated: Bump the publish timestamp.
This is an automated email from the ASF dual-hosted git repository. aaronmarkham pushed a commit to branch asf-site in repository https://gitbox.apache.org/repos/asf/incubator-mxnet-site.git The following commit(s) were added to refs/heads/asf-site by this push: new 8d000be Bump the publish timestamp. 8d000be is described below commit 8d000beccdacee575b37bda8de41fa8cb83ad803 Author: mxnet-ci AuthorDate: Sun Jul 19 18:42:34 2020 + Bump the publish timestamp. --- date.txt | 1 + 1 file changed, 1 insertion(+) diff --git a/date.txt b/date.txt new file mode 100644 index 000..47de1df --- /dev/null +++ b/date.txt @@ -0,0 +1 @@ +Sun Jul 19 18:42:34 UTC 2020
[GitHub] [incubator-mxnet] D-Roberts opened a new pull request #18757: Add qr backward for wide inputs with nrows < ncols
D-Roberts opened a new pull request #18757: URL: https://github.com/apache/incubator-mxnet/pull/18757 As titled. This is a resubmit of [#18197](https://github.com/apache/incubator-mxnet/pull/18197) . In addition, tests were re-verified for robustness. ``` MXNET_TEST_COUNT=1 pytest -v ~/workspace/incubator mxnet/tests/python/unittest/test_numpy_op.py::test_np_linalg_qr incubator-mxnet/tests/python/unittest/test_numpy_op.py::test_np_linalg_qr PASSED [100%] ``` The obtained gradient has the same values for a given input as with TensorFlow. The implemented method is similar to the method implemented in [tf](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/ops/linalg_grad.py) . Here are cross-checked examples: ``` import mxnet as mx import mxnet.numpy as np import numpy as _np _np.random.seed(42) data_np = _np.random.uniform(-1, 1, (3, 5)).astype(_np.float32) data = np.array(data_np, dtype='float32') data.attach_grad() with mx.autograd.record(): ret = np.linalg.qr(data) mx.autograd.backward(ret) print(data.grad) [[ 0.7569422 0.5140486 -0.48962986 -0.48962957 -0.48962957] [-2.414882 -1.2380642 -1.6602778 -1.6602782 -1.6602782 ] [ 0.2763981 -0.5044659 0.06115001 0.06115055 0.06115055]] import tensorflow as tf import numpy as _np _np.random.seed(42) data_np = _np.random.uniform(-1, 1, (3, 5)).astype(_np.float32) data = tf.convert_to_tensor(data_np) with tf.GradientTape() as g: g.watch(data) ret = tf.linalg.qr(data) print(g.gradient(ret, data)) tf.Tensor( [[ 0.75694233 0.51404876 -0.48962957 -0.48962957 -0.48962957] [-2.414882 -1.2380638 -1.6602784 -1.6602781 -1.6602781 ] [ 0.276398 -0.50446594 0.0611502 0.06115052 0.06115052]], shape=(3, 5), dtype=float32) ``` At high level the methodology is: partition/split the input A into 2 matrices X and Y and split matrix R (from A=QR decomposition) into 2 matrices U and V. Then X = QU and get X_grad by applying the gradient derivation from the square input case (m=n) with adjusted Q_grad. Also get Y_grad separately. Then A_grad is the concatenation of X_grad and Y_grad. ### Changes ### - [ ] qr backward wide input - [ ] tests This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] mxnet-bot commented on pull request #18757: Add qr backward for wide inputs with nrows < ncols
mxnet-bot commented on pull request #18757: URL: https://github.com/apache/incubator-mxnet/pull/18757#issuecomment-660656983 Hey @D-Roberts , Thanks for submitting the PR All tests are already queued to run once. If tests fail, you can trigger one or more tests again with the following commands: - To trigger all jobs: @mxnet-bot run ci [all] - To trigger specific jobs: @mxnet-bot run ci [job1, job2] *** **CI supported jobs**: [website, miscellaneous, centos-gpu, windows-gpu, unix-cpu, unix-gpu, windows-cpu, edge, sanity, clang, centos-cpu] *** _Note_: Only following 3 categories can trigger CI :PR Author, MXNet Committer, Jenkins Admin. All CI tests must pass before the PR can be merged. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[incubator-mxnet-site] branch asf-site updated: Publish triggered by CI
This is an automated email from the ASF dual-hosted git repository. aaronmarkham pushed a commit to branch asf-site in repository https://gitbox.apache.org/repos/asf/incubator-mxnet-site.git The following commit(s) were added to refs/heads/asf-site by this push: new 0a5d928 Publish triggered by CI 0a5d928 is described below commit 0a5d92884698add65220717ad632d55315a738a3 Author: mxnet-ci AuthorDate: Sun Jul 19 12:42:41 2020 + Publish triggered by CI --- api/python/docs/_modules/mxnet/util.html | 82 date.txt | 1 - feed.xml | 2 +- 3 files changed, 42 insertions(+), 43 deletions(-) diff --git a/api/python/docs/_modules/mxnet/util.html b/api/python/docs/_modules/mxnet/util.html index 42e8ea84..b0c301a 100644 --- a/api/python/docs/_modules/mxnet/util.html +++ b/api/python/docs/_modules/mxnet/util.html @@ -815,7 +815,7 @@ return free_mem.value, total_mem.value -def set_np_shape(active): +[docs]def set_np_shape(active): Turns on/off NumPy shape semantics, in which `()` represents the shape of scalar tensors, and tuples with `0` elements, for example, `(0,)`, `(1, 0, 2)`, represent the shapes of zero-size tensors. This is turned off by default for keeping backward compatibility. @@ -859,10 +859,10 @@ deactivate both of them.) prev = ctypes.c_int() check_call(_LIB.MXSetIsNumpyShape(ctypes.c_int(active), ctypes.byref(prev))) -return bool(prev.value) +return bool(prev.value) -def is_np_shape(): +[docs]def is_np_shape(): Checks whether the NumPy shape semantics is currently turned on. In NumPy shape semantics, `()` represents the shape of scalar tensors, and tuples with `0` elements, for example, `(0,)`, `(1, 0, 2)`, represent @@ -893,7 +893,7 @@ curr = ctypes.c_bool() check_call(_LIB.MXIsNumpyShape(ctypes.byref(curr))) -return curr.value +return curr.value class _NumpyShapeScope(object): @@ -924,7 +924,7 @@ set_np_shape(self._prev_is_np_shape) -def np_shape(active=True): +[docs]def np_shape(active=True): Returns an activated/deactivated NumPy shape scope to be used in with statement and captures code that needs the NumPy shape semantics, i.e. support of scalar and zero-size tensors. @@ -990,10 +990,10 @@ assert arg_shapes[0] == () assert out_shapes[0] == () -return _NumpyShapeScope(active) +return _NumpyShapeScope(active) -def use_np_shape(func): +[docs]def use_np_shape(func): A decorator wrapping a function or class with activated NumPy-shape semantics. When `func` is a function, this ensures that the execution of the function is scoped with NumPy shape semantics, such as the support for zero-dim and zero size tensors. When @@ -1064,7 +1064,7 @@ return _with_np_shape else: raise TypeError(use_np_shape can only decorate classes and callable objects, -while received a {}.format(str(type(func +while received a {}.format(str(type(func def _sanity_check_params(func_name, unsupported_params, param_dict): @@ -1074,7 +1074,7 @@ .format(func_name, param_name)) -def set_module(module): +[docs]def set_module(module): Decorator for overriding __module__ on a function or class. Example usage:: @@ -1089,7 +1089,7 @@ if module is not None: func.__module__ = module return func -return decorator +return decorator class _NumpyArrayScope(object): @@ -1117,7 +1117,7 @@ _NumpyArrayScope._current.value = self._old_scope -def np_array(active=True): +[docs]def np_array(active=True): Returns an activated/deactivated NumPy-array scope to be used in with statement and captures code that needs the NumPy-array semantics. @@ -1143,10 +1143,10 @@ _NumpyShapeScope A scope object for wrapping the code w/ or w/o NumPy-shape semantics. -return _NumpyArrayScope(active) +return _NumpyArrayScope(active) -[docs]def is_np_array(): +[docs]def is_np_array(): Checks whether the NumPy-array semantics is currently turned on. This is currently used in Gluon for checking whether an array of type `mxnet.numpy.ndarray` or `mx.nd.NDArray` should be created. For example, at the time when a parameter @@ -1169,7 +1169,7 @@ _NumpyArrayScope._current, value) else False -def use_np_array(func): +[docs]def use_np_array(func): A decorator wrapping Gluon `Block`s and all its methods, properties, and static functions with the semantics of NumPy-array, which means that where ndarrays are created, `mxnet.numpy.ndarray`s should be created, instead of legacy ndarrays of type `mx.nd.NDArray`. @@ -1248,10 +1248,10 @@ return _with_np_array
[incubator-mxnet-site] branch asf-site updated: Bump the publish timestamp.
This is an automated email from the ASF dual-hosted git repository. aaronmarkham pushed a commit to branch asf-site in repository https://gitbox.apache.org/repos/asf/incubator-mxnet-site.git The following commit(s) were added to refs/heads/asf-site by this push: new db769b1 Bump the publish timestamp. db769b1 is described below commit db769b128f50f7712c18929b48908f1e6d2249ec Author: mxnet-ci AuthorDate: Sun Jul 19 12:42:47 2020 + Bump the publish timestamp. --- date.txt | 1 + 1 file changed, 1 insertion(+) diff --git a/date.txt b/date.txt new file mode 100644 index 000..7b6f26b --- /dev/null +++ b/date.txt @@ -0,0 +1 @@ +Sun Jul 19 12:42:47 UTC 2020
[GitHub] [incubator-mxnet] ChaiBapchya commented on pull request #18608: Cherry-pick #18310 #18355
ChaiBapchya commented on pull request #18608: URL: https://github.com/apache/incubator-mxnet/pull/18608#issuecomment-660611884 @MoisesHer sorry about hitting that flaky test. Please retrigger unix-cpu pipeline. Hopefully that should be the last retrigger for this PR. > We might consider to include this patch in 1.7.0 release if there's rc2, otherwise, this patch will be in v1.7.x branch and the binary release @ciyongch sounds good to me. This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] ChaiBapchya commented on issue #16848: Flaky test_np_mixed_precision_binary_funcs
ChaiBapchya commented on issue #16848: URL: https://github.com/apache/incubator-mxnet/issues/16848#issuecomment-660611712 http://jenkins.mxnet-ci.amazon-ml.com/blue/organizations/jenkins/mxnet-validation%2Funix-cpu/detail/PR-18608/3/pipeline Unrelated PR:https://github.com/apache/incubator-mxnet/pull/18608 This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[GitHub] [incubator-mxnet] wkcn commented on issue #18751: gluon.nn.BatchNorm seems to swap updated values of moving_mean and moving_var on GPU.
wkcn commented on issue #18751: URL: https://github.com/apache/incubator-mxnet/issues/18751#issuecomment-660600872 The values of moving_mean and moving_var are not consistent between CPU and GPU. The values on CPU is population variance (v / n), but that on GPU CUDNN is sample variance (v / (n - 1)). Refer: https://github.com/apache/incubator-mxnet/pull/18694/files#diff-cb652780258e73a9cd08568f38929aa2R1554 The line 1554 in tests/python/unittest/test_operator.py ```# cudnn uses m-1 in the denominator of its sample variance calculation, not m``` This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org
[incubator-mxnet-site] branch asf-site updated: Publish triggered by CI
This is an automated email from the ASF dual-hosted git repository. aaronmarkham pushed a commit to branch asf-site in repository https://gitbox.apache.org/repos/asf/incubator-mxnet-site.git The following commit(s) were added to refs/heads/asf-site by this push: new 994d956 Publish triggered by CI 994d956 is described below commit 994d956f0f4afb5e8cbbbafda2c8737cc44cb591 Author: mxnet-ci AuthorDate: Sun Jul 19 06:43:08 2020 + Publish triggered by CI --- api/python/docs/_modules/mxnet/util.html | 82 date.txt | 1 - feed.xml | 2 +- 3 files changed, 42 insertions(+), 43 deletions(-) diff --git a/api/python/docs/_modules/mxnet/util.html b/api/python/docs/_modules/mxnet/util.html index b0c301a..42e8ea84 100644 --- a/api/python/docs/_modules/mxnet/util.html +++ b/api/python/docs/_modules/mxnet/util.html @@ -815,7 +815,7 @@ return free_mem.value, total_mem.value -[docs]def set_np_shape(active): +def set_np_shape(active): Turns on/off NumPy shape semantics, in which `()` represents the shape of scalar tensors, and tuples with `0` elements, for example, `(0,)`, `(1, 0, 2)`, represent the shapes of zero-size tensors. This is turned off by default for keeping backward compatibility. @@ -859,10 +859,10 @@ deactivate both of them.) prev = ctypes.c_int() check_call(_LIB.MXSetIsNumpyShape(ctypes.c_int(active), ctypes.byref(prev))) -return bool(prev.value) +return bool(prev.value) -[docs]def is_np_shape(): +def is_np_shape(): Checks whether the NumPy shape semantics is currently turned on. In NumPy shape semantics, `()` represents the shape of scalar tensors, and tuples with `0` elements, for example, `(0,)`, `(1, 0, 2)`, represent @@ -893,7 +893,7 @@ curr = ctypes.c_bool() check_call(_LIB.MXIsNumpyShape(ctypes.byref(curr))) -return curr.value +return curr.value class _NumpyShapeScope(object): @@ -924,7 +924,7 @@ set_np_shape(self._prev_is_np_shape) -[docs]def np_shape(active=True): +def np_shape(active=True): Returns an activated/deactivated NumPy shape scope to be used in with statement and captures code that needs the NumPy shape semantics, i.e. support of scalar and zero-size tensors. @@ -990,10 +990,10 @@ assert arg_shapes[0] == () assert out_shapes[0] == () -return _NumpyShapeScope(active) +return _NumpyShapeScope(active) -[docs]def use_np_shape(func): +def use_np_shape(func): A decorator wrapping a function or class with activated NumPy-shape semantics. When `func` is a function, this ensures that the execution of the function is scoped with NumPy shape semantics, such as the support for zero-dim and zero size tensors. When @@ -1064,7 +1064,7 @@ return _with_np_shape else: raise TypeError(use_np_shape can only decorate classes and callable objects, -while received a {}.format(str(type(func +while received a {}.format(str(type(func def _sanity_check_params(func_name, unsupported_params, param_dict): @@ -1074,7 +1074,7 @@ .format(func_name, param_name)) -[docs]def set_module(module): +def set_module(module): Decorator for overriding __module__ on a function or class. Example usage:: @@ -1089,7 +1089,7 @@ if module is not None: func.__module__ = module return func -return decorator +return decorator class _NumpyArrayScope(object): @@ -1117,7 +1117,7 @@ _NumpyArrayScope._current.value = self._old_scope -[docs]def np_array(active=True): +def np_array(active=True): Returns an activated/deactivated NumPy-array scope to be used in with statement and captures code that needs the NumPy-array semantics. @@ -1143,10 +1143,10 @@ _NumpyShapeScope A scope object for wrapping the code w/ or w/o NumPy-shape semantics. -return _NumpyArrayScope(active) +return _NumpyArrayScope(active) -[docs]def is_np_array(): +[docs]def is_np_array(): Checks whether the NumPy-array semantics is currently turned on. This is currently used in Gluon for checking whether an array of type `mxnet.numpy.ndarray` or `mx.nd.NDArray` should be created. For example, at the time when a parameter @@ -1169,7 +1169,7 @@ _NumpyArrayScope._current, value) else False -[docs]def use_np_array(func): +def use_np_array(func): A decorator wrapping Gluon `Block`s and all its methods, properties, and static functions with the semantics of NumPy-array, which means that where ndarrays are created, `mxnet.numpy.ndarray`s should be created, instead of legacy ndarrays of type `mx.nd.NDArray`. @@ -1248,10 +1248,10 @@ return _with_np_array
[incubator-mxnet-site] branch asf-site updated: Bump the publish timestamp.
This is an automated email from the ASF dual-hosted git repository. aaronmarkham pushed a commit to branch asf-site in repository https://gitbox.apache.org/repos/asf/incubator-mxnet-site.git The following commit(s) were added to refs/heads/asf-site by this push: new 53d20fc Bump the publish timestamp. 53d20fc is described below commit 53d20fc1532a8883f673dcba2cd34adc87526c6e Author: mxnet-ci AuthorDate: Sun Jul 19 06:43:13 2020 + Bump the publish timestamp. --- date.txt | 1 + 1 file changed, 1 insertion(+) diff --git a/date.txt b/date.txt new file mode 100644 index 000..e73c71b --- /dev/null +++ b/date.txt @@ -0,0 +1 @@ +Sun Jul 19 06:43:13 UTC 2020
[GitHub] [incubator-mxnet] ciyongch commented on pull request #18608: Cherry-pick #18310 #18355
ciyongch commented on pull request #18608: URL: https://github.com/apache/incubator-mxnet/pull/18608#issuecomment-660595392 Hi @ChaiBapchya @MoisesHer please check the failed job. BTW, as it's for the binary release, then I think it's not a mandatory patch for the source release (the current release candidate is rc1). We might consider to include this patch in 1.7.0 release if there's rc2, otherwise, this patch will be in v1.7.x branch and the binary release, what do you think? This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org