ckt624 opened a new issue #15856: unix GPU CI fails randomly URL: https://github.com/apache/incubator-mxnet/issues/15856 ## Description I submited PR of tensordot operators to master branch, and failed the unix GPU test for test_parallel_random_seed_setting_for_context from time to time (http://jenkins.mxnet-ci.amazon-ml.com/blue/organizations/jenkins/mxnet-validation%2Funix-gpu/detail/PR-15820/4/pipeline). This failure seems to happen randomly. Error messages: ====================================================================== FAIL: test_operator_gpu.test_parallel_random_seed_setting_for_context ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/nose/case.py", line 197, in runTest self.test(*self.arg) File "/usr/local/lib/python2.7/dist-packages/nose/util.py", line 620, in newfunc return func(*arg, **kw) File "/work/mxnet/tests/python/gpu/../unittest/common.py", line 177, in test_new orig_test(*args, **kwargs) File "/work/mxnet/tests/python/gpu/../unittest/test_random.py", line 549, in test_parallel_random_seed_setting_for_context assert same(samples_imp[i - 1], samples_imp[i]) AssertionError: -------------------- >> begin captured logging << -------------------- common: INFO: Setting test np/mx/python random seeds, use MXNET_TEST_SEED=1666910558 to reproduce. --------------------- >> end captured logging << ---------------------
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services