haojin2 opened a new issue #17414: flaky test_deconvolution2d
URL: https://github.com/apache/incubator-mxnet/issues/17414
 
 
   
http://jenkins.mxnet-ci.amazon-ml.com/blue/organizations/jenkins/mxnet-validation%2Funix-gpu/detail/PR-17392/2/pipeline/
   ```
   ======================================================================
   FAIL: test_ops.test_deconvolution2d
   0: diff32(0.00E+00) | diff16(2.44E-04) | atol32(0.00E+00) | 
atol16(-1.30E-04) | orig.min(2.50E-01)
   0: diff32(0.00E+00) | diff16(4.57E-04) | atol32(0.00E+00) | 
atol16(-9.39E-06) | orig.min(1.17E-02)
   0: diff32(0.00E+00) | diff16(4.68E-04) | atol32(0.00E+00) | 
atol16(-1.01E-05) | orig.min(1.26E-02)
   0: diff32(0.00E+00) | diff16(4.85E-04) | atol32(0.00E+00) | 
atol16(-9.82E-06) | orig.min(1.24E-02)
   0: diff32(0.00E+00) | diff16(4.54E-04) | atol32(0.00E+00) | 
atol16(-8.10E-06) | orig.min(1.03E-02)
   0: diff32(0.00E+00) | diff16(4.41E-04) | atol32(0.00E+00) | 
atol16(-8.99E-06) | orig.min(1.13E-02)
   0: diff32(0.00E+00) | diff16(4.85E-04) | atol32(0.00E+00) | 
atol16(-6.78E-06) | orig.min(1.01E-02)
   0: diff32(0.00E+00) | diff16(4.67E-04) | atol32(0.00E+00) | 
atol16(-8.06E-06) | orig.min(1.01E-02)
   0: diff32(0.00E+00) | diff16(4.00E-04) | atol32(0.00E+00) | 
atol16(-9.63E-06) | orig.min(1.19E-02)
   0: diff32(0.00E+00) | diff16(4.78E-04) | atol32(0.00E+00) | 
atol16(-9.14E-06) | orig.min(1.12E-02)
   0: diff32(0.00E+00) | diff16(1.04E-03) | atol32(0.00E+00) | 
atol16(-3.58E-05) | orig.min(3.98E-02)
   0: diff32(0.00E+00) | diff16(9.42E-04) | atol32(0.00E+00) | 
atol16(-5.53E-05) | orig.min(7.25E-02)
   0: diff32(0.00E+00) | diff16(9.63E-04) | atol32(0.00E+00) | 
atol16(-2.19E-05) | orig.min(2.51E-02)
   0: diff32(0.00E+00) | diff16(6.88E-04) | atol32(0.00E+00) | atol16(5.38E-04) 
| orig.min(4.33E-04)
   0: diff32(0.00E+00) | diff16(6.06E-04) | atol32(0.00E+00) | atol16(4.57E-04) 
| orig.min(5.93E-04)
   0: diff32(0.00E+00) | diff16(6.66E-04) | atol32(0.00E+00) | atol16(5.88E-04) 
| orig.min(3.23E-04)
   0: diff32(0.00E+00) | diff16(7.59E-04) | atol32(0.00E+00) | 
atol16(-4.53E-06) | orig.min(1.02E-03)
   0: diff32(0.00E+00) | diff16(6.95E-04) | atol32(0.00E+00) | 
atol16(-1.54E-06) | orig.min(3.35E-04)
   0: diff32(0.00E+00) | diff16(5.83E-04) | atol32(0.00E+00) | 
atol16(-2.28E-06) | orig.min(4.62E-04)
   0: diff32(0.00E+00) | diff16(2.43E-04) | atol32(0.00E+00) | 
atol16(-1.21E-05) | orig.min(1.50E-02)
   0: diff32(0.00E+00) | diff16(2.43E-04) | atol32(0.00E+00) | 
atol16(-1.20E-05) | orig.min(1.33E-02)
   0: diff32(0.00E+00) | diff16(3.89E-04) | atol32(0.00E+00) | 
atol16(-2.43E-05) | orig.min(2.77E-02)
   0: diff32(0.00E+00) | diff16(4.64E-04) | atol32(0.00E+00) | 
atol16(-8.47E-06) | orig.min(1.11E-02)
   0: diff32(0.00E+00) | diff16(4.31E-04) | atol32(0.00E+00) | 
atol16(-1.07E-05) | orig.min(1.33E-02)
   0: diff32(0.00E+00) | diff16(4.41E-04) | atol32(0.00E+00) | 
atol16(-8.00E-06) | orig.min(1.05E-02)
   0: diff32(0.00E+00) | diff16(4.77E-04) | atol32(0.00E+00) | 
atol16(-6.45E-06) | orig.min(1.01E-02)
   0: diff32(0.00E+00) | diff16(4.86E-04) | atol32(0.00E+00) | 
atol16(-7.03E-06) | orig.min(1.02E-02)
   0: diff32(0.00E+00) | diff16(4.82E-04) | atol32(0.00E+00) | 
atol16(-6.37E-06) | orig.min(1.02E-02)
   0: diff32(0.00E+00) | diff16(4.70E-04) | atol32(0.00E+00) | 
atol16(-2.45E-05) | orig.min(2.71E-02)
   0: diff32(0.00E+00) | diff16(2.40E-04) | atol32(0.00E+00) | 
atol16(-1.42E-04) | orig.min(1.93E-01)
   0: diff32(0.00E+00) | diff16(1.96E-04) | atol32(0.00E+00) | 
atol16(-1.98E-05) | orig.min(2.41E-02)
   0: diff32(0.00E+00) | diff16(2.33E-04) | atol32(0.00E+00) | 
atol16(-2.53E-05) | orig.min(2.59E-02)
   0: diff32(0.00E+00) | diff16(2.43E-04) | atol32(0.00E+00) | 
atol16(-1.39E-05) | orig.min(2.04E-02)
   0: diff32(0.00E+00) | diff16(3.77E-04) | atol32(0.00E+00) | 
atol16(-2.54E-05) | orig.min(2.63E-02)
   0: diff32(0.00E+00) | diff16(3.67E-04) | atol32(0.00E+00) | 
atol16(-8.44E-06) | orig.min(1.16E-02)
   0: diff32(0.00E+00) | diff16(3.84E-04) | atol32(0.00E+00) | 
atol16(-9.13E-06) | orig.min(1.18E-02)
   0: diff32(0.00E+00) | diff16(4.79E-04) | atol32(0.00E+00) | 
atol16(-1.09E-05) | orig.min(1.51E-02)
   downloading sample input
   Downloading /home/jenkins_slave/.mxnet/models/resnet18_v2-a81db45f.zip from 
https://apache-mxnet.s3-accelerate.dualstack.amazonaws.com/gluon/models/resnet18_v2-a81db45f.zip...
   LeNet-5 test
   Running inference in MXNet
   Running inference in MXNet-TensorRT
   MXNet accuracy: 99.100000
   MXNet-TensorRT accuracy: 99.090000
   ----------------------------------------------------------------------
   Traceback (most recent call last):
     File "/usr/local/lib/python3.6/dist-packages/nose/case.py", line 198, in 
runTest
       self.test(*self.arg)
     File "/work/mxnet/tests/python/tensorrt/../unittest/common.py", line 215, 
in test_new
       orig_test(*args, **kwargs)
     File "/work/mxnet/tests/python/tensorrt/test_ops.py", line 210, in 
test_deconvolution2d
       rtol_fp16=rtol_fp16, atol_fp16=atol_fp16)
     File "/work/mxnet/tests/python/tensorrt/test_ops.py", line 107, in 
check_single_sym
       assert_allclose(fp32, orig, rtol=rtol_fp32, atol=atol_fp32)
     File 
"/usr/local/lib/python3.6/dist-packages/numpy/testing/_private/utils.py", line 
1533, in assert_allclose
       verbose=verbose, header=header, equal_nan=equal_nan)
     File 
"/usr/local/lib/python3.6/dist-packages/numpy/testing/_private/utils.py", line 
846, in assert_array_compare
       raise AssertionError(msg)
   AssertionError: 
   Not equal to tolerance rtol=1e-06, atol=0
   
   Mismatched elements: 2 / 16128 (0.0124%)
   Max absolute difference: 1.4305115e-06
   Max relative difference: 1.2375308e-06
    x: array([[[[0.69503 , 0.950935, 0.999062, ..., 0.769287, 0.784347,
             0.537922],
            [1.179859, 2.278583, 1.939123, ..., 1.17782 , 1.370589,...
    y: array([[[[0.69503 , 0.950935, 0.999061, ..., 0.769287, 0.784347,
             0.537922],
            [1.179859, 2.278583, 1.939123, ..., 1.17782 , 1.370589,...
   -------------------- >> begin captured logging << --------------------
   common: INFO: Setting test np/mx/python random seeds, use 
MXNET_TEST_SEED=552091553 to reproduce.
   --------------------- >> end captured logging << ---------------------
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to