leezu commented on issue #15856:
URL: 
https://github.com/apache/incubator-mxnet/issues/15856#issuecomment-622617937


   ```
   [2020-05-01T23:59:35.381Z] ________________ 
test_parallel_random_seed_setting_for_context _________________
   [2020-05-01T23:59:35.381Z] 
   [2020-05-01T23:59:35.381Z]     @with_seed()
   [2020-05-01T23:59:35.381Z]     def 
test_parallel_random_seed_setting_for_context():
   [2020-05-01T23:59:35.381Z]         seed_to_test = 1234
   [2020-05-01T23:59:35.381Z]         dev_type = 
mx.context.current_context().device_type
   [2020-05-01T23:59:35.381Z]         for dtype in ['float16', 'float32', 
'float64']:
   [2020-05-01T23:59:35.381Z]             samples_imp = []
   [2020-05-01T23:59:35.381Z]             samples_sym = []
   [2020-05-01T23:59:35.381Z]             # Collect random number samples from 
the generators of all devices, each seeded with the same number.
   [2020-05-01T23:59:35.381Z]             for dev_id in range(0, 
mx.context.num_gpus() if dev_type == 'gpu' else 1):
   [2020-05-01T23:59:35.381Z]                 with mx.Context(dev_type, dev_id):
   [2020-05-01T23:59:35.381Z]                     ctx = 
mx.context.current_context()
   [2020-05-01T23:59:35.381Z]                     # Avoid excessive test cpu 
runtimes.
   [2020-05-01T23:59:35.381Z]                     num_temp_seeds = 25 if 
dev_type == 'gpu' else 1
   [2020-05-01T23:59:35.381Z]                     # To flush out a possible 
race condition, run multiple times.
   [2020-05-01T23:59:35.381Z]                     for _ in range(20):
   [2020-05-01T23:59:35.381Z]                         # Create enough samples 
such that we get a meaningful distribution.
   [2020-05-01T23:59:35.381Z]                         shape = (200, 200)
   [2020-05-01T23:59:35.381Z]                         params = { 'low': -1.5, 
'high': 3.0 }
   [2020-05-01T23:59:35.381Z]                         
params.update(shape=shape, dtype=dtype)
   [2020-05-01T23:59:35.381Z]     
   [2020-05-01T23:59:35.381Z]                         # Check imperative. 
`uniform` uses parallel rng.
   [2020-05-01T23:59:35.381Z]                         seed = 
set_seed_variously_for_context(ctx, 1, num_temp_seeds, seed_to_test)
   [2020-05-01T23:59:35.381Z]                         rnds = 
mx.nd.random.uniform(**params)
   [2020-05-01T23:59:35.381Z]                         
samples_imp.append(rnds.asnumpy())
   [2020-05-01T23:59:35.381Z]     
   [2020-05-01T23:59:35.381Z]                         # Check symbolic. 
`uniform` uses parallel rng.
   [2020-05-01T23:59:35.381Z]                         X = mx.sym.Variable("X")
   [2020-05-01T23:59:35.381Z]                         Y = 
mx.sym.random.uniform(**params) + X
   [2020-05-01T23:59:35.381Z]                         x = mx.nd.zeros(shape, 
dtype=dtype)
   [2020-05-01T23:59:35.381Z]                         xgrad = 
mx.nd.zeros(shape, dtype=dtype)
   [2020-05-01T23:59:35.381Z]                         yexec = Y.bind(ctx, {'X' 
: x}, {'X': xgrad})
   [2020-05-01T23:59:35.381Z]                         
set_seed_variously_for_context(ctx, seed, num_temp_seeds, seed_to_test)
   [2020-05-01T23:59:35.381Z]                         
yexec.forward(is_train=True)
   [2020-05-01T23:59:35.381Z]                         
yexec.backward(yexec.outputs[0])
   [2020-05-01T23:59:35.381Z]                         
samples_sym.append(yexec.outputs[0].asnumpy())
   [2020-05-01T23:59:35.381Z]             # The samples should be identical 
across different gpu devices.
   [2020-05-01T23:59:35.381Z]             for i in range(1, len(samples_imp)):
   [2020-05-01T23:59:35.381Z]                 assert same(samples_imp[i - 1], 
samples_imp[i])
   [2020-05-01T23:59:35.381Z]             for i in range(1, len(samples_sym)):
   [2020-05-01T23:59:35.381Z] >               assert same(samples_sym[i - 1], 
samples_sym[i])
   [2020-05-01T23:59:35.381Z] E               assert False
   [2020-05-01T23:59:35.381Z] E                +  where False = same(array([[ 
5.83024027e-01,  1.67935735e+00,  8.77294878e-01,\n         2.51641507e+00,  
2.95223944e+00,  2.75676270e+00,\n...-1.33459247e-01,\n        -7.95632855e-01, 
 1.66822078e+00, -1.47758483e-01,\n         1.39249882e+00, -2.99302394e-01]]), 
array([[ 5.83024027e-001,  1.67935735e+000,  8.77294878e-001,\n         
2.51641507e+000,  2.95223944e+000,  2.75676270e...59247e-001,\n        
-7.95632855e-001,  1.66822078e+000, -1.47758483e-001,\n         
1.39249882e+000, -2.99302394e-001]]))
   [2020-05-01T23:59:35.381Z] 
   [2020-05-01T23:59:35.381Z] tests/python/unittest/test_random.py:560: 
AssertionError
   ```
   
http://jenkins.mxnet-ci.amazon-ml.com/blue/rest/organizations/jenkins/pipelines/mxnet-validation/pipelines/unix-gpu/branches/master/runs/1928/nodes/349/steps/510/log/?start=0


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to