xidulu opened a new issue #18527:
URL: https://github.com/apache/incubator-mxnet/issues/18527
## Description
```
=================================== FAILURES
===================================
[2020-06-08T10:57:34.306Z] ___________________________
test_preloaded_multi_sgd ___________________________
[2020-06-08T10:57:34.306Z] [gw2] linux -- Python 3.6.9 /usr/bin/python3
[2020-06-08T10:57:34.306Z]
[2020-06-08T10:57:34.306Z] @with_seed()
[2020-06-08T10:57:34.306Z] def test_preloaded_multi_sgd():
[2020-06-08T10:57:34.306Z] dtypes = ['float16', 'float32']
[2020-06-08T10:57:34.306Z] momentums = [None, 0.9]
[2020-06-08T10:57:34.306Z] min_nparam = 5
[2020-06-08T10:57:34.306Z] max_nparam = 10
[2020-06-08T10:57:34.306Z] maxdim = 6
[2020-06-08T10:57:34.306Z] maxndim = 4
[2020-06-08T10:57:34.306Z] for dtype in dtypes:
[2020-06-08T10:57:34.306Z] use_master_weights_list = [False,] if
dtype == 'float32' else [True, False]
[2020-06-08T10:57:34.306Z] for use_master_weights in
use_master_weights_list:
[2020-06-08T10:57:34.306Z] for momentum in momentums:
[2020-06-08T10:57:34.306Z] > nparam =
np.random.randint(min_nparam + 1, max_nparam + 1)
[2020-06-08T10:57:34.306Z]
[2020-06-08T10:57:34.306Z] tests/python/gpu/test_operator_gpu.py:451:
[2020-06-08T10:57:34.306Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[2020-06-08T10:57:34.306Z] python/mxnet/numpy/random.py:79: in randint
[2020-06-08T10:57:34.306Z] return _mx_nd_np.random.randint(low, high,
size, dtype, ctx, out)
[2020-06-08T10:57:34.306Z] python/mxnet/ndarray/numpy/random.py:91: in
randint
[2020-06-08T10:57:34.306Z] return _npi.random_randint(low, high,
shape=size, dtype=dtype, ctx=ctx, out=out)
[2020-06-08T10:57:34.306Z] <string>:58: in random_randint
[2020-06-08T10:57:34.306Z] ???
[2020-06-08T10:57:34.306Z] mxnet/cython/ndarray.pyx:219: in
mxnet._cy3.ndarray._imperative_invoke
[2020-06-08T10:57:34.306Z] ???
[2020-06-08T10:57:34.306Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[2020-06-08T10:57:34.306Z]
[2020-06-08T10:57:34.306Z] > ???
[2020-06-08T10:57:34.306Z] E mxnet.base.MXNetError: Traceback (most recent
call last):
[2020-06-08T10:57:34.306Z] E [bt] (9) /usr/bin/python3() [0x509a90]
[2020-06-08T10:57:34.306Z] E [bt] (8) /usr/bin/python3() [0x507d64]
[2020-06-08T10:57:34.306Z] E [bt] (7)
/usr/bin/python3(_PyEval_EvalFrameDefault+0x444) [0x50bfb4]
[2020-06-08T10:57:34.306Z] E [bt] (6) /usr/bin/python3() [0x50a635]
[2020-06-08T10:57:34.306Z] E [bt] (5)
/work/mxnet/python/mxnet/_cy3/ndarray.cpython-36m-x86_64-linux-gnu.so(+0x14a80)
[0x7fb629d83a80]
[2020-06-08T10:57:34.306Z] E [bt] (4)
/work/mxnet/python/mxnet/../../build/libmxnet.so(MXImperativeInvokeEx+0x7a)
[0x7fb69ca2fe1a]
[2020-06-08T10:57:34.306Z] E [bt] (3)
/work/mxnet/python/mxnet/../../build/libmxnet.so(MXImperativeInvokeImpl(void*,
int, void**, int*, void***, int, char const**, char const**)+0x5d4)
[0x7fb69ca2f254]
[2020-06-08T10:57:34.306Z] E [bt] (2)
/work/mxnet/python/mxnet/../../build/libmxnet.so(mxnet::Imperative::Invoke(mxnet::Context
const&, nnvm::NodeAttrs const&, std::vector<mxnet::NDArray*,
std::allocator<mxnet::NDArray*> > const&, std::vector<mxnet::NDArray*,
std::allocator<mxnet::NDArray*> > const&)+0xf9) [0x7fb69cb77c79]
[2020-06-08T10:57:34.306Z] E [bt] (1)
/work/mxnet/python/mxnet/../../build/libmxnet.so(mxnet::imperative::SetShapeType(mxnet::Context
const&, nnvm::NodeAttrs const&, std::vector<mxnet::NDArray*,
std::allocator<mxnet::NDArray*> > const&, std::vector<mxnet::NDArray*,
std::allocator<mxnet::NDArray*> > const&, mxnet::DispatchMode*)+0x86c)
[0x7fb69cb8878c]
[2020-06-08T10:57:34.306Z] E [bt] (0)
/work/mxnet/python/mxnet/../../build/libmxnet.so(dmlc::LogMessageFatal::~LogMessageFatal()+0x7f)
[0x7fb69c88c82f]
[2020-06-08T10:57:34.306Z] E File
"/work/mxnet/src/imperative/./imperative_utils.h", line 173
[2020-06-08T10:57:34.306Z] E MXNetError: Operator _random_randint
inferring shapes failed.
[2020-06-08T10:57:34.306Z] E input shapes:
[2020-06-08T10:57:34.306Z] E output shapes:
[2020-06-08T10:57:34.306Z] E None
[2020-06-08T10:57:34.306Z] E operator attributes:
[2020-06-08T10:57:34.306Z] E dtype : int64
[2020-06-08T10:57:34.306Z] E shape : ()
[2020-06-08T10:57:34.306Z] E __profiler_scope__ : <unk>:
[2020-06-08T10:57:34.306Z] E ctx : gpu(0)
[2020-06-08T10:57:34.306Z] E high : 11
[2020-06-08T10:57:34.306Z] E low : 6
[2020-06-08T10:57:34.306Z]
[2020-06-08T10:57:34.306Z] mxnet/cython/./base.pyi:41: MXNetError
[2020-06-08T10:57:34.306Z] __________________________________ test_ifft
___________________________________
[2020-06-08T10:57:34.306Z] [gw1] linux -- Python 3.6.9 /usr/bin/python3
[2020-06-08T10:57:34.306Z]
[2020-06-08T10:57:34.306Z] @with_seed()
[2020-06-08T10:57:34.306Z] def test_ifft():
[2020-06-08T10:57:34.306Z] nrepeat = 2
[2020-06-08T10:57:34.306Z] maxdim = 10
[2020-06-08T10:57:34.306Z] for repeat in range(nrepeat):
[2020-06-08T10:57:34.306Z] for order in [2,4]:
[2020-06-08T10:57:34.306Z] shape =
tuple(np.random.randint(1, maxdim, size=order))
[2020-06-08T10:57:34.306Z] > check_ifft(shape)
[2020-06-08T10:57:34.306Z]
[2020-06-08T10:57:34.306Z] tests/python/gpu/test_operator_gpu.py:179:
[2020-06-08T10:57:34.306Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[2020-06-08T10:57:34.306Z] tests/python/gpu/test_operator_gpu.py:118: in
check_ifft
[2020-06-08T10:57:34.306Z] init = [np.random.normal(size=shape,
scale=1.0)]
[2020-06-08T10:57:34.306Z] python/mxnet/numpy/random.py:207: in normal
[2020-06-08T10:57:34.306Z] return _mx_nd_np.random.normal(loc, scale,
size, dtype, ctx, out)
[2020-06-08T10:57:34.306Z] python/mxnet/ndarray/numpy/random.py:179: in
normal
[2020-06-08T10:57:34.306Z] return _api_internal.normal(loc, scale, size,
ctx, dtype, out)
[2020-06-08T10:57:34.306Z] mxnet/_ffi/_cython/./function.pxi:188: in
mxnet._ffi._cy3.core.FunctionBase.__call__
[2020-06-08T10:57:34.306Z] ???
[2020-06-08T10:57:34.306Z] mxnet/_ffi/_cython/./function.pxi:132: in
mxnet._ffi._cy3.core.FuncCall
[2020-06-08T10:57:34.306Z] ???
[2020-06-08T10:57:34.306Z] mxnet/_ffi/_cython/./function.pxi:36: in
mxnet._ffi._cy3.core.make_arg
[2020-06-08T10:57:34.306Z] ???
[2020-06-08T10:57:34.306Z] mxnet/_ffi/_cython/./convert.pxi:73: in
mxnet._ffi._cy3.core.convert_object
[2020-06-08T10:57:34.306Z] ???
[2020-06-08T10:57:34.306Z] mxnet/_ffi/_cython/./convert.pxi:55: in
mxnet._ffi._cy3.core.convert_tuple
[2020-06-08T10:57:34.306Z] ???
[2020-06-08T10:57:34.306Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[2020-06-08T10:57:34.306Z]
[2020-06-08T10:57:34.306Z] > ???
[2020-06-08T10:57:34.306Z] E TypeError: Don't know how to convert type
<class 'mxnet.numpy.ndarray'>
[2020-06-08T10:57:34.306Z]
[2020-06-08T10:57:34.306Z] mxnet/_ffi/_cython/./convert.pxi:81: TypeError
[2020-06-08T10:57:34.306Z] ____________________
tests/python/gpu/test_operator_gpu.py _____________________
[2020-06-08T10:57:34.306Z] [gw0] linux -- Python 3.6.9 /usr/bin/python3
[2020-06-08T10:57:34.306Z] worker 'gw0' crashed while running
'tests/python/gpu/test_operator_gpu.py::test_fft'
[2020-06-08T10:57:34.564Z] _________________________
test_pooling_nhwc_with_type __________________________
[2020-06-08T10:57:34.564Z] [gw1] linux -- Python 3.6.9 /usr/bin/python3
[2020-06-08T10:57:34.564Z]
[2020-06-08T10:57:34.564Z] @with_seed()
[2020-06-08T10:57:34.564Z] def test_pooling_nhwc_with_type():
[2020-06-08T10:57:34.564Z] def make_pooling_syms(**kwargs):
[2020-06-08T10:57:34.564Z] # Conventional NCHW layout pooling
[2020-06-08T10:57:34.564Z] sym = mx.sym.Pooling(**kwargs)
[2020-06-08T10:57:34.564Z] # NHWC pooling
[2020-06-08T10:57:34.564Z] data = mx.sym.Variable('pool_data')
[2020-06-08T10:57:34.564Z] sym_nhwc = mx.sym.transpose(data,
axes=(0,2,3,1))
[2020-06-08T10:57:34.564Z] sym_nhwc = mx.sym.Pooling(sym_nhwc,
layout='NHWC', **kwargs)
[2020-06-08T10:57:34.564Z] sym_nhwc = mx.sym.transpose(sym_nhwc,
axes=(0,3,1,2), name='pool')
[2020-06-08T10:57:34.564Z] return [sym, sym_nhwc]
[2020-06-08T10:57:34.564Z]
[2020-06-08T10:57:34.564Z] # While the float32 and float64 output is
reliably consistent, float16 departs occasionally.
[2020-06-08T10:57:34.564Z] # We compare nhwc and nchw results only
within a given precision.
[2020-06-08T10:57:34.564Z] for data_type in [np.float64, np.float32,
np.float16]:
[2020-06-08T10:57:34.564Z] # NHWC pooling only enabled on GPU
with CUDNN
[2020-06-08T10:57:34.564Z] ctx_list = [{'ctx': mx.gpu(0),
'pool_data': (10, 2, 10, 10), 'type_dict': {'pool_data': data_type}}]
[2020-06-08T10:57:34.564Z] symlist =
make_pooling_syms(name='pool', kernel=(3,3), stride=(2,2), pool_type='max')
[2020-06-08T10:57:34.564Z] > check_consistency_NxM(symlist,
ctx_list)
[2020-06-08T10:57:34.564Z]
[2020-06-08T10:57:34.564Z] tests/python/gpu/test_operator_gpu.py:1107:
[2020-06-08T10:57:34.564Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[2020-06-08T10:57:34.564Z] tests/python/gpu/test_operator_gpu.py:647: in
check_consistency_NxM
[2020-06-08T10:57:34.564Z] check_consistency(np.repeat(sym_list,
len(ctx_list)), ctx_list * len(sym_list), scale=0.5)
[2020-06-08T10:57:34.564Z] python/mxnet/numpy/multiarray.py:5748: in repeat
[2020-06-08T10:57:34.564Z] return _mx_nd_np.repeat(a, repeats, axis)
[2020-06-08T10:57:34.564Z] python/mxnet/ndarray/numpy/_op.py:4079: in repeat
[2020-06-08T10:57:34.564Z] return _api_internal.repeat(a, repeats, axis)
[2020-06-08T10:57:34.564Z] mxnet/_ffi/_cython/./function.pxi:188: in
mxnet._ffi._cy3.core.FunctionBase.__call__
[2020-06-08T10:57:34.564Z] ???
[2020-06-08T10:57:34.564Z] mxnet/_ffi/_cython/./function.pxi:120: in
mxnet._ffi._cy3.core.FuncCall
[2020-06-08T10:57:34.564Z] ???
[2020-06-08T10:57:34.564Z] mxnet/_ffi/_cython/./function.pxi:107: in
mxnet._ffi._cy3.core.FuncCall3
[2020-06-08T10:57:34.564Z] ???
[2020-06-08T10:57:34.564Z] mxnet/_ffi/_cython/./function.pxi:36: in
mxnet._ffi._cy3.core.make_arg
[2020-06-08T10:57:34.564Z] ???
[2020-06-08T10:57:34.564Z] mxnet/_ffi/_cython/./convert.pxi:75: in
mxnet._ffi._cy3.core.convert_object
[2020-06-08T10:57:34.564Z] ???
[2020-06-08T10:57:34.564Z] mxnet/_ffi/_cython/./convert.pxi:63: in
mxnet._ffi._cy3.core.convert_list
[2020-06-08T10:57:34.564Z] ???
[2020-06-08T10:57:34.564Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[2020-06-08T10:57:34.564Z]
[2020-06-08T10:57:34.564Z] > ???
[2020-06-08T10:57:34.564Z] E TypeError: Don't know how to convert type
<class 'mxnet.symbol.symbol.Symbol'>
[2020-06-08T10:57:34.564Z]
[2020-06-08T10:57:34.564Z] mxnet/_ffi/_cython/./convert.pxi:81: TypeError
[2020-06-08T10:57:34.564Z] ____________________________
test_lstm_forget_bias _____________________________
[2020-06-08T10:57:34.564Z] [gw1] linux -- Python 3.6.9 /usr/bin/python3
[2020-06-08T10:57:34.564Z]
[2020-06-08T10:57:34.564Z] @with_seed()
[2020-06-08T10:57:34.564Z]
@assert_raises_cudnn_not_satisfied(min_version='5.1.10')
[2020-06-08T10:57:34.564Z] def test_lstm_forget_bias():
[2020-06-08T10:57:34.564Z] forget_bias = 2.0
[2020-06-08T10:57:34.564Z] fused = mx.rnn.FusedRNNCell(10,
forget_bias=forget_bias, num_layers=2, mode='lstm', prefix='')
[2020-06-08T10:57:34.564Z]
[2020-06-08T10:57:34.564Z] dshape = (32, 1, 20)
[2020-06-08T10:57:34.564Z] data = mx.sym.Variable('data')
[2020-06-08T10:57:34.564Z]
[2020-06-08T10:57:34.564Z] sym, _ = fused.unroll(1, data,
merge_outputs=True)
[2020-06-08T10:57:34.564Z] mod = mx.mod.Module(sym,
label_names=None, context=mx.gpu(0))
[2020-06-08T10:57:34.564Z] mod.bind(data_shapes=[('data', dshape)],
label_shapes=None)
[2020-06-08T10:57:34.564Z]
[2020-06-08T10:57:34.564Z] mod.init_params()
[2020-06-08T10:57:34.564Z]
[2020-06-08T10:57:34.564Z] args, auxs = mod.get_params()
[2020-06-08T10:57:34.564Z] args = fused.unpack_weights(args)
[2020-06-08T10:57:34.564Z]
[2020-06-08T10:57:34.564Z] bias_name = next(x for x in args if
x.endswith('f_bias'))
[2020-06-08T10:57:34.564Z] expected_bias = forget_bias * np.ones(10,
)
[2020-06-08T10:57:34.564Z] >
mx.test_utils.assert_allclose(args[bias_name], expected_bias)
[2020-06-08T10:57:34.564Z]
[2020-06-08T10:57:34.564Z] tests/python/gpu/test_operator_gpu.py:1778:
[2020-06-08T10:57:34.564Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[2020-06-08T10:57:34.564Z] python/mxnet/test_utils.py:641: in assert_allclose
[2020-06-08T10:57:34.564Z] assert_almost_equal(a, b, rtol=rtol,
atol=atol, equal_nan=equal_nan)
[2020-06-08T10:57:34.564Z] python/mxnet/test_utils.py:601: in
assert_almost_equal
[2020-06-08T10:57:34.564Z] output = mx.nd.contrib.allclose(a, b, rtol,
atol, equal_nan)
[2020-06-08T10:57:34.564Z] <string>:70: in allclose
[2020-06-08T10:57:34.564Z] ???
[2020-06-08T10:57:34.564Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[2020-06-08T10:57:34.564Z]
[2020-06-08T10:57:34.564Z] op_name = '_contrib_allclose', func_name =
'allclose'
[2020-06-08T10:57:34.564Z] args = [
[2020-06-08T10:57:34.564Z] [2. 2. 2. 2. 2. 2. 2. 2. 2. 2.]
[2020-06-08T10:57:34.564Z] <NDArray 10 @gpu(0)>, array([2., 2., 2., 2., 2.,
2., 2., 2., 2., 2.], ctx=gpu(0))]
[2020-06-08T10:57:34.564Z] out = None
[2020-06-08T10:57:34.564Z]
[2020-06-08T10:57:34.564Z] def _verify_all_legacy_ndarrays(op_name,
func_name, args, out):
[2020-06-08T10:57:34.564Z] """Verify if all the arrays are legacy
ndarrays.
[2020-06-08T10:57:34.564Z]
[2020-06-08T10:57:34.564Z] Parameters
[2020-06-08T10:57:34.564Z] ----------
[2020-06-08T10:57:34.564Z] op_name : str
[2020-06-08T10:57:34.564Z] Operator full name registered in
backend.
[2020-06-08T10:57:34.564Z] func_name : str
[2020-06-08T10:57:34.564Z] Operator name exposed to users. This
is usually the name by stripping off
[2020-06-08T10:57:34.564Z] the prefix of the full operator names
registered in backend.
[2020-06-08T10:57:34.564Z] args : list of arrays
[2020-06-08T10:57:34.564Z] Input ndarray arguments to be checked.
[2020-06-08T10:57:34.564Z] out : ndarray or None or list of ndarrays
[2020-06-08T10:57:34.564Z] User-provided output ndarrays.
[2020-06-08T10:57:34.564Z] """
[2020-06-08T10:57:34.564Z] from ..numpy import ndarray as np_ndarray
[2020-06-08T10:57:34.564Z] for arr in args:
[2020-06-08T10:57:34.564Z] if (arr is not None) and
(isinstance(arr, np_ndarray)):
[2020-06-08T10:57:34.565Z] raise TypeError('Operator `{}`
registered in backend is known as `{}` in Python. '
[2020-06-08T10:57:34.565Z] 'This is a legacy
operator which can only accept '
[2020-06-08T10:57:34.565Z] 'legacy ndarrays,
while received an MXNet numpy ndarray. '
[2020-06-08T10:57:34.565Z] 'Please call
`as_nd_ndarray()` upon the numpy ndarray to '
[2020-06-08T10:57:34.565Z] 'convert it to a
legacy ndarray, and then feed the converted '
[2020-06-08T10:57:34.565Z] 'array to this
operator.'
[2020-06-08T10:57:34.565Z] > .format(op_name,
func_name))
[2020-06-08T10:57:34.565Z] E TypeError: Operator
`_contrib_allclose` registered in backend is known as `allclose` in Python.
This is a legacy operator which can only accept legacy ndarrays, while received
an MXNet numpy ndarray. Please call `as_nd_ndarray()` upon the numpy ndarray to
convert it to a legacy ndarray, and then feed the converted array to this
operator.
[2020-06-08T10:57:34.565Z]
[2020-06-08T10:57:34.565Z] python/mxnet/ndarray/register.py:98: TypeError
[2020-06-08T10:57:34.565Z] _____________________________ test_take_with_type
______________________________
[2020-06-08T10:57:34.565Z] [gw3] linux -- Python 3.6.9 /usr/bin/python3
[2020-06-08T10:57:34.565Z]
[2020-06-08T10:57:34.565Z] @with_seed()
[2020-06-08T10:57:34.565Z] def test_take_with_type():
[2020-06-08T10:57:34.565Z] sym = mx.sym.take(name='take')
[2020-06-08T10:57:34.565Z] for data_ndim in range(2, 5):
[2020-06-08T10:57:34.565Z] for idx_ndim in range(1, 4):
[2020-06-08T10:57:34.565Z] data_shape = ()
[2020-06-08T10:57:34.565Z] for _ in range(data_ndim):
[2020-06-08T10:57:34.565Z] > data_shape +=
(np.random.randint(low=3, high=6), )
[2020-06-08T10:57:34.565Z]
[2020-06-08T10:57:34.565Z] tests/python/gpu/test_operator_gpu.py:1683:
[2020-06-08T10:57:34.565Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[2020-06-08T10:57:34.565Z] python/mxnet/numpy/random.py:79: in randint
[2020-06-08T10:57:34.565Z] return _mx_nd_np.random.randint(low, high,
size, dtype, ctx, out)
[2020-06-08T10:57:34.565Z] python/mxnet/ndarray/numpy/random.py:91: in
randint
[2020-06-08T10:57:34.565Z] return _npi.random_randint(low, high,
shape=size, dtype=dtype, ctx=ctx, out=out)
[2020-06-08T10:57:34.565Z] <string>:58: in random_randint
[2020-06-08T10:57:34.565Z] ???
[2020-06-08T10:57:34.565Z] mxnet/cython/ndarray.pyx:219: in
mxnet._cy3.ndarray._imperative_invoke
[2020-06-08T10:57:34.565Z] ???
[2020-06-08T10:57:34.565Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _
[2020-06-08T10:57:34.565Z]
[2020-06-08T10:57:34.565Z] > ???
[2020-06-08T10:57:34.565Z] E mxnet.base.MXNetError: Traceback (most recent
call last):
[2020-06-08T10:57:34.565Z] E [bt] (9) /usr/bin/python3() [0x509a90]
[2020-06-08T10:57:34.565Z] E [bt] (8) /usr/bin/python3() [0x507d64]
[2020-06-08T10:57:34.565Z] E [bt] (7)
/usr/bin/python3(_PyEval_EvalFrameDefault+0x444) [0x50bfb4]
[2020-06-08T10:57:34.565Z] E [bt] (6) /usr/bin/python3() [0x50a635]
[2020-06-08T10:57:34.565Z] E [bt] (5)
/work/mxnet/python/mxnet/_cy3/ndarray.cpython-36m-x86_64-linux-gnu.so(+0x14a80)
[0x7fe3cb2b0a80]
[2020-06-08T10:57:34.565Z] E [bt] (4)
/work/mxnet/python/mxnet/../../build/libmxnet.so(MXImperativeInvokeEx+0x7a)
[0x7fe43e421e1a]
[2020-06-08T10:57:34.565Z] E [bt] (3)
/work/mxnet/python/mxnet/../../build/libmxnet.so(MXImperativeInvokeImpl(void*,
int, void**, int*, void***, int, char const**, char const**)+0x5d4)
[0x7fe43e421254]
[2020-06-08T10:57:34.565Z] E [bt] (2)
/work/mxnet/python/mxnet/../../build/libmxnet.so(mxnet::Imperative::Invoke(mxnet::Context
const&, nnvm::NodeAttrs const&, std::vector<mxnet::NDArray*,
std::allocator<mxnet::NDArray*> > const&, std::vector<mxnet::NDArray*,
std::allocator<mxnet::NDArray*> > const&)+0xf9) [0x7fe43e569c79]
[2020-06-08T10:57:34.565Z] E [bt] (1)
/work/mxnet/python/mxnet/../../build/libmxnet.so(mxnet::imperative::SetShapeType(mxnet::Context
const&, nnvm::NodeAttrs const&, std::vector<mxnet::NDArray*,
std::allocator<mxnet::NDArray*> > const&, std::vector<mxnet::NDArray*,
std::allocator<mxnet::NDArray*> > const&, mxnet::DispatchMode*)+0x86c)
[0x7fe43e57a78c]
[2020-06-08T10:57:34.565Z] E [bt] (0)
/work/mxnet/python/mxnet/../../build/libmxnet.so(dmlc::LogMessageFatal::~LogMessageFatal()+0x7f)
[0x7fe43e27e82f]
[2020-06-08T10:57:34.565Z] E File
"/work/mxnet/src/imperative/./imperative_utils.h", line 173
[2020-06-08T10:57:34.565Z] E MXNetError: Operator _random_randint
inferring shapes failed.
[2020-06-08T10:57:34.565Z] E input shapes:
[2020-06-08T10:57:34.565Z] E output shapes:
[2020-06-08T10:57:34.565Z] E None
[2020-06-08T10:57:34.565Z] E operator attributes:
[2020-06-08T10:57:34.565Z] E dtype : int64
[2020-06-08T10:57:34.565Z] E shape : ()
[2020-06-08T10:57:34.565Z] E __profiler_scope__ : <unk>:
[2020-06-08T10:57:34.565Z] E ctx : gpu(0)
[2020-06-08T10:57:34.565Z] E high : 6
[2020-06-08T10:57:34.565Z] E low : 3
[2020-06-08T10:57:34.565Z]
[2020-06-08T10:57:34.565Z] mxnet/cython/./base.pyi:41: MXNetError
[2020-06-08T10:57:34.565Z] =============================== warnings summary
===============================
```
http://jenkins.mxnet-ci.amazon-ml.com/blue/rest/organizations/jenkins/pipelines/mxnet-validation/pipelines/unix-gpu/branches/PR-18403/runs/1/nodes/385/steps/478/log/?start=0
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]