reraise(exc_type, exc_value, exc_trace)

  File 
"/usr/local/lib/python2.7/dist-packages/theano/compile/function_module.py", 
line 884, in __call__

    self.fn() if output_subset is None else\

RuntimeError: Cuda error: k_elemwise_unary_rowmajor_copy: unspecified 
launch failure. (n_blocks=3829, n_threads_per_block=256)


Apply node that caused the error: GpuAlloc(GpuElemwise{Composite{(i0 * 
(Composite{(((i0 - i1) * i2) + i3)}(i1, i2, i3, i4) + Abs(Composite{(((i0 - 
i1) * i2) + i3)}(i1, i2, i3, i4))))}}[(0, 1)].0, TensorConstant{1}, 
TensorConstant{16}, TensorConstant{175}, TensorConstant{175}, 
TensorConstant{2})

Toposort index: 225

Inputs types: [CudaNdarrayType(float32, (False, False, False, False, 
True)), TensorType(int64, scalar), TensorType(int64, scalar), 
TensorType(int64, scalar), TensorType(int64, scalar), TensorType(int8, 
scalar)]

Inputs shapes: [(1, 16, 175, 175, 1), (), (), (), (), ()]

Inputs strides: [(0, 30625, 175, 1, 0), (), (), (), (), ()]

Inputs values: ['not shown', array(1), array(16), array(175), array(175), 
array(2, dtype=int8)]

Outputs clients: [[Rebroadcast{0}(GpuAlloc.0)]]


HINT: Re-running with most Theano optimization disabled could give you a 
back-trace of when this node was created. This can be done with by setting 
the Theano flag 'optimizer=fast_compile'. If that does not work, Theano 
optimizations can be disabled with 'optimizer=None'.

HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint and 
storage map footprint of this apply node.


what could be the problem? i am running with two gpus (K80) simultaneously. 
 it is on AWS.  i had run many many times until this time it failed as the 
way above.

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to