Hi, I have already set optimizer = None. But Theano cause Memory Error.

Thanks,
Tsuyoshi.

On Friday, May 24, 2019 at 3:47:16 PM UTC+9, toto haryanto wrote:
>
> Dear
> Miyamoto.
> It Seem out of memory problem. 
> You can open file *.theanorc* and edit the *optimizer*.
> *optimizer = fast_run* or if it is not success you can set the *optimizer 
> = None*
>
> Thanks
>
> Regards
> Toto 
>
>
>
> *Toto Haryanto*
>
> *Mahasiswa Program Doktor*
> *Fakultas Ilmu Komputer *
> *Universitas Indonesia *
>
> ==========================
> Office: 
> Depertemen Ilmu Komputer IPB
> Jl. Meranti Wing 20 Level 5
> Kampus Darmaga-Bogor
> (0251)8625584
> <http://cs.ipb.ac.id/~bioinfo/>
> email : totoha...@[ipb.ac.id || apps.ipb.ac.id <javascript:>]
> http://totoharyanto.staff.ipb.ac.id
> http://cs.ipb.ac.id/~bioinfo/
> ==========================
>
>
> On Fri, May 24, 2019 at 10:55 AM 宮本剛 <[email protected] 
> <javascript:>> wrote:
>
>> Hi.
>>
>> I tried to execute a sample python program on the book 
>> Deep-Learning-with-Keras 
>> <https://github.com/PacktPublishing/Deep-Learning-with-Keras>
>>
>> But the program cause Memory Error detailed follows:
>>
>> MemoryError: 
>> Apply node that caused the error: Elemwise{sgn}(Elemwise{add,no_inplace}.0)
>> Toposort index: 419
>> Inputs types: [TensorType(float32, 4D)]
>> Inputs shapes: [(3, 400, 400, 64)]
>> Inputs strides: [(40960000, 1600, 4, 640000)]
>> Inputs values: ['not shown']
>> Inputs type_num: [11]
>> Outputs clients: [[Elemwise{mul}(Elemwise{mul}.0, Elemwise{sgn}.0)]]
>>
>> Backtrace when the node is created(use Theano flag traceback.limit=N to make 
>> it longer):
>>   File "C:\ProgramData\Anaconda3\lib\site-packages\theano\gradient.py", line 
>> 1326, in access_grad_cache
>>     term = access_term_cache(node)[idx]
>>   File "C:\ProgramData\Anaconda3\lib\site-packages\theano\gradient.py", line 
>> 1021, in access_term_cache
>>     output_grads = [access_grad_cache(var) for var in node.outputs]
>>   File "C:\ProgramData\Anaconda3\lib\site-packages\theano\gradient.py", line 
>> 1021, in <listcomp>
>>     output_grads = [access_grad_cache(var) for var in node.outputs]
>>   File "C:\ProgramData\Anaconda3\lib\site-packages\theano\gradient.py", line 
>> 1326, in access_grad_cache
>>     term = access_term_cache(node)[idx]
>>   File "C:\ProgramData\Anaconda3\lib\site-packages\theano\gradient.py", line 
>> 1021, in access_term_cache
>>     output_grads = [access_grad_cache(var) for var in node.outputs]
>>   File "C:\ProgramData\Anaconda3\lib\site-packages\theano\gradient.py", line 
>> 1021, in <listcomp>
>>     output_grads = [access_grad_cache(var) for var in node.outputs]
>>   File "C:\ProgramData\Anaconda3\lib\site-packages\theano\gradient.py", line 
>> 1326, in access_grad_cache
>>     term = access_term_cache(node)[idx]
>>   File "C:\ProgramData\Anaconda3\lib\site-packages\theano\gradient.py", line 
>> 1162, in access_term_cache
>>     new_output_grads)
>>
>> Debugprint of the apply node: 
>> Elemwise{sgn} [id A] <TensorType(float32, 4D)> ''   
>>  |Elemwise{add,no_inplace} [id B] <TensorType(float32, 4D)> ''   
>>    |InplaceDimShuffle{0,2,3,1} [id C] <TensorType(float32, 4D)> ''   
>>    | |AbstractConv2d{convdim=2, border_mode='half', subsample=(1, 1), 
>> filter_flip=True, imshp=(None, 3, None, None), kshp=(64, 3, 3, 3), 
>> filter_dilation=(1, 1), num_groups=1, unshared=False} [id D] 
>> <TensorType(float32, 4D)> ''   
>>    |   |InplaceDimShuffle{0,3,1,2} [id E] <TensorType(float32, 4D)> ''   
>>    |   | |Join [id F] <TensorType(float32, 4D)> ''   
>>    |   |   |TensorConstant{0} [id G] <TensorType(int8, scalar)>
>>    |   |   |/variable [id H] <TensorType(float32, 4D)>
>>    |   |   |/variable [id I] <TensorType(float32, 4D)>
>>    |   |   |/placeholder [id J] <TensorType(float32, 4D)>
>>    |   |InplaceDimShuffle{3,2,0,1} [id K] <TensorType(float32, 4D)> ''   
>>    |     |block1_conv1/kernel [id L] <TensorType(float32, 4D)>
>>    |Reshape{4} [id M] <TensorType(float32, (True, True, True, False))> ''   
>>      |block1_conv1/bias [id N] <TensorType(float32, vector)>
>>      |MakeVector{dtype='int64'} [id O] <TensorType(int64, vector)> ''   
>>        |Elemwise{Cast{int64}} [id P] <TensorType(int64, scalar)> ''   
>>        | |TensorConstant{1} [id Q] <TensorType(int8, scalar)>
>>        |Elemwise{Cast{int64}} [id R] <TensorType(int64, scalar)> ''   
>>        | |TensorConstant{1} [id Q] <TensorType(int8, scalar)>
>>        |Elemwise{Cast{int64}} [id S] <TensorType(int64, scalar)> ''   
>>        | |TensorConstant{1} [id Q] <TensorType(int8, scalar)>
>>        |Subtensor{int64} [id T] <TensorType(int64, scalar)> ''   
>>          |Shape [id U] <TensorType(int64, vector)> ''   
>>          | |block1_conv1/bias [id N] <TensorType(float32, vector)>
>>          |Constant{0} [id V] <int64>
>>
>>
>> I don't know what's wrong.
>>
>>
>>
>> -- 
>>
>> --- 
>> You received this message because you are subscribed to the Google Groups 
>> "theano-users" group.
>> To unsubscribe from this group and stop receiving emails from it, send an 
>> email to [email protected] <javascript:>.
>> To view this discussion on the web visit 
>> https://groups.google.com/d/msgid/theano-users/31013f94-a40c-4c52-bf3b-d78390a33a61%40googlegroups.com
>>  
>> <https://groups.google.com/d/msgid/theano-users/31013f94-a40c-4c52-bf3b-d78390a33a61%40googlegroups.com?utm_medium=email&utm_source=footer>
>> .
>> For more options, visit https://groups.google.com/d/optout.
>>
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/theano-users/08903454-b4f8-4b87-ab2e-fa72fa43ae7c%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to