To be safer, uninstall Theano a few time before and make sure you can't
import it before reinstalling.

Le 15 nov. 2016 16:20, "Pascal Lamblin" <[email protected]> a
écrit :

> On Tue, Nov 15, 2016, Chi Ku wrote:
> > Hi Pascal,
> >
> >    How would I find out which version of Theano I have?
>
> You can try to print theano.version.version, that should be generated when
> you call "pip install"
>
> >    Can I use the following command to install the latest development
> version?
> >
> >       <sudo> pip install <--user> <--no-deps> git+
> https://github.com/Theano/Theano
>
> I think so.
>
> >
> >    Thanks.
> >
> > Chi
> >
> >
> >
> >
> >
> >
> > -----Original Message-----
> > From: [email protected] [mailto:theano-users@
> googlegroups.com] On Behalf Of Pascal Lamblin
> > Sent: Monday, November 14, 2016 12:37 PM
> > To: [email protected]
> > Subject: Re: [theano-users] TypeError: Cannot convert Type
> TensorType(float64, 3D) (of Variable Subtensor{:int64:}.0) into Type
> TensorType(float64, (False, True, False)). You can try to manually convert
> Subtensor{:int64:}.0 into a TensorType(float64, (False, Tr...
> >
> > Which version of Theano are you using?
> > If you are using the 0.8.2 release, can you try the latest development
> version?
> >
> > On Sat, Nov 12, 2016, [email protected] wrote:
> > >
> > > I got the following errors when calling tensor.grad() to compute the
> > > symbolic gradient of the finetune_cost of a hybrid DBN-RNN model.
> > > I tried changing the way this expression is formed in several ways
> without
> > > any success.   I need help from experts.   A tar file of the source
> code
> > > and data is attached here.
> > >
> > > The pretraining code for file hybrid_array.py ran for 0.48m
> > > ... getting the finetuning functions
> > > Traceback (most recent call last):
> > >   File "/usr/lib/python2.7/pdb.py", line 1314, in main
> > >     pdb._runscript(mainpyfile)
> > >   File "/usr/lib/python2.7/pdb.py", line 1233, in _runscript
> > >     self.run(statement)
> > >   File "/usr/lib/python2.7/bdb.py", line 400, in run
> > >     exec cmd in globals, locals
> > >   File "<string>", line 1, in <module>
> > >   File "hybrid_array.py", line 494, in <module>
> > >     test_DBN()
> > >   File "hybrid_array.py", line 412, in test_DBN
> > >     learning_rate=finetune_lr
> > >   File "hybrid_array.py", line 253, in build_finetune_functions
> > >     gparams = T.grad(self.finetune_cost, self.params)
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 561, in grad
> > >     grad_dict, wrt, cost_name)
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 1324, in _populate_grad_dict
> > >     rval = [access_grad_cache(elem) for elem in wrt]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 1279, in access_grad_cache
> > >     term = access_term_cache(node)[idx]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 973, in access_term_cache
> > >     output_grads = [access_grad_cache(var) for var in node.outputs]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 1279, in access_grad_cache
> > >     term = access_term_cache(node)[idx]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 973, in access_term_cache
> > >     output_grads = [access_grad_cache(var) for var in node.outputs]
> > >     output_grads = [access_grad_cache(var) for var in node.outputs]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 1279, in access_grad_cache
> > >     term = access_term_cache(node)[idx]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 973, in access_term_cache
> > >     output_grads = [access_grad_cache(var) for var in node.outputs]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 1279, in access_grad_cache
> > >     term = access_term_cache(node)[idx]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 973, in access_term_cache
> > >     output_grads = [access_grad_cache(var) for var in node.outputs]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 1279, in access_grad_cache
> > >     term = access_term_cache(node)[idx]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 973, in access_term_cache
> > >     output_grads = [access_grad_cache(var) for var in node.outputs]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 1279, in access_grad_cache
> > >     term = access_term_cache(node)[idx]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 973, in access_term_cache
> > >     output_grads = [access_grad_cache(var) for var in node.outputs]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 1279, in access_grad_cache
> > >     term = access_term_cache(node)[idx]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 973, in access_term_cache
> > >     output_grads = [access_grad_cache(var) for var in node.outputs]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 1279, in access_grad_cache
> > >     term = access_term_cache(node)[idx]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 973, in access_term_cache
> > >     output_gra    term = access_term_cache(node)[idx]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 973, in access_term_cache
> > >     output_grads = [access_grad_cache(var) for var in node.outputs]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 1279, in access_grad_cache
> > >     term = access_term_cache(node)[idx]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 973, in access_term_cache
> > >     output_grads = [access_grad_cache(var) for var in node.outputs]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 1279, in access_grad_cache
> > >     term = access_term_cache(node)[idx]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 973, in access_term_cache
> > >     output_grads = [access_grad_cache(var) for var in node.outputs]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 1279, in access_grad_cache
> > >     term = access_term_cache(node)[idx]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 973, in access_term_cache
> > >     output_grads = [access_grad_cache(var) for var in node.outputs]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 1279, in access_grad_cache
> > >     term = access_term_cache(node)[idx]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 973, in access_term_cache
> > >     output_grads = [access_grad_cache(var) for var in node.outputs]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 1279, in access_grad_cache
> > >     term = access_term_cache(node)[idx]
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gradient.py",
> line
> > > 1113, in access_term_cache
> > >     input_grads = node.op.grad(inputs, new_output_grads)
> > >   File
> > > "/usr/local/lib/python2.7/dist-packages/theano/scan_
> module/scan_op.py",
> > > line 2523, in grad
> > >     outputs = local_op(*outer_inputs)
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/gof/op.py",
> line 611,
> > > in __call__
> > >     node = self.make_node(*inputs, **kwargs)
> > >   File
> > > "/usr/local/lib/python2.7/dist-packages/theano/scan_
> module/scan_op.py",
> > > line 430, in make_node
> > >     new_inputs.append(format(outer_seq, as_var=inner_seq))
> > >   File
> > > "/usr/local/lib/python2.7/dist-packages/theano/scan_
> module/scan_op.py",
> > > line 422, in format
> > >     rval = tmp.filter_variable(rval)
> > >   File "/usr/local/lib/python2.7/dist-packages/theano/tensor/type.py",
> line
> > > 233, in filter_variable
> > >     self=self))
> > > TypeError: Cannot convert Type TensorType(float64, 3D) (of Variable
> > > Subtensor{:int64:}.0) into Type TensorType(float64, (False, True,
> False)).
> > > You can try to manually convert Subtensor{:int64:}.0 into a
> > > TensorType(float64, (False, True, False)).
> > > Uncaught exception. Entering post mortem debugging
> > > Running 'cont' or 'step' will restart the program
> > > >
> > > /usr/local/lib/python2.7/dist-packages/theano/tensor/type.
> py(233)filter_variable()
> > > -> self=self))
> > >
> > >
> > >
> > > --
> > >
> > > ---
> > > You received this message because you are subscribed to the Google
> Groups "theano-users" group.
> > > To unsubscribe from this group and stop receiving emails from it, send
> an email to [email protected].
> > > For more options, visit https://groups.google.com/d/optout.
> >
> >
> >
> > --
> > Pascal
> >
> > --
> >
> > ---
> > You received this message because you are subscribed to a topic in the
> Google Groups "theano-users" group.
> > To unsubscribe from this topic, visit https://groups.google.com/d/
> topic/theano-users/8uSF6ub-drA/unsubscribe.
> > To unsubscribe from this group and all its topics, send an email to
> [email protected].
> > For more options, visit https://groups.google.com/d/optout.
> >
> > --
> >
> > ---
> > You received this message because you are subscribed to the Google
> Groups "theano-users" group.
> > To unsubscribe from this group and stop receiving emails from it, send
> an email to [email protected].
> > For more options, visit https://groups.google.com/d/optout.
>
> --
> Pascal
>
> --
>
> ---
> You received this message because you are subscribed to the Google Groups
> "theano-users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> For more options, visit https://groups.google.com/d/optout.
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to