I am similiarly dealing with large number of video clips. I had stored the 
clips in the LMDB format. I am loading the dataset as follows: 
X_train, y_train, _ = load_data(some arguments)
X_val, y_val, _ = load_data(some arguments)
X_test, y_test = load_data(some arguments)

data = (X_train, y_train, X_val, y_val, X_test, y_test)

Here number of X_train clips is 912, X_val and X_test is 144.
Is this the apt way to do it. My program shows memory error when I try to 
expand the dataset size. Can anyone suggest some way to iteratively load 
the data one by one directly from the LMDB database. Some block of code 
will be very helpful.

On Thursday, September 15, 2016 at 8:57:53 PM UTC+5:30, Jose Carranza wrote:
>
> Hi guys
>
> I have a fairly big dataset (million+ images for train) that I want to use 
> to train from scratch a model in Theano. In Caffe we use LMDB however I 
> haven't seen any best practice in Theano for something bigger than MNIST 
> and stuff like that. Can somebody suggest what is the best option to pull 
> data into Theano/Lasagne? I need something that is not 100% in memory but 
> that can pull in batches (hopefully also shuffled batches).
>
> Thx in advance
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to theano-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to