I hope you are doing fine. I just have a question, and thought you might
have an idea on it. I have some large images (i.e. 1000x1000 pixels), and
would like to train a convolutional neural network on those images.
I used a GPU instance from Amazon Web Service (AWS), but the training takes
much time, although I picked a kind of fast instance.
My question is, would there be a better way to train such large images,
especially that they will be increasing in number day after day, and thus
expecting to have a kind of large dataset.
At the moment, I'm feeding all my data to the network and waiting until all
the epochs run on the dataset. I thought of may running the epochs on
smaller numbers, saving the model, and building on that, but for the first
small images I used, I got the accuracy of "0", and wasn't thus sure of the
reliability of using this model for the next batch of images.
Is there a way to train a large dataset with large images better than just
feeding all the data in the network and waiting, which might sometimes take
day, or that would be the normal case, and what I'm currently doing
(feeding all the dataset) would be the correct way?
Thanks so much, and apologize for my disturbance.
code-quality mailing list