Never mind, I think I am going to go ahead with passing the training set as 
the validation set for now.

On Tuesday, September 6, 2016 at 8:17:02 PM UTC+5:30, Mallika Agarwal wrote:
>
> Hello, 
>
> I have a dataset divided into just a train and test set. Is there a way I 
> can skip the "validation" part? 
>
> Could someone guide me on how to? Because the part where the validation 
> score is checked, I can't simply remove that, can I?
>
> if (iter + 1) % validation_frequency == 0:
>
>                 # compute zero-one loss on validation set
>                 validation_losses = [validate_model(i) for i
>                                      in range(n_valid_batches)]
>                 this_validation_loss = numpy.mean(validation_losses)
>                 print('epoch %i, minibatch %i/%i, validation error %f %%' %
>                       (epoch, minibatch_index + 1, n_train_batches,
>                        this_validation_loss * 100.))
>
>                 # if we got the best validation score until now
>                 if this_validation_loss < best_validation_loss:
>
>                     #improve patience if loss improvement is good enough
>                     if this_validation_loss < best_validation_loss *  \
>                        improvement_threshold:
>                         patience = max(patience, iter * patience_increase)
>
>                     # save best validation score and iteration number
>                     best_validation_loss = this_validation_loss
>                     best_iter = iter
>
>                     # test it on the test set
>                     test_losses = [
>                         test_model(i)
>                         for i in range(n_test_batches)
>                     ]
>                     test_score = numpy.mean(test_losses)
>                     print(('     epoch %i, minibatch %i/%i, test error of '
>                            'best model %f %%') %
>                           (epoch, minibatch_index + 1, n_train_batches,
>                            test_score * 100.))
>
> This is from convolutional_mlp.py. 
>
> Thanks in anticipation!
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"theano-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to