Yes, you understood correctly.
You can see the implementation in the code:
https://github.com/scikit-learn/scikit-learn/blob/master/sklearn/neural_network/multilayer_perceptron.py#L491
It calls ``train_test_split``, so it's a random subset of the data.
Currently the API doesn't allow providing your own validation set.
What is the use-case for that?
Andy
On 08/11/2017 05:57 PM, fabian.si...@gmx.net wrote:
Hello Scikit-Learn Team,
I´ve got a question concerning the implementation of Early Stopping in
MLPClassifier. I am using it in combination with RandomizedSearchCV.
The fraction used for validation in early stopping is set with the
parameter validation_fraction of MLPClassifier. How is the validaton
set extracted from the training set ? Does the function simply take
the last X % from the training set ? Is there a possibility to
manually set this validation set ?
I wonder whether I correctly understand the functionality: The neural
net is trained on the training data and the performance is evaluated
after every epoch on the validation set (which is internally selected
by the MLPClassifer)? If the Net stops training, the performance on
the left out data (Parameter "cv" in RandomizedSearch) is determined ?
Thank you very much for your help !
Kind Regards,
Fabian Sippl
_______________________________________________
scikit-learn mailing list
scikit-learn@python.org
https://mail.python.org/mailman/listinfo/scikit-learn
_______________________________________________
scikit-learn mailing list
scikit-learn@python.org
https://mail.python.org/mailman/listinfo/scikit-learn