RE: Learning Fails with 4 Number of Layes at ANN Training with SGDOptimizer

2016-02-16 Thread Ulanov, Alexander
Hi Hayri, The default MLP optimizer is LBFGS. SGD is available only thought the private interface and its use is discouraged due to multiple reasons. With regards to SGD in general, the paramters are very specific to the dataset and network configuration, one need to find them empirically. The

Learning Fails with 4 Number of Layes at ANN Training with SGDOptimizer

2016-02-09 Thread Hayri Volkan Agun
Hi Everyone, When MultilayerPerceptronClassifier set to three or four number of layers and the SGDOptimizer's selected parameters are as follows. tol : 1e-5 numIter=1 layers : 82,100,30,29 stepSize=0.05 sigmoidFunction in all layers learning finishes but it doesn't converge. What may be the