Thanks for replying ted,as per our understanding you are trying to say this
needs to be done by the developer beforehand by using some  dynamic
techniques and while testing user will assign these values based on the
optimal values we have generated using some dynamic techniques is it
correct?


On Sat, Oct 19, 2013 at 11:57 AM, Ted Dunning <[email protected]> wrote:

> That has been the practice in Mahout so far.
>
> Generally, a higher level learner is used to adjust those parameters, but
> it is important for testing purposes to expose them.
>
>
> On Sat, Oct 19, 2013 at 6:16 AM, Sushanth Bhat(MT2012147) <
> [email protected]> wrote:
>
> > Hi,
> >
> > We are implementing Multi-layer perceptron Neural networks using
> > back-propagation for Mahout. There are some parameters such as learning
> > rate, momentum, activation function, threshold error, number of layers,
> > number of neurons in hidden layers which are dependent upon the input
> data.
> > Are we suppose to make these parameters to be passed by user?
> >
> >
> > Thanks and regards,
> > Sushanth Bhat
> > IIIT-Bangalore
> >
>



-- 
Surabhi
http://www.linkedin.com/pub/surabhi-pandey/22/46/904

Reply via email to