Hi Ryan,

 Thanks a lot. This is helpful. I will try something like this.

class FFNWrapper
{
  ...

  template<typename MatType, typename LabelsType>
  void Train(const MatType& data,
             const LabelsType& labels,
             const vector<int>& hidden_layers)
  {
  // Based on the vector, I will create the hidden_layers and start the training

  }
  ...
};

Hope this works.

Thanks,
Ambica
-----Original Message-----
From: Ryan Curtin <r...@ratml.org>
Sent: 13 November 2020 08:20
To: Ambica Prasad <ambica.pra...@arm.com>
Cc: Benson Muite <benson_mu...@emailplus.org>; mlpack@lists.mlpack.org
Subject: Re: [mlpack] Tutorial for HyperParameterTuning for FFNs (Ambica Prasad)

Hi Ambica,

There's one more thing worth mentioning.  The hyperparameter tuner works with 
mlpack classifiers (or regressors) whose hyperparameters are specified in the 
Train() call.  So, for instance, you could implement a class that works a 
little like this:

class FFNWrapper
{
  ...

  template<typename MatType, typename LabelsType>
  void Train(const MatType& data,
             const LabelsType& labels,
             const bool addSecondLayer)
  {
    // In this method you would build the network, and if
    // `addSecondLayer` is true, you would add a second layer, then do
    // the training.
  }

  ...
};

Now that is just one idea for a single boolean parameter, but you could extend 
that to do search over architectures, so long as you can keep the parameters of 
the architecture as parameters to Train().  Then I think the hyperparameter 
tuner could work for that situation.

I hope this is helpful!  I know it would be a bit of implementation work, but 
it should work (maybe with minor modifications). :)

On Wed, Nov 11, 2020 at 07:47:48PM +0000, Ambica Prasad wrote:
> Thanks Benson, I get it now.
>
> Thanks,
> Ambica
>
> -----Original Message-----
> From: Benson Muite <benson_mu...@emailplus.org>
> Sent: 12 November 2020 00:50
> To: Ambica Prasad <ambica.pra...@arm.com>; mlpack@lists.mlpack.org
> Subject: Re: [mlpack] Tutorial for HyperParameterTuning for FFNs
> (Ambica Prasad)
>
> Hi Ambica,
> If the aim is to avoid overfitting and choose a reasonable number of 
> parameters, then DropOut might help reduce the size of grid search you need 
> to do - in particular, will likely need to write code to change number of 
> layers, but dropout changes layer size for you during training phase.
> Regards,
> Benson
> On 11/10/20 5:17 PM, Ambica Prasad wrote:
> > Hi Benson,
> >
> > I am not sure how I would use DropOut to perform a grid-search over my 
> > parameters. Could you elaborate?
> >
> > Thanks,
> > Ambica
> >
> > -----Original Message-----
> > From: mlpack <mlpack-boun...@lists.mlpack.org> On Behalf Of Benson
> > Muite
> > Sent: 08 November 2020 00:04
> > To: mlpack@lists.mlpack.org
> > Subject: Re: [mlpack] Tutorial for HyperParameterTuning for FFNs
> > (Ambica Prasad)
> >
> > You may also want to examine the documentation on dropout:
> > https://www.mlpack.org/doc/mlpack-3.0.4/doxygen/classmlpack_1_1ann_1
> > _1
> > Dropout.html
> >
> > On 11/7/20 9:15 PM, Aakash kaushik wrote:
> >> Hey Ambica
> >>
> >> So There is not a specific tutorial available for that but you can
> >> always put the layer size in an array and loop over that for
> >> variable layers sizes or you can sample random integers from a
> >> range and for layer numbers I believe you have to change them
> >> manually every time but not totally sure about it.
> >>
> >> Best,
> >> Aakash
> >>
> >> On Sat, Nov 7, 2020 at 10:30 PM <mlpack-requ...@lists.mlpack.org
> >> <mailto:mlpack-requ...@lists.mlpack.org>> wrote:
> >>
> >>      Send mlpack mailing list submissions to
> >>      mlpack@lists.mlpack.org <mailto:mlpack@lists.mlpack.org>
> >>
> >>      To subscribe or unsubscribe via the World Wide Web, visit
> >>      http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack
> >>      <http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack>
> >>      or, via email, send a message with subject or body 'help' to
> >>      mlpack-requ...@lists.mlpack.org
> >> <mailto:mlpack-requ...@lists.mlpack.org>
> >>
> >>      You can reach the person managing the list at
> >>      mlpack-ow...@lists.mlpack.org
> >> <mailto:mlpack-ow...@lists.mlpack.org>
> >>
> >>      When replying, please edit your Subject line so it is more specific
> >>      than "Re: Contents of mlpack digest..."
> >>      Today's Topics:
> >>
> >>          1. Tutorial for HyperParameterTuning for FFNs (Ambica
> >> Prasad)
> >>
> >>
> >>
> >>      ---------- Forwarded message ----------
> >>      From: Ambica Prasad <ambica.pra...@arm.com
> >>      <mailto:ambica.pra...@arm.com>>
> >>      To: "mlpack@lists.mlpack.org <mailto:mlpack@lists.mlpack.org>"
> >>      <mlpack@lists.mlpack.org <mailto:mlpack@lists.mlpack.org>>
> >>      Cc:
> >>      Bcc:
> >>      Date: Sat, 7 Nov 2020 02:36:39 +0000
> >>      Subject: [mlpack] Tutorial for HyperParameterTuning for FFNs
> >>
> >>      Hi Guys,____
> >>
> >>      __ __
> >>
> >>      Is there an example or a tutorial that explains how to perform the
> >>      hyperparameter tuning for FFNs, where I can evaluate the network on
> >>      different number of layers and layer-sizes?____
> >>
> >>      __ __
> >>
> >>      Thanks,____
> >>
> >>      Ambica____
> >>
> >>      __ __
> >>
> >>      __ __
> >>
> >>      IMPORTANT NOTICE: The contents of this email and any attachments are
> >>      confidential and may also be privileged. If you are not the intended
> >>      recipient, please notify the sender immediately and do not disclose
> >>      the contents to any other person, use it for any purpose, or store
> >>      or copy the information in any medium. Thank you.
> >>      _______________________________________________
> >>      mlpack mailing list
> >>      mlpack@lists.mlpack.org <mailto:mlpack@lists.mlpack.org>
> >>      http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack
> >>      <http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack>
> >>
> >>
> >> _______________________________________________
> >> mlpack mailing list
> >> mlpack@lists.mlpack.org
> >> http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack
> >>
> >
> > _______________________________________________
> > mlpack mailing list
> > mlpack@lists.mlpack.org
> > http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack
> > IMPORTANT NOTICE: The contents of this email and any attachments are 
> > confidential and may also be privileged. If you are not the intended 
> > recipient, please notify the sender immediately and do not disclose the 
> > contents to any other person, use it for any purpose, or store or copy the 
> > information in any medium. Thank you.
> >
>
> IMPORTANT NOTICE: The contents of this email and any attachments are 
> confidential and may also be privileged. If you are not the intended 
> recipient, please notify the sender immediately and do not disclose the 
> contents to any other person, use it for any purpose, or store or copy the 
> information in any medium. Thank you.
> _______________________________________________
> mlpack mailing list
> mlpack@lists.mlpack.org
> http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack
>

--
Ryan Curtin    | "Happy premise #2: There is no giant foot trying
r...@ratml.org | to squash me." - Kit Ramsey
IMPORTANT NOTICE: The contents of this email and any attachments are 
confidential and may also be privileged. If you are not the intended recipient, 
please notify the sender immediately and do not disclose the contents to any 
other person, use it for any purpose, or store or copy the information in any 
medium. Thank you.
_______________________________________________
mlpack mailing list
mlpack@lists.mlpack.org
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack

Reply via email to