I do not believe the perceptron trainer is multithreaded.  But it should be 
fast.

On 1/3/17, 1:44 PM, "Damiano Porta" <damianopo...@gmail.com> wrote:

    Hi WIlliam, thank you!
    Is there a similar thing for perceptron (perceptron sequence) too?
    
    2017-01-03 19:41 GMT+01:00 William Colen <co...@apache.org>:
    
    > Damiano,
    >
    > If you are using Maxent, try TrainingParameters.THREADS_PARAM
    >
    > https://opennlp.apache.org/documentation/1.7.0/apidocs/
    > opennlp-tools/opennlp/tools/util/TrainingParameters.html#THREADS_PARAM
    >
    > William
    >
    > 2017-01-03 16:27 GMT-02:00 Damiano Porta <damianopo...@gmail.com>:
    >
    > > I am training a new postagger and lemmatizer.
    > >
    > > 2017-01-03 19:24 GMT+01:00 Russ, Daniel (NIH/CIT) [E] <
    > dr...@mail.nih.gov
    > > >:
    > >
    > > > Can you be a little more specific?  What trainer are you using?
    > > > Thanks
    > > > Daniel
    > > >
    > > > On 1/3/17, 1:22 PM, "Damiano Porta" <damianopo...@gmail.com> wrote:
    > > >
    > > >     Hello,
    > > >     I have a very very big training set, is there a way to speed up 
the
    > > >     training process? I only have changed the Xmx option inside
    > > bin/opennlp
    > > >
    > > >     Thanks
    > > >     Damiano
    > > >
    > > >
    > > >
    > >
    >
    

Reply via email to