On 11/17/11 2:48 AM, Srivatsan Ramanujam wrote:
I was trying to run the OpenNLP POSTagger on Android and noticed that it
was taking way too long to load a large model (4-5 MB in size).
I had to settle for a much smaller model, to be able to get the POS tagger
to run in reasonable time (on Android).

Please give us some details about the device you used.
How much memory has it?

Having a model in memory can easily require a couple
of 10 MBs.

Jörn

Reply via email to