Sure. It is actually throughout the document. All I had to do was search
for "database" in

https://opennlp.apache.org/documentation/1.7.2/manual/opennlp.html

I pulled text above as a copy and paste.

I think the solution I was looking for would be to take a model's database
connection (input steam equivalent), create tables, and manipulate the
database to store the internal model format---however the object would
save. It would then control the object loading in the same way. I wouldn't
need to save the byte representation to a database, essentially saving a
file within a database, but I was wondering if objects would save internal
state to a database and then load and re-read it. If not, that might be an
interesting feature.

Thanks,
~Ben

On Fri, Apr 14, 2017 at 12:46 PM, Daniel Russ <dr...@apache.org> wrote:

> I wont say it is impossible.  I have custom BaseModel (like POSModel or
> ParseModel but it does something different), if you use the commandline
> tool there is call to CmdLineUtil.writeModel(String,File,BaseModel).
> Notice that it requires a file.  If you want to save it in a database, I
> can think of a few ways of doing this.
>
> 1.  (Expert knowlege required)  Edit the code to read/write to db
> 2.  Write to file, then read as a large byte[], write to a blob.  read blob
> from db, write file, read into opennlp.
> 3.  Use something like MapDB?  Read in models. put them in a MapDB database
> (looks like a hashmap, but is actually a key-value database.
>
> I think 1 and 2 would be difficult.  If 3 works for you, let me know.  Even
> if 3 does work, I don't think it will work with the command line tool.  The
> best option may be to stick with the file-based models.  Is there some
> reason this wont work?  You might wont to consider adding a feature
> request, opennlp.apache.org > issue tracker (though you may have to create
> a login).
>
> Can you tell me where in the documentation it talks about saving models to
> a database?  It probably should be fixed, since we cant do it now.
> Daniel
>

Reply via email to