I am fine with the suggested approach.

Maybe, we should do a minor release after this is merged in order to
unblock the SOLR folks?

Gruß
Richard

Am Donnerstag, dem 12.10.2023 um 12:41 -0400 schrieb Jeff Zemerick:
> Hi all,
> 
> I created OPENNLP-1515 to change the ONNX Runtime dependency from
> onnxruntime-gpu to onnxruntime. This change will remove GPU support
> and
> cause OpenNLP to always use CPU for inference. The reason for this
> change
> is the onnxruntime dependency supports Linux, Windows, and Mac x64,
> and the
> onnxruntime-gpu dependency only supports Linux and Windows. (
> https://onnxruntime.ai/docs/get-started/with-java.html) I think
> OpenNLP
> should support the most operating systems out of the box instead of
> favoring GPU. Please take a look at the pull request:
> https://github.com/apache/opennlp/pull/551
> 
> This change is partially proposed to support OpenNLP's ONNX
> integration in
> Apache Solr: https://github.com/apache/solr/pull/1999
> 
> I think GPU support in OpenNLP should be easily accessible to users,
> so I
> wrote OPENNLP-1516 to capture that with a link to one possible
> method. If
> the above PR is merged, a user can still enable GPU in OpenNLP by
> manually
> replacing the onnxruntime.jar with onnxruntime-gpu.jar on their
> classpath
> until OPENNLP-1516 is resolved.
> 
> All comments/suggestions are welcome!
> 
> Thanks,
> Jeff

Reply via email to