[
https://issues.apache.org/jira/browse/OPENNLP-1515?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17774510#comment-17774510
]
ASF GitHub Bot commented on OPENNLP-1515:
-----------------------------------------
rzo1 commented on PR #551:
URL: https://github.com/apache/opennlp/pull/551#issuecomment-1759604170
I agree with you @jzonthemtn - maybe we can attach a `classifier`-jar
(separate issue) for GPU support in another issue? People consuming it via
Maven can use the classifier to get correct transient dependencies without
using excludes.
> Default to onnxruntime instead of onnxruntime-gpu
> -------------------------------------------------
>
> Key: OPENNLP-1515
> URL: https://issues.apache.org/jira/browse/OPENNLP-1515
> Project: OpenNLP
> Issue Type: Task
> Components: Deep Learning
> Reporter: Jeff Zemerick
> Assignee: Jeff Zemerick
> Priority: Major
>
> The onnxruntime-gpu dependency is currently being included by opennlp-dl.
> <dependency>
> <groupId>com.microsoft.onnxruntime</groupId>
> <!-- This dependency supports CPU and GPU -->
> <artifactId>onnxruntime_gpu</artifactId>
> <version>${onnxruntime.version}</version>
> </dependency>
> The problem is, GPU support is only on Linux and Windows and not OSX. I think
> it would be best to use the onnxruntime dependency instead.
> But we need to make OpenNLP able to use GPU easily.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)