[
https://issues.apache.org/jira/browse/OPENNLP-1515?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17774504#comment-17774504
]
ASF GitHub Bot commented on OPENNLP-1515:
-----------------------------------------
rzo1 commented on PR #551:
URL: https://github.com/apache/opennlp/pull/551#issuecomment-1759590853
Something like:
```bash
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.apache.opennlp</groupId>
<artifactId>opennlp</artifactId>
<version>2.3.1-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<groupId>org.apache.opennlp</groupId>
<artifactId>opennlp-dl</artifactId>
<name>Apache OpenNLP DL</name>
<properties>
<onxx-runtime.artifact.name>onnxruntime_gpu</onxx-runtime.artifact.name>
</properties>
<profiles>
<profile>
<id>system-osx</id>
<activation>
<os>
<family>mac</family>
</os>
</activation>
<properties>
<onxx-runtime.artifact.name>onnxruntime</onxx-runtime.artifact.name>
</properties>
</profile>
</profiles>
<dependencies>
<dependency>
<groupId>org.apache.opennlp</groupId>
<artifactId>opennlp-tools</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>com.microsoft.onnxruntime</groupId>
<artifactId>${onxx-runtime.artifact.name}</artifactId>
<version>${onnxruntime.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-api</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-engine</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
</project>
```
Think it would propagate for the transient dependencies as well but would
need special treatment for building the zip releases (ie. include both libs)
> Default to onnxruntime instead of onnxruntime-gpu
> -------------------------------------------------
>
> Key: OPENNLP-1515
> URL: https://issues.apache.org/jira/browse/OPENNLP-1515
> Project: OpenNLP
> Issue Type: Task
> Components: Deep Learning
> Reporter: Jeff Zemerick
> Assignee: Jeff Zemerick
> Priority: Major
>
> The onnxruntime-gpu dependency is currently being included by opennlp-dl.
> <dependency>
> <groupId>com.microsoft.onnxruntime</groupId>
> <!-- This dependency supports CPU and GPU -->
> <artifactId>onnxruntime_gpu</artifactId>
> <version>${onnxruntime.version}</version>
> </dependency>
> The problem is, GPU support is only on Linux and Windows and not OSX. I think
> it would be best to use the onnxruntime dependency instead.
> But we need to make OpenNLP able to use GPU easily.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)