[
https://issues.apache.org/jira/browse/BEAM-13970?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17516349#comment-17516349
]
Beam JIRA Bot commented on BEAM-13970:
--------------------------------------
This issue is assigned but has not received an update in 30 days so it has been
labeled "stale-assigned". If you are still working on the issue, please give an
update and remove the label. If you are no longer working on the issue, please
unassign so someone else may work on it. In 7 days the issue will be
automatically unassigned.
> RunInference V1
> ---------------
>
> Key: BEAM-13970
> URL: https://issues.apache.org/jira/browse/BEAM-13970
> Project: Beam
> Issue Type: New Feature
> Components: sdk-py-core
> Reporter: Andy Ye
> Assignee: Andy Ye
> Priority: P2
> Labels: run-inference, stale-assigned
>
> Users of machine learning frameworks must currently implement their own
> transforms for running ML inferences. The exception is the TensorFlow
> [RunInference
> transform|https://github.com/tensorflow/tfx-bsl/blob/master/tfx_bsl/beam/run_inference.py].
> However, this is hosted in its own
> [repo|https://github.com/tensorflow/tfx-bsl], and has an
> [API|https://www.tensorflow.org/tfx/tfx_bsl/api_docs/python/tfx_bsl/public/beam/RunInference]
> that is exclusively geared towards the TensorFlow TFX library. Our goal is
> to add new implementations of RunInference for the two other popular machine
> learning frameworks: scikit-learn and Pytorch.
> Please see main design document
> [here|https://s.apache.org/inference-sklearn-pytorch].
--
This message was sent by Atlassian Jira
(v8.20.1#820001)