[ 
https://issues.apache.org/jira/browse/BEAM-13970?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17520005#comment-17520005
 ] 

Beam JIRA Bot commented on BEAM-13970:
--------------------------------------

This issue was marked "stale-assigned" and has not received a public comment in 
7 days. It is now automatically unassigned. If you are still working on it, you 
can assign it to yourself again. Please also give an update about the status of 
the work.

> RunInference V1
> ---------------
>
>                 Key: BEAM-13970
>                 URL: https://issues.apache.org/jira/browse/BEAM-13970
>             Project: Beam
>          Issue Type: New Feature
>          Components: sdk-py-core
>            Reporter: Andy Ye
>            Priority: P2
>              Labels: run-inference
>
> Users of machine learning frameworks must currently implement their own 
> transforms for running ML inferences. The exception is the TensorFlow 
> [RunInference 
> transform|https://github.com/tensorflow/tfx-bsl/blob/master/tfx_bsl/beam/run_inference.py].
>  However, this is hosted in its own 
> [repo|https://github.com/tensorflow/tfx-bsl], and has an 
> [API|https://www.tensorflow.org/tfx/tfx_bsl/api_docs/python/tfx_bsl/public/beam/RunInference]
>  that is exclusively geared towards the TensorFlow TFX library. Our goal is 
> to add new implementations of RunInference for the two other popular machine 
> learning frameworks: scikit-learn and Pytorch.
> Please see main design document 
> [here|https://s.apache.org/inference-sklearn-pytorch].



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

Reply via email to