[
https://issues.apache.org/jira/browse/BEAM-14337?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Anand Inguva updated BEAM-14337:
--------------------------------
Description:
Some models in Pytorch instantiating from torch.nn.Module, has extra parameters
in the forward function call. These extra parameters can be passed as Dict or
as positional arguments.
Example of PyTorch models supported by Hugging Face ->
[https://huggingface.co/bert-base-uncased]
[Some torch models on Hugging
face|https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/modeling_bert.py]
Eg:
[https://huggingface.co/docs/transformers/model_doc/bert#transformers.BertModel]
{code:java}
inputs = {
input_ids: Tensor1,
attention_mask: Tensor2,
token_type_ids: Tensor3,
}
model = BertModel.from_pretrained("bert-base-uncased") # which is a
# subclass of torch.nn.Module
outputs = model(**inputs) # model forward method should be expecting the keys
in the inputs as the positional arguments.{code}
[Transformers|https://pytorch.org/hub/huggingface_pytorch-transformers/]
integrated in Pytorch is supported by Hugging Face as well.
was:
Some models in Pytorch instantiating from torch.nn.Module, has extra parameters
in the forward function call. These extra parameters can be passed as Dict or
as positional arguments.
Example of PyTorch models supported by Hugging Face ->
[https://huggingface.co/bert-base-uncased]
[Some torch models on Hugging
face|https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/modeling_bert.py]
Eg:
[https://huggingface.co/docs/transformers/model_doc/bert#transformers.BertModel]
{code:java}
inputs = {
input_ids: Tensor1,
attention_mask: Tensor2,
token_type_ids: Tensor3,
}
model = BertModel.from_pretrained("bert-base-uncased") # which is a #subclass
of torch.nn.Module
outputs = model(**inputs) # model forward method should be expecting the keys
in the inputs as the positional arguments.{code}
[Transformers|https://pytorch.org/hub/huggingface_pytorch-transformers/]
integrated in Pytorch is supported by Hugging Face as well.
> Support **kwargs for PyTorch models.
> ------------------------------------
>
> Key: BEAM-14337
> URL: https://issues.apache.org/jira/browse/BEAM-14337
> Project: Beam
> Issue Type: Sub-task
> Components: sdk-py-core
> Reporter: Anand Inguva
> Assignee: Anand Inguva
> Priority: P2
>
> Some models in Pytorch instantiating from torch.nn.Module, has extra
> parameters in the forward function call. These extra parameters can be passed
> as Dict or as positional arguments.
> Example of PyTorch models supported by Hugging Face ->
> [https://huggingface.co/bert-base-uncased]
> [Some torch models on Hugging
> face|https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/modeling_bert.py]
> Eg:
> [https://huggingface.co/docs/transformers/model_doc/bert#transformers.BertModel]
> {code:java}
> inputs = {
> input_ids: Tensor1,
> attention_mask: Tensor2,
> token_type_ids: Tensor3,
> }
> model = BertModel.from_pretrained("bert-base-uncased") # which is a
> # subclass of torch.nn.Module
> outputs = model(**inputs) # model forward method should be expecting the keys
> in the inputs as the positional arguments.{code}
>
> [Transformers|https://pytorch.org/hub/huggingface_pytorch-transformers/]
> integrated in Pytorch is supported by Hugging Face as well.
>
--
This message was sent by Atlassian Jira
(v8.20.7#820007)