damccorm opened a new issue, #21453:
URL: https://github.com/apache/beam/issues/21453

   Some models in Pytorch instantiating from torch.nn.Module, has extra 
parameters in the forward function call. These extra parameters can be passed 
as Dict or as positional arguments. 
   
   Example of PyTorch models supported by Hugging Face -\> 
[https://huggingface.co/bert-base-uncased](https://huggingface.co/bert-base-uncased)
   
   [Some torch models on Hugging 
face](https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/modeling_bert.py)
   
   Eg: 
[https://huggingface.co/docs/transformers/model_doc/bert#transformers.BertModel](https://huggingface.co/docs/transformers/model_doc/bert#transformers.BertModel)
   ```
   
   inputs = {
        input_ids: Tensor1,
        attention_mask: Tensor2,
        token_type_ids: Tensor3,
   }
   
   model = BertModel.from_pretrained("bert-base-uncased") # which is a  
   # subclass of torch.nn.Module
   
   outputs
   = model(**inputs) # model forward method should be expecting the keys in the 
inputs as the positional
   arguments.
   ```
   
    
   
   [Transformers](https://pytorch.org/hub/huggingface_pytorch-transformers/) 
integrated in Pytorch is supported by Hugging Face as well. 
   
    
   
   Imported from Jira 
[BEAM-14337](https://issues.apache.org/jira/browse/BEAM-14337). Original Jira 
may contain additional context.
   Reported by: Anand Inguva.
   Subtask of issue #21435


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to