gigasquid commented on issue #15023: Extend Clojure BERT example
URL: https://github.com/apache/incubator-mxnet/pull/15023#issuecomment-495288807
 
 
   Thanks @daveliepmann - Looks great so far!
   Here are the answers to your questions:
   
   _is there a way to pass the fine-tuned model directly to the infer API, 
rather than creating a factory over a saved checkpoint?_
   
   No - there currently isn't a way to do that. It's a good idea to investigate 
:)
   
    _is my interpretation of results correct? I included some individual 
samples that surprised me._
   A couple things to keep in mind:
   
   1. The example is really on for demostration purposes. In the original Gluon 
NLP tutorial there is a Conclusion section 
https://gluon-nlp.mxnet.io/examples/sentence_embedding/bert.html
   
   ```For demonstration purpose, we skipped the warmup learning rate schedule 
and validation on dev dataset used in the original implementation. ```
   
   We don't have a validation set only a training set that is going to affect 
the fine-tuning. We are also only running it for 3 epochs, we only have a 
training accuracy of about 0.70
   
   2. Your results are going to be better the closer it is to your fine tuned 
data. Some of your made up sentences have words that might not be in the vocab 
as well. Any words that are not in there get assigned an unknown token.
   
   In general, I think it's a great addition and I am happy to see it come 
along :)

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to