Attention mechanism is becoming more and more important in natural language 
processing. In order to further study the principle of attention mechanism, I 
read the paper [A Structured Self-Attentive Sentence 
Embedding](https://arxiv.org/abs/1703.03130).
This is an implementation of the paper [A Structured Self-Attentive Sentence 
Embedding](https://arxiv.org/abs/1703.03130). This program implements most of 
the details in the paper. Finally, the user reviews the emotional star ratings 
in the three experiments mentioned in the original paper, and used the same 
data set: [The reviews of Yelp 
Data](https://www.kaggle.com/yelp-dataset/yelp-dataset#yelp_academic_dataset_review.json).
 

**Implemented**
1.  **Attention mechanism proposed in the original paper.**
2.  **Punishment constraints to ensure diversity of attention.**
3.  **Parameter pruning proposed in the appendix of the paper.**
4.  **Gradient clip and learning rate decay.**
5.  **SoftmaxCrossEntropy with category weights**

[ Full content available at: 
https://github.com/apache/incubator-mxnet/pull/12535 ]
This message was relayed via gitbox.apache.org for [email protected]

Reply via email to