References
1. Mikolov T, Sutskever I, Chen K, et al. Distributed representations of words and phrases and their compositionality. Neural Information Processing Systems, Neural Information Processing Systems Foundation, Dec 5 -10, 2013, Lake Tahoe, NV, United States, 2013: 3111 -3119
2. Sutskever I, Vinyals O, Le Q V. Sequence to sequence learning with neural networks. Neural Information Processing Systems, Neural Information Processing Systems Foundation, Dec 8 -13, 2014, Montreal, QC, Canada, 2014: 3104 -3112
3. Kim Y. Convolutional neural networks for sentence classification. Conference on Empirical Methods in Natural Language Processing, Association for Computational Iinguistics, Oct 25 -29, 2014, Doha, Qatar, 2014: 1746 -1751
4. Socher R, Perelygin A, Wu J Y, et al. Recursive deep models for semantic compositionality over a sentiment treebank. Conference on Empirical Methods in Natural Language Processing, Association for Computational Linguistics, Oct 18 -21, 2013, Seattle, WA, United States, 2013: 1631 -1642
5. Tai K S, Socher R, Manning C D. Improved semantic representations from tree-structured long short-term memory networks. ACL-IJCNLP, Association for Computational Linguistics, Jul 26 -31, 2015, Beijing, China, 2015: 1556 -1566
6. Tang D Y, Qin B, Liu T. Document modeling with gated recurrent neural network for sentiment classification. Conference on Empirical Methods in Natural Language Processing, Association for Computational Linguistics, Sep 17 -21, 2015, Lisbon, Portugal, 2015: 1422 -1432
7. Le Q V, Mikolov T. Distributed representations of sentences and documents. International Conference on Machine Learning, International Machine Learning Society, Jun 21 -26, 2014, Beijing, China, 2014: 2931 -2939
8. Lei T, Barzilay R, Jaakkola T. Molding CNNs for text: non-linear, non-consecutive convolutions. Indiana University Mathematics Journal, 2015, 58(3): 1151 -1186
9. Zhang R, Lee H, Radev D. Dependency sensitive convolutional neural networks for modeling sentences and documents. Conference of the North American Chapter of the Association for Computational Linguistics, Association for Computational Linguistics, Jun 12 -17, 2016, San Diego, CA, United States, 2016: 1512 -1521
10. Zhang X, Zhao J B, Le C Y. Character-level convolutional networks for text classification. Neural Information Processing Systems, Neural Information Processing Systems Foundation, Dec 7 -12, 2015, Montreal, QC, Canada, 2015: 649 -657
11. Zhou C T, Sun C L, Liu Z Y, et al. A C-LSTM neural network for text classification. Computer Science, 2015, 1(4): 39 -44
12. Conneau A, Schwenk H, Barrault L, et al. Very deep convolutional networks for text classification. Conference of the European Chapter of the Association for Computational Linguistics, Association for Computational Linguistics, Apr 3 -7, 2017, Valencia, Spain, 2017: 1107 -1116
13. He K M, Zhang X Y, Ren S Q, et al. Deep residual learning for image recognition. IEEE Computer Society Conference on Computer Vision and Pattern Recognition, IEEE Computer Society, Jun 26 - Jul 1, 2016, Las Vegas, NV, United States, 2016: 770 -778
14. Zhang T Y, Huang M L, Zhao L. Learning structured representation for text classification via reinforcement learning. AAAI Conference on Artificial Intelligence, Feb 2 -7, 2018, New
Orleans, Louisiana, USA, 2018: 6053 -6060
15. Bahdanau D, Cho K, Bengio Y. Neural machine translation by jointly learning to align and translate. ICLR 2015, May 7 -9, 2015, San Diego, CA, United States, 2015
16. Xu K, Ba J, Kiros R, et al. Show, attend and tell: neural image caption generation with visual attention. International Conference on Machine Learning, International Machine Learning Society, Jul 6 -11, 2015, Lile, France, 2015: 2048 -2057
17. Luong M T, Pham H, Manning C D. Effective approaches to attention-based neural machine translation. Conference on Empirical Methods in Natural Language Processing, Association for Computational Linguistics, Sep 17 -21, 2015, Lisbon, Portugal, 2015: 1412 -1421
18. Lin Z H, Feng M W, Cicero N D S, et al. A structured self-attentive sentence embedding. ICLR 2017, Apr 24 -26, 2017, Toulon, France, 2017
19. Yin W P, Schutze H, Xiang B, et al. ABCNN: attention-based convolutional neural network for modeling sentence pairs. Association for Computational Linguistics, Aug 7 -12, 2016,
Berlin, Germany, 2016: 259 -272
20. Wang Y Q, Huang M L, Zhao L, et al. Attention-based LSTM for aspect-level sentiment classification. Conference on Empirical Methods in Natural Language Processing, Nov 1 -5, 2016, Austin, Texas, USA, 2016: 606 -615
21. Pennington J, Socher R, Manning C D. Glove: global vectors for word representation. Conference on Empirical Methods in Natural Language Processing, Association for Computational Linguistics, Oct 25 -29, 2014, Doha, Qatar, 2014: 1532 -1543
22. Huang G, Liu Z, Maaten L, et al. Densely connected convolutional networks. Conference on Computer Vision and Pattern Recognition, Institute of Electrical and Electronics Engineers Inc, Jul 21 -26, 2017, Honolulu, HI, United States, 2017: 2261 -2269
23. Nair V, Hinton G E. Rectified linear units improve restricted boltzmann machines. International Conference on Machine Learning, Jun 21 -25, 2010, Haifa, Israel, 2010: 807 -814
24. Geoffrey E H, Nitish S, et al. Improving neural networks by preventing co-adaptation of feature detectors. 2012-07-03. https://arxiv.org/abs/1207.0580v1
25. Ramachandran P, Zoph B, Le Q V. Searching for activation functions. ICLR 2018, Apr 30 - May 3, 2018, Vancouver, BC, Canada, 2018
26. Cho K, Merrienboer B V, Gulcehre C, et al. Learning phrase representations using RNN encoder-decoder for statistical machine translation. Conference on Empirical Methods in Natural Language Processing, Association for Computational Linguistics, Oct 25 -29, 2014, Doha, Qatar, 2014: 1724 -1734
27. Qian Q, Huang M L, Zhu X Y, et al. Linguistically regularized LSTMs for sentiment classification. Association for Computational Linguistics, Association for Computational Linguistics, Jul 30 - Aug 4, 2017, Vancouver, BC, Canada, 2017: 1679 -1689
28. Du J C, Lin G, Xu R F, et al. A convolutional attention model for text classification. Natural Language Processing and Chinese Computing, Nov 8 -12, 2017, Dalian, China: Springer Verlag, 2018: 183 -195
29. Zhang Y, Roller S, Wallace B C. MGNC-CNN: a simple approach to exploiting multiple word embeddings for sentence classification. NAACL HLT 2016, Jun 12 -17, 2016, San Diego, CA, United States, 2016: 1522 -1527 |