[1] SUTSKEVER I, VINYALS O, LE Q V. Sequence to sequence learning with neural networks[C]//Proceedings of 2014 annual conference on neural information processing systems (NIPS). Montreal:Neural Information Processing Systems Foudation,2014:3104-3112.
[2] LE H T, LE T M. An approach to abstractive text summarization[C]//Proceedings of 2013 soft computing and pattern recognition (SoCPaR).Hanoi:IEEE, 2013:371-376.
[3] 赵文娟, 刘忠宝. 基于汉语框架的网络事件抽取及相关算法研究[J]. 情报理论与实践, 2016, 39(10):112-116.
[4] 张晗, 赵玉虹. 基于语义图的医学多文档摘要提取模型构建[J]. 图书情报工作, 2017,61(8):112-119.
[5] KHAN A, SALIM N, FARMAN H, et al. Abstractive text summarization based on improved semantic graph approach[J]. International journal of parallel programming, 2018,46(1):1-25.
[6] 王振超, 孙锐, 姬东鸿. 基于事件指导的多文档生成式摘要方法[J]. 计算机应用研究, 2017, 34(2):343-346.
[7] BAHDANAU D, CHO K, BENGIO Y. Neural machine translation by jointly learning to align and translate[EB/OL].[2017-12-30]. https://arxiv.org/pdf/1409.0473.pdf.
[8] RUSH A M, CHOPRA S, WESTON J. A neural attention model for abstractive sentence summarization[EB/OL].[2017-12-30].https://arxiv.org/pdf/1509.00685.
[9] CHOPRA S, AULI M, RUSH A M. Abstractive sentence summarization with attentive recurrent neural networks[C]//Conference of the North American chapter of the Association for Computational Linguistics. San Diego:Human Language Technologies, 2016:93-98.
[10] GULCEHRE C, AHH S, NALLAPATI R, et al. Pointing the unknown words[C]//Proceedings of the 54th annual meeting of the Association for Computational Linguistics. Berlin:ACL, 2016:140-149.
[11] MIAO Y, BLUNSOM P. Language as a latent variable:discrete generative models for sentence compression[C]//Proceedings of the 2016 conference on empirical methods in natural language processing. Austin:EMNLP, 2016:319-328.
[12] 谢鸣元. 基于文本类别的文本自动摘要模型[J]. 电脑知识与技术:学术交流, 2018, 14(1):206-208.
[13] JEAN S, CHO K, MEMISEVIC R, et al. On using very large target vocabulary for neural machine translation[EB/OL].[2018-02-10]. https://arxiv.org/pdf/1412.2007.pdf.
[14] XIE Z, AVATI A, ARIVAZHAGAN N, et al. Neural language correction with character-based attention[EB/OL].[2017-12-30]. https://arxiv.org/pdf/1603.09727.
[15] LUONG M T, SUTSKEVER I, LE Q V, et al. Addressing the rare word problem in neural machine translation[J]. Bulletin of university of agricultural sciences and veterinary medicine cluj-napoca. veterinary medicine, 2014, 27(2):82-86.
[16] GU J, LU Z, LI H, et al. Incorporating copying mechanism in sequence-to-sequence learning[C]//Proceedings of the 54th annual meeting of the Association for Computational Linguistics. Berlin:ACL, 2016:1631-1640.
[17] NALLAPAT R, ZHOU B, SANTOS C N D, et al. Abstractive text summarization using sequence-to-sequence RNNs and beyond[C]//Proceedings of the 20th SIGNLL conference on computational natural language learning. Berlin:CoNLL,2016:280-290.
[18] TU Z, LU Z, LIU Y, et al. Modeling coverage for neural machine translation[C]//Proceedings of the 54th annual meeting of the Association for Computational Linguistics. Berlin:ACL, 2016:76-85.
[19] CHO K, MERRIENBOER B V, GULCEHRE C, et al. Learning phrase representations using RNN encoder-decoder for statistical machine translation[EB/OL].[2018-03-01]. https://arxiv.org/pdf/1406.1078.pdf
[20] HOCHREITER S, SCHMIDHUBER J. Long Short-Term Memory[J].Neural computation, 1997, 9(8):1735-1780.
[21] SEE A, LIU P J, MANNING C D. Get to the point:summarization with pointer-generator networks[C]//Proceedings of the 55th annual meeting of the Association for Computational Linguistics. Vancouver:ACL, 2017:1073-1083.
[22] HU B, CHEN Q, ZHU F. LCSTS:a large scale Chinese short text summarization dataset[C]//Proceedings of the 2015 conference on empirical methods in natural language processing. Lisbon:EMNLP, 2015:2667-2671.
[23] SUN J. 中文分词工具[EB/OL].[2017-10-20]. https://pypi.python.org/pypi/jieba/.
[24] FLICK C. ROUGE:a package for automatic evaluation of summaries[EB/OL].[2017-12-30]. http://www.aclweb.org/anthology/W04-1013.
[25] MIHALCEA R, TARAU P. TextRank:bringing order into texts[C]//Proceedings of the 2004 conference on empirical methods in natural language processing. Barcelona:EMNLP, 2004:404-411. |