研究论文

“焦点—意图”视角下同行评议意见的语义特征挖掘及分布研究

  • 孙梦婷 ,
  • 张琳
展开
  • 1 南京大学信息管理学院, 南京 210023;
    2 武汉大学科教管理与评价中心, 武汉 430072;
    3 武汉大学信息管理学院, 武汉 430072
孙梦婷,博士后,博士;张琳,教授,博士,博士生导师,通信作者,E-mail:linzhang1117@whu.edu.cn。

收稿日期: 2024-04-22

  修回日期: 2024-07-09

  网络出版日期: 2025-02-26

Semantic Feature Mining and Analysis of Peer Review Comments from the Perspective of “Aspect-Intention” Correlation

  • Sun Mengting ,
  • Zhang Lin
Expand
  • 1 School of Information Management, Nanjing University, Nanjing 210023;
    2 Center for Science, Technology & Education Assessment (CSTEA), Wuhan University, Wuhan 430072;
    3 School of Information Management, Wuhan University, Wuhan 430072

Received date: 2024-04-22

  Revised date: 2024-07-09

  Online published: 2025-02-26

摘要

[目的/意义] 同行评议作为知识生产的质量筛选机制,其意见文本中隐含着特定的语义信息。基于文本分类模型对同行评议的意见文本进行语义特征挖掘和分析,有利于加深对同行评议书面交流过程的认识,理解同行评议过程“黑箱”,为加强同行评议的质量评估、监测机制提供新思路。[方法/过程] 从评议意见“所面向的论文内容”和“所传达的评议目的”两个语义层面,构建“焦点—意图”语义特征挖掘框架;以开放同行评议代表性期刊eLife为数据源进行数据采集,构建标注数据集DAIPRV1;基于SciBERT模型,生成面向评议意见的焦点、意图类别划分任务的文本多分类模型;基于自动分类结果进行同行评议意见语义特征的挖掘和分布分析。[结果/结论] 在评议焦点层面,评议专家最关注论文的实证部分,其次为论文的表述/格式、理论部分等。在评议意图层面,评议专家最常给出指令型的意见,以要求作者阐明、提供实验/证据、进行编辑/修改为主,其次主要给出评价性和总结性的意见,且同一评议意见中会综合多种指令。从评议焦点和意图的双重视角,针对不同的评议焦点,评议意见往往聚焦特定的评议意图,两者之间存在对应关系。通过探究同行评议意见文本的“焦点—意图”语义特征分布,丰富理解同行评议过程的视角,加深对同行评议作为一种重要的学术交流方式的理解。

本文引用格式

孙梦婷 , 张琳 . “焦点—意图”视角下同行评议意见的语义特征挖掘及分布研究[J]. 图书情报工作, 2025 , 69(4) : 77 -91 . DOI: 10.13266/j.issn.0252-3116.2025.04.007

Abstract

[Purpose/Significance] Peer review serves as a pivotal quality assurance mechanism in academic knowledge production, encompassing implicit semantic insights within reviewers’ comments. Using text classification models to identify and analyze the semantic features in comment texts can deepen our understanding of the peer review process, illuminate the opaque processes of peer review, and ultimately enhance this crucial system. [Method/Process] This paper introduced an “aspect-intention” semantic mining framework to delineate the specific aspects of papers that reviewers scrutinize and the intentions underpinning their comments. Leveraging data from the open peer review journal eLife, it constructed an annotated dataset DAIPRV1. It then trained and tested text multi-classification models based on SciBERT to identify the aspects and intentions of reviewers’ comments. Finally, it analyzed the semantic features based on the classification results. [Result/Conclusion] The analysis reveals that reviewers predominantly focus on the empirical aspects of papers, followed by presentational and theoretical elements. In terms of intention, reviewers commonly give instructions, urging authors to clarify their work, provide additional experiments or evidence, and make editing and formatting revisions. This is followed by evaluative and summarizing comments, with multiple instructions integrated within a single comment. This paper also underscores the correspondence relationship between the aspects of reviewers’ comments and their underlying intentions. By exploring the semantic features from the perspective of “aspect-intention” correlation, this paper provides a rich perspective for understanding the peer review process, and deepens the comprehension of peer review as an important form of academic communication.

参考文献

[1] MERTON R K. The sociology of science: theoretical and empirical investigations[M]. Chicago: University of Chicago Press, 1973: 339.
[2] 楚宾, 哈克特. 难有同行的科学:同行评议与美国科学政策[M]. 北京: 北京大学出版社, 2011. (CHUBIN D E, HACKETT E J. Peerless science: peer review and U.S. science policy[M]. Beijing: Peking University Press, 2011.)
[3] MULLIGAN A, HALL L, RAPHAEL E. Peer review in a changing world: an international study measuring the attitudes of researchers[J]. Journal of the American Society for Information Science and Technology, 2013, 64(1): 132-161.
[4] CARNEIRO C F D, Queiroz V G S, Moulin T C, et al. Comparing quality of reporting between preprints and peer-reviewed articles in the biomedical literature[J]. Research integrity and peer review, 2020, 5(1): 16.
[5] BENOS D J, BASHARI E, CHAVES J M, et al. The ups and downs of peer review[J]. Advances in physiology education, 2007, 31(2): 145-152.
[6] THURNER S, HANEL R. Peer-review in a world with rational scientists: toward selection of the average[J]. The European physical journal B, 2011, 84(4): 707-711.
[7] ESAREY J. Does peer review identify the best papers? a simulation study of editors, reviewers, and the scientific publication process[J]. Political science & politics, 2017, 50(4): 963-969.
[8] COWLEY S J. How peer-review constrains cognition: on the frontline in the knowledge sector[J]. Frontiers in psychology, 2015, 6:1706.
[9] WOOD M. Beyond journals and peer review: towards a more flexible ecosystem for scholarly communication[M/OL]. [2024-12-05]. https://www.qeios.com/read/SWKKOC.
[10] TEPLITSKIY M. Frame search and re-search: how quantitative sociological articles change during peer review[J]. The American sociologist, 2016, 47(2): 264-288.
[11] 孙梦婷, 张琳. 从“要素公开”到“开放参与”:开放同行评议策略研究[J]. 图书情报知识, 2024, 41(2): 67-80. (SUN M, ZHANG L. From “element transparency” to “open participation”: an investigation on open peer review strategies[J]. Documentation, information & knowledge, 2024, 41(2): 67-80.)
[12] FALK DELGADO A, GARRETSON G, FALK DELGADO A. The language of peer review reports on articles published in the BMJ, 2014–2017: an observational study[J]. Scientometrics, 2019, 120(3): 1225-1235.
[13] SUN Z, CLARK CAO C, MA C, et al. The academic status of reviewers predicts their language use[J]. Journal of informetrics, 2023, 17(4): 101449.
[14] BULJAN I, GARCIA-COSTA D, GRIMALDO F, et al. Large-scale language analysis of peer review reports[J]. eLife, 2020, 9.
[15] STEPHEN D. Peer reviewers equally critique theory, method, and writing, with limited effect on the final content of accepted manuscripts[J]. Scientometrics, 2022, 127(6): 3413-3435.
[16] MUNGRA P, WEBBER P. Peer review process in medical research publications: language and content comments[J]. English for specific purposes, 2010, 29(1): 43-53.
[17] HERBER O R, BRADBURY-JONES C, BÖLING S, et al. What feedback do reviewers give when reviewing qualitative manuscripts? A focused mapping review and synthesis[J]. BMC medical research methodology, 2020, 20(1): 122.
[18] QIN C, ZHANG C. Which structure of academic articles do referees pay more attention to?: perspective of peer review and full-text of academic articles[J]. Aslib journal of information management, 2023, 75(5): 884-916.
[19] ZONG Q, XIE Y, LIANG J. Does open peer review improve citation count? evidence from a propensity score matching analysis of PeerJ[J]. Scientometrics, 2020, 125(1): 607-623.
[20] CHAKRABORTY S, GOYAL P, MUKHERJEE A. Aspect-based sentiment analysis of scientific reviews[C] //Proceedings of the ACM/IEEE joint conference on digital libraries in 2020. Wuhan: Association for Computing Machinery, 2020: 207-216.
[21] GHOSAL T, KUMAR S, BHARTI P K, et al. Peer review analyze: a novel benchmark resource for computational analysis of peer reviews[J]. Plos one, 2022, 17(1): e0259238.
[22] YUAN W, LIU P, NEUBIG G. Can we automate scientific reviewing?[J]. Journal of articial intelligence research, 2022, 75: 42.
[23] CHENG L, BING L, YU Q, et al. APE: argument pair extraction from peer review and rebuttal via multi-task learning[C]//Proceedings of the 2020 conference on empirical methods in natural language processing. Association for Computational Linguistics, 2020: 7000-7011.
[24] FROMM M, FAERMAN E, BERRENDORF M, et al. Argument mining driven analysis of peer-reviews[C]//Proceedings of the AAAI conference on artificial intelligence, 2021: 4758-4766.
[25] HUA X, NIKOLOV M, BADUGU N, et al. Argument mining for understanding peer reviews[C]//Minneapolis: Association for Computational Linguistics, 2019: 2131-2137.
[26] KENNARD N, O’GORMAN T, DAS R, et al. DISAPERE: a dataset for discourse structure in peer review discussions[A]. Seattle, United States: Association for Computational Linguistics. 2022: 1234-1249.
[27] KING S R F. Consultative review is worth the wait[J]. eLife, 2017, 6: e32012.
[28] 刘丽萍, 刘春丽. eLife开放同行评审模式研究[J]. 中国科技期刊研究, 2019, 30(9): 949-955. (LIU L, LIU C. Research on the open peer review mode of eLife[J]. Chinese journal of scientific and technical periodicals, 2019, 30(9): 949-955.)
[29] 于曦. eLife开放同行评议模式改革与启示[J]. 中国科技期刊研究, 2023, 34(5): 609-614. (YU X. eLife’s open peer review model reform and enlightenment[J]. Chinese journal of scientific and technical periodicals, 2023, 34(5): 609-614.)
[30] BORDAGE G. Reasons reviewers reject and accept manuscripts: the strengths and weaknesses in medical education reports[J]. Academic medicine, 2001, 76(9): 889-896.
[31] STRANG D, SILER K. Revising as reframing: original submissions versus published papers in Administrative Science Quarterly, 2005 to 2009[J]. Sociological theory, 2015, 33(1): 71-96.
[32] KANG D, AMMAR W, DALVI B, et al. A dataset of peer reviews (PeerRead): collection, insights and NLP applications[A]. New Orleans, Louisiana: Association for Computational Linguistics. 2018: 1647-1661.
[33] KUZNETSOV I, BUCHMANN J, EICHLER M, et al. Revise and resubmit: an intertextual model of text-based collaboration in peer review[J]. Computational linguistics, 2022, 48(4): 949-986.
[34] 秦成磊, 韩茹雪, 周昊旻,等. 同行评审意见类型识别及其在不同被引频次下的分布研究[J]. 图书情报工作, 2022, 66(13): 102-117. (QIN C, HAN R, ZHOU H, et al. Identification of peer review comments types and research on their distribution at different citation frequencies[J]. Library and information service, 2022, 66(13): 102-117.)
[35] WANG K, WAN X. Sentiment analysis of peer review texts for scholarly papers[C]//The 41st international ACM SIGIR conference on research & development in information retrieval. New York: ACM, 2018.
[36] BHARTI P K, GHOSAL T, AGARWAL M, et al. BetterPR: a dataset for estimating the constructiveness of peer review comments[M]. Cham: Springer, 2022: 500-505.
文章导航

/