Withdraw
Loading…
Evaluating the language style of AI-generated peer review
Zheng, Xiang; Xiong, Tiancheng
Content Files


Loading…
Download Files
Loading…
Download Counts (All Files)
Loading…
Edit File
Loading…
Permalink
https://hdl.handle.net/2142/126242
Description
- Title
- Evaluating the language style of AI-generated peer review
- Author(s)
- Zheng, Xiang
- Xiong, Tiancheng
- Issue Date
- 2025-03-11
- Keyword(s)
- Large language model
- Peer review
- Evaluation
- Abstract
- Peer review is a cornerstone of science, ensuring the integrity of scholarly communication. However, it has faced criticism for potential bias and harsh language style. This study explores the language style of peer reviews written by large language models (LLMs), specifically GPT-4, to examine whether it can mitigate such issues. By comparing human reviews from NIPS 2022 with GPT-4-generated reviews under different tonal conditions (normal, kind, harsh), this pilot study assesses the objectivity, clarity, and sentiment of AI-generated feedback. Using sentiment and subjectivity analyses, we find that GPT-4’s reviews exhibit lower subjectivity and more positive sentiment, even under prompts encouraging harshness. GPT-4 also avoids outright rejection and offers more favorable outcomes. However, this tendency to minimize negative feedback raises concerns about its critical evaluation capacity. The results suggest that LLMs may complement human peer review but must be used cautiously to maintain rigor. Future work will expand this analysis using a large-scale dataset to further validate these findings.
- Publisher
- iSchools
- Series/Report Name or Number
- iConference 2025 Proceedings
- Type of Resource
- Other
- Genre of Resource
- Conference Poster
- Language
- eng
- Handle URL
- https://hdl.handle.net/2142/126242
- Copyright and License Information
- Copyright 2025 is held by Xiang Zheng and Tiancheng Xiong. Copyright permissions, when appropriate, must be obtained directly from the author.
Owning Collections
iConference 2025 Posters PRIMARY
Posters presented at the 2025 iConference https://www.ischools.org/iconferenceManage Files
Loading…
Edit Collection Membership
Loading…
Edit Metadata
Loading…
Edit Properties
Loading…
Embargoes
Loading…