Knowing HE standards: How good are students at evaluating academic work?


To become effective learners, students need to develop good evaluative judgment skills. Unfortunately, numerous studies find that self-evaluation estimates provided by undergraduates often differ significantly from the marks awarded by the tutor. This suggests that students either have a rather poor grasp of the assessment criteria, or they find it difficult to apply standards to their own work because of their emotional investment. They may demonstrate a better understanding of standards when asked to judge the work of their peers. We use data from a cohort of 2nd year undergraduates to compare the ability of students to accurately self- and peer-evaluate an assessed essay. We find that peer evaluation is more accurate, on average, than self-evaluation but shows greater dispersion, and there is limited evidence that misconceptions about standards are consistent across self- and peer-evaluation.

Publication DOI:
Divisions: College of Business and Social Sciences > Aston Business School > Economics, Finance & Entrepreneurship
College of Business and Social Sciences > Aston Business School
College of Business and Social Sciences > Aston Business School > Centre for Personal Financial Wellbeing
Additional Information: This is an Accepted Manuscript version of the following article, accepted for publication in Higher Education Research & Development. Jon Guest & Robert Riegler (2021) Knowing HE standards: how good are students at evaluating academic work?, Higher Education Research & Development. It is deposited under the terms of the Creative Commons Attribution-NonCommercial License (, which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Uncontrolled Keywords: Peer-evaluation,evaluation accuracy,evaluation consistency,evaluation design,self-evaluation,Education
Publication ISSN: 0729-4360
Last Modified: 16 Apr 2024 07:23
Date Deposited: 21 Dec 2020 10:05
Full Text Link:
Related URLs: https://www.tan ... rnalCode=cher20 (Publisher URL)
http://www.scop ... tnerID=8YFLogxK (Scopus URL)
PURE Output Type: Article
Published Date: 2022
Published Online Date: 2021-01-06
Accepted Date: 2020-11-15
Authors: Guest, Jon (ORCID Profile 0000-0001-6139-905X)
Riegler, Robert (ORCID Profile 0000-0002-0423-5080)



Version: Accepted Version

| Preview

Export / Share Citation


Additional statistics for this record