ACADEMIC JOURNAL OF EDUCATIONAL RESEARCH AND MANAGEMENT (AJERM)

A Comparative Review of CET4 Essay Writing Assessments with Insights from the CEFR Level Descriptors

E-ISSN: 2390-4383

P-ISSN: 1330-3473

DOI: https://iigdpublishers.com/article/961

The comparative analysis of essay writing assessment criteria, particularly the Common European Framework of Reference for Languages (CEFR) and the College English Test Band 4 (CET4), has attracted growing attention in language testing research. Building upon previous research from the past decade, this study examines the comparison between CET4 writing rubrics and CEFR level descriptors, with a particular focus on the influence of task complexity under the CEFR level descriptors. Underpinned by Robinson’s Cognition Hypothesis (CH) and Skehan’s Limited Attentional Capacity Model (LACM), this study explores how variations in task complexity impact linguistic outcomes in CET4 essay writing. Robinson’s CH suggests increased task complexity promotes greater lexical and syntactic complexity, whereas Skehan’s LACM emphasises the trade-offs between accuracy, fluency, and complexity under cognitive constraints. Over the past decade, research has applied the theoretical models to assess task complexity dimensions, specifically “+/- planning time (PT)” and “+/- few elements (FE)” in CET4 essay writing. Empirical findings have demonstrated significant correlations between task complexity and CEFR-aligned performance indicators. Accordingly, this study conducts a comprehensive review of the relationship between CET4 writing rubrics and CEFR level descriptors, identifying points of convergence and divergence in writing criteria. Findings suggest that integrating the CEFR into CET4 assessment practices enhances comprehensiveness, supporting more specific evaluation dimensions for essay writing. Overall, this study provides insights into writing assessment practices by integrating theoretical perspectives and recent empirical evidence. The findings hold implications for educators, particularly in bridging the CET4 and CEFR writing assessment and enhancing task-based essay writing instruction. 

Keyword(s) Essay Writing, CET4 Writing Rubrics, CEFR Level Descriptors, Assessment, Comparison.
About the Journal VOLUME: 9, ISSUE: 10 | October 2025
Quality GOOD

Changlin Li, Nik Aloesnita Nik Mohd Alwi & Mohammad Musab Azmat Ali

Bui, G., & Skehan, P. (2018). Complexity, fluency and accuracy in L2 speech and writing: Investigating influences of task structure and processing load. LANGUAGE LEARNING, 68(2), 409-445. https://doi.org/10.1111/lang.12283 


Bygate, M., Skehan, P., & Swain, M. (Eds.). (2013). Researching pedagogic tasks: Second language learning, teaching, and testing. Routledge. 


Chen, L. C., Chang, K. H., Yang, S. C., & Chen, S. C. (2023). A Corpus -Based Word Classification Method for Detecting Difficulty Level of English Proficiency Tests. Applied Sciences-Basel, 13(3), Article 1699. https://doi.org/10.3390/app13031699 


Committee, N. C. E. T. B. a. B. E. (2016). National College English Test Band 4 and Band 6 Syllabus (2016 Revised Edition). 


Crossley, S. A. (2020). A Corpus Analysis of Academic Writing and how it Informs Writing Instruction on a University PreSessional Course. Journal of Writing Research, 11(3), 415-443. 


Crossley, S. A., Allen, L. Q., & McNamara, D. S. (2019). Coh-Metrix model-based automatic assessment of interpreting quality. Interpreting in the Age of Global Communication, 181-198. https://doi.org/10.1007/978-981-15-8554-8_9 

article