Full metadata record
DC FieldValueLanguage
dc.contributor.authorGautschi, Curtis-
dc.date.accessioned2021-01-21T08:29:30Z-
dc.date.available2021-01-21T08:29:30Z-
dc.date.issued2019-09-
dc.identifier.urihttps://digitalcollection.zhaw.ch/handle/11475/21341-
dc.description.abstractThere is growing interest in automated essay scoring (AES) and the individual measures that can be automatically calculated and used in AES, including readability, lexical diversity and complexity indices. These are particularly attractive for second-language learning, given their potential to assist teachers in writing skills assessment and diagnostics. Although relatively little has been done specifically with regard to AES and CEFR-level prediction (Yannakoudakis et al. 2018: 253), some CEFR-based AES tools are currently available as standalone tools (e.g., writeandimprove.com - Cambridge English) or as part of comprehensive placement tests (e.g., Pearson English Placement test). These tools, however, may not adequately meet an institution's needs for quick rating of large numbers of texts. Free tools require manual entry of individual texts, and are thus inefficient to use. While bulk grading is possible with tools that have subscription fees, and full placement tests provide automatic assessment, these may be cost-prohibitive. This paper describes the design, results and further development of an AES tool and CEFR-level prediction algorithm created and experimentally implemented at a major University of Applied Sciences in Switzerland as part of an online English placement test. In line with current research, the algorithm was developed employing a prediction-accuracy pseudo-black box approach (see Vanhove et al. 2019, Yannakoudakis 2013) using a small training corpus of texts with known CEFR levels (N=50). Written and run entirely in an R environment (R Core Team 2017) using the koRpus package (Michalke 2018) as the tool's workhorse, it can be integrated with other advanced text analyses possible in R. As the tool can handle bulk grading of large numbers of texts, it is ideal for placement testing. The algorithm is also efficient, calculating the CEFR levels of 400 student essays in 15 minutes of runtime. While gold-standard (human) validation evidence is still required, external validation checks demonstrate the accuracy of the tool. CEFR-level prediction patterns were found to be closer to official CEFR levels of a selection of texts than from other online AES systems. Further research perspectives and dissemination in the form of a shiny web-app and an R package will also be discussed.de_CH
dc.language.isoende_CH
dc.rightsLicence according to publishing contractde_CH
dc.subjectAutomated essay scoringde_CH
dc.subjectPlacement testing for writingde_CH
dc.subjectComputerized assessmentde_CH
dc.subjectEFL writing assessmentde_CH
dc.subject.ddc410.285: Computerlinguistikde_CH
dc.subject.ddc808: Rhetorik und Schreibende_CH
dc.titlePredicting CEFR levels of student essays in placement tests using an automated essay scoring tool in R : a corpus-based approachde_CH
dc.typeKonferenz: Sonstigesde_CH
dcterms.typeTextde_CH
zhaw.departementAngewandte Linguistikde_CH
zhaw.organisationalunitInstitute of Language Competence (ILC)de_CH
zhaw.conference.details8th International Conference on Writing Analytics, Zürich, 5-6 September 2019de_CH
zhaw.funding.euNode_CH
zhaw.originated.zhawYesde_CH
zhaw.publication.statuspublishedVersionde_CH
zhaw.publication.reviewPeer review (Abstract)de_CH
zhaw.webfeedSprachkompetenzde_CH
zhaw.author.additionalNode_CH
zhaw.display.portraitYesde_CH
Appears in collections:Publikationen Angewandte Linguistik

Files in This Item:
There are no files associated with this item.
Show simple item record
Gautschi, C. (2019, September). Predicting CEFR levels of student essays in placement tests using an automated essay scoring tool in R : a corpus-based approach. 8th International Conference on Writing Analytics, Zürich, 5-6 September 2019.
Gautschi, C. (2019) ‘Predicting CEFR levels of student essays in placement tests using an automated essay scoring tool in R : a corpus-based approach’, in 8th International Conference on Writing Analytics, Zürich, 5-6 September 2019.
C. Gautschi, “Predicting CEFR levels of student essays in placement tests using an automated essay scoring tool in R : a corpus-based approach,” in 8th International Conference on Writing Analytics, Zürich, 5-6 September 2019, Sep. 2019.
GAUTSCHI, Curtis, 2019. Predicting CEFR levels of student essays in placement tests using an automated essay scoring tool in R : a corpus-based approach. In: 8th International Conference on Writing Analytics, Zürich, 5-6 September 2019. Conference presentation. September 2019
Gautschi, Curtis. 2019. “Predicting CEFR Levels of Student Essays in Placement Tests Using an Automated Essay Scoring Tool in R : A Corpus-Based Approach.” Conference presentation. In 8th International Conference on Writing Analytics, Zürich, 5-6 September 2019.
Gautschi, Curtis. “Predicting CEFR Levels of Student Essays in Placement Tests Using an Automated Essay Scoring Tool in R : A Corpus-Based Approach.” 8th International Conference on Writing Analytics, Zürich, 5-6 September 2019, 2019.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.