Sterett H. MercerJoanna E. Cannon

Validity of automated learning progress assessment in English written expression for students with learning difficulties

Shortlink: https://www.waxmann.com/artikelART104836
.doi: https://doi.org/10.31244/jero.2022.01.03

free download

Abstract

We evaluated the validity of an automated approach to learning progress assessment (aLPA) for English written expression. Participants (n = 105) were students in Grades 2–12 who had parent-identified learning difficulties and received academic tutoring through a community-based organization. Participants completed narrative writing samples in the fall and spring of 1 academic year, and some participants (n = 33) also completed a standardized writing assessment in the spring of the academic year. The narrative writing samples were evaluated using aLPA, four hand-scored written expression curriculum-based measures (WE-CBM), and ratings of writing quality. Results indicated (a) aLPA and WE-CBM scores were highly correlated with ratings of writing quality; (b) aLPA and more complex WE-CBM scores demonstrated acceptable correlations with the standardized writing subtest assessing spelling and grammar, but not the subtest assessing substantive quality; and (c) aLPA scores showed small, statistically significant improvements from fall to spring. These findings provide preliminary evidence that aLPA can be used to efficiently score narrative writing samples for progress monitoring, with some evidence that the aLPA scores can serve as a general indicator of writing skill. The use of automated scoring in aLPA, with performance comparable to WE-CBM hand scoring, may improve scoring feasibility and increase the likelihood that educators implement aLPA for decision-making.

Keywords
automated text evaluation, learning progress assessment, written expression, curriculum-based measurement, learning difficulties