fbpx

TEA Rolls Out Computer Scoring of STAAR Exams

STAAR
Student filling out answers to a test | Image by Constantine Pankin/Shutterstock

State education officials have quietly rolled out computer scoring for written answers on student achievement tests.

The Texas Education Agency (TEA) upgraded its scoring process so that only 1 in 4 students’ written responses on the State of Texas Assessments of Academic Readiness (STAAR) exam will be read by an actual person. The change, which was announced in December, has been questioned by some education officials.

“This is surprising news to me as a member of the House Public Education Committee, as I do not recall ever receiving notice of this novel and experimental method for grading high-stakes STAAR tests,” wrote Rep. Gina Hinojosa (D-Austin) in a letter to Commissioner Mike Morath, according to The Dallas Morning News.

However, Jake Kobersky, a spokesperson for TEA, claimed that the committee was informed of plans to move to automated testing for the STAAR in August 2022 in the interest of lowering the costs of human testing, which were estimated as totaling approximately $15-20 million a year, the DMN reported.

The technological tool is referred to as an “automated scoring engine,” and it employs a hybrid scoring model to grade all short constructed-response questions and extended constructed-response questions appearing on the STAAR exam. Afterward, human scorers will review at least 25% of these graded exams as a sort of “second reader” to compare and monitor the computer-generated scores.

The change only applies to written answers in English, as Spanish STAAR assessments will continue to be graded entirely by human scorers.

TEA claims that the automated essay scoring process was already tested on the spring 2023 STAAR results and demonstrated as much accuracy as human scorers, according to the DMN. Yet the agency has had some mishaps with technology in the past, such as in 2016 when thousands of students had their exam scores voided due to them having difficulties logging in and staying online during testing.

As such, some have been skeptical of the scoring engine’s reliability, especially since it awarded zero points for 8 in 10 written responses on high schoolers’ recent English II exam this fall. Chris Rozunick, director of TEA’s assessment development division, told the DMN that the dip in scores was not related to the automated scoring process.

“At the very least, they should do a pilot or study for a pretty long time,” said State Board of Education member Pat Hardy (R-Fort Worth), according to the DMN. “It’s an area that needs more exploration. … It just seems so cold.”

The new scoring process comes not long after a complete redesign of the STAAR exam, as previously covered in The Dallas Express. The new format released in spring 2023 put a 75% cap on multiple-choice questions in favor of open-ended questions in order to better reflect classroom learning as well as students’ cross-curricular knowledge.

TEA data showed that less than half of students statewide scored at grade level or above on the 2022-2023 STAAR exam, as covered in The Dallas Express. Only 44% of Dallas ISD, which has struggled academically, hit that threshold the following academic year. Meanwhile, the accountability scores given to school districts for that year have not been released due to ongoing litigation over TEA’s new formula for calculating performance grades.

Support our non-profit journalism

Submit a Comment

Your email address will not be published. Required fields are marked *

Continue reading on the app
Expand article