Open in App
  • Local
  • U.S.
  • Election
  • Politics
  • Crime
  • Sports
  • Lifestyle
  • Education
  • Real Estate
  • Newsletter
  • Fort Worth StarTelegram

    More students got zeros on STAAR writing questions. It isn’t just a Texas problem.

    By Silas Allen,

    1 day ago

    There are a number of theories on why so many Texas students received zeroes on certain question types on this year’s state test. But one thing is for certain: The issue isn’t unique to Texas.

    In Fort Worth and across the state, the number of students who received zeros on constructed response questions on this year’s State of Texas Assessments of Academic Readiness, or STAAR, exams grew substantially this year over last year. That growth comes following the rollout of a newly redesigned state test and a shift to automated scoring. Texas Education Agency officials say they’re confident the new exam still offers an accurate look at how students are doing.

    But a national testing expert says many states across the country are seeing the same trend.

    “It’s not a Texas problem. It seems to be a national problem,” said Scott Marion, executive director of the National Center for the Improvement of Educational Assessment.

    TEA rolls out STAAR redesign, online grading

    In December, the state education agency rolled out an “automated scoring engine,” a computer-based grading system, to evaluate STAAR exams. Officials said the grading engine would score three-quarters of all essay questions, and human graders would continue to handle the rest. The agency had used computer-based grading for closed-ended questions like multiple choice for years.

    The shift to automated grading isn’t the only change the agency has made to the test over the past few years. Last year, the agency rolled out a newly redesigned state test that featured fewer multiple choice questions and more constructed response prompts. State education officials said the new test was designed to more closely resemble the instruction students receive in class.

    The redesigned test also features new kinds of questions that students had never dealt with before, including extended constructed response questions that ask students to draw on information from a reading passage to craft an in-depth response to a prompt.

    This year, the number of Texas students in grades 3-8 receiving a zero on constructed response questions jumped nearly 20% from last year’s total, which itself represented a sharp uptick over the number of zeros students logged before the STAAR redesign. About 35% of test-takers across the state received a zero on a constructed response question on last spring’s exam, TEA records show.

    In the Fort Worth Independent School District , about 54% of test-takers received a zero on a constructed response question this year, according to TEA data. That’s 14,280 students in grades 3-8, a 12% uptick over last year.

    Students receiving a zero on constructed response questions don’t necessarily fail the entire exam. About 76.5% of students across the state passed this year’s state test, compared to about 78.1% last year, according to TEA data.

    Rise in zeros is a national issue

    Marion, the testing expert, said the growing number of zeros on constructed response questions is a major topic of conversation in his meetings with clients. Marion’s center works with education officials in 35 states and about 40 large school districts, and many are seeing the same thing, he said. Although there’s little reliable data to explain the trend, Marion said he suspects some part of it may come from a growing number of test-takers simply refusing to take the questions seriously.

    “My hunches are that we’re dealing with some post-pandemic residue of kids saying, ‘I’ve been through the pandemic. What are you going to do to me now?’” he said.

    If states want to get more students to do a better job of engaging with test questions, they need to do a better job of writing them, Marion said. One way of doing that is by using a strategy educators call a cognitive lab . Test developers give students a series of sample questions and ask them to talk through their thought process as they respond. If the students have trouble understanding what they’re being asked to do, developers can go back and clarify the question. If they understand it but won’t engage with it, the developers can talk with them about how they could make the prompt more interesting.

    Although the trend in Texas began just after the state shifted to computer-based grading, Marion said he doubts the two changes are related. While no automated scoring system is perfect, Marion said the one Texas adopted is generally reliable. Also, he said, the uptick in zeros seems to exist in both human-graded and computer-graded tests.

    TEA: STAAR difficulty doesn’t vary by year

    Chris Rozunick, director of the assessment development division at TEA, cautioned against making too much of the uptick in zeros on constructed response questions. The agency always works to make the STAAR consistent in its level of difficulty from one year to the next, but only at the overall test level, she said. Individual sections within the test can vary, she said — if one section is harder one year, another section will be easier.

    There are a number of ways a student could fail to earn any points on a constructed response question, including writing a response that answers the question but lacks an organizational structure or one that doesn’t include any relevant evidence, according to the agency’s grading rubric. But usually, when a student receives a zero in that section, it means that they didn’t demonstrate that they could answer the question directly, Rozunick said.

    That doesn’t always mean the student left the question blank, Rozunick said. Often, students offer responses that are clear, concise and well-constructed, but don’t deal with the question at hand, she said. Under the state’s scoring scale, those students would receive no points for such a response, she said.

    “As soon as a student shows some knowledge to answer, to interact with and engage with a question, they will generally start to see points,” she said.

    Although she said there’s probably some merit to the theory that students aren’t willing to engage with constructed response questions nationally, Rozunick said she suspects that the uptick in zeros in Texas has more to do with changes to the test format. The new extended constructed response questions are a major departure from the kinds of questions students have seen on the test before, she said.

    “In the past, we generally asked kids to write ‘how was your summer?’ or ‘what’s your favorite hobby?’” she said. “And when you’re asking someone to take a piece of text and analyze it and create an answer, that, of course, is a very different set of skills.”

    The good news, Rozunick said, is that the issue most likely has more to do with students being unfamiliar with that kind of test question than it does with their ability to answer it. Over the coming years, as students get more practice with the new question types, she expects to see those scores begin to bounce back.

    Expand All
    Comments / 0
    Add a Comment
    YOU MAY ALSO LIKE
    Local Texas State newsLocal Texas State
    Most Popular newsMost Popular
    Total Apex Sports & Entertainment4 days ago
    Total Apex Sports & Entertainment25 days ago

    Comments / 0