Open in App
  • Local
  • U.S.
  • Election
  • Politics
  • Crime
  • Sports
  • Lifestyle
  • Education
  • Real Estate
  • Newsletter
  • The 74

    Texas Educators Blame Test for English Learners’ Low Test Scores

    By Keaton Peters,

    5 hours ago
    https://img.particlenews.com/image.php?url=0YS5m1_0v5KsPwl00

    This article was originally published in The Texas Tribune.

    English-learning students’ scores on a state test designed to measure their mastery of the language fell sharply and have stayed low since 2018 — a drop that bilingual educators say might have less to do with students’ skills and more with sweeping design changes and the automated computer scoring system that were introduced that year.

    English learners who used to speak to a teacher at their school as part of the Texas English Language Proficiency Assessment System now sit in front of a computer and respond to prompts through a microphone. The Texas Education Agency uses software programmed to recognize and evaluate students’ speech.

    Students’ scores dropped after the new test was introduced, a Texas Tribune analysis shows. In the previous four years, about half of all students in grades 4-12 who took the test got the highest score on the test’s speaking portion, which was required to be considered fully fluent in English. Since 2018, only about 10% of test takers have gotten the top score in speaking each year.


    Get stories like these delivered straight to your inbox. Sign up for The 74 Newsletter


    Passing TELPAS is not a graduation requirement, but the test scores can impact students. Bilingual educators say students who don’t test out of TELPAS often have to remain longer in remedial English courses, which might limit their elective options and keep their teachers from recommending them for advanced courses that would help make them better candidates when they apply for college.

    The way the state education agency currently tests English learners’ skills frustrates some educators who say many of their students are already fully capable of communicating in English but might be getting low marks in the test because of the design changes.

    Related

    In California, Rebuilding Bilingual Education in Schools After an 18-year Ban

    “You’re putting [students] in an artificial environment, which already reduces the ability of students to give you natural language,” said Jennifer Phillips, an educator with two decades of experience teaching bilingual students in Texas. “It’s a flawed system.”

    TELPAS scores also account for 3% of the grades the TEA gives school districts and campuses in its A-F accountability rating system. Though they only represent a small portion of their rating, TELPAS scores might be more significant for school districts at a time when they have grown increasingly worried about how the state evaluates their performance. Several districts have sued TEA to block the release of the last two years of ratings, arguing that recent changes to the metrics made it harder to get a good rating and could make them more susceptible to state intervention.

    TEA’s use of an automated scoring engine to score portions of TELPAS has also come under scrutiny after the agency used the same tool to evaluate short-answer and essay questions in this year’s State of Texas Assessments of Academic Readiness, the state’s standardized test that all students in grades 4-12 take to measure their understanding of core subjects. Educators are weary of using an automated system to score STAAR and list it as one of their complaints in the districts’ latest lawsuit against the state.

    Testing English learners’ skills

    When students enter a public school in Texas, they are classified as “emergent bilingual” if they indicate they speak a language other than English at home and fail a preliminary English assessment. About a quarter of Texas students have that designation.

    Federal law requires Texas to assess English learners’ progress regularly. Texas is one of only a handful of states that developed its own test instead of using the exam used in other parts of the country.

    Each spring, about a million emergent bilingual students in Texas public schools take the TELPAS exam, which consists of four parts: listening, reading, writing and speaking.

    Before 2018, teachers with TELPAS training would administer the test at students’ schools. Listening and reading evaluations were, and still remain, multiple-choice sections measuring student comprehension. For writing, teachers would gather and assess a sample of students’ work in the classroom throughout the school year. For speaking, teachers would talk to students in one-on-one evaluations or fill out a rubric based on their observations of students’ English fluency throughout the year.

    When the TEA moved the test online, it changed the testing environment and scoring method. The change sought to standardize the test and make the results more reliable, an agency spokesperson said. The automated scoring technology helped deliver speaking assessment results more quickly. Last year, the automated scoring system started evaluating students’ written responses.

    In each of the four assessment categories, students get a score of beginner, intermediate, advanced or advanced high. Students have to continue taking the test each year until they score advanced high in at least three categories; they may score advanced in the other one and still pass. Before this year, students had to score advanced high in every domain.

    Several bilingual educators the Tribune spoke with for this story said the low test scores students have received since the test was changed do not reflect their actual performance in the classroom, adding that many English learners communicate better than their scores suggest. While English-learning students’ scores have improved on the STAAR test since 2021, the TELPAS scores — particularly in speaking — have remained low since the test was changed.

    “It is a little disheartening,” said Ericka Dillon, director of bilingual education and English as a Second Language courses at Northside ISD in the San Antonio area. The district has about 14,500 emergent bilingual students, a significant number of whom are proficient in English but struggle to reach advanced high on the TELPAS assessment, she said.

    “They’re doing the best that they can, but they still won’t be able to meet that criteria,” Dillon said.

    In response to a Tribune data analysis showing that the average number of passing TELPAS scores in speaking dropped after TEA redesigned the test and introduced the automated scoring system, an agency spokesperson said, “It’s not uncommon to see performance adjustments when student performance is evaluated in a standardized manner across the state.” The spokesperson also noted that speaking and writing are by nature more challenging than listening and reading.

    The TEA has vigorously defended its automated scoring engine, rejecting comparisons of the technology to artificial intelligence. The agency has said humans oversee and train the system as well as monitor its results. The TEA said a technical advisory council has approved the technology, and when the program encounters a student response that its training does not know how to handle, it directs it to a human to score.

    This year, the TEA said that at least 25% of the TELPAS writing and speaking assessments were re-routed to a human scorer to check the program’s work. That number oscillated between 17% and 23% in the previous six years, according to public records obtained by the Tribune.

    Score changes after human reviews

    One of the reasons educators are skeptical of TELPAS’ automated system is how scores sometimes change when they ask for a review. Humans rescore speaking and writing assessments.

    Last year, 9% of the TELPAS speaking assessments that TEA reviewed got a higher score; that number was 13% the year before. The automated system initially scored more than 95% of the assessments that improved after a second look, public records show.

    Spring Branch ISD officials said the percentage of assessments that improved after requesting a rescore was even higher at their district. They sent more than 800 speaking assessments for rescoring in 2022, and more than a third got a better score after they were reviewed. The next year, about half of their submissions improved after rescoring, officials said.

    “If the evidence from our rescoring submissions is any indication, the system leaves a lot to be desired for its accuracy,” said Keith Haffey, executive director of assessment and compliance at Spring Branch ISD.

    It’s unclear how many assessments would lead to a better grade after a second look since most results go unchallenged. The number of rescored assessments each year is less than 1% of the total TELPAS tests administered. Educators say they have to weigh costs and time constraints when deciding whether to request a rescore. Reviews are free if they result in a better score; if they don’t, schools have to pay $50 per rescoring request.

    In addition, educators say it’s not easy to decide which results to challenge because they haven’t had access to students’ audio responses. This contrasts with STAAR results: Written student responses are readily available online to districts.

    “If we can’t hear how they did on TELPAS, we can’t say if this is where they really are or not,” Dillon said.

    The TEA says district testing coordinators can request listening sessions, but some educators said the agency’s director of student assessments told them only parents can request the files. A TEA spokesperson said that person misspoke.

    In response to district feedback, the TEA spokesperson said districts and parents will have easier access to all TELPAS responses starting in the 2024-25 school year.

    Not an “accurate reflection”

    Edith Treviño, known affectionately as Dr. ET, used to be the ESL specialist for the TEA’s education service center in Edinburg. Now she runs a private consulting practice helping students pass TELPAS.

    Treviño said she worries that the automated scoring system penalizes students who are fluent in English but speak with an accent, mix in a few words from their native tongue or stray from using academic language.

    “Children are not supposed to answer like regular people, according to TELPAS,” she said.

    To score advanced high in the test’s speaking portion, students must respond to each prompt with answers that last 45 to 90 seconds. They have two chances to record a response and they need to use academic language fitting their grade level.

    But Treviño said the prompts are often simple and do not require long answers. In a recent TikTok video , she said some questions were like asking students to identify an orange.

    Related

    California Celebrates Its Linguistic Diversity While Shortchanging Bilingual Ed

    Because passing TELPAS is not a graduation requirement and scores only account for a small portion of campus and district accountability ratings, some schools do not prioritize helping students prepare for the test. But the results can affect students’ educational journey.

    Many school districts enroll English-learning students in ESL courses, which can prevent them from taking certain electives and advanced courses because of scheduling conflicts. Teachers or staff might also hesitate to recommend a student to advanced courses if they are still taking ESL courses, Phillips said. Those advanced courses, especially at the high school level, are crucial to being competitive in college admissions.

    She said any school policies that keep English learners from participating in advanced courses would amount to language-based discrimination. Nevertheless, she said it’s a common practice she’s observed in her career as an educator and while studying for her doctorate in education.

    “It’s not in the law, but it’s in practice,” Phillips said.

    Not being able to test out of TELPAS can also impact students’ experience in school. Kids failing to pass the test could internalize the failure, which in turn makes them vulnerable to further academic struggles, Phillips said.

    “What this does to children’s self-esteem is horrible,” Treviño said, particularly for students who can speak English well but have test results that tell them they are not proficient.

    Carlene Thomas, the former ESL coordinator for the TEA who now is the CEO of an education consulting company, said she would like to see the TEA use more sophisticated tools that enable more conversational student responses to ensure TELPAS is “meaningful in how students interact socially and with content material.”

    She added that educators should also help students by giving them more opportunities to practice speaking English during class, relying less on direct translation and ensuring they understand the stakes and structure of the test.

    But as of now, she said, “TELPAS is not giving us an accurate reflection of where our students are.”


    The full program is now LIVE for the 2024 Texas Tribune Festival , happening Sept. 5–7 in downtown Austin. Explore the program featuring more than 100 unforgettable conversations on topics covering education, the economy, Texas and national politics, criminal justice, the border, the 2024 elections and so much more. See the full program.

    This article originally appeared in The Texas Tribune at https://www.texastribune.org/2024/08/13/texas-telpas-bilingual-students-test-scores/ . The Texas Tribune is a member-supported, nonpartisan newsroom informing and engaging Texans on state politics and policy. Learn more at texastribune.org.

    Expand All
    Comments / 0
    Add a Comment
    YOU MAY ALSO LIKE
    Local Texas State newsLocal Texas State
    Most Popular newsMost Popular
    Cooking With Maryann4 days ago

    Comments / 0