Reflecting on ACCESS Test Scores and WIDA: A Deeper Dive

By: Mirla Rodriguez

After encountering numerous concerns from fellow multilingual learners (ML) teachers about ACCESS test scores and the WIDA framework, I initially dismissed my own apprehensions as mere personal biases against the system. However, connecting with ML educators across various states through social media prompted a more thorough investigation into the true nature and impact of ACCESS tests.

This article stems from that research and is designed to raise thoughtful questions for readers to consider. By delving into these issues, I hope to foster a deeper understanding and reflection on the challenges and implications surrounding ACCESS testing.

In a recent examination of the ACCESS test scores and their application in decision-making, several issues have been highlighted by educators. Carolyn N. Waters, Ph.D., in her work “Policy and Practice Brief 6: Considerations for Using ACCESS Test Scores in Decision-Making,” brings attention to these concerns based on a survey conducted shortly after the 2019 Virginia state ACCESS testing period.

The survey of 273 K-12 English Learner educators in Virginia highlighted several potential threats to the validity of the ACCESS test, with reliability being a key concern. Reliability, or the consistency of scores across different settings, was questioned due to issues such as score discrepancies between paper and electronic test versions, inadequate and noisy testing environments, and inconsistent test preparation practices. The latter varied significantly, with some schools offering no preparation, while others provided targeted or extensive preparation using materials from teachers or WIDA, the organization behind the ACCESS test. These factors collectively threaten the test’s reliability and validity, raising concerns about its effectiveness in educational decision-making.

In an insightful article titled “8 Reasons Not to Take Student WIDA ACCESS Scores Too Seriously” by Madeleine Clays (2024), the author outlines several reasons to approach these scores with caution. Here are the key points she highlights:

  1. Unfamiliar Test Administrators: Students are often assessed by strangers, which can affect their comfort levels and performance.
  2. Unfamiliar Testing Environments: The WIDA ACCESS assessments frequently take place in unfamiliar settings, which can be unsettling for students.
  3. Distractions During Testing: There are often numerous distractions present during the testing period, which can hinder students’ ability to focus.
  4. Mid-Year Timing: The assessment is conducted halfway through the school year, which may not reflect students’ true progress.
  5. Post-Holiday Testing: Testing typically occurs right after a two-week break, when students might not be fully re-engaged with their school routine.
  6. Limited Computer Experience: Many English learners have limited experience with computers, which can affect their test performance.
  7. Competing School Events: Other events at school can compete for students’ attention, making it difficult for them to concentrate on the test.
  8. General Test Anxiety: Many students do not perform well in test scenarios, which can lead to scores that do not accurately represent their abilities.

These points suggest that while WIDA ACCESS scores can provide some insights, they should not be the sole measure of a student’s language proficiency or academic potential.

District departments responsible for the English language should actively communicate with school leaders and English language educators to tackle these challenges and to guarantee that the conditions for WIDA ACCESS testing are sufficient. This will help prepare students for a successful testing experience.

Educators and parents must express their concerns and support students to perform optimally on this assessment, ensuring that their scores more accurately represent their proficiency in the four language domains

In the article “Are WIDA Test Results Appropriately Reflecting Multilingual Learners’ Language Skills According to ESOL Teachers’ Experiences? Results of a Pilot Study,” published in the GATESOL Journal 2024 by Emily Patterson and Elke Schneider from the Richard W. Riley College of Education at Winthrop University, the authors explore the effectiveness of WIDA ACCESS testing. This study contributes to the limited research surrounding WIDA ACCESS assessments. While studies affiliated with WIDA suggest that these tests are beneficial for making decisions based on standardized testing data, teaching strategies, and teacher evaluations, our independent research highlights several issues with WIDA ACCESS’s effectiveness. These findings align with those of Coulter (2017), Waters (2020), and Lopez and Garcia (2020), who emphasize the necessity for better standardized language proficiency assessments to address the gap between actual and perceived language skills of multilingual learners.

Despite these concerns, WIDA ACCESS tests remain the primary tool for assessing language proficiency in multilingual learners. There is a need for further research to evaluate the authenticity of WIDA ACCESS testing from the viewpoint of ESOL teachers. Expanding on our small-scale study, future research should incorporate diverse geographic, socioeconomic, and cultural perspectives, with broader participation from early childhood through high school ESOL experiences, and include voices from rural, suburban, and urban educational settings. Additionally, comparing WIDA testing practices and student exit rates with those of other standardized language tests used in states not employing WIDA ACCESS could provide valuable insights. Overall, the findings from this study have the potential to enhance ESOL education and assessment, benefiting both the ESOL field and the wider educational community.

This blog was created to inspire and motivate ML (Multilingual Learner) teachers to delve deeper into researching the ACCESS test and WIDA. Numerous case studies can help amplify our voices and aid schools in comprehending what lies behind this annual test. It is important to move away from placing blame on teachers for the fact that “students do not exit the ESL program.” The situation is much more complex than it appears.

By understanding these assessments better, we can work together to create more informed educational strategies that truly support our students’ diverse needs.

References

https://owlcation.com/academia/8-Reasons-WIDA-ACCESS-Student-Scores-May-Be-Invalid

 

Leave a comment