Unpacking the Data Blackout: Why Washington’s OSPI Took Down Student Test Scores and What It Means for Our Kids

9 Min Read

A recent move by Washington’s Office of Superintendent of Public Instruction (OSPI) to pull years of student test score data offline has ignited a fierce debate about government transparency and accountability in education, with critics suggesting it masks a deeper problem of student underperformance.

The sudden disappearance of historical student test score data from the Washington Office of Superintendent of Public Instruction (OSPI) website has sent ripples through the state’s education community, sparking urgent questions about transparency and accountability. While OSPI attributes the removal to outdated information, a prominent conservative think tank suggests a more concerning motive: obscuring declining student performance.

The Official Explanation vs. Mounting Criticism

According to OSPI Communications Director Katy Payne, the student test score data was removed because it was “multiple years out of date and had not been updated since the amendment to our state’s federal Every Student Succeeds Act (ESSA) plan.” This amendment, she explained, extended the timeframe for meeting specific educational targets.

However, this explanation has been met with significant skepticism from Todd Myers, Vice President for Research at the Washington Policy Center. Myers contends that his persistent inquiries about updating the data directly led to the agency “scrubbing” the information entirely. He views this action as a clear example of officials potentially hiding inconvenient data when Washington’s children are falling short of educational goals.

The Battle for Transparency: Behind the Data Removal

The controversy began with specific achievement targets set under the ESSA plan. In 2017, ambitious goals were established for 2027, aiming for 90% student proficiency across various subjects. While the pandemic prompted a reasonable extension of these targets to 2029, the ability to track progress remained critical. Myers noted that a dedicated webpage, which used to show performance against these targets, displayed data only up to 2018-2019.

His attempts to get clarification from OSPI were met with initial assurances that current data was available elsewhere, followed by silence. After repeated emails and a public records request, Myers observed a pivotal shift: the webpage dedicated to tracking progress against these crucial goals was completely removed. Katy Payne stated that any delay in her response was unintentional, attributing it to requests “slipping through the cracks” and an expectation that his public records request would yield the necessary information.

The Context-Free Data Problem

Myers argues that the currently available data lacks the necessary context for meaningful evaluation. For instance, the 2024-25 report card indicates that 70.9% of students achieved “Levels 2, 3, and 4” in English Language Arts, and only 63.3% in Math. These figures fall significantly short of the previously established 90% proficiency goal that was displayed on the now-removed targets page.

Adding to the complexity, recent releases of OSPI’s student test score data have seen adjustments to proficiency measures, making accurate year-over-year comparisons difficult. This change further hinders the ability to clearly gauge improvement or decline in student outcomes. The Washington State Board of Education provides comprehensive information on academic standards and assessments, which can offer deeper context on these proficiency levels, as detailed in their official reports Washington State Board of Education.

Myers steadfastly maintains that the data was removed deliberately to obscure the reality that Washington students are not meeting crucial improvement goals. “When the data are inconvenient to the government, they simply hide the data,” he remarked, emphasizing a pattern he frequently observes.

Spending vs. Outcomes: A Broader Critique

This debate extends beyond data transparency to a fundamental critique of education spending. Myers points out that despite significant increases in education funding and competitive teacher compensation in Washington, student outcomes, as measured by the National Assessment of Educational Progress (NAEP) scores, have actually declined during the same period. This trend is a national concern, with detailed reports on state-by-state NAEP performance available from the National Center for Education Statistics (NCES).

He argues that this indicates a failure in current educational strategies and highlights the ineffectiveness of simply “throwing more money at the problem.” For Myers, the core issue is holding government and politicians accountable for how public funds are spent and for the promises made regarding student achievement.

A History of Data Challenges in Education

The concerns raised by Myers echo historical challenges in ensuring integrity and transparency in educational data. Across the country, the pressure to demonstrate improved test scores has, at times, led to unfortunate incidents of data manipulation.

One notable case in Washington occurred at Beacon Hill International School in Seattle. In a startling discovery, district officials noticed “improbably high scores” and “heavy erasure marks” on tests submitted. A state investigation confirmed a “huge number of erasures of incorrect answers” that were subsequently replaced with correct ones. This tampering led to the invalidation of the entire school’s state test results—a first for Washington—and prompted Seattle Public Schools to launch an independent investigation.

Similarly, the infamous Atlanta Public Schools cheating scandal involved teachers being pressured by administrators to alter student test scores, often tied to performance-based bonuses for staff. While Seattle’s bonus structures for teachers differ, principals and administrators do receive performance-based bonuses, creating a potential, albeit indirect, incentive for score manipulation.

The introduction of new, computer-based tests like the Smarter Balanced Assessment, aligned with Common Core State Standards, aims to reduce the odds of physical tampering seen in earlier paper-based exams. However, the underlying pressure to meet targets and demonstrate progress remains a constant factor in the education landscape.

What This Means for Washington Education

The removal of accessible, comparative student test data by OSPI is more than just an administrative tweak; it has profound implications for every stakeholder in Washington’s education system.

  • For Parents: Without clear, historical data, it becomes significantly harder for parents to assess the performance of their local schools and advocate effectively for their children’s educational needs.
  • For Students: A lack of transparent data can obscure areas where students are struggling, potentially delaying crucial interventions and support.
  • For Policymakers and Educators: Reliable and accessible data is the bedrock of informed decision-making. Its absence hinders the ability to evaluate existing programs, identify successful strategies, and allocate resources effectively.
  • For Accountability: The core of educational accountability lies in the ability to measure progress against stated goals. When data is removed or presented without context, it erodes public trust and makes accountability challenging to enforce.

The conversation around OSPI’s actions underscores the vital importance of data transparency in maintaining public confidence and ensuring that Washington’s education system truly serves its students. As concerned community members, it is crucial to continue advocating for clear, accessible, and historically contextualized data to hold our educational leaders accountable and ensure a brighter future for all Washington children.

Share This Article