QUESTIONS ACCURACY OF "IMPROVED" 1998 SCORES
X-Mailer: QUALCOMM Windows Eudora Light Version 3.0.5 (16)
Date: Sat, 12 Sep 1998 23:13:05
From: "Lynn M Stuter"
Subject: WASL Scores
Okay folks, here is the first snafu on the recently released statewide
assessments scores. According to a Seattle Times article on 9/8/98, the
scores were figured differently this year. It seems that Ms Bergeson
suddenly came to the realization that schools shouldn't be penalized for
those students who didn't take the test last year. In short, the scores
last year were figured using the number of children ELIGIBLE to take the
test -- ie, the number of children enrolled.
It is indicated, in this same Seattle Times article, that last years scores
have been "adjusted" so they are comparible to this years scores.
It is also noted, in this same Seattle Times article, how much the scores
increased/decreased from last years scores. But on checking, they figured
the point differentiation by using last years scores figured using the
number eligible to take the test and this years scores figured on the
actual number who took the test. This is comparing apples and oranges.
The point spread also look much better than using last years "adjusted"
When the assessment scores were released last year, I pointed out that the
scores were figured using the enrollment figures rather than the actual
number who took the test, thus skewing the scores and making them look
worse then they actually were. Follows are the published scores from last
year, figured using the eligible enrollment -- and the scores from last
year, based on the actual number that took the test.
Last Years Scores:
Eligible Enrollment Actual # Tested
Math 21.1 21.9
Reading 47.0 50.3
Writing 41.7 47.2
Listening 61.3 64.7
Now this years scores, figured using the actual number tested:
Actual # Tested Point Differentiation
Math 31 +9.7
Reading 56 +5.7
Writing 37 -10.2
Listening 71 +6.3
When the test scores were released this year, no mention was made of the
following criteria, which directly affect the reliability of the test, and
the test scores:
1. The number of children who were given "special accommodation," meaning
that they had helpers who read them the questions and wrote down the answers.
2. The number of questions from last years test that also appeared on this
years test. At a meeting in Spokane this past spring, Bergeson noted that
a body of questions that most children answered correctly last year would
be included on this years test. This skews the scores.
3. The standards used to figure pass/fail this year. Were they the same
as those used last year, ie,
Math 40 of a possible 60 points (65%)
Reading 31 of a possible 43 points (72%)
Writing 9 of a possible 12 points (75%)
Listening 7 of a possible 10 points (70%)
4. The caliber of questions used on this years test as comparible to last
5. The rubrics used to scores this years test as comparible to last years
Because the scoring of the Washington Assessment of Student Learning is
subjective, each of these criteria directly affects the comparibility of
the scores from last year and this year. This is why the test is not
Lynn M Stuter