STUTER QUESTIONS VALIDITY OF ASSESMENT "IMPROVEMENT"
Here is another citizen question the near-universal and miraculous
"improvement" on the Washington state 4th grade assessment.
Statewide, math improved from 80% flunking to only 70% flunking If a
year's notice isn't enough to prove that 100% of students can pass
this test, how much time do they need?
Nearly every school from best to worst showed dramatic improvements.
Did the basket-case schools really change their program?? I doubt it.
Later study of the KIRIS showed claims of improvment were largely
bogus - teachers were teaching to the test, and the test itself had
be revised to be easier. It is entirely possible that some of the
questions I objected to as being "above grade level" were removed,
indeed, the original samples that were originally posted on the
Commision on Student Learning website have been withdrawn, possibly
because of problems with the questions. It is also possible that
problems were found with some of the 1997 pilot test questions that
would remove them from future tests.
The 1st year of California's CLAS test, the number of students
allowed to score the top score was deliberately set to zero - the
next year, a small percentage were allowed to score that high, and
they claimed "improvement".
------- Forwarded Message Follows -------
Date sent: Thu, 10 Sep 1998 12:38:10
From: "Lynn M Stuter"
Subject: 4th Grade Assessment
The following letter was sent to House and Senate Republicans. Please
contact your legislators, asking them to investigate. LynnS
>We are hearing that the 4th grade statewide assessment scores rose this
year. All must be wonderful and good, and education reform must be working
>Well, before anyone gets too jubilant about all of this, I think there are
some questions that should be answered:
>1. Were the test scores figured in the same manner as last year? The
indication that I received from Bergeson's presentation at the
Accountability Task Force Hearing in Spokane is that they were not, that
children that were included last year, were not included this year.
>2. Were the same "standards" used as last year -- the same "cut scores"
for pass or fail?
>3. Were the same rubrics used this year as last year to score the
>4. Were the questions asked this year of the same caliber (number of hard
questions, number of easy questions) as last year?
>5. Were there any questions on the test this year that were also on the
test last year, and, if so, how many? Last Spring, while in Spokane,
Bergeson told a group of teachers that there would be a core group of
questions on the test this year that most of the children got right last year.
>6. Were the children given any help on the test? Our information is that
some children were given "helpers" -- people who read the questions to the
children and wrote down the answers.
>Every question ask here would affect the scores one way or another.
Before we get too jubilant about these test scores, I would encourage
Legislators to get the answers to the above questions.
>Lynn M Stuter