In article <[EMAIL PROTECTED]>, [EMAIL PROTECTED] (Alan Zaslavsky) wrote: > Dear Gene, > > I share your feelings about MCAS (fortunately my daughter finished high > school before it came into effect, but that's no consolation for the > thousands of other kids who are supposed to be deprived of their high > school diplomas), and had some similar reactions to the article. <SNIP> Each one of the 1539 schools in MA were given ratings on whehter their scores had gone up: "F" for failed to meet, A for approached, M for Met and E for exceeded. My second daughter was a 4th grader in a school that had one of the highest MCAS scores in the state last year. The 1/9/01 Department of Education ranked her school as "F" for failing to meet expectations. Every school in my town was rated "F." Unfortunately, the Department of Education has cloaked this evaluation process with a veneer of mathematical rigor, when the analysis appears to be anything but rigorous. > On the statistical issue, there are two ways that this could be done that > have different implications. If you form categories based on the actual > score in the previous year, then you have a classic regression to the > mean situation. The magnitude of the regression to the mean effect could > be estimated knowing variances and sample sizes at the various schools; > it is even possible that allowing for a few points smaller required > advance in the higher-scoring school might adjust for that effect > (although I would be surprised!). > > Another approach would be to define groups by some other variable related > to but distinct from outcomes, e.g. inner city versus suburbs, percent > minority, or percent in poverty. Although the groups will differ in their > baseline values, there is no regression to the mean effect there. > > However, either of these analyses begs the question of the potential for > improvement in different schools. An excellent school may already be doing > everything it should be doing and have no way to improve, while a low-scoring > school may have a lot of possible avenues to improvement. There may also > be issues about the appropriateness of the educational strategies that might > be adopted. Somebody might argue that at a school already doing well at > MCAS, improvements could only be obtained by teaching to the MCAS at the > cost of less investment in AP exams. (I personally have little confidence > in the educational incentives of the MCAS at any level; I am just laying > out possible arguments.) Presumably some arguments could also be advanced > to the opposite effect, as well. > > Best regards > Alan Zaslavsky The local schools are already being forced to teach to the test. I reviewed my older daughter's science text and thought it was apalling. There would be a 10-page section mediocre discussion of pressure in the ocean and atmosphere, followed by an inane 10-p discussion of pressure in the blood system. There was little to unite the two concepts in that both dealt with a term called pressure that was very poorly described. I told her teacher that I didn't envy him having to teach with a book that was structured so poorly. He said the book was the best of a bad lot and that they chose it over one that they preferred because the content was closer to that being tested with the MCAS. He said that their previous model was that earth sciences were dealt with in a unified package in one year, followed by the life sciences in the years before and after. However, the MCAS tests both earth and life sciences in one exam, so they couldn't go a year without covering both with the same text. I fear that decisions like this are being made state-wide. -- Eugene D. Gallagher ECOS, UMASS/Boston Sent via Deja.com http://www.deja.com/ ================================================================= Instructions for joining and leaving this list and remarks about the problem of INAPPROPRIATE MESSAGES are available at http://jse.stat.ncsu.edu/ =================================================================
