School rankings don't tell you what you need to knowThis is part 2 of a series on education in DC. See part 1.
When choosing between public schools or deciding whether to send a child to private school or move to the suburbs, many parents look at the test scores for schools listed online. But DCPS, as in many states, reports just the percentage of students who scored "proficient" or higher in math and reading. That number actually doesn't tell parents what they need to know.
Let's take 2 children going to 2 different schools whose teachers are equally good. One year, a number of other kids, who happen to be doing worse, move from one school to the other. One school's proficiency numbers would go up, while the other would go down. But for all of the other kids, nothing may have changed.
A parent considering those 2 schools, however, would suddenly think one was better or worse than the other, and make choices on that basis. If those other kids influence things like classroom discipline, it might matter, but it could well be that both schools remain just as good as they were before. In short, the numbers are misleading.
How much will a school help your child?
Every parent wants the best for his or her child. A parent who has the ability to choose among places to live, and is choosing those based on the schools, will look at these test score rankings, but that's not really what they need to know.
What they need to know is simple: if I send my kid to this school versus that, will he or she come out more high-achieving? Or, another way to look at it is: if one could clone a kid and send him or her simultaneously to every school in the area, would the kid have more proficiency at the end at some of the schools versus others?
It doesn't inherently matter if at one school the other kids achieve more than at another; all a parent really cares about is his or her own kid. They don't need the percentage of proficiency. What they really want is what educators call a "value added" measure.
We could broadly say that there are 3 components of a kid's achievement:
- The influence of parents and others at home
- The quality of the teaching and instructional resources at the school
- The influence of other students
#3 matters if the other students have an effect. For example, a high achieving student might not learn as much if there are a lot of low-achieving students in the class because the teacher has to spend more time on basic concepts. Or, if most kids bully and mock higher-achieving students, that can encourage many kids to pretend not to know answers in class.
On the other hand, having a higher-achieving environment can push lower-achieving kids to work a little harder, perhaps, unless things are too hard, and a peer environment that rewards hard work can also positively reinforce a student's efforts.
If you look at a DCPS school profile, however, it doesn't separate #1, #2, and #3. A school with higher proficiency could only mean that the kids come from more privileged backgrounds (factor #1) and nothing about the experience at the school. It might be that sending your child to that school versus another has no actual benefit.
In fact, we can estimate some of this. Race does not equal income, but at DCPS it's very highly correlated. While Hardy Middle School in Georgetown has lower average proficiency than Deal Middle School, near Tenleytown, if you only look at white students the proficiency is the same.
Given that a kid's race (or income) is not going to change based on what school he or she goes to, there's no reason to believe the educational experience at one is better or worse. Yet a number of parents in the area don't want to send their kids there. Is it just a generalized fear of being in a school with poor and minority children?
Rankings can drive segregation
We know that lower-income students on average come into school worse off achievement-wise than their higher-income peers. There are many reasons for this. Their parents are less likely to have been able to spend as much time reading to the kids and teaching them outside of school. They are less likely to have sent the kid to camps and other programs with academic enrichment. Also, they might not have as many books available at home, and so on.
Moreover, studies have shown that lower-income kids perform worse on standardized tests in general. All of these factors add up to the fact that a school with only higher-income kids might have higher test scores than a school with a mix of incomes even if the intellectual ability and teacher quality are exactly the same.
Let's assume that you have a school with a bunch of terrific teachers that is doing a great job educating its kids. One year, a bunch of lower-income kids come into the school. Let's say that the teachers do just as terrific a job educating the existing kids and the new kids. Existing kids don't lose out at all. Yet the school's average test score will go down.
This reinforces the fact that we are measuring and reporting the wrong thing. But most people don't necessarily know this, nor do they have better data available, so they'll understandably choose the highest-performing school they can, even if "highest-performing" only really means "school with the fewest lower-income kids."
There are better metrics
Steve Glazerman pointed out some of these same flaws and recommended using a "value added" measure instead. This is the kind of calculation the IMPACT teacher evaluation uses, but DCPS could report an average across all teachers for the school.
Instead of answering the question, what percentage of kids at a school are doing well, which is very dependent on who goes there, this number would say how much each kid will probably gain from going to the school, which is really what will help parents make choices.
Have you used the school ranking data in DC or elsewhere? What would make it more useful?
- Landover is not the place for the FBI
- Not just a phase: Young Americans won’t start motoring like their parents
- After more crashes, DDOT pledges to remove Arkansas Avenue's rush hour lane
- Sharrows tell drivers to share the road with cyclists, except when that road is a state highway
- Is Sheridan Station a sign of change east of the river, or more of the same?
- Many Silver Line riders make a long trek from Metro's eastern branches
- Where DC area bike fatalities happen, in one map... and what's the real "intersection of doom"