Saturday, December 11, 2010

Charter School Accountability

A common criticism lobbied at charters is that they can pick and choose who comes to their school, and they can more easily remove students from their school. This leads to artificial inflation of performance, for they are not doing the same job that traditional public schools are because they have that ability.

This can be seen in the first comment on The Notebook article in response to Marc Mannella's publication of the KIPP Open Book. The comment writes:

"Marc

Thank you for your post. I like the concept of an Open Book. Maybe you can do a follow up post where you share both your struggles and strategies to retain your students. I noticed in your Open Book Executive Summary , that KIPPs retains 87%-90% of your students. What are the reasons you loose your students? What does the quanative and qualative data say about those students? What percentage of students leave because KIPP is not the right educational fit?

There is this perception that some charter schools dump students that dont fit their mold.

It is great that you are willing to be transparent to settle these perceptions."


While it seems that this commenter took the time to navigate to here, the commenter did not take the time to read the graph that KIPP clearly presents right next to the retainment data: in 2008-9, KIPP lost 2% of its students because they moved and 10% for "other reasons," defined as "e.g. transportation, parents deciding the school was not a good fit for their student." No where on the open book, interestingly enough, can you find total enrollment figures, so you cannot determine how many students that is. As it is, though, this is kind of a non-starter-- 10% leave the school in one year for reasons other than moving? That does not seem to be an enormous amount, especially since we have no other baseline to go off of. However, a more pointed question to ask would be to break this 10% down in terms of overall enrollment figures, especially because KIPP says that they track all of this (footnote 3). That would please the critics and me.

Otherwise, minus some abysmal graphs (average growth purports to show one number between the two years, yet this shows 5th and 8th grade scores... are these just the average scores from each and the growth is between the two? Then what do the individual bars represent? And all of this contradicts the subtitle, "Average of 2004 - 2010 PSSA Data," which says that they subtracted two averages?), the document represents a supremely excellent step to quiet critics. Public schools don't make their teacher retention data available-- KIPP does, and it throws in some resumes and administrator pay scales to boot.