What is value added education? Education Week provides an answer. Value added education is based on a statistical analysis which measures the effectiveness of schools and teachers based on the amount of academic progress students make from one year to the next. They track the "value" that schools add to individual students' learning, separate from such background characteristics as race and poverty. My understanding is that instead of comparing one student with others, as norm-referenced tests do, or against an established standard - value-added assessments measure how far the student has progressed at the end of the school year compared with where he/she was at the start of the school year. I see it as a pre-test/post-test type of analysis. I don't believe that this type of analysis tells us much about whether or not a student will be able to pass the Ohio Graduation Test (OGT).
Let me say that I am not a fan of charter schools. I think they serve as a brain drain and a financial strain on the public school system. According to a Vindicator report in October 2007, nearly one out of every four public school pupils in the city was enrolled in a charter school last year. This report also indicated that six of the 12 charter schools in the city of Youngstown are ranked in academic emergency, the lowest rating given by the state. So what the new report from the Ohio Alliance for Public Charter Schools tells me is that someone has designed another measurement technique to justify the continued existence of charter schools. The new measurement technique does not tell us what the child knows, only that the child knows something at the end of the year that he or she did not know in the beginning of the year.
OAPCS said in its report that 57% of the seven Youngstown charter schools met or exceeded the state's academic growth expectations, whereas only 38% of the 16 city public schools reviewed by the state met or exceeded the expectations. I am not a statistician but it appears to me that the statistics are not giving us an accurate picture of what is going on in these schools. In October 2007 there were 12 charter schools in Youngstown. Based on the OAPCS report that means that 6.8 of the schools showed that value was added. The Youngstown City schools showed value added in 6.08 schools.
It is also important to keep in mind that the total student body of the charter schools in 2007 numbered 2,615 whereas the student body of the city schools was approximately 8,200. Also, I do not believe the charter schools in Youngstown included special ed students (exception being the Mollie Kessler charter school). This data does not convince me that charter schools are doing any better than public schools, especially as there are no charter schools on the secondary (High School) level.
Anyone who has ever worked in the education field or dealt with Charter Schools first hand knows the charter school administration can influence, to a certain degree, who gets to attend their schools. First of all, students are selected through the use of a lottery, where their names are literally drawn out of a pool of hopefuls. This has a positive affect on the mentality of the students and their parents. If selected they feel chosen, as if they've won a prize. Where do the "losers" go? The same place most of us did, public school. In addition, at the first signs of trouble, a student can be expelled from a charter school and sent back to public school. Does the public school have that luxury? No, because public schools are where the buck stops.
And after all that- the best kids with the most involved parents competing in a lottery to get in, no pesky union or certified teachers to deal with or pay adequately, no special ed students and the luxury of kicking out problem kids at the first hint of trouble - they have yet to outperform public schools on standardized tests.
Charter schools in Youngstown go from Kindergarten to the eighth grade. The students then leave the charter school system and go to the public high school system where high school teachers are thrust into a position of bringing students up to the level that must be attained in order for the students to pass the OGT. Simultaneously teacher's must deal with adolescent behaviorial problems. To make matters worse, it is implied that teacher's evaluations are going to be tied to the academic achievement of students. Let's face it, there are many students who simply do not care about school or academics,, and there is little that teachers, as hard as they try, can do to change that.
A report from the Thomas B. Fordham Institute , speaks volumes. In this report it is noted that data derived from value-added statistical analysis raises questions about how to define success in a school. Seems that some schools with high levels of overall achievement, many from wealthy districts, are failing to show adequate growth over time; while some lower achieving schools are showing gobs of growth over time. My question is exactly what are we measuring? It certainly isn't the quantity of knowledge that students are receiving.
There are glaring flaws in the overall rating scheme. For example, a school that has an excellent rating at, say, 95% will be unable to make AYP unless they go up to, say, 96% which would be very hard and unreasonable to expect. What about the school who is 99%? If there is one? Will that school eventually fall short because they can't maintain 100% forever? Crazy, isn't it?But here's the bigger problem. Why didn't the Vindicator explain what "value added" data really means? Why didn't the Vindicator ask questions about the validity of a value added education assessment or a report published by a special interest group (Ohio Alliance for Public Charter Schools). I understand newspapers are supposed to report the facts, but the author of the article certainly seems to have become drunk on the the charter school "kool-aid" if you ask me. Even the headline "Charter Schools Top City’s in Performance" is completely unsubstantiated. What is expressed in the "data" is that city charter schools show a greater increase over last year's numbers when compared to the city schools. Even a cursory comparison of the report cards for Youngstown City and Eagle Heights Academy (a charter school mentioned in the Vindy article) shows that out of 19 indicators for 2006-07 grade level standardized test passage rates from grades 3-8, Youngstown tops Eagle Heights in 14 of them.
According to a report issued by the Ohio Education Association and the Coalition for Public Education, Ohio's public school districts far outperformed the state's chain of publicly funded, privately operated charter schools per the ODE report card data. But alas, if charter schools are here to stay, I believe that it should be a requirement that they provide education K-12, thereby teaching the whole child, and also being held accountable for whether or not the students can pass the Ohio Graduation Test. I believe that any thing less than this level of accountability is just playing hinky with the numbers. The very fact that the state's better schools cannot show "value added" because they can't improve on their AYP which is already good to excellent makes this new "value added criterion" laughable.