Wednesday, April 8, 2009
The ninth grade bulge (with graduation implications)[UPDATED]
Ninth grade is always Kentucky's largest grade. The graph above is one illustration, tracking one class through six years of testing. There, you see almost 3,000 more students in grade 9 than in any other grade. Our public schools retain more students in that grade than any other, producing a dramatic bulge in total students.
From the graph, I could say, "Out of every 100 ninth graders tested, 82 made it to twelfth grade testing four years later."
I could also say "Out of every 100 seventh graders in spring testing, 88 made it to twelfth grade testing six years later."
Looking at the graph, you can see why the two results are different, and how the giant helping of retained students makes the problem look much worse.
Here's the thing.
A common approach to describing graduation rates begins with the phrase "out of every 100 ninth graders...." That means "We divided graduates by ninth graders." It also means "We included a bunch of ninth graders in this year's math that we also included in last year's math." It means "We're doing a double count."
No matter how many years a student spends in grade nine, that student can only collect one high school diploma.
The method that simply divides by ninth graders is simply wrong. If you hear numbers presented that way, I urge you to ask how many times they count repeating ninth graders.
We have a big enough problem with the students we really lose, without claiming that we lose even more.
Two small added notes. First, thanks to Steve Clements for questioning me about graduation issues this weekend: his thoughts led to this post. Second, I said above that the same bulge happens every year. The numbers are below, and a PDF graphed version is here.
Update: the numbers above use a fall count of grade 12 for the classes of 2007 and 2008, and a spring count for earlier years. I've added a new post here that shows spring counts consistently, allowing a better comparison over time.
Labels:
Graduation
Subscribe to:
Post Comments (Atom)
Susan,
ReplyDeleteI took a look at the KPR numbers for 2007 and 2008 Grade 12 when I saw the sudden jumps in the percentages you listed for those years in the table at the end of your Blog. That anomaly is a clue that something isn’t right. It isn’t.
I talked to Kevin Hill at the Kentucky Department of Education about KPR reporting changes that occurred in 2007. Those changes led to your mixing apples and oranges because the 2004 to 2006 12th grade numbers you used are for spring testing in the graduation year while the 2007 and 2008 numbers are for fall testing. Fall testing takes place before a notable number of early graduations and 12th grade dropouts impact the spring test samples. As a consequence, the 12th grade data you use isn’t comparable across all the years. The numbers for 2007 and 2008 are far too high.
Anyway, I appreciate you pointing me at the changes that appeared in the KRP reporting of 12th grade students in CATS starting in 2007. There seem to be some interesting things going on. Mr. Hill indicated he’d be checking into that.
By the way, the correct spring testing number of 12th graders in 2008 isn’t the 44,130 figure you used. It is only 40,924 (check the data in the 2008 Statewide KPR for writing portfolios). Using that “apples to apples” number, I get a “Grade 12 as Percent of Grade 9” figure of only 76 percent, not 82 percent. That “apples to apples” number is little changed from all the earlier results.
There are more problems with the assumption implied in your heading that your numbers have graduation implications. This is why your approach wasn’t recommended when the US Department of Education looked very deeply at this issue several years ago. Until high accuracy student tracking systems are available in a state, the feds recommend using another formula called the “Averaged Freshman Graduation Rate.”
We really won’t know what Kentucky’s true graduation rate is until the state gets the Infinite Campus data system on line (hopefully collecting accurate data) and then runs it for four years. Once that happens, we can do our first calculation using the National Governors’ Association’s graduation rate calculation, which the feds have mandated for eventual use in all states. Assuming someone doesn’t mess with the process (I recall reading that already happened in Texas with their student tracking system), then we will start to know the real situation with some accuracy.
Anyway, I think we are making some progress on graduation rates, but I am not sure if that is due to a creep in social promotion or a real educational advance. In any event, the rate of progress isn’t anywhere close to what your Blog shows between 2006 and 2008. You simply grabbed the wrong data.
Fair enough. My point about ninth grade is unchanged, but I've updated the post above and shared graphs and a chart using the spring counts in another post.
ReplyDelete