Monday, September 29, 2014

School results coming soon: now with program review data

Friday, October 3rd, is now the scheduled date for 2014 accountability results to be released to the public, and the 2014 school report cards will include program reviews as an additional source of evidence about how students are being served.

Kentucky uses program reviews to check on the quality of students' learning opportunities in subjects that we no longer test, including arts and humanities, practical living and career studies, and writing of the sustained kind that is not easily measured by brief standardized assessments.

The Kentucky EdGuide on "Quality of Learning Programs" explains that:
[A] program review is defined as “a systematic method of analyzing components of an instructional program, including instructional practices, aligned and enacted curriculum, student work samples, formative and summative assessments, professional development and support services, and administrative support and monitoring.” 

Each program review looks at multiple aspects of a school’s program, using a rubric organized around standards for the program and “demonstrators” of strong quality on that standard. For each demonstrator, a school’s program can be scored 0 (no implementation), 1 (needs improvement), 2 (proficient), or 3 (distinguished), based on more detailed characteristics found in the rubric.
You can learn more about program reviews by downloading the EdGuide or by taking a look at the program review rubrics schools use to score themselves.

Many schools' 2014 overall scores may look better now that program reviews are included. 

I say that because we already know how 2013 overall scores would have looked with program reviews factored in, and most scores definitely would have been higher.  In the chart below, you can see examples of the impact.   The lighter bars show the statewide overall scores for each level from the 2013 state report card (without program review data), and the darker ones show the Department of Education's calculation of the same overall scores with program review data included.  The revised versions look stronger because, in general, schools' program review scores are stronger than their state test scores and graduation rates.

However, it is important to know that 2014 accountability will be an apples-to-apples comparison.  Program reviews were included when schools' 2014 annual measurable objectives were set, and they were included in defining the 2014 scores that will qualify a school for a particular percentile rank and accountability classification. 

Source note: the 2013 overall scores that include program review data can be found in the Accountability section of KDE's Open House portal, in a file that also shows the 2013-14 annual measurable objectives for improving on the 2012-13 results.

--Posted by Susan Perkins Weston


Sunday, September 28, 2014

Will better math tests be good enough?

PARCC and Smarter Balanced are the two multistate consortia working on new assessments for the Common Core State Standards. After four years of design and pilot testing and field testing, they're nearly ready for prime-time use--meaning administration to large groups of students for use in statewide accountability systems.  Since Kentucky hasn't chosen to use either assessment yet, I've been noticing news about both, including a rising debate about the ways they approach mathematics.

The consortia argue that their new versions are definitely better than existing multiple choice math tests because of new on-line options.  As an EdWeek report recently summarized:
Unlike previous state assessments, those being developed by the two federally funded consortia will include complex, multipart word problems that students will answer on screen. While some of those questions will provide built-in tools that allow students to put points on a graph or draw lines on a ready-made picture, other questions will ask them to write their answers in narrative form, using a keyboard.
But others argue that the computer tools still aren't close enough to the real work of using math to solve problems.  In the same EdWeek account, David Foster of the Silicon Valley Mathematics Initiative shared his concerns:
"I'm a mathematician, and I never solve problems by merely sitting at the keyboard. I have to take out paper and pencil and sketch and doodle and tinker around and draw charts," he said. "Of course, I use spreadsheets all the time, but I don't even start a spreadsheet until I know what I want to put in the cells.

"All Smarter Balanced and PARCC are going to look at is the final explanation that is written down," he said, "and if there's a flaw in the logic, there's no way to award kids for the work they really did and thought about."
Mr. Foster added: "I've played with the platform, and it makes me sick. And I've done it with problems I've written."
I'm suspect they're both right.  PARCC and Smarter Balanced are offering some big steps forward in how students answer math prompts, and yet they are also far from inviting students to use math in ways that are close to real life applications on the job, in the home, or in civic life.

For the long-term education discussion, this debate turns yet another spotlight on an enduring puzzle: how can schools develop a balanced commitment to the skills that are easy to measure and the skills that matter at least as much but don't fit easily into standardized assessments?

Do note that Kentucky is not signed up to use either PARCC or Smarter Balanced.  Our current K-PREP assessments use methods that were in use years before we adopted Common Core, and any changes from that will occur when the Department of Education seeks new bids for our testing contracts.  The consortia will be eligible to submit proposals, in open competition with other companies that think they can provide the data we need.

--Posted by Susan Perkins Weston

Wednesday, September 24, 2014

International Benchmarks for Kentucky Scoring

If Kentucky 2011 students had taken the international TIMSS and PIRLS assessments, the American Institutes of Research estimates that:
  • 46% of Kentucky fourth graders would have reached the high benchmark in mathematics (in a system that gives scores of low, intermediate, high, and advanced)
  • 60% of Kentucky fourth graders would have reached the high benchmark in reading
  • 27% of Kentucky eighth graders would have reached the high benchmark in mathematics
AIR used a "chain linking" method to develop those estimates, reported in a new report on International Benchmarking: State and National Education Performance Standards.

The report also shares analysis of how Kentucky defined proficient work on the old KCCT assessment and the new K-PREP tests.  On the old test, Kentucky's proficient lined up with TIMSS and PIRLS scores of at the intermediate level.  In our new system, launched in 2012, AIR concludes that Kentucky proficient lines up with the high score level on the international assessments.  That's an excellent step up for helping us understand how our students' learning compares to success rates around the globe!

--Posted by Susan Perkins Weston

Tuesday, September 23, 2014

Professional Growth and Effectiveness System: Some Basics

Across Kentucky, schools are moving rapidly to implement of our new statewide approach to teaching quality.  The new Professional Growth and Effectiveness System will replace past evaluations and provide a much deeper attention to feedback and support for individual teachers to grow steadily stronger in their craft.

The new approach will look at teaching from two different angles:
First, professional practice will matter, using evidence from multiple sources, including:
■ Observations of the teachers work by administrators and peers
■ Student voice surveys
■ Professional growth plans and self-reflection
■ Possibly, additional district-determined sources of evidence.
That evidence will be used to identify each teacher’s practice as being at one of four levels-- exemplary, accomplished, developing, or ineffective practice....

Second, student growth will also matter, looking at how students improve from year to year in each subject. For most teachers, that evidence will all be gathered locally, using student growth goals, professional judgment, and district-defined rubrics. For those who teach reading and mathematics in grades 4-8, some evidence will be gathered that way and added evidence will come from state assessments of those two subjects. Depending on the evidence, each teacher’s student growth will be rated at one of three levels: high, expected, or low growth
Those two sources of understanding will be combined to identify next steps for each teacher's further development as a professional.  You can learn more about the teacher system and the related system for principals from the Kentucky EdGuide on "Educator Growth and Effectiveness" (quoted above), and you can learn much more from the Kentucky Department of Education's PGES webpage.

--Posted by Susan Perkins Weston

Sunday, September 21, 2014

Sweet results: Allen County AP work and college readiness

The Citizen-Times is sharing some great student results at Allen County-Scottsville High School.  In 2011, the school signed up with AdvanceKentucky--a systematic approach to engaging students in Advanced Placement work.  Early evidence on students enrolling at Western Kentucky University this fall is very impressive:

In the fall of 2012, almost 66 percent of AC-S graduates entering WKU had to take some form of remedial classes, because, at least in some subject areas, they weren’t academically prepared for collegiate-level work. That year, the state average was 54.2 percent.

A year later, things had changed little for AC-S, at 66 percent, though the state average had actually worsened, climbing to 63.4 percent.

But this fall marked the first incoming WKU freshman class to see AC-S students who had been through the entire three years of Advance AP courses. The change was dramatic: 90 percent of incoming AC-S graduates needed no remedial courses. As [
Director of Instruction Rick Fisher] put it, “We’ve gone from only 30 percent who didn’t need remedial courses to only 10 percent who did need them.”

Thursday, September 18, 2014

KSBIT and millions of dollars being charged to Kentucky districts

The Kentucky School Boards Insurance Trust (KSBIT) used to offer school districts two self-insurance pools: one for workers' compensation insurance and one for property and liability insurance. 

For many years, the costs of participating seemed lower than the premiums charged by commercial carriers and many Kentucky districts signed on.  "These self-insured pools allow school districts to combine their resources while sharing the risk," according to the KSBIT board of trustees.

The part about "sharing the risk," highlighted above, has always been the catch.  If this kind of risk-sharing pool doesn't have enough money to pay expected claims, it can send the members an additional assessment to fill the gap--and KSBIT developed some big gaps.

The problems became very public in January 2013 and has been in the headlines pretty much ever since.  KSBIT no longer runs the pools, and the Kentucky Employers' Mutual Insurance (KEMI) has taken over handling claims by the former members, but KSBIT's former members are still on the hook to contribute enough to cover claims for the years they participated in each pool.

In May, the Franklin Circuit Court entered orders that specify each participating district's required payments for the $37 million worker's comp gap and the $8.8 million property and liability hole. 

Now districts are working out how to pay those shares off in varying ways, all of them painful.  For example, Fayette County will pay off its $3.1 million assessment over five years, and Madison County will pay $1.2 million over time as well.  Fleming County will pay $351,803 over 10 years, and Harlan Independent is working out plans to pay $258,728.

Unsurprisingly, many district leaders are concerned about how the hole got so deep and whether better choices by KSBIT leaders could have avoided these difficult new payments.  All reports seem to agree that the problems built up over multiple years.  I found this report on a briefing from Kentucky Commissioner of Insurance Sharon Clark helpful, but I still don't know enough to say much about how responsibility should be apportioned.  

Here's the question I most wish I understood: if KSBIT had charged the right amount every year, so that no district would be facing unexpected billing now, would it still have been a better deal than other insurance options?  Or, put another way, if districts could have known then what they know now, would they still have decided KSBIT was the best deal on offer at the time?

What I do know is that these payments are currently consuming resources I'd rather have available to serve kids now in Kentucky schools: the KSBIT collapse is definitely not good news!

--Posted by Susan Perkins Weston

Tuesday, September 16, 2014

Which AP tests do Kentucky students take and pass?

On Advanced Placement tests, scores of 3, 4, or 5 can qualify a student for college credit, placement in advanced courses, or both.  Monday, while posting on the Leaders and Laggards report,  I realized that the subjects where students earn those credits deserve closer attention.

So, below, two additional thoughts on AP test success in Kentucky.

First, a look at the major areas where 2013 public school students received successful scores, combining multiple tests in disciplinary clusters. The green shades identify science, math, and world languages, the subjects that Leaders and Laggards included in their economic competitiveness ratings.  The very small slice for world languages stands out as a weak result in the overall picture.

Second, a look at the top 12 tests where Kentucky students succeed, showing the number of students passing each test.  It isn't really a surprise to see the English tests at the top of this list, but it would definitely be good to see the science, math, and language numbers move up.

Source note: These numbers come from the page for "AP Program Participation and Performance Data 2013" at the College Board website.

--Posted by Susan Perkins Weston