January 2013
Mon Tue Wed Thu Fri Sat Sun
« Dec   Feb »
 123456
78910111213
14151617181920
21222324252627
28293031  

Day 02/01/2013

Grading for Skills, Not Scores

The usual way of categorizing things.

The usual way of categorizing things.

 

I take great pride in my gradebook. It’s a kinda-fancy, multi-colored collection of Excel spreadsheets that I made myself (with training and inspiration from my friend and former colleague Dan Lyons, @dan_lyons). I tinker with it year after year to get it just right. I can enter a test score in the test scores section, a quiz score in the quiz scores section, and some other grade in the miscellaneous section. All the formulas are there to spit out the students’ averages to however many decimal points I want. It’s precise, it’s easy, and it’s organized.

But is it organized the right way? The basic underlying structure is just an imitation of my understanding of what a gradebook should look like: tests count for X percent, quizzes count for Y percent, homework or class participation get factored in to some significant–but not too significant–extent.

During one of my classes at Teachers College last semester, a classmate told me about an idea he’d been playing around with: the skills-based gradebook. Why not keep track of the kinds of questions our students get wrong or the exact types of questions that keep tripping them up? For example, which is more useful for the math teacher and student: recording an 82% on the Unit 2 test, or recording that a student ran into trouble on the word problems section but did well with simplifying expressions? Of course, we might go over the test with the student afterwards and point out the difficulty with word problems, but it would likely be helpful to record the information and look for patterns throughout the year.

Paul Bambrick-Santoyo’s Driven by Data: A Practical Guide to Improve Instruction provides a helpful way for reconsidering how we design our curricula and how we keep track of student performance. If one of our students scores a 75% on his test but misses most of his questions in the antonyms section, do we record that in our gradebooks, or do we simply record the score and tell him to study harder next time? By keeping careful track of specific areas where our students are having trouble, we can offer guided and highly specific instruction to help them improve their performance and achieve mastery going forward. Bambrick-Santoyo offers a highly developed approach to “teaching to the test,” whereby teachers home in on the areas of weakness on one test to prepare their students for the next one. Skills-based mastery is the name of the game.

I have no doubt that many teachers already record student performance this way. Skills-based assessment may be one of the many areas in which primary-school teachers have much to teach those of us who have older students.  I also have no doubt that it requires more work to keep such careful records. But the benefit for the student–and for the teacher’s effectiveness–could be huge.

If we take this idea one step further, how could such thinking about assessment affect our report cards? What if students received joint comments from the English and history teachers about the students’ ability to write effective thesis statements and clear topic sentences? Or if the foreign language and English (and math!) teachers joined forces to assess the students’ understanding of grammatical and linguistic structure?

Addressing, assessing, and recording skills strength instead of simply recording grades on quizzes and tests…I’m still playing around with this basic idea and how I can incorporate it into my teaching. But I know that before I return to the classroom next year, I have an awful lot of work to do on my gradebook. Just when I thought I had it right, time to rethink…*

%d bloggers like this: