Sure. That's my job, after all: make them leave with stronger skills and deeper knowledge than they had walking in.
The conversation about how student growth will be a part of our evaluations under TPEP continues to evolve and we need to keep paying attention. I personally think that OSPI has actually made some smart decisions (no offense, but how often do you hear people say that kind of thing?) particularly with emphasizing that student growth data must come from the right kind of assessment. As the debate continues to swirl, we need to think about what will and won't work in terms of student growth data that is an accurate reflection of teacher performance.
Here's what will not work for measuring the growth I guide my students to achieve:
- MSP and HSPE scores. Why not? For one, I teach 9th grade English. Those tests happen long before and long after my time with my students. And another: it turns out that my colleagues at our feeder middle schools are doing their jobs too well. An awful lot of my students come in already having scored high marks on the MSP. When a bunch of my students scored "4" on the MSP-reading, does that mean a "3" (which, by the way, meets standard) on the HSPE is not growth? And, what about the fact that said HSPE would be given about 10 months after I last had the chance to work with those students? Argh, too complicated.
- Okay, so the current state tests don't work...how about more tests? A waste, in my opinion. Testing is not what makes kids improve any more than a new ruler will make my 8-year-old grow taller. Measuring him more often won't make him grow either... and neither will punishing our household if he does grow but not at the same rate as his brothers or as the kids down the street. Besides, what if I measure in centimeters one time and inches another? A bunch of assessments that don't align with each other doesn't help at all...although inches first and then centimeters would certainly produce impressive growth data!
- What about MAP? Or SRI? Or some other cool data gathering machinery that the local rep for the textbook company swears he can get me a thousand site licenses for at a discount? Hmm. Click on the banner above and scroll down through some of the recent posts about MAP here on SfS.
So what's the solution? I don't know: let us ponder it.
You know what? In a dream world, we'd bring in a professional to do this difficult work. We would need to find someone who is highly qualified--maybe someone with a Masters Degree in Education, Teaching, or some other similar field--and then put them in a room with the students for a while. Let that person either build or find standards-based formative assessments to diagnose the students' levels of proficiency. Then, let that person design instruction that guides that group of students toward higher levels of proficiency on that standard. After that, this person can administer an assessment to ascertain the degree of growth these students have achived.
Wait just a minute there... I have a Masters Degree. I stand in front of a room-full of students every day. I have used formative assessments to diagnose their strengths and I have designed instruction to guide them to the next higher level of proficiency. I've even administered assessments that helped me, the students, their parents, and my bosses analyze the growth my students have made toward the established standards!
Maybe, just maybe, we've found the solution. (Turns out, it's the same solution OSPI recommended.) Am I the professional who can choose, design and implement the assessments that can actually provide evidence that my students are making meaningful growth toward a standard? And, should I then be evaluated on my choices, their implementation, and the impact this has on my students' growth?
Sure. That's my job, after all.
I'm not just here to babysit between state tests.