All of us at CPS are thinking about what our teacher evaluations will be like next year. We will now be evaluated based on student performance. These evaluations determine who keeps jobs and which programs are cut. Of course, we have no clue what that process is, yet must improve and develop curriculae to survive said process.
So many questions....
How will student performance be measured? What rubrics will be used? How do we measure data that is not only quantitative but also QUALITATIVE? How do we balance the numbers-driven goals of CPS and the experiential process we know works in the Art Studio? How can student performance be measured in a way to show the VALUE of art teacher performance and our own effectiveness beyond student attendance, grades, and core subject test scores? I am thinking of music, theater, and dance teacher evaluations in addition to the visual arts.
We at CPS need to be sure that the way we are evaluated reflects the way we need to teach. Our curriculum will have to respond to the ways student performance will be measured in addition to the way teacher performance will be measured. I don't know about you, but we are all a nervous wreck here at Curie. It is hard to develop a curriculum without the slightest idea of what those rubrics will be. Not to mention working under the threats made about what happens if we (and our students) don't perform according to the unknown rubric.
So I have been asking questions about WHO is developing these student and teacher evaluations. That same day I got some answers. There is a group being formed by CPS to work on it. Cool, let's get involved! Let's make sure art, music, theater, and dance teachers are on that committee.
Of course I couldn't help but think about how SHoM can play into this process. I have had such success at using this program for the first time this year. I will continue to use it. I wrote to Kate asking how can the data collection developed through SHoM help in our quest to evaluate art students and teachers fairly. She asked me to put these questions on the blog.
So here you are. What will we do? I hope to be part of the conversation and not just a victim to what is decided for next year. I hope to be able to make use of what we have all learned this year for years to come. I hope to lessen my fear of the impending evaluation process, knowing that some of us from SHoM were part of the fair development of rubrics. We all want to measure our own effectiveness without sacrificing the experiential quality of our instruction. How can we do this? I put this question out there... help.