Search This Blog

Dec 11, 2009

The Panopticon Project

This week, at the Secondary Coordinating Council meeting, we came up with a great idea. I called it the Panopticon Project. It is a bit of a joke. Panopticon is a kind of a prison building, where everyone is visible to the guards, and no one knows when one is being watched. Michelle Foucault, in Discipline and Punish has famously used it to illustrate the gentle oppression of the modern age. But to improve, we have to make things visible – not to the guards, but to each other.

We were just talking about the data on program quality, frustrated at how the data is not reliable, how it is hard to read, how hard is it to get the information across the academic turf boundaries, and how it always comes so late to do anything about it. So we thought it would be so great if you can just see instantly what is taught in every class (without reading a 20-page syllabus), and what students have learned. So, we came up with an idea that I think is going to work really well for us. It is simple, the technology is there, and it is fun.

Imagine that in every class, students are asked to complete a short survey "Ten things I have learned in this class," and the results of it become immediately available for viewing. The instructors will have to agree, of course, on what the ten main things are, and students will have to agree to be honest and objective. But this would provide a great snapshot of program design, expose gaps and overlaps, provide a glimpse of overall quality, and a constant feedback loop to program coordinators, administrators, and faculty. OK, let's just imagine a web page like this. I even piloted the technology (that's how excited I got!), so click on those two live links to see how it might look like (Sidorkin-007 :survey; results). Feel free to enter a few test answers, and see how the results update.

Secondary PTEP


Spring 10 (Instructor, section)

Summer 10

Fall 10

STEP 161


EDF 366

Sidorkin-007 :survey; results
Bartelheim-001:survey; results
Trainor-003:survey; results
Allen-002:survey; results
Allen-009:survey; results


ET 249


STEP 262


EDSE 360


PSY 349


STEP 363



EED 402 Kraver: survey; results
THEA 385 Schuttler: survey; results
FL 341
SOSC 341, etc…


EDRD 340


ET 349


STEP 464



Of course, we would have to overcome anxieties, our traditions of secrecy, and assume a certain amount of data contamination. However, it would allow us to learn quickly, and to change quickly. For example, in the next semester, we will realize that there needs to be a set of different questions we want to ask, or that we need to change some of our methods and assignments. If I see students learned about the law better in Wayne's class than in mine, I will come and ask him how he does it. I think a tiny bit of public pressure is also needed for us to work on constant improvements. The Council members Mary Schuttler and Jeri Kraver agreed to pilot it in the Spring semester, and I am hoping Social Foundations faculty would be able to pilot all EDF 366 and 370 courses as well.

I am sick and tired of bad data, of bad standards written by people who know nothing about the real life; I am tired of compliance for the sake of compliance; I can't waste anymore of my time on instruments and measures that are not that useful. I want us to move to the Google age.

Dec 4, 2009

Confessions of a micromanager

Micromanagement is a bad thing. How on earth did I end up editing a bunch of handbooks and surveys, and answering dozens of emails a day about technical bugs? - Surely NOT because I like to do everything myself, and not because I don't trust my colleagues and staff. Here is my story this semester:

Much of my summer prep work went into grant writing. I found myself late in August scrambling to change the data collection systems for our PTEP programs. Because I was scrambling, I did not really have time to talk to coordinators and staff about what has changed, and how the new processes work. Delegation of responsibilities requires time for discussion, and training people, especially if a new technology is involved. None of that happened. The result of it is that we had many organizational and technological glitches (if you discuss a change a lot, and test extensively, less of this happens). But remember, I did not inform and train other people to help with those glitches, so I ended up doing a lot of trouble shooting myself: no one knew how to help. This creates a vicious cycle: I run around and plug the holes, and therefore have no time to catch up on information and training. The end of the semester came unexpectedly (who knew, right?), and I find myself in the same position again: rewriting the handbooks for the next semester, no time to talk to others. Besides, a couple of unplanned problems came up, some very time consuming, others less so. But again, they always do come up and should be time-budgeted for.

Could this be avoided? I am not so sure. The cost of delaying the changes is also high. I think our new data collection system is a lot better than the old one; it will eventually become much better, when the kinks are worked out. It is almost completely paperless, gives us much better data much faster, and involves significantly less work for students, cooperating teachers, and supervisors. I also learned that if you delay a change for a semester, it ends up being delayed for three years. Why? - Because if you don't do it during Summer, you surely will miss December, and then something may come up in the next Summer. And those are really two windows of opportunity for implementing changes. However, in Summer, very few faculty are around. In December, they all run around looking exhausted, and will shoot without a warning if I call a meeting. The world we live in gives no time to improve things, because we're too busy doing things that need to be improved.

The lesson I've learned is that getting involved in just one too many projects may have a chain-reaction effect on a whole number of other projects. I also learned that one may become an involuntary micromanager. Just need to get a grip and start planning how to get from under this one.

For those of you who do not know, the new system is pretty simple. All PTEP programs (we still need to convert two more) collect the following data:

  1. Work Sample portfolio through; they all have rubrics that collect evaluation data. We also figured out a way for students to feed data back to iwebfolio, and scan and upload needed documents (mainly, the Diverse Field Experience, and the last lesson observation form).
  2. Standardized lesson observation form: those are short, make sense to us, and incorporate different content knowledge areas.
  3. On-line Final Evaluation modules for cooperating teachers and supervisors, AND Exit Surveys for graduates; both on

It is not surprising, given our numbers, that many technical and communication bugs need to be eliminated for this simple scheme to work. That's been my project for almost the entire semester.