Sunday, August 28, 2016

SLOs for a Real Education

Really interesting podcast (first in what is hopefully going to be a series) from Michael Wesch. I'm still processing what I think about the entire podcast (there were certainly parts that made me uncomfortable, which probably means it's something I need to think about more), but I wanted to pull out this quote about what real Student Learning Outcomes (SLOs) should look like (about 6:53 mark):
And we have to help them achieve all this within a bureaucratic structure that demands that we frame our goals in a few neat bullet points at the top of our syllabus in a section called: Student Learning Outcomes, often called SLOs. I've never been satisfied with these, they never reflect the complexity or necessity of a real education. If I were to write SLOs for a real education, they might be something like this:

Students will be able to:
  1. Ask questions that burn in their soul and take them farther than they ever thought possible.
  2. Open themselves up to others and new experiences, to challenge their taken-for-granted assumptions.
  3. Cross rivers of doubt and conquer mountains of fear to set themselves free.
I think this very nicely identifies the tension between the SLOs we are supposed to write (and achieve) and the ones that really matter. I know there are many that will read the above and completely dismiss them as late-night-college ramblings (which, indeed, they are), but I think we need to take the time to reexamine our "taken-for-granted assumptions".

Yes, there are more specific, down-to-earth learning outcomes for our courses that I think should be part of the discussion, but I think very few of those should be (or even can be) standardized for all students. These "late-night college ramblings", however, are the types of outcomes that I can support being a requirement (or at least a worthy goal) for all of our students.

So I wonder why it is that we shy away from discussions around outcomes such as these, and obsess over measuring how our students do on discrete, isolated skills that very few of them will ever need to actually use. Perhaps it's because we are afraid of what we will discover. As Wesch says (about 49:50 mark):
You can't just think your way into a new way of living; you have to live your way into a new way of thinking.

Tuesday, July 12, 2016

Algebra: First Semester Lesson Plans 16-17

As I indicated in my last post, I'll be teaching a section of Algebra again next year. Please read that post for all the context, but I'll just reiterate here that I like to make an overall comprehensive plan in advance, fully realizing that I'll have to adjust almost daily based on how it goes with students.

For anyone who's interested, here are my (draft) first semester lesson plans. My attempt to blog my lesson plans didn't end up generating any feedback (so I stopped), but if you have any feedback now feel free to leave them in the comments or contact me directly.

Sunday, May 22, 2016

I'm Going to Have to MTBoS the Sh*t Out of This

This year I got the opportunity to start a Computer Science program at my school, which ended up meaning I got to teach two semester-long sections of Intro to Computer Science in both the fall and spring semesters (and teaching myself Python last summer). The plan was that we were going to try to hire a real computer science teacher for 2016-17 but, since that didn't happen, I'll be teaching the Intro class again next year, along with an Advanced Python class and a Javascript class (and a colleague will thankfully be teaching a Java class) as we try to grow the program.

None of that preceding paragraph is really relevant to this post, other than to give some context as to why I'm asking for your help. Not that it needs context, because asking for your help is a good thing in and of itself, but I'm hoping to make you feel a little bit sorry for me and guilt you into helping. (Hey, it's worth a shot.) Because in addition to learning some more Python, and teaching myself Javascript, I'll also get the opportunity to teach a section of Algebra 1 again next year.

Now, the good news is, I don't have to teach myself Algebra 1, I sorta, kinda already know Algebra 1. Long-time readers of this blog may remember about 6 years ago when I got the opportunity to teach Algebra 1 after many years of being completely out of the classroom in my role as technology coordinator for my building. At that time I tried to blog my planning and did great for a couple of weeks and then petered out. This time I hope to do better, and to hopefully do a better job of tapping into the online community of teachers to get your help.

Six years ago I experimented with what's now almost universally called the "flipped classroom" (at the time there was still some debate about what to call it). While there are strong opinions on both sides of the flipped debate, I still feel like it was a worthwhile approach for me at the time because it allowed me to free up class time to do better things than I would be able to do otherwise if I had to "cover" the material in class. I was trying to bridge the gap between the expectations of my school and the rest of the math department and where I was hoping to take my Algebra class, and flipping allowed me to do that to some extent.

Well, now it's six years later, and I haven't taught Algebra the last two, and some things have changed. My district has transitioned completely to a common-core based Algebra course, using Agile Mind as the "textbook." But, more importantly, I'm more willing to go even further away from the mainstream (at least what the mainstream is in my building), and the resources available to me are even better, including even more teachers blogging about math, Desmos has gotten even better, and technology in general has gotten easier for students to use in the classroom. I think I can do better than flipping and, while at least some of the lessons/activities I used in class before are still great, I think they can be improved on.

So I'm sitting here planning my summer (last official day of work is this Tuesday) and trying to figure out how best to structure learning a bit more about Python (I have a rough outline for my new advanced course, just need to make sure I know it well enough to teach it), teaching myself Javascript and planning that course, and planning my new-and-improved Algebra class. I'm worried because the easiest thing for me to do is to focus on the computer science stuff that is new to me, and to slack off on the Algebra planning, because I know I can just rely on what I've done before and do an okay job.

But I don't want to do an "okay" job, I want to do a really good job, and take advantage of all these great resources (and my additional willingness to diverge from the traditional in my building). As I was thinking about how to do this, a line from the movie The Martian kept going through my head:
I'm going to have to science the sh*t out of this.
So the way I'm framing planning for Algebra is,
I'm going to have to MTBoS the sh*t out of this.
For the uninitiated, MTBoS stands for the MathTwitterBlogosphere, which is the self-given name of all the math folks who are sharing their thinking, asking questions, challenging each other, and generally discussing ways to improve their math teaching. My hope is twofold: first, I'm going to tap into (borrow/steal) all the wonderful resources the MTBoS has collectively created and, second, that by publicly blogging my planning I will not only encourage feedback from the MTBoS, but guilt myself into not slacking off on the Algebra planning part of my summer.

We'll see how that goes. If you're interested in helping, I would appreciate it if you would follow along with the blog I set up for this (right now the only post is a cross-posting of this one, but beginning on Wednesday I will start in earnest - RSS, Email). Fair warning that there is at least a 50% chance that this will flame out but, if everything goes swimmingly, perhaps this will not only help me tremendously, but serve as a resource for other Algebra 1 teachers.

Student and Parent Evaluations of Me: Spring 2016

Similar to last semester, I asked parents and students to evaluate me at the end of the semester via a google form. I think it's always good to ask for this feedback, even though sometimes you're not sure how honest and accurate it is. Even so, I learn something from it each time.

I also think it's important to be transparent, so here are the links to their responses. These responses are verbatim, except for some slight editing where personal information was included. Both students and parents could choose to leave their name if they wished (totally optional), and some of the parent comments included their student's name - I edited all of that out.

Student Feedback (36 out of 45 completed the survey)

Parent Feedback (16 responses to the survey)

Thursday, May 12, 2016

A Defective Method

No system is perfect. And no school will ever be perfect. But there's a difference between not achieving perfection and purposely creating a system that you know won't work. My school currently has a system for "credit recovery" that is designed to fail.

Like just about every school, we have some students who struggle in our classes. For a small number of those students, we have a "credit recovery" system in place, where they work with an online learning platform to make up classes they have failed. I have a ton of problems with this, not the least of which is that it's completely designed around the idea of "recovering credit" and not around the idea of learning (or what the student even needs). But even with those concerns, I would be willing to give it a pass if it provided a viable way for these students to jump through the hoops, graduate and move on with their lives.

I haven't ever had any interaction with our system but, yesterday, I had the opportunity to help one of our students who was working on their Geometry class on the platform. I was a tad bit surprised when the problems I was helping her with involved the Law of Sines and the Law of Cosines. Back when I was a full-time math teacher we taught that in Trig, but I figured perhaps with the changes due to Common Core that too had moved down into Geometry.

When I had a chance to look later, however, I discovered that while it's an option in Geometry, it falls in the "+" category, which means it's "Additional mathematics that students should learn in order to take advanced courses such as calculus, advanced statistics, or discrete mathematics." That hardly seems appropriate for a student who is struggling in mathematics and is participating in our "credit recovery" option as sort of a last-chance.

But, again, I thought perhaps it was something we had decided as a school was to be included in our Geometry classes in which case, while I still didn't think it was appropriate for this student, it would at least be consistent with our regular classes. So I went and talked to our Math Department and we don't teach Law of Sines and Cosines in our Geometry classes. Which means this struggling student, who is in our online-only, credit recovery option, is being asked to do more than the students in our regular, teacher-led classes.

But it gets worse. Because after the relatively straightforward Law of Sines and Law of Cosines problems (assuming that's not an oxymoron), she was presented with a problem something like the following. (Because it's in the online platform, I don't have access to it to see exactly what it said but, for reasons that will become clear in a moment, I feel relatively confident that this is essentially it.)
Using the defects method, which relationship represents the Law of Cosines if the measure of the included angle between the sides a and b of ΔABC is less than 90°?
Well, I read that a few times and was stumped. I had never heard of this "defects method." The student couldn't help me with what it was, so I asked her if we could go back and look at the "instruction" she had presumably had over this method previously on the platform. She said we couldn't because she was "locked out" now that she had finished that part. (I can't independently verify whether that is accurate, but she certainly thought it was.) So I googled "defects method Law of Cosines" . . . and found nothing.

Well, that's not entirely true. I found four or five links for it - all with various versions of that same problem that students had posted to various sites looking for answers (like this one). Unfortunately, I had a meeting to get to so I couldn't investigate further at that point, but later I spent more time googling and still came up with nothing. I did find something similar when talking about hyperbolic triangles (and I'm pretty sure even Common Core doesn't include that in high school Geometry), but nothing for 2D geometry. That night I asked on Twitter, and no one knew. And the next day I went in and asked our Geometry teachers, and they had never heard of it.

Now, none of that necessarily means it doesn't exist or that there perhaps wasn't some instruction in the online platform that would help explain it, but it does again make you wonder why it's being included in a credit recovery course for struggling math students. We don't cover it in our regular Geometry classes, none of the math teachers in the building (or who saw my tweet) have ever heard of it, and Google can't seem to find it either. Why in the world was this question there?

There are larger problems here, of course. How and why did my district select this platform? Who is overseeing the content and ensuring that students are actually getting content similar to the courses they are theoretically "recovering credit" for? Why do we think that students who struggled in a regular classroom, with a teacher and classmates to help them, is suddenly going to be successful as a learner in a learn-on-your-own online platform (even if the platform wasn't serving up the wrong content)?

Clearly, this "credit recovery" option is not at all about what the students need. It's not about what they want or need to learn to be successful in their future, it's not even about them being successful right now. It's just a desperate attempt by the adults in our system to somehow, some way, get these students to pass our required courses. As I said earlier, as horrible as that sounds, given our current system, if it actually accomplished that then I'd be okay with looking the other way (while still vigorously arguing to change the system). But it doesn't. We're taking these students that we've already failed and setting them up to fail again.

I still don't know what the "defect method" is in relation to triangles and geometry, but I have a pretty good idea what a defective method looks like in practice. If "defective method of instruction" was a standard, we would "exceed expectations."

Sunday, May 01, 2016

Goals Gone Wild

Back in February my school district changed platforms for our websites. As a result, we had to make all the usual design decisions, figure out workflow, and move content over. As part of that process, I realized that we did not have a link to our Unified Improvement Plan (UIP), an annual improvement plan required by the Colorado Department of Education. I then asked for a copy and posted it on our website (pdf). And then I read it.

Our UIP has two goals (Priority Performance Challenges), and I thought it might be worthwhile to look at each one individually.

Goal 1: To improve writing skills building-wide
At first blush, this is a goal I can strongly support. I think writing is critically important for our students. It allows them to express and communicate ideas, interact with others and their ideas, and refine their own thinking through the writing process. I also appreciate how it says "building-wide", which implies that the adults in the building will be working on this as well.

Now, while I strongly support this, that doesn't mean I don't have suggestions on how to improve it. As much as I support "writing", I think that's way too limiting of a concept. As Dr. Richard Miller, the English Department Chair at Rutgers University says,
To compose, and compose successfully in the 21st century, you have to not only excel at verbal expression, at written expression, you have to also excel in the use and manipulation of images. That's what it means to compose.
So I think a significant improvement in this goal would be to replace the word "writing" with "composition." It's not enough for our students to improve their "writing", but they have to be able to "consume and produce in the media forms of the day" (Jason Ohler - 1, 2, 3). Writing (in the traditional "text" sense), is still a hugely important component of that, and the writing process is also critical. But it's not enough. For students to communicate ideas, express opinions, interact with others and their ideas, and even refine their own thinking in 2016 (and beyond), they need not only text, but images, and sound, and video, and hyperlinks, and infographics, and storify's, and . . .well, you get the idea.

So while I like the idea of improving our students' (and staff's) abilities in this area, I think we are missing the boat when we limit it to simply "writing." But as I read further in the UIP, I was dismayed to see the "Action Steps" we were going to take to try to achieve this goal.

While these are all well-intentioned, I have serious concerns about our conception of what "good writing" (or I would prefer "good composition") looks like. In the first section, I think it's great that we're making sure all students have technology (of course I would), that we will utilize PLC time to discuss writing strategies, and that we will "imbed" (sic) writing more frequently (although I'm not sure that's actually happening in many of our classes). But the emphasis on "Data Days" and working on "skills missed in common assessments" is a bit worrisome. That seems to place more emphasis on how we're doing as a school and on specific, school-defined "skills" as measured by common assessments, and less emphasis on developing each student as a writer. (Those don't have to be mutually exclusive, but I worry about the focus there.)

I also really like some of the ideas in the second section focused on Mindset, but the implementation of that third item, "alter grading practices," seems to be lacking. While some individual teachers have certainly done this, our grading practices across the faculty are very much not aligned with a growth mindset (1, 2)

It's the third section that really, really concerns me. While "SLO" has the word "Student" in there, I think that again the emphasis here is on "results" as viewed from the school perspective. The use of "common assessments" and "MAP data" almost necessitates a narrow focus on "academic" writing skills and less of an emphasis on the actual purpose of writing for our students. Again, these are not mutually exclusive, but where is the student, their ideas, and their reason for writing in any of this? If our students were writing for a purpose, about things they care about, with audiences that matter, then those "academic" aspects could help achieve their goals. But when we focus on the "academic" aspects and ignore the reason and purpose for writing, I don't think it works at all. It actually turns off and discourages our students from writing for themselves. It focuses on the "performance" of writing for an assignment, instead of the "purpose" of writing for oneself (and others).

Plus, by focusing on "common assessments" and "MAP data", we ignore something else Miller had to say,
That's writing in the 21st century. It's multiply authored, it's multiply produced, and that's where English is going.
Multiply authored. Multiply produced. We, of course, would call that cheating on a common assessment or on MAP testing.

Finally, I would point out that none of these implementation strategies seem to involve developing the adults in the building as better writers (composers).  There appears to be no effort to ask staff to compose on a regular basis, or participate in any "writing in the 21st century" as Miller puts it, or to improve their skills and abilities in this area. If we don't model for our students, then we are not only missing an opportunity, but are pretty darn hypocritical.

Goal 2: To decrease the number of students that opt out of testing
Okay, read that again.

Yep, that's what it says. 50% of our goals as a school for how to help our students learn and grow focuses on getting more of them to take state-mandated tests. This totally flabbergasts me. In fact, I hesitated to write this post because this goal reflects so poorly on my school (and, selfishly, on me). This is not only not a worthy goal, it's flat-out embarrassing.

And then there's the way we are going to achieve this goal:
More "Data Days." More class time practicing standardized test items. More time spent trying to convince our students and their parents that these tests (and "test-taking practice) are helpful to them. Less time spent actually helping our students to learn and grow. (As a side note, because of a new state law, district's were required to formalize the opt-out procedure. Our district did so (pdf), but note the attempt to coerce the parents by making them feel guilty. Even worse, when parents did opt-out, we sent them a follow-up letter trying to make them feel guilty again and encouraging them to change their minds.)

I think our UIP is a pretty good demonstration of "Goals Gone Wild":
  • The harmful side effects of goal setting are far more serious and systematic than prior work has acknowledged.
  • Goal setting harms organizations in systematic and predictable ways.
  • The use of goal setting can degrade employee performance, shift focus away from important but non-specified goals, harm interpersonal relationships, corrode organizational culture, and motivate risky and unethical behaviors.
  • In many situations, the damaging effects of goal setting outweigh its benefits.
  • Managers should ask specific questions to ascertain whether the harmful effects of goal setting outweigh the potential benefits.
It also is a pretty good example of Campbell's Law:
The more any quantitative social indicator (or even some qualitative indicator) is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor.
Even when Campbell's Law doesn't rise to the extremes that we saw in the cheating scandals in DC, Atlanta (and who got punished 1, 2) and elsewhere, it has a truly corrosive effect on the culture of learning (1, 2) in our schools.

I'm not questioning whether the folks who wrote our UIP were well-intentioned - they were. Or even whether they thought these goals would be good for kids - they did. But if we're going to be required to take the time to complete this plan, I think we should spend a lot more time thinking about this from our students' perspectives, about what it means to be a good writer (composer), about whether participation rates in state-mandated tests are a metric that is useful for our individual students, and about what it really means to be educated and literate in 2016 and beyond (1, 2, 3).

Tuesday, April 19, 2016

The True Cost of Testing: Part 2

My last post on The True Cost of Testing generated a couple of follow-up questions I thought I might address.

One of those questions was what would we do to replace final exams? Don't we need some kind of assessment at the end of a semester/year? Maybe, maybe not, but if you do believe we need some kind of assessment at that point, I suggest an alternative. I wrote about this a while back, I think we should replace final exams with conferencing with students. Meeting with them individually and actually talking with them to see what they know, what they are still struggling with, and what they would like to do next would be much more valuable for the student than a final exam.

Another question focused on what should we do with that freed up time? We have about 175 instructional days with students but, as discussed in the last post, we lose about 17 or so to testing, which means we really have something more like 158. So one possibility is simply to give those days back to instruction. That effectively adds 11% (17/158) more days to our school year without actually increasing the number of days or spending any more money. That's an idea that everyone from Bernie to Donald could presumably support.

Alternatively, we could stick with 158 days. That would save us about 10% of our current budget, so that means we could hire 10% more teachers. At my school, that equates to about 11 more teachers, which could either translate to lower class sizes or additional offerings (or both). Which is better for students, 17 days of testing, or 158 days of instruction with 11 more teachers in the building?

Or perhaps we don't spend that saved money on additional teachers. Since we currently have 158 days of instruction that we're clearly satisfied with, then perhaps we use those 17 days differently. I'd suggest that the 2,150 students and 150 staff members participate in various forms of community service. Think what we could accomplish in our community with 2300 people, 17 days, and $2 million. Think also what the students could learn.

This undoubtedly would take many, many forms, some of them costing no money just donated time, and others taking both time and some money, but let me give just one example. What if we worked with Habitat For Humanity? With $2 million, 17 days, and that many volunteers, I think we could easily build 10 houses for families in need in our community. That's per year. Just from my school. Now only would it provide desperately needed affordable housing, but think of all the students would learn in that process. It could very much be an apprenticeship model, with students doing good while learning.

Even with 10 simultaneous houses going up, and even if students and teachers were split into 3 shifts a day at 6 hours each, that still wouldn't take all 2300 of us, so there still would be plenty of other volunteer opportunities going on at the same time for those 17 days. Maybe reading with students in elementary schools, maybe tending a community garden, maybe visiting seniors living in assisted living. And, obviously, we're only limited by our imagination in terms of finding activities that benefit the community while simultaneously teaching our students valuable skills. Part of the learning process would presumably be the students researching what the best use of that time might be.

So, once again, what would be the impact on the culture of learning in our schools? What would be the message we send to students (and teachers) of what we value and who we are serving?

Saturday, April 16, 2016

The True Cost of Testing

Next week my school is giving the state-mandated testing to our freshmen, sophomores and juniors. Colorado, like many states, has made some changes this year, and they are definitely an improvement. The overall amount of testing is decreased somewhat, and sophomores will no longer be taking the PARCC exam but instead will be taking the PSAT. While I'm not a fan of PSAT and SAT either, they are at least somewhat useful to some students.

When the news was announced in December, the focus was primarily on three things: the reduced amount of testing, the switch from the ACT to the SAT, and the timing of the announcement. What was missing was much discussion of the merits of testing in general and the cost (both direct and indirect) of the testing. For this post, I'm going to just focus on cost.

I've so far been unable to find anywhere on the Colorado Department of Education site the cost to the state of these state-mandated tests. (I'm sure it's there somewhere, I just haven't found it yet.)  This article seems to indicate that the current ACT for juniors costs about $2.1 million a year, and they are budgeting $1.8 million for the sophomore exam, plus an additional $432,000 for juniors who want to take the writing portion. The cost for PARCC/CMAS (9th graders and 11th graders in science) is harder to figure out, but this article from about a year ago shows the state will pay Pearson about $27 million for PARCC and CMAS. If we assume that the high school portion of that is about 10% (very rough estimate), then add in another $2.7 million. So, somewhere around $7 million in direct costs to the state. That figure, of course, doesn't include the indirect costs of staffing, materials, time, etc., nor does it include the same types of indirect costs to school districts.

But even what little focus there has been on that really quite large sum of money then ignores the opportunity costs that are ultimately paid by school districts (and kids). Let's look at my building as one example. Next week we will spend three days on state-mandated testing. While we run an abbreviated schedule in the afternoons, most folks will acknowledge that the classes held during that time are not optimal for learning. The students who took the tests in the morning are tired, and some students who do not have to take the tests in any given morning choose not to come to school just for the classes in the afternoon (definite surge in our absentee rate). Given all that, many teachers make the reasonable decision to limit their instruction during this time to less critical matters. Not that we don't try to make the time worthwhile, we definitely do, but it's tough to try to reach the same level of learning as in a typical day. So, for me, I consider those three days pretty much lost for instruction. (If you disagree, you can pro-rate the numbers I'm about to share accordingly.)

So what does this "cost"? Well, according to district budget documents, we spend $9597 per student per year (that's including federal, state, and local funds; the amount directly from the state is less than that). We have about 175 days of "instruction" (theoretically), so $9597 divided by 175 works out to just under $55 per day per student. Since we have roughly 2150 students at Arapahoe, a single day of instruction costs roughly $118,000. That means that for each and every day of instruction we choose to "give up," we are "forfeiting" that money. So each day of testing is costing us $118,000, or roughly $354,000 for the three days of state-mandated testing. (Keep in mind that does not include the pro-rated cost of the $7 million the state is directly paying, or the indirect costs to the state and especially the districts that I'm sure adds several million more.)

But, sticking to just the lost instructional time, we're now at $350,000 (just at my school). But there's more, of course. We currently choose to take a day of instruction in the fall to give the PSAT to all Juniors. (Yes, despite the fact that we're now going to be giving it to all sophomores in April, we are still going to turn around and give it to all of them again in October when they're Juniors. Why? National Merit.) So that choice means we're deciding to spend another $118,000 on testing. We're now up to $472,000 (just at my school).

But, of course, there's still more if you don't limit it to state-mandated testing. What other types of testing do we have at my school? Well, we give MAP testing in language arts and mathematics to our students. That's a bit harder to quantify in terms of cost, since we don't devote parts of entire school days to it. Instead, students are tested in their language arts and math classes (twice a year in 9th and 10th grade, just once a year in 11th I think). Making a very rough estimate again, I'll say that equates to about half a day per year per student, so $59,000. We're now up to $531,000 (just at my school).

We also have many students who take AP exams at the end of the school year. When students take an AP exam, they not only miss the 3 or so hours they are writing the exam, but they often miss the entire school day as they are pretty exhausted. There are certainly many arguments in favor of the usefulness, importance and value of AP Exams, but there are also arguments against. No matter which side you fall on, certainly those days are not available for instruction for those students. (And even for those students who are not taking an AP exam, teachers adjust what they are doing in class because so many students are missing due to the AP exams). I don't really have the data to completely quantify this, but our students write close to 900 AP exams in a given year, so if we take 900 times $55 per day per student, that adds $49,500. We're now up to $580,500 (just at my school).

My school also requires final exams each semester. We devote four days each semester exclusively to final exams, so eight days throughout the school year. While there are certainly some folks who will argue that final exams are useful, necessary and important, there are also arguments that they are not. Whichever you believe, they certainly don't much resemble instruction. So eight days times $118,000 adds another $944,000, so we're up to $1,524,500 (just at my school). Since many teachers also take at least one day to review for the final exams, you could perhaps add in more here (although that review has some instructional value, so I'll leave that out for now).

Then you add in the individual tests that teachers give. This is even murkier territory, since I do believe assessment - when it is done well - is very valuable, and how teachers give these assessments varies tremendously. But certainly there are a fair number of teachers who give "unit" tests multiple times a year that take an entire class period. That's time that is no longer available for instruction, so there is an opportunity cost associated with it. Let's make a conservative estimate and say that each students loses 4 days a year cumulatively to these tests. That adds another $472,000, so we're up to $1,996,500.

Now, that's waaaaay too many digits of precision, so let's just say $2 million as a rough estimate. We spend $2 million dollars a year on testing . . . just at my school.
Two. Million. Dollars.
We can debate my estimates and I'll freely admit that I'm just ballparking all of this, but at least it gives us a place to start the discussion. If we ignore the $7 million the state spends in direct costs, and if we ignore the additional millions the state and school districts spend in indirect costs, and just focus on what my high school spends on testing each year, $2 million is a good number to work with.

That's the cost of testing.

Yet even that isn't the true cost of testing, or at least not the total true cost of testing. My daughter is a sophomore this year, and in her language arts and math classes they have been doing some practice PSAT items. They are not spending a lot of time on this, but they are spending some. And just because they are practicing for the PSAT doesn't automatically make it a poor use of time, the skills they are practicing may (or may not) be valuable.

But I think we have to acknowledge that in addition to the actual time spent testing, we are impacting what we do in our schools. Even if you believe those practice items are valuable, keep in mind that those items change each year as the tests change. We used to do CSAP practice items, then TCAP, then PARCC, then ACT, and now PSAT and SAT. While those are certainly related, each time the test changes we change the prep we do. I think it's awfully hard to argue the high ground here about how valuable these items are when they keep changing based on which test we're giving.

And it doesn't just influence those test prep items. We change what we do in our classes based on these tests. From major changes like adjusting the entire curriculum, to more minor changes like materials selection and the emphasis we place on different topics within that curriculum, the current test ends up driving a lot of what we do (even if we don't want to admit it). Again, that doesn't necessarily mean that the things we are doing are bad, but I think we need to be honest and acknowledge why we are doing them. The question we need to ask is what would we choose to do with our students in the absence of those tests? Instead of trying to do things better, we should do better things.

It's not just the time (and dollars) spent on actual testing, it's the impact on the culture of learning in our schools. It's the message we send to students (and teachers) of what we value and who we are serving.

That's the true cost of testing.

Thursday, March 31, 2016

Deadlines Are For Kids

Anyone else remember "Trix are for kids"? Well, I'm becoming more and more convinced that, today, deadlines are for kids.

I hear endless discussions about deadlines and due dates. About how we need to teach kids responsibility. And how they need to have "consequences" when they don't turn in their assignment on time. (Never mind that both the assignment and the deadline are often pretty arbitrary, but I digress.)

So, let me share just a few examples from the last couple of weeks in my neck of the woods.
  1. My school district is implementing a new unified password system. As part of that, staff members have to answer 5 challenge questions in advance so that, if they need to change their password in the future, they can do it themselves via a web interface. The district sent out very detailed instructions for how to do this, a process that takes less than 4 minutes. A couple of weeks later 40% of staff still hadn't done it, so they sent me a reminder to send out to my staff, which I did. The "due date" was March 18th. Two weeks later (including a week off for spring break), and I've had multiple staff members come up to me and say, "Sorry, I didn't get to that, can I still do that?"

  2. We're 12 weeks into the semester. In one particular class there are 3 grades in the grade book: Week 1 participation, Week 2 participation, and a grade from January 15th.

  3. In another class, they took a test on March 18th. No grade (and, more importantly, no feedback, as of yet). Also three quizzes from before that that still aren't in the grade book.

  4. In yet another class, last grade is from February 28th.

    With all three of these classes the problem isn't so much the lack of grades (although that is still problematic when students are held accountable for their grade), but the lack of feedback. How can students learn from their work if they don't get timely feedback?

  5. We have a monthly newsletter for parents with articles that our submitted by many different staff members that is created in Publisher, then converted to PDF, and posted on our website. (I know, I know. Monthly. PDF. But this is progress, keep in mind that until two years ago we printed and mailed 2400+ copies of this 15-20 page newsletter each month.) I'm not sure the newsletter has ever been done on time. And, pretty much every month, there are at least two or three corrections that have to be posted because of mistakes (typos, incorrect information, etc.).

  6. We're trying to get some new computer science courses approved in my district. Part of the process is to present the info to a committee in my building. The week before the meeting I sent all the information to them. Admittedly, there was a lot of information so, anticipating that might be too much, I specifically pointed them to the one (page-and-a-half) document they should definitely look at which summarized the courses, and then they could drill deeper if they really wanted to. I get to the meeting and it appears as though none of them have looked at it. They ask me to write the courses on the white board so they can see them.
Do I ever miss, or stretch, or forget about deadlines? Sure, of course I do. But I also don't enforce (often arbitrary) deadlines for students, then scold and punish them (including docking their grade) when they don't meet them. That's not to say that I don't think deadlines can have meaning or importance, but it means that I think we need to keep in mind the bigger picture of what we're doing here.

So my question is, for all the teachers who are represented in numbers 1-6 above, what are you going to do the next time a student misses a deadline?

Tuesday, January 05, 2016


I think sometimes we use words assuming they mean one thing when they actually mean another. Much like accountability, I think "tradition" is sometimes one of those words.

When schools use the word "tradition" they usually mean it in the sense of school pride, a kind of institutional memory of the things the school has done well and wants to continue to do well. But I worry that - in practice - our use of tradition ends up meaning something quite different from that.

Here's how Merriam-Webster defines tradition:
  1. a: an inherited, established, or customary pattern of thought, action, or behavior (as a religious practice or a social custom)

    b: a belief or story or a body of beliefs or stories relating to the past that are commonly accepted as historical though not verifiable
  2. the handing down of information, beliefs, and customs by word of mouth or by example from one generation to another without written instruction
  3. cultural continuity in social attitudes, customs, and institutions
  4. characteristic manner, method, or style
While there's nothing here that particularly conflicts with how schools define tradition, there's also much here that the school-ey definition ignores. Words like "inherited", "customary", "cultural continuity",  and "customs" all share a common element: they are accepted without thought. It doesn't matter whether the idea is good, bad, or in-between, because it's "tradition" it's what we do. Too often tradition ends up being - in practice - translated into "because we've always done it this way."

I know that folks who like using the word tradition would protest that that's not what they mean. What they mean is all the "good" traditions that we have, all the things we're proud of and that we think define us. But the problem is that tradition is more than that. If there are certain traditions that we are referring to, we should be specific and enumerate them, because otherwise we are endorsing all the traditions that we have.

For example, my school is (justifiably) proud of our relationship with the Arapaho Nation. Since 1993, we have had on-going dialogue and interaction with the Arapaho on the Wind River Reservation in Wyoming, including taking a busload of our students up to the reservation, and a large contingent of Arapaho coming to our school for an assembly and to visit our classes. But we should also acknowledge that our "tradition" includes the 29 years prior to that when we didn't have the relationship, when our mascot was a caricature of an Indian (Native American) that we found out from the Arapaho in 1993 was actually closer to a Pawnee, and which (in addition to it being offensive in and of itself), we used in offensive ways (like putting it on the floor of the gym, offensive chants, etc.). What if in 1993 we had argued that "tradition" supported keeping our original mascot? (Which is the argument that many others are currently making, from high schools across the country to the professional football team in Washington D.C.)

To be clear, we have many great traditions at my school and we have done many things well in the past. But that was the past; we should be asking ourselves what we should be doing for our students today, and tomorrow, and for what the world is going to look like when they are our age. As John Dewey famously said,
If we teach today’s students as we taught yesterday’s, we rob them of tomorrow.
That doesn't mean we just "throw everything out" or that we "don't respect our traditions". But it does mean that we can't blithely use the word tradition to unthinkingly continue on the way we've been going. It brings to mind the song from Fiddler on the Roof:
Hodel, oh Hodel, have I made a match for you.
He's handsome! He's young! All right, he's 62.
But he's a nice man, a good catch. True? True!
I promise you'll be happy. And even if you're not,
There's more to life than that. Don't ask me what!

(italics mine)
Don't ask me what. Don't question. Don't think about it for yourself. Don't spend the time to figure out what you truly value. Just do it because it's tradition.

We can do better.