‘The Mirage’ and Teacher Learning

Originally posted on my EdWeek Teacher blog, Road Trips in Education, on 8/10/15.


It’s back to school time, and teachers’ first days back often include some professional learning time, for better or for worse. In my case, some of that PD time and money will be invested in technology training in our district’s learning management system, but according to a recent policy report, that training won’t be a worthwhile investment unless I end up with a better teacher evaluation this year, or my students’ test scores go up.

The recent policy report on teachers’ professional development from TNTP, has raised some eyebrows in the past week. (TNTP stands for The New Teacher Project, though they’ve wiped that name from most of their website and it appears nowhere in the report). It’s no surprise to teachers and many others in school systems that there’s a lot of ineffective or marginally effective professional development out there, but TNTP suggests in “The Mirage” that the problem is much worse, and more expensive, than we realized. I think everyone involved in education wants to see teacher learning improved, and TNTP is not off-base in some of its suggestions about improving school systems and the teaching profession. And yet, the overall tone of the report seems insulting, as it suggests that we’re currently wasting a lot of money in failed attempts to help teachers who mostly lack self-awareness, vision, ambition, and the capacity for improvement. They’re just trying to help though, so we’re supposed to overlook the insult.

DSC_0124In TNTP’s study of three large, unnamed school districts and one unnamed charter management organization, the first detail that jumped out for me was the finding that teacher development expenditures in these districts are spending an average of $18,000 per year, per teacher.

I was having trouble imagining that kind of spending on professional development, so I started playing with the numbers a bit. For convenience sake, let’s think about a 180-day school year – a pretty close estimate in most cases, and it makes for some easy math. That’s a PD budget of $100 per day. $500 per week. Around $2000 in a full month. Sounds like an outrageous amount. If you deliver a one day training for 100 teachers on that budget, you have $10,000 to work with.

I work in a district that invests heavily in professional development. I tried a quick estimate, adding up a year’s worth of conference registrations and travel expenses, substitutes, my pay for district professional development days, and a generous $100/day for materials and incidental costs. For good measure, I threw in the costs of some available PD that I haven’t actually taken advantage of. I had trouble generating a number above $5000, even rounding up at every step.

Clearly, they must have included spending I wasn’t thinking of as “teacher development,” so I took a closer look at the TNTP report. How did they come up with $18,000? They acknowledge that there are many ways you could calculate the costs of teacher development, and so they actually generated three estimates per organization, “ranging from the most conservative definition of teacher improvement spending to a broader approach that considered anything that could be interpreted as teacher improvement efforts in each district” (41). A chart near the back of the report offers some additional though at times unclear information. For my purposes, I’ve only focused on the low and mid-level estimates, since that’s the source of the $18,000 figure.

The key takeaway regarding TNTP’s methodology is that they include personnel costs (salary and benefits) for anyone who can be linked to teacher development; we’re talking about significant percentages of teacher and administrator compensation, along with the entire cost of instructional coaches, and anyone who trains or supports administrators or coaches. In describing what’s included in the lower and mid-range estimates, the report includes, “School leader time for meetings with teachers for improvement (not evaluation-related); other school-based support staff time on direct teacher improvement efforts” (42) – in plain English, that seems to refer to what we pay administrators and coaches any time they’re talking to teachers about teaching. Another huge chunk comes in the form of “lanes spending” – the portion of teacher salaries that comes from movement across the salary scale. So if your base salary is $50,000 and you earn $54,000 due to post-graduate credits, there’s the first $4000 of teacher development spending. Then, TNTP estimates that 10% of teacher contract time counts as teacher development time (9), so there’s the next $5000. Now I see how it starts to add up. The appendix of the report is not detailed enough to reassure us that there’s no “double-dipping” – i.e., when they calculated the cost of teacher development as a percentage of contract, did they include the “lane” spending, or only the base salary?

At the mid-level cost estimate, TNTP also included “direct and indirect time related to teacher evaluation” (42).  Unfortunately, teacher evaluation has long suffered from too much focus on compliance, and not enough on growth and development. So, while it makes sense, in a theoretical, idealized way, to include evaluation costs in professional development, I don’t know that this methodology will resonate with the actual experiences of most educators, at least after their first few years. And to clarify, “Indirect Personnel Spending represents staff that manage direct teacher improvement efforts or spend time providing strategic or operational support to teacher improvement efforts” (46, emphasis added). Since student test scores are now used for evaluative purposes in many districts, it would seem that “teacher development” costs in this report also include paying the district data analysts and assessment specialists who are working the VAM-voodoo behind the scenes. By the way, the report uses the words “indirect” or “indirectly” 26 times: is that thoroughness, or an attempt to inflate the numbers?

The TNTP mid-range cost estimate also includes teachers’ “Contracted time, survey time estimates for formal collaboration” (42). It would be nice if collaboration always produced teacher development. I suspect that, too often, it just means meetings to plan lessons and units, coordinate calendars and assessments, review materials and resources, etc. There’s also the cost of “teacher evaluation non-personnel expenditures” (42), meaning… print materials relating to evaluation? The principal’s iPad? Information and data management systems? All of the computers and office furnishings for central office staff who deal with teacher evaluation? 

Nancy Flanagan and Peter Greene were out ahead of me in critiquing “The Mirage,” and have done a fine job highlighting the problems in TNTP’s assumptions about teachers, teaching quality, and value-added measurement. If you’re interested, please read their posts – I won’t re-hash them here. However, I have one critique to add regarding the use of extreme extrapolations to dramatize a point. We saw the same method in the much-debated value-added research of Chetty, et. al.: if you want to make a small difference sound big, find ways to multiply it. Their study argued that an effective teacher could affect a student’s future earnings, but it sounds more impressive if you multiply the effect by every student in the class over a lifetime of earnings: that way, instead of percentages that sound pretty small, you’re talking about millions of dollars! Well, TNTP used limited data from a few districts over a short time period to find little impact from professional development, but rather than consider that their model might not be robust enough to capture teacher improvement and student learning, they extrapolated their very limited data another thirty-plus years to argue that the current situation is rather hopeless: “And unfortunately, it is likely almost impossible for the average teacher to become ‘highly effective’ in some key instructional skills, based on current growth rates. In one district, for example, it would take the average teacher 31 years–potentially an entire career–to become ‘highly effective’ at developing students’ higher-level understanding; it would take 33 years for the average teacher in another district to do so. And for a teacher in another district in their sixth year of teaching or beyond, it would be nearly impossible to reach ‘highly effective’ in skills like using questioning and discussion techniques and designing student assessments” (16). We’re never going to improve!

Or, maybe if we worked at charter schools…

“The fourth school system we studied is a midsize charter management organization (CMO) operating across several cities. This CMO takes a markedly different approach to teacher improvement than the other districts we studied. While they have not solved the problem of teacher development entirely–and given the CMO’s size, it is important to note our limited sample sizes here–their results seem promising, and point to several strategies other districts might consider as they reassess their efforts to help teachers improve” (30). The report goes on to say that the students at the unnamed charter schools have both higher raw test scores and higher value-added scores. Never mind the details about which schools and communities, the characteristics of the students or teachers, the magnitude of the difference, or whose data and formulas support those claims. Because, this isn’t actual scholarship, and independent verification isn’t necessary.

And there is an agenda. Though TNTP is a non-profit organization, they are revenue-generating, and they’re selling solutions to the problems they identify; yes, states and districts can hire TNTP to improve teacher development. It’s also a familiar stance for TNTP to suggest that teacher training and preparation are broken. For example, in a prior report, TNTP reported the results of a survey of 117 “irreplaceable” teachers, and found that teacher preparation programs and school-based professional development had the weakest support among these decorated teachers when they were asked what made them better teachers: school-based PD had a 60% favorable rating, and teacher preparation had a 57% favorable rating. (For contrast, all teachers said their own teaching experience helped them improve, and over 90% said that observation of other teachers and direct feedback were also helpful). Still, most of the “irreplaceables” did actually find value even in these least valued kinds of training. TNTP’s authors described a 60% agreement as, “Our respondents have not found formal professional development at their school especially helpful, either: 40 percent disagreed or strongly disagreed that it had helped them improve. Only teacher preparation programs drew higher rates of disagreement” (12). I can’t decide if that’s brazen editorial spin, or a complete lapse in reasoning. In either case, that’s who we’re dealing with.

Learning Forward, the leading national professional association for teacher learning and professional development, reacted to “The Mirage” by saying, essentially, we’re not the problem, we’re the solution. Other than individual teachers and bloggers, I haven’t yet seen any strongly negative reactions to this report, and there’s much in “The Mirage” that I think most educators could agree upon, regarding the importance of professional learning and the fact that we’re not doing it well enough. Yes, we do need more individualization, focus, coherence, and follow through. Given TNTP’s history, though, I’m left with unsettling concerns about their faith in test scores as the key measure of quality teaching, and the report’s implication that most public school teachers and leaders don’t really know what they’re doing because they haven’t been given clear enough data, a vision of high expectations, or “metrics” to track their progress. 

One thought on “‘The Mirage’ and Teacher Learning

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.