Over the last few years, as LifeReady has taken root at McDonogh School, I’ve received lots of questions about what, in fact, LifeReady is and how we might measure its success. To clarify my own thinking and to answer those questions, I wrote this paper. It is very much a working paper — indeed, it still needs accurate footnotes and references, but I’ve tried to indicate all of my sources as I’ve drawn on them.
I’d love your feedback, either in the comment section or by direct message.
The question understandably and rightfully comes up: how do we know LifeReady teaching and learning is effective? How do we measure it? Why and how can we invest in LifeReady?
To answer these questions, we must first understand what LifeReady is and what it isn’t. We must also confront the fact that few independent schools have ever tracked their impact reliably — ever. We are an unregulated enterprise, and while we have terrific anecdotal and unscientific data that we have most certainly had impact on students, few or none of us have accurate, scientific measurements about teaching and learning. And then we must propose a way to collect meaningful data about teaching and learning in our school at the same time that we protect the autonomy that attracts and retains the best faculty in the world. The challenge for McDonogh — and for any great school — is to honor the best practices that have always distinguished the school while evolving to meet the demands of change and disruption.
Beware of The False Choice
LifeReady is often misunderstood, and, as has been evident in recent years, people sometimes make assumptions rather than ask questions about what it is. LifeReady is a plan — a public promise — to help McDonogh School investigate, design, maintain, and measure the strongest, most life-affecting academic program it can. McDonogh sees its program has having an impact for life, with college placement only one (though significant) event in that life. LifeReady is aspirational and inspirational. It holds school leadership accountable to maintain the trust in its families have always had in the school while innovating in ways based on research, trial, feedback, and common sense.
A frequent misperception about LifeReady is that it stands in opposition to the kinds of teaching and learning that has characterized McDonogh School for nearly 150 years. This is patently false. Indeed, much of the work we’ve undertaken in the last several years honors strong pedagogical practices that have always been productive for student learning: deep discussion, projects, field work, writing, presentations, lab investigations, strong teacher-student relationships. But LifeReady has also allowed us to examine less-productive practices: rote learning, “transmission” teaching (i.e., teacher talking, students only listening), recall testing, knowledge acquisition without context, summative assessments only.
How do we know that some approaches to teaching and learning are “less productive”? We have consulted the most recent research on learning and the brain. While this paper doesn’t allow for a full review of that research, these studies — all of which share stable findings — review the research and provide recommendations for classroom practice (cf. Willingham, 2009; Hardiman, 2012; Brown et. al, 2014; Carey, 2014; Ambrose, et. al. 2010). We know, to cite one example, that passive learning — i.e., merely taking notes on information transmitted by a teacher — has a very short life-span: most of that information will be forgotten if it’s not used (Peter C. Brown, et. al. 2014). It can be memorized for a test, and students can recall information enough to do well on tests. This doesn’t seem such a surprising proposition, for most people’s experience will convince them of this. But education in the last two centuries has followed this assembly-line, “Taylorist” model (Davidson, 2017; Rose, 2016); for its time, it was an efficient delivery system for a school vision seeking to educate masses of young people. But this mode of education isn’t effective when measuring students’ understanding in a conceptual, long-term, and transferable way. Nor is it the kind of learning that futurist research will say is most useful to them (cf. “Dancing with Robots”). But LifeReady makes clear that deep, authentic learning is our objective as we take students to the core competencies of the Plan, and it has been leading our careful, innovative work so that we can be intentional about the best practices to honor this vision.
We do know that the brain likes to learn when it is challenged with questions, real problems, and achievable outcomes (Willingham, 2009). The brain likes to think when it believes thinking will be fruitful. We also know, as Daniel T. Willingham makes clear in his critical study, Why Don’t Students Like School?, that we learn what we think about. “Memory is the residue of thought,” Willingham concludes. This claim — stable across the literature and axiomatic for McDonogh — has led us to invest time and resources in the frameworks and practices of Harvard Graduate School of Education’s research group, “Project Zero.” That group, over fifty years old and home to some of the most famous researchers in education, is producing bleeding-edge research and practice that is a perfect fit for LifeReady’s vision and for an immensely talented and successful faculty. This work has provided rich and sustained professional learning for our faculty, PreK-12, over the last two years and will continue to do so for the foreseeable future.
In addition to understanding the findings in brain research, studying the educational research out of Harvard and other research centers, McDonogh has done a good deal of futurist research in order to frame not only what we teach but how we teach, for students’ skills, habits of minds, and behaviors will mean as much to their success as will their acquisition of knowledge. Analysis of future signals and trends make predictions about the kinds of people who will succeed in this century. If we look at just three pieces of research, we find striking similarities in the predictions. The National Network of Business and Industry’s “Common Employability Skills” framework lists competencies in mathematics, reading, writing, oral communication, and problem solving (hallmarks of liberal arts learning, which will continue to be the core of McDonogh School’s program of study). But this same framework lists planning and organizing, decision making, customer service, technological understanding alongside more traditional school curricula. Teamwork, adaptability, and initiative are also equal players with these other competencies. A recent research paper prepared by the ACT and Joyce Foundations in conjunction with the Institute for the Future lists the following as essential to success in this century:
- Cross-Cultural Competency
- Social Intelligence
- Virtual Collaboration
- Novel and Adaptive Thinking
- Cognitive Load Management
- New Media Literacy
- Design Mindset
- Computational Thinking
The American Association of Colleges & Universities makes similar recommendations, which can be found in the L.E.A.P Initiative research. Their Essential Learning Outcomes are described in the following insert:
|Knowledge of Human Cultures and the Physical and Natural World
Focused by engagement with big questions, both contemporary and enduring
Intellectual and Practical Skills, Including
Practiced extensively, across the curriculum, in the context of progressively more challenging problems, projects, and standards for performance
Personal and Social Responsibility, Including
Anchored through active involvement with diverse communities and real-world challenges
Integrative and Applied Learning, Including
Demonstrated through the application of knowledge, skills, and responsibilities to new settings and complex problems.
What are we to make of these recommendations? First, you can see that the time-honored liberal arts program still stands as the core of what students should learn, and that has been — and will continue to be — true at McDonogh. But the competencies that extend beyond just core content are too consistent across these various sources to ignore. How do we teach teamwork, ethical reasoning, problem solving, and integrity?
The answer lies less in what we teach and more in how we teach — and how we assess the impact of our teaching. What we might call “traditional” teaching — i.e., the passive transmit-recall-test model — might expose students to information and ideas and procedures, but it doesn’t do much else, unfortunately, and the brain science argues against its long-term retention.
The good news is that as we re-frame our liberal arts curriculum with research-based frameworks and practices, we are able to deliver the same content but in ways that improve student retention, understanding, and transferability. Additionally, when students work in groups, tackle problems in teams, and engage in authentic assessments of understanding, we are able to have them practice the “soft skills” so consistently called for in futurist research. And so this re-framing of curriculum — this shift of pedagogical approach (i.e., how teachers teach) — isn’t a categorical break with the past; rather, it’s an evolution of practice that honors the best of what we’ve always held true in education as we find ways of preparing students for the world they will inherit.
But How Do We Know Our Impact?
First of all we still assess students as we have in the past, and we still give letter grades for assignments. That hasn’t changed. But quantitative evaluations of this sort, while easier for the teacher to grade, don’t always tell us what students really understand. In other words, how do we know that a student understands and makes sense of the operation, 2 + 2 = 4? She could certainly memorize this without really understanding it. And so we give different problems on a test: 3 + 5 = ?. Now, this way of testing can reveal a student’s thinking. If she answers 8, then we have some indication of understanding. But most of us have experienced the kind of tests where we crammed information, definitions, dates, etc. so that we could prove recall when, in fact, there is no indication that we understand these things. “A metaphor,” one might write, “is a comparison that doesn’t use ‘like’ or ‘as’.” While that is correct, is it evidence of understanding? In most assessments of this form, we simply do not know. Understanding is most powerfully revealed when we see students use knowledge in new ways and, ideally, new domains. When a student can study metaphor in a unit on poetry and can then use metaphor in a letter to persuade a senator to vote in particular way on a bit of legislation — that’s transfer and, therefore, evidence of understanding.
Two things might occur with this example. One might say, “well, yes, I did transfer work just like that when I was in school. How is this new?” Precisely! Some of what we are calling “new” pedagogical modes are really the very best of what teachers often instinctively did. Not so radical at all. But too often these practices weren’t intentional, and too often these understandings in students were 1) assessed too late (say, in a unit test or final exam) or 2) weren’t assessed at all — a matching test, for instance, doesn’t give us real evidence of understanding: it tends to test recall only.
The work McDonogh has been doing with LifeReady, then, has been to steer teacher practice in the direction of teaching and assessing for understanding. How, we might ask, can we make more use of these kinds of higher-cognitive practices every day with students so that we are making sure they are learning deeply — a core tenet of LifeReady? What kinds of thinking do we want our students to do? How do we know that they’re doing it?
To get at these questions, teachers are continuously experimenting with their practice so that assessing isn’t only something that happens at the end of a unit but that can occur frequently during a unit or lesson. The more visible we can make student thinking and understanding day to day, the more precise can be our teaching and the deeper can be their understanding. This, then, is a far more intentional and rigorous approach to teaching and learning. It also takes care to retain the best of what we’ve always done.
What Are the Challenges to Assessment?
First of all, assessment has always been a flawed, imperfect practice in schools. The only way to really understand impact is to have standardized tests where randomized groupings of children with the same preparation are tested under controlled conditions — and this must be done over time. To be sure, individual assessments in classes don’t meet these standards, and ACTs, APs, and SATs are coached, often rewarding those who can afford prep and punishing those who can’t.
Independent schools are (mercifully) removed from federal and states regulations when it comes to academic program, and so we don’t have tests like NY State’s Regents’ exams or the PARCC tests in Maryland.
What does this mean? We have never had any reliable metric or data that measures the impact of our teaching on students. It doesn’t exist. Now, we do have years of graduates and families who can certainly see impact in their lives and in their children’s’ lives. But we don’t have more controlled, disinterested data.
LifeReady seeks to address this at the same time that it explores best practices for student learning in the 21st century. How?
We want to teach for understanding, and we want to be able to assess learning of habits, skills, and other behaviors required of people for the future. This past school year (2017 – 18), the entire PreK-12 faculty worked in vertical teams to articulate the enduring understandings for every subject area in the school: mathematics, ELA, history, arts, library and media, science, wellness, physical education, Roots, world language. These understandings describe content and skills learning that will prepare students for additional work in this field but also prepare them to transfer these “ways of working/knowing” to other fields of study. (See Appendix 1 for an example of Enduring Understandings for English Language Arts.)
In the coming school year (2018 – 19), the PreK-12 faculty, under the guidance of LifeReady team facilitators, will pursue a single big question: now that we have declared, collectively, the understandings sought in every disciplinary field, what counts as evidence of true understanding? This will lead us to consider ways in which students move beyond merely recalling facts and knowledge so that we can get them to perform understanding. Again, performing understanding takes place whenever students practice their understanding by using skills and knowledge in new ways and in new areas. While performance connotes large-scale events, this is misleading; creating a metaphor or an analogy for an idea, for instance, is a performance because it makes understanding visible and it calls upon higher-order thinking skills. This kind of assessment practice is really no different than what we expect of athletes or musicians or dancers who train, are coached, and then perform, either in a contest on the field or in an exhibition hall.
How Do We Measure Achievement?
Before, during, and after learning takes place, we measure what students can do and understand in their learning. For example, in the lower school, students are asked to compose, say, a paragraph before a unit on writing before a unit begins; this provides valuable insight to a teacher who can then adjust her approach to that student’s needs. During the unit, the teacher doesn’t merely describe what good writing is; she provides some modeling and then creates ample opportunity for students to practice new skills and understandings. At the conclusion of a unit, the student then produces a product (creates an original piece of writing, which is a pure performance of understanding), which can then be measured against the original effort, and growth can be indicated.
This kind of learning process should not seem new at all, for we engage in this cycle in just about everything in life: football, theatre, industry, scientific research, business. Indeed, KPIs — key performance indicators — are standard whenever measuring new initiatives; this applies to learning as well, for we want to know what those indicators are — i.e. what is the evidence that real, deep, durable learning has taken place?
What Do We Measure?
We measure the understandings we have declared are important for our students. If, for example, we believe that students should understand that “written and oral communication uses a variety of strategies to connect with a variety of audiences,” then we must describe what those strategies may include, and we must describe varying levels of proficiency when using such strategies. We communicate these domains and proficiency levels through carefully-written rubrics to the students before they embark on a project, and teachers use them to assess where students are in their growth.
How Should We Communicate Growth?
Right now, we communicate growth through report cards and through a few parent-teacher conferences a year; in this way, McDonogh is like most independent schools in the country. But we have the ability to communicate much more evidence of growth to parents and to students. We can do this by having students and teachers keep a portfolio of key performances of understandings (KPUs). These provide concrete evidence of achievement and indicators of places where child need to grow.
Currently, a few teams at McDonogh are experimenting with a portfolio system called Lessoncast. This is a customizable web-based platform that can use our PreK-12 alignment understandings, our rubrics, and our LifeReady competencies so that evidence of student learning can be exhibited beside these metrics. In the 2017 – 18 school year, the 4th grade began to use this in conjunction with the Passion Project, a student-generated investigation about something that really matters to them. This project necessarily crosscuts many disciplines and coordinates with the three core LifeReady competencies.
Few, if any schools, can establish their own longitudinal studies to measure real growth. And all external metrics (i.e. AP scores, SATs, etc.) produce data skewed by many factors. Traditional grades, too, lack sufficient explanation to communicate anything truly meaningful. Instead, a great school’s best hope is to leverage the expertise in a faculty to align and create a curriculum of rigor and distinction based on understandings sought; to create a cache of genuine performance opportunities (aligned to the cognitive rigor indicated by Bloom, Webb, and Hess); to curate growth-oriented rubrics that teachers can share among themselves and for students; to share out empirical evidence of each student’s growth through portfolio and direct communication. We know how parents relish those moments when children dazzle on the field or stage — why wouldn’t we seek to do the same in their classrooms?
|English Language Arts (ELA) Enduring Understandings|
|Students will understand that ⸻