Back to the (educational) future?
In the late 1990’s I was taking a graduate level methodology class for teaching Social Studies. The class met at Roosevelt University’s Schaumberg (IL) campus, just outside Chicago. The instructor did a wonderful job in helping us understand the importance of making history come alive for our students. We learned about developing engaging lessons with clear themes that encourage student creativity. In general, it was once of the most useful and enjoyable graduate classes that I attended.
During one class, the professor shared with us a conversation that he had with an executive at nearby Motorola. The topic of education had come up between the two men and the exec really laid into his professor friend. “You guys in education have it all wrong.” “How so?” asked the prof. “You teach everyone to work alone, that communicating and sharing ideas is cheating. Once these kids get into the workplace we need to completely deprogram and retrain them to cooperate and collaborate, to work together effectively.” The professor, duly humbled, shared this powerful conversation with us. It was a concept that I always remembered as I began my career in the field of education.
Since then, a term has been coined to reflect these objectives in education. It’s called “21st century education” or “21st century learning.” The phrase refers to a new way of approaching education, with a greater emphasis on team building, problem solving, communication, and collaboration than what we saw historically. The emphasis in education has largely shifted from frontal, lecture based instruction in classrooms to modern classrooms are more fluid, with greater emphasis on differentiation, student discovery, technology, and cooperative learning.
(As an aside, I think that the term “21st Century learning” is presumptuous, if only because there is no way that any educational expert could reliably predict how the educational landscape will develop over the next eight plus decades. Imagine if a professor wrote as late as 1915, during WWI, about “20th century education.” In his article, he details many of the breakthrough techniques used by contemporary theorists and/or practitioners of the time. Would the theories and practices detailed therein reflect what educational best practice looked like in the late 1990’s? Most likely they would not. Even the revolutionary frameworks of later, mid-century researchers had by then come under question or at least have experienced some form of posthumous revisionism.)
Without question, these “21st century” qualities are important and belong at the center of how we educate children. I do, however, worry about the implications of 21st century thinking and wonder if we haven’t begun to discard the baby with its bathwater.
Recently, I spoke with the principal of a small independent school. We were discussing possible professional development (PD) topics when he expressed a strong desire to “return to basics,” which I (correctly, it turns out) interpreted as an interest in helping his teachers focus less on educational bells and whistles and more on such staples as objective setting and checking for student understanding. A number of other principals that I have spoken with and deliver PD for have spoken to me similarly in the past year or so. (In fact, my workshop on essential elements of instruction, which borrows largely from ideas developed by Madeline Hunter, is one of my most popular presentations for that exact reason.)
As I look around the field of education and particularly educational professional development conferences, I see a near-obsession with educational technology and other 21st century tools. We fly all over the country and beyond to learn all we can about the latest devices, programs, apps, and ideas for integration (such as flipped classrooms.) We then rush to implement as much as we can into our schools and classrooms.
Such thinking can at times be misguided. I remember once in the days of Web 1.0 participating in a mini-course in Chicago on how to develop educational webquests, a concept that was being pioneered by researchers at San Diego State University. The program (called WIT – the Web Institute for Teachers) was a required course for all Chicago Public School teachers. In fact, the enthusiasm for webquests was so strong that T1 lines were installed in schools all throughout the city. At one point, the program coordinator and lead presenter reported to a group that I was with that many of these resources were going to waste, because the teachers had not effectively bought into the concept.
Of course, buy-in is not the only concern. Even those teachers who view themselves as #edtech junkies must first master the foundational theories and skills of effective instruction and classroom management if they and their students are to benefit from the novel concepts and tools that the burgeoning 21st century has to offer.
It is my strong belief that if we as an educational community are to succeed in taking full advantage of our ever-evolving “21st century” tools, we first need to commit ourselves to sharpening our understanding and usage of some “20th century” (or older) educational staples such as Bloom’s taxonomy and objective setting. We need to be able to relate well to children, to understand what makes them and their minds tick, to inspire them to do great things, and to be mindful that if they are not physiologically in order then, as Maslow taught us, they will likely not learn well.