Memory is a bread-and-butter topic in my academic field of cognitive psychology—something that we’ve been systematically researching for over a half century now and where important discoveries continue to happen. As many of my disciplinary colleagues have pointed out, memory theory is also an incredibly practical body of work, one that we can put to use in countless arenas, including college teaching.
Around 10 years ago, I began writing about the applications of memory theory to instruction, looking for insights that teachers can use to make course material memorable. Many such techniques have to do with tapping into the mind’s mechanisms for accomplishing the core purpose of memory: retaining information that is relevant to our goals and our survival and retrieving it in times and places where that information will be useful.
Taking advantage of these inborn cognitive mechanisms means being mindful of the fact that no one—our students included—is likely to retain information they don’t understand or that they simply don’t care about. It follows that contextualizing information in terms of importance or how it will be used in the immediate future is key for helping that knowledge stick in memory. When there’s an emotional charge attached to the material, that helps as well. After all, when something we’ve encountered evokes a strong emotion, that’s a strong predictor that it will be important at some point in the future.
No discussion of research-based tips for strengthening memory would be complete without considering the extensive work on a powerful phenomenon called retrieval practice. First observed in laboratory studies over 20 years ago, this quirk of the mind has to do with what happens when we pull something out of memory, as when taking a quiz or answering a question without referring to notes. It turns out that engaging in retrieval practice radically improves the chance of that information being retained in memory. Fortunately for teachers, researchers now know that this improvement translates out of laboratory studies and into realistic educational situations, ones involving complex materials and long periods of retention.
This is all useful guidance onhow we can strengthen memory for course material. But it raises an important question: Why should we be worried about strengthening memory in the first place? After all, teachers who are passionately committed to developing higher-level thinking processes—which, let’s face it, ought to be all of us—tend to look on the whole idea of memory and memorization with skepticism.
One of the common reasons for this is some variation on the idea that teachers can focus on either low-level retention of facts or thinking skills but not both. The assumption is that high-value thinking skills—such as the ability to make inferences, think critically, and extend knowledge to new situations—are either not helped by or actively blocked by a focus on building factual knowledge. Especially if in our teaching we’ve relied on hierarchies of learning, taxonomies, and similar frameworks to prioritize what we do in the classroom, it may seem like we have to choose one or the other.
But is the memory-thinking tradeoff actually necessary? Given emerging research in the field, I’d argue that it is not. According to some of these new findings, memory and thinking are far more intertwined than many of us have been led to believe. For example, when students have formed a more solid base of knowledge—such as through retrieval practice—they are more able to engage in processes like inference and extension, not less.
From a strictly practical standpoint as well, for students to be able to effectively use what they know in fast-paced, real-world situations, that knowledge has to be solidly established. Just about any field of study or practice has a body of bedrock facts, knowledge that may exist online but that is highly impractical to stop and search for when you need it.
Questioning the wisdom of relying on Google instead of one’s own brain brings us to another set of assumptions worth investigating: the supposed ways in which technology affects our ability to remember and what these impacts mean for teaching and learning.
According to much of what’s been written in the popular press, the impacts of digital memory on biological memory are ominous, far-reaching, and something teachers ought to be fighting by any means necessary. Google might be “making us stupid,” in the words of one highly circulated article on the topic, a fear that was vividly expressed in one large-scale survey about the impacts of technology on modern life. Of all the potential worries respondents cited, cognitive decrements topped the list.
The research literature does bear out some aspects of this worrisome narrative. Attention is a necessary precursor for forming almost any kind of new memory, and mobile technology in particular is notorious for generating all kinds of distractions, thereby disrupting memories before they can even begin to be established. There’s also a more subtle, but potentially far-reaching dynamic involving off-loading of memory to technology. Especially when we assume we will be able to access a particular fact online, we’re less likely to commit it to memory. It is as if the mind, constantly seeking efficiencies, refuses to put in the effort to remember when there is a reliable backup in the form of technology.
But these concerns are not the only side of the story. In general, the decrements technology wreaks on memory are temporary and limited. There’s little reason to believe that things like smartphones create lasting, widespread cognitive impacts, even when we use them more than we should. Even the offloading phenomenon doesn’t spread throughout all of memory but rather crops up selectively, correlating to beliefs about how available a given piece of information will be in the future.
Having an accurate and nuanced understanding of technology and memory does matter, especially for instructors. A revealing example of how comes from recent debates over allowing students to take class notes on laptops rather than by hand. After a study came out showing that laptop note-taking led to reduced recall of lecture material, the notion that laptops should be banned in classes spread like wildfire, even generating op-ed pieces in major newspapers. But this conclusion was both premature and not totally supported by the original study design. Later studies failed to replicate some of the key findings of this research, and even the original study stressed that it was the techniques associated with rapid laptop typing, not the fact of typing itself, that seemed to be the critical factor. All in all, this episode offers a cautionary tale that we would do well to remember before putting drastic measures into place based on limited evidence.
We should also keep in mind how technology can be a net positive, complementing and expanding what we can do with our biological memories. For one, it provides multiple ways to take advantage of retrieval practice—without turning class into an endless set of drills and exams. Gamified applications entice students into participating with a dose of friendly competition; self-quizzing applications let them set up customized decks of nearly infinite numbers of questions. Because these quiz applications are so well-suited to smartphones, they also encourage students to practice during small bits of downtime—a pattern that we know reinforces memory much better than the classic multi-hour cram session.
Another benefit ties to an important but frequently overlooked form of recall known as prospective memory. This type of memory is what we use when we put an intention on hold—as when remembering to stop off for an errand after work, take medication before a meal, or make an important phone call at a particular time. Contemporary life is saturated with things we need to remember to do in the future, and this is nowhere truer than in our students’ complicated lives and schedules.
Unfortunately for contemporary humans, prospective memory is extremely fallible, especially in cases where there is any kind of distraction or significant time delay between the intention and execution of an action. Biological prospective memory may be terrible, but digital prospective memory is a superstar. And so, our students—and probably we instructors as well—would do well to use methods like calendar alerts, timers, and checklists to help us function in our complex and demanding lives.
These kinds of technology applications are good for our students, and teachers can strategically use them to free up time and space for developing the thinking skills that students so clearly need.
In sum, spending time building a base of knowledge isn’t something teachers should shy away from. And in the quest to do this, technology is neither unambiguously good nor consistently bad. Fortunately, memory research offers guidance in how to zero in on its benefits while managing its downsides. It’s something that every teacher in our technological age should reflect on, know about, and make part of their practice.
Anderson, J., & Rainie, L. (2018, April 17). The future of well-being in a tech-saturated world. Pew Research Center. https://www.pewresearch.org/internet/2018/04/17/the-future-of-well-being-in-a-tech-saturated-world
Carr, N. (2008, July/August). Is Google making us stupid? What the Internet is doing to our brains. The Atlantic. https://www.theatlantic.com/magazine/archive/2008/07/is-google-making-us-stupid/306868
Dismukes, R. K. (2012). Prospective memory in workplace and everyday situations. Current Directions in Psychological Science, 21(4), 215–220. https://doi.org/10.1177/0963721412447621
Dynarski, S. (2017, November 22). Laptops are great. But not during a lecture or a meeting. The New York Times. https://www.nytimes.com/2017/11/22/business/laptops-not-during-lecture-or-meeting.html
Karpicke, J. D., & Roediger, H. L., III. (2008). The critical importance of retrieval for learning. Science, 319(5865), 966–968. https://doi.org/10.1126/science.1152408
Lang, J. M. (2011, December 14). Teaching and human memory, part 2. The Chronicle of Higher Education. https://www.chronicle.com/article/teaching-and-human-memory-part-2
McDaniel, M. A., Roediger, H. L., & McDermott, K. B. (2007). Generalizing test-enhanced learning from the laboratory to the classroom. Psychonomic Bulletin & Review, 14(2), 200–206. https://doi.org/10.3758/BF03194052
Morehead, K., Dunlosky, J., & Rawson, K. A. (2019). How much mightier is the pen than the keyboard for note-taking? A replication and extension of Mueller and Oppenheimer (2014). Educational Psychology Review, 31, 1–28. https://doi.org/10.1007/s10648-019-09468-2
Mueller, P.A., & Oppenheimer, D. M. (2014). The pen is mightier than the keyboard: Advantages of longhand over laptop note taking. Psychological Science, 25(6), 1159–1168. https://doi.org/10.1177/0956797614524581
Siler, J., & Benjamin, A. S. (2019). Long-term inference and memory following retrieval practice. Memory and Cognition, 48, 645–654. https://doi.org/10.3758/s13421-019-00997-3
Sparrow, B., Liu, J., & Wegner, D. M. (2011). Google effects on memory: Cognitive consequences of having information at our fingertips. Science, 333(6043), 776–778. https://doi.org/10.1126/science.1207745
Michelle D. Miller, PhD, is a professor of psychological sciences and President’s Distinguished Teaching Fellow at Northern Arizona University. She is the author of Minds Online: Teaching Effectively with Technology (Harvard University Press, 2014) and Remembering and Forgetting in the Age of Technology: Teaching, Learning, and the Science of Memory in a Wired World (West Virginia University Press, 2022).