Why Your Memory Stinks: The Science of Remembering in the Internet Age
This article originally appeared on BrainUpFL.org, a former program of the Winter Park Health Foundation. Please continue to visit WellbeingNetwork.org for new content to fuel your intellectual pursuits and resources that support a healthy brain.
Take a second and think about the three most important people in your life.
Got ’em? Okay, now here’s a quiz: Do you remember all three of their phone numbers off the top of your head?
If you don’t, you’re not alone. Why waste brain space memorizing phone numbers when you can look them up on your cell phone whenever you want?
Not too long ago, we used to outsource information we didn’t know to friends and family. Instead of remembering the information ourselves, we’d remember who knows what. Dad knows how to change a tire; Rob knows all the baseball stats; Caroline knows how to get to Grandma’s house.
Now, we outsource memory to technology and the internet. The answer to any question in the world is only a Google search away. Our smartphones, our emails, WikiHow — they’ve all become a part of our external hard drive.
But that reliance on the internet and less on our own memory isn’t just changing our lifestyles. It’s actually changing the structure of our brains.
We’ve changed the way we take in new information. Our attention spans have become shorter. The massive amounts of information we expose ourselves to forces us to be more efficient about what we convert to long-term memories.
Has memory become obsolete thanks to the ubiquity of the internet? Let’s take a look how our dependence on technology and the internet has affected our brains and how we think, learn, and remember.
How We Make Memories
To understand how technology is changing how we make memories, let’s take a quick look at how we make memories in the first place.
Every time you learn a fact or have an experience, this information enters your working memory, also known as your short-term memory.
Your working memory is a fragile place. A new piece of information lives in there for only about 60 milliseconds before it’s either forgotten, or it moves to your long-term memory system.
What determines its survival? Sometimes, it’s your own decision, whether conscious or unconscious: You decide whether the information is noteworthy or relevant enough to warrant becoming a long-term memory. Other times, a simple break in your attention can make you forget it.
Only when facts and experiences enter your long-term memory can you weave them into more complex, big-picture ideas — a process that’s a trademark of our depth of intelligence, argues WIRED‘s Nicholas Carr.
It’s that jump from short-term to long-term memory that can be most profoundly affected by our digital lifestyle.
Why? For one, because our working memories can get overloaded with more information than our brains can handle; and secondly, because we’ve trained ourselves to trivialize the information we learn online in the first place.
When we go online to learn things, we often end up exposing ourselves to more information than our brain can possibly process and store.
Ever found yourself in one of those Wikipedia black holes? You go in looking up the name of a Russian prime minister and emerge, hours later, having read through the entire history of the Russian Revolution?
That leads to what’s called “cognitive overload.” It can also happen when you go online look up the name of that Russian prime minister and get distracted. You read your emails, scroll through your Twitter feed, and skim through a few articles within the same time frame.
All this activity online is an interactive process that requires a lot of quick decision-making. This is why neuroimaging studies have shown frequent internet users have extensive brain activity when actively surfing the internet.
But that’s not necessarily a good thing. All the skimming we do and the notifications we receive while spending time online can easily lead to cognitive overload. When the amount of information entering our working memory exceeds our ability to process and store it, we have trouble retaining that information in our long-term memory or drawing connections with other memories.
“Our ability to learn suffers, and our understanding remains weak,” writes Carr.
A Different Type of Memory
The 2011 study “Google Effects on Memory: Cognitive Consequences of Having Information at Our Fingertips” found that people who have access to search engines tend to remember fewer facts and less information overall because they know they can find the answer easily using the internet.
In other words, when faced with a question we don’t know the answer to, we’ve conditioned ourselves not to recall the information itself. We do not stretch our memories to figure out the answer. Instead, we are conditioned to remember how to find the answer using a search engine.
It still means we have to remember things, it just means we’re remembering a different type of thing. We’re remembering how to find the information. We remember best practices for online search queries. We memorize the websites that might have the best answers or clues for verifying a reputable source. It’s kind of like the use of calculators in the classroom: Students are expected to do less rote memorization and are trained more on how to find the answers to complex questions.
“The internet has become a primary form of external or transactive memory, where information is stored collectively outside ourselves,” reads the study.
However, that does mean we’ve trained our brains to treat online information as trivial and less worthy of our undivided attention. Each time we check email or Facebook or the news, we prepare ourselves for skimming, not for learning. In a way, we’ve conditioned ourselves to forget the information before we even read it. Our brains are less apt to focus, digest information, and convert it into our long-term memory. Instead, we have an increasing appetite for more stimuli.
Adapting to Our New Reality
The folks at Academic Earth put it best: “If the goal is to forge a creative mind through critical thinking, our Google amnesia may be problematic.” Like Carr said, the human ability to translate memories into complex thinking and analysis is part of what makes us uniquely intelligent.
But it’s not like we’re losing the ability to think critically altogether. Frequent internet usage is our new reality, and the answer isn’t to turn it off or blame the kids — it’s to adapt so we can lessen any negative impacts on converting facts and experiences to long-term memory.
Put simply, we’ll need to teach ourselves how to consciously prioritize information so we can deeply process the most important stuff.
But how? Just as we’ve trained ourselves to trivialize online information, we can also train ourselves to consciously commit information to memory.
Train Your Brain to Commit Information to Memory
Repetition is one way to remember things more easily. When you do or read something once, a neurological pathway is created in your brain. When you repeat that action and experience the same reward again, that neurological pathway gets a little bit thicker; and the next time, even thicker. The thicker that pathway gets, the more implicit recalling it becomes. That’s why re-reading important articles, for instance, can be a helpful way to process and store the information in them.
Another tip? Removing the interruptions that can break your attention and make you forget things stored in your short-term memory to begin with. This means closing our email and turning off notifications when we’re working. (Or even when we’re reading our favorite newspaper.)
The ubiquity of the internet — and its effects on the way we think and how are brains are wired — can be overwhelming at times. Not to mention, a little creepy.
“We become part of the internet in a way. We become part of the system and we end up trusting it,” said Daniel Wegner, the UCLA psychology professor who headed the “Google Effects on Memory” study.
So, how will our brains continue to develop as a result of technology and the internet? I’m sure we’ll see much more research on our dependence on (and even interdependence with) technology in the years ahead.