Memory in the Digital Age: Too Much, or Never Enough?

Memory in the Digital Age: Too Much, or Never Enough?


“Work that Ph.D.,” wrote Becky Lang, suggesting I respond to an article arguing that “we must remember to forget” in the digital age. In The Guardian, Stuart Jeffries profiles Viktor Mayer-Schönberger, whose forthcoming book Delete argues that we should program digital devices to forget most of what they record, lest we become trapped by our past. Too much memory, Mayer-Schönberger, argues, can keep us from creating ourselves anew.

Becky’s right: I have some views on this.

First of all, this is a classic example of a now-we’ve-gone-too-far argument. Since the invention of cave painting, humans’ artificial memory capacity has been increasing. Sure, digital technology is a game-changer: our capacity has increased geometrically in the past couple of decades. But is the sky falling now, when it’s stayed firmly in place for the past several thousand years? I don’t think so.

Second, there’s no way in hell we’re going to start setting our devices to automatically forget things as Mayer-Schönberger argues. We’d like control of information regarding our own lives, but in most cases we’d like to keep that information within our grasp. If you want to hit delete, you should be able to—and it’s generally-agreed best practice that you should be able to. If you ask Facebook to forget everything about you, it will. But we’re not going to accept automatic forgetting, because digital memory is too useful for reasons ranging from security to sentiment. It’s the chip-in-the-head scenario: it seems creepy to implant tracking chips in kids’ heads, but just watch it happen, because what if your kid was ever abducted? You know you’d want that tracking chip in there.

Mayer-Schönberger mentions the concept of “reputation bankruptcy”: the idea that we should be allowed to declare bankruptcy on our personal reputations and start from scratch with a clean Google slate, just like you can do with money. But though there may be cases now where that seems necessary—maybe Harriet Schwenck is sick of being That Lady Who Got Elton John’s Flowers—society is already adjusting to our increasingly long-lived digital memory. Bill Clinton couldn’t admit inhaling marijuana, but Barack Obama published an autobiography casually admitting that he’d done “blow,” and it wasn’t a big deal when Obama was elected only 16 years after Clinton.

When Facebook first started becoming ubiquitous, counselors and parents would wag their fingers at high schoolers and college students, warning them that an ill-advised kegstand pose could cost them a job. No one needs to be told to watch their Facebook photos any more, but that’s not just because we’ve internalized our counselors’ advice. It’s just not as big a deal any more.

Both Becky and I maintain very, um, frank Twitter accounts and freely—hell, proudly—associate our names with this blog, which publishes posts like “A Guide to Fucking Hipster Girls” and “Things I Think While Simultaneously Giving and Receiving a Blow Job.” Our bosses are cool with it. True, we both work in creative jobs where pushing boundaries is encouraged (Becky’s in marketing, and I’m in journalism), but we’re also not too worried about burning uncrossed bridges. Maybe we should be, but we’re both betting that future employers will judge us based on the quality of our work and not on whether we’ve ever been associated with anything lewd or even illegal. It would be a different matter if we were documented committing murder or major fraud—but would you really want us to be able to declare reputation bankruptcy after that kind of thing?

That Ph.D. Becky mentioned is in sociology, and as a sociologist I’m biased in favor of more data. The more you know, the better your ability to understand how the world works so you can make informed decisions about it. “Those who cannot remember the past,” wrote George Santayana, “are doomed to repeat it.”

Finally, there’s a difference between knowing and…well, knowing. There’s a lot more information available about my life than there was about my dad’s when he was my age, but I’m not in any less need of help understanding who I am or deciding what I should do. Look at those thousands of hyper-documented teenagers who follow The Tangential on Tumblr—they’re live-blogging their lives, but when you look at their blogs, they’re basically wondering the same thing Dion asked in 1959: “Why must I be a teenager in love?” Like sociologists, they have more data than ever—but they’re still looking for theories to make sense of it. I have a hunch that putting expiration dates on those teens’ blog posts won’t bring them—or us—any closer to freedom, understanding, or happiness.

Think of Jim Carrey’s character in Eternal Sunshine of the Spotless Mind: having arranged to have his ex-girlfriend Clementine erased from his memory, he finds himself changing his mind as the procedure begins, clinging to a memory of a close moment he and Clementine shared. “Let me keep this moment!” he pleads to the scientist erasing his memory. “Just this one!”

You never know what that one moment will be: that one photo you’d pay a million dollars to have back, that one voicemail you’d cut off your ear to hear, that one day you’ll want to relive again and again. That’s why we’ll keep them. We’ll keep them all.

Jay Gabler

  • the idea that we should program devices to automatically forget information to preserve the “sanctity of forgetting” is cute and precious and really dumb. gables to the rescue.

  • Good post.  Reminds me a little bit of a piece I wrote about post mordem digital artifacts.  http://tweetandmeet.com/digital-ghosts-dead-with-zero-footprint

  • H.G.

    I expected more from you on this topic. It doesn’t even scratch the surface of memory in a digitally social world. *thumbs down*

  • One huge leap we are all asked to make in order to follow Mr. Mayer-Schonberger’s argument and actually believe that it might be necessary to have “permission devices” strapped around our necks in order to alert nearby cameras of the desired expiration date of one’s photographic likeness–shelf life of epic bro shots: 100 years–is to assume that other people somehow won’t be able to distinguish between what it means to be an IRL actual human being (warts and all) and the image-heavy, homemade tabloid magazine of each of our lives that shows up online via social media.   

    He’s basically saying that we need to micromanage our online presence because there’s going to be some HR employee creeping your Facebook profile 10 years from now who’s not going to be able to look at the photo of you doing something unflattering, err, not in accordance with CV-promotion, and not be able to say, “Yeah, I’ve dressed in drag at a party, too.” Or whatever.  Seriously, empathy is a fucking basic human characteristic.  People do dumb things all the time.  Occasionally overlooking that fact is part of the give and take of our daily lives.  

    I hate to say it, but this book’s about 10 years too late, and this puff piece in the the Guardian UK is the worst kind of print media fear-mongering, completely predictable in its attempt to create a market for something for which there is no need.  Hint: the solution that resurrects our sacred and time-honored “ability to forget” isn’t the delete key.  As M-S envisions it, it’s a fucking product.   So there you have it.  

    And speaking of bad ideas, is someone really trying to make the word ‘brainwave’ happen?  As in, “Shelaigh had a brainwave and decided to check her old emails and look for the address.”  Really? Brainwave?  Is something wrong with ‘thought’ or ‘idea’?  I guess I shouldn’t be surprised.  We were all asked to do without such intellectual niceties in order to buy any of this article’s nonsense.   

    Thanks, Jay Gabler.  I won’t be bothering with this book.      

    • An apropos aphorism: “OWN YOUR SHIT. Lindsay Lohan can’t untag herself from Us Weekly.” (John James Wallace)

  • Pingback: Revisiting “Eternal Sunshine of the Spotless Mind” | The Tangential()