I’ve barely touched Twitter in the last 3 months — it isn’t that I’m not tweeting, I’m not even checking it. This seemed to happen shortly after I read The Shallows by Nicholas Carr, another great book talking about humanity’s adolescent approach to the Internet at the moment. While the overall tone of the book was a little too “doomsday” for me, he had some fantastic ideas on the importance that immediacy has gained in the last few years.
24 hour news, Twitter feeds, Facebook statuses, news aggregators and near real-time search engines are pumping information at us at a pace we’ve never experienced before. As consumers, we’re all expecting and demanding this sort of information, as we’d check Facebook and expect the latest news from our friends, check the BBC News website for minute-by-minute updates etc. However, over time this has had the undesired fact that this immediacy (or at least desire for) is now pervasive.
How many people have noticed their attention span has significantly decreased over the last few years? How many people feel like they’re undergoing withdrawal symptoms when they leave their phone at home? How many of you actively crave your information fix?
I remember when I first heard Bin Laden had been killed, I was on a beach in Costa Rica with little to no wifi. I was genuniely put out I couldn’t check the news from the beach to get the latest information. I have plenty of friends that practically sit on Facebook ensuring they don’t miss any major social announcement — in fact, they’ve even coined the phrase FOMO (Fear Of Missing Out).
I also noticed this ran deeply in my work life — Outlook beeps, toasts, flashes and notifies every time you get an e-mail (by default!) and I get a fair amount of e-mail. This has proved a constant distraction, despite the fact that probably 95% of the e-mail could be dealt with at a later time. Similarly, texts and IMs all add to the constant distraction.
So I decided to try a little experiment. Firstly, Outlook has been completely reigned in so that any e-mail that comes into my inbox, comes in completely silently. Next, phone alerts were turned off and Skype was silenced. Lastly, I made the conscious effort not to log into Twitter for a while and I decided to completely ignore tools like Empire Avenue and Klout (which I’ve had reservations about anyway).
The result has been quite astounding. My productivity and attention span has soared and I’ve found myself completing complicated work in a fraction of the time it used to take me. What started as an obsessive Twitter habit evaporated practically overnight — my cravings to read the latest tweets gave way to only checking Twitter a few times a week. Admittedly, I’m finding the BBC News and Facebook habits much harder to kick but I find they prove significantly less distracting as tweets have a significantly shorter lifespan (minutes?) to Facebook posts and certainly BBC articles.
The effect of technology on our attention span concerns me deeper than just being perturbed about our personal productivity. Our brains are remarkably adaptable, and will continue to rewire themselves thoroughout the course of our lifetimes (a concept called neuroplasticity). If one area of the brain becomes defunct, it will be requisitioned for another use over time — many believe “phantom limb” syndrome in amputees can be a by-product of this re-wiring. The deeper concern then is that as we become increasingly reliant on technology to provide us with answers, our memory will become increasingly transactive. We will lose our abilities to store and (more importantly) process and understand the vast quantities of information available to us. Indeed, studies are already showing we are losing our ability to store information but are gaining abilities to recall where it is available (BBC).
As a technologist, I genuinely feel that technology (for the most part) adds and augments our humanity and our ability to function as a society, but our adolescent treatment and application of our technologies may eventually suffocate our younger generations’ ability to learn introspective “skills” and subsequently grow up without the mental faculties to think in the deep and creative ways that creating bleeding-edge technology and science demands. In essence, just as the brightest star burns quickest, the blistering pace of our technology and the effect both physically (neuroplasticity) and culturally (immediacy) will mean our subsequent generations won’t be able to continue to move the bar higher. We will have stunted our own growth.
So now who sounds like a doomsayer? I’ll caveat all this by saying when I talk about “we” and “our”, I tend to refer to Western society. The BRIC bloc and other nations investing heavily in high-tech education will learn from any failings here and as the superpower shift happens over the course of the next 30-50 years. I certainly don’t foresee the Internet being the downfall of humanity!
As with all technologies, these concerns are all trends that have appeared in history before — Plato wrote tales discussing the impropriety of the written word and how dependence on the technology of the alphabet would alter a person’s mind, for the worse. The invention of the book and subsequently the printing press led to a massive proliferation of information that overwhelmed many. Similarly, since the advent of the Internet, we have all felt the effects of the vast, almost infinite, amount of information available to us all at our fingertips.
Many of the themes here mirror my thoughts after reading Lanier’s You Are Not a Gadget in that they revolve around our maturity to develop, consume and appreciate technology on a macroscopic/holistic level. I have no answers (yet!) on how Western societies may shift and curtail the current trend, but would be interested to hear people’s thoughts in the comments below.