I’m reading a fascinating book right now by Gary Small and Gigi Vorgan’s titled iBrain: Surviving the Technological Alteration of the Modern Mind.
iBrain looks at some of the many ways that our technology is changing the neurocircuitry in our brain. Using clinical research gathered through rigorous studies and experiments, psychologists and neuroscientists are increasingly able to pinpoint exactly what happens to the brain when we become addicted to our new communication technologies.
One of the most fascinating things for me about the book is what it says about the loss of empathy implicated by the digital revolution. If the authors are to be believed, our increasing addiction to technological tools (and toys) is leading to a race of humans less skilled in intuitive interpersonal skills, including empathy.
Here are some quotes from the book about some of the research that has been done on the relationship between the internet and empathy:
“Besides influencing how we think, digital technology is altering how we feel, how we behave, and the way in which our brains function. Although we are unaware of these changes in our neural circuitry or brain wiring, these alterations can become permanent with repetition. This evolutionary brain process has rapidly emerged over a single generation and may represent one of the most unexpected yet pivotal advances in human history. Perhaps not since Early Man first discovered how to use a tool has the human brain been affected so quickly and so dramatically….
As the brain evolves and shifts its focus toward new technological skills, it drifts away from fundamental social skills, such as reading facial expressions during conversation or grasping the emotional context of a subtle gesture. A Standford University study found that for every hour we spend on our computers, traditional face-to-face interaction time with other people drops by nearly thirty minutes. With the weakening of the brain’s neural circuitry controlling human contact, our social interactions may become awkward, and we tend to misinterpret, and even miss subtle, nonverbal messages.” (p. 2)
“Our high-tech revolution has plunged us into a state of continuous partial attention, which software executive Linda Stone describes as continually staying busy-keeping tabs on everything while never truly focusing on anything…. When paying partial continuous attention, people may place their brains in a heightened state of stress. They no longer have time to reflect, contemplate, or make thoughtful decisions. Instead, they exist in a sense of constant crisis-on alert for a new contact or bit of exciting news or information at any moment. Once people get used to this state, they tend to thrive on the perpetual connectivity. It feeds their egos and sense of self-worth, and it becomes irresistible.” (p. 18)
“Initially, the daily blitz of data that bombards us can create a form of attention deficit, but our brains are able to adapt in a way that promotes rapid information processing…. While the brains of today’s Digital Natives are wiring up for rapid-fire cyber searches, the neural circuits that control the more traditional learning methods are neglected and gradually diminished. The pathways for human interaction and communication weaken as customary one-on-one people skills atrophy.” (p. 21)
“Dr. Robert McGivern and co-workers at San Diego State University have found that when kids enter adolescence, they struggle with the ability to recognize another person’s emotions. During the study, the teenage volunteers viewed faces demonstrating different emotional states. Compared with other age groups, eleven and twelve-year-olds (the age when puberty typically kicks in) needed to take more time to identify the specific emotions expressed by the faces presented to them. It took longer for their frontal lobes to identify happy, angry, or sad faces, because of the prunning or trimming down of excess synaptic connections that occurs during puberty. However, once that pruning-down process is complete and the teenager matures to adulthood, expression recognition becomes faster and more efficient.
Scientists have pinpointed a specific region of the teenage brain that controls this tendency towards selfishness and a lack of empathy. Dr. Sarah-Jayne Blakemore of University College in London used functional MRI scanning to study the brains of teenagers (eleven to seventeen years) and young adults (twenty-one to thirty-seven years) while they were asked to make everyday decisions, such as when and where to see a movie or go out to eat. The scientists found that teenagers, when making these choices, used a brain network in their temporal lobes (underneath the temples), while older volunteers used the prefrontal cortex-a region that processes how our decisions affect other people. Such differing neural circuitry may explain why teens are less abel to appreciate how their decisions affect those around them.
See AlsoUnfortunately, today’s obsession with computer technology and video gaming appears to be stunting frontal lobe development in many teenagers, impairing their social and reasoning abilities. If young people continue to mature in this fashion, their brain’s neural pathways may never catch up. It is possible that they could remain locked into a neural circuitry that stays at an immature and self-absorbed emotional level, right through adulthood…. Are we rearing a new generation with underdeveloped frontal lobes–a group of young people unable to learn, remember, feel, or control their impulses?” (p. 30-32 and 39)
“Dr. Richard Haier and colleagues at University of California in Irvine studied gender differences in the brain according to IQ scores. They found that a larger volume of grey matter (the neuronal cell bodies), distributed throughout the brains of men, was associated with higher IQ, whereas for women, higher IQ came from the brain’s white matter (the axons or wires that connect the cell bodies), concentrated in the frontal lobe. This centralized frontal intelligence processing in women is consistent with other studies showing that women are more sensitive to frontal brain trauma than men, and it explains a woman’s advantage in taking in the big picture of complex social situations…. Excessive exposure to digital technology may make the male brain more likely to exhibit autistic-like behaviors–poorer eye contact and less ability to make empathic connections.” (91-92)
“Empathy–the ability to imaginatively see things from another person’s perspective, understand the person’s feelings, and convey that understanding back to the other person–serves as the social glue that keeps people connected…. Recent neuroscience points to pathways in the brain that are necessary to home interpersonal skills, empathic abilities, and effective personal instincts. In Digital Natives who have been raised on technology, these interpersonal neural pathways are often left unstimulated and underdeveloped. However, electronic overexposure leading to altered neural pathways and waning social skills can happen at any age…. Spending hours in front of the computer can atrophy the brain’s neural circuitry that controls recognition and interpretation of nonverbal communication–skills that are essential for both personal and professional success. Some studies suggest that these nonverbal signals constitute a higher proportion of what we communicate to other people than the actual words we speak.” (108 & 117 & 124)
“Spending hours playing video games or working at a computer does little to bolster our empathic skills. Neuroimaging studies have identified the specific brain circuitry that controls empathy. Although this circuitry varies according to a person’s abilities, most of us can strengthen our empathic neural pathways and improve our skills through off-line training…. Having empathic role models and experiencing our own pain can help shape our understanding of other people’s feelings. The high-tech revolution, however, often detracts from these abilities. Although the content of an email or a text message may contain empathic feelings the quality of the message is dramatically different from when it is expressed in person.” (133-134)
Further Reading