FEATURES

Separation Anxiety

Now that there's no escaping the digital world, research is getting more serious about what happens to personalities that are incessantly on.

January/February 2011

Reading time min

Separation Anxiety

Photo Illustration: William Duke

There are now roughly 2 billion Internet users worldwide. Five billion earthlings have cell phones. That scale of connectivity offers staggering power: In a few seconds, we can summon almost any fact, purchase a replacement hubcap or locate a cabin mate from those halcyon days at Camp Tewonga. We can call, email, text or chat online with our colleagues, friends and family just about anywhere. (In October, Mount Everest got 3G cell service.)

Yet, along with the power has come the feeling that digital devices have invaded our every waking moment. We've had to pass laws to get people off their cell phones while driving. Backlit iPads slither into our beds for midnight Words With Friends trysts. Sitcoms poke fun at breakfast tables where siblings text each other to ask that the butter be passed. (According to a Nielsen study, the average 13- to 17-year-old now deals with 3,339 texts a month.) We even buy new technology to cure new problems created by new technology: There's an iPhone app that uses the device's built-in camera to show the ground in front of a user as a backdrop on the keypad. "Have you ever tried calling someone while walking with your phone only to run into something because you can't see where you're going?" goes the sales pitch.

Stanford computer scientists and engineers have played a central role in the development of the gadgets and software enabling all this, from semiconductors to networking equipment to GPS to Google. And now a growing number of researchers here and elsewhere are exploring the social and psychological consequences of virtual experience and digital incursion. Researchers observe the blurring boundaries between real and virtual life, challenge the vaunted claims of multitasking, and ponder whether people need to establish technology-free zones. (Last year, enthusiasm for the "Sabbath Manifesto" project spread rapidly via the Internet—from which its creators specifically advocate unplugging on a regular basis.)

Sometimes these research threads seem at cross purposes. One Stanford professor, for example, argues that bringing the adrenaline-pumping ingredients of online game environments to the workplace could revolutionize productivity, while across campus a Stanford psychiatrist treats the harm from excessive immersion in cyberworlds. Here are some dispatches from the digital revolution—which seems to be a perpetual-motion generator of unintended consequences.

The Persona Electric

Psychiatrist Elias Aboujaoude, MS '98, MD '98, directs clinics for obsessive-compulsive disorders and impulse disorders at the School of Medicine. His patients battle all manner of compulsions, including, increasingly, online addictions that become so central in their lives that the line between real and virtual blurs. In his forthcoming book, Virtually You: The Dangerous Power of the e-Personality (Norton), Aboujaoude writes, "The flip side of enhanced productivity, expediency, and courage can be confusion, pain, and disorientation in the real world."

Aboujaoude was the lead author on a 2006 study—still the largest study to date—about problem Internet usage. He found that between 4 and 14 percent of the population admitted that a preoccupation with being online was interfering in various ways with their relationships, financial health and other aspects of real life. Only four years later, the research—performed before Facebook caught fire and before smart phones became prevalent—feels as antiquated as the brick-sized portable phones glimpsed in 1980s movies.

Aboujaoude observes that, even at a subpathological level, time spent communicating electronically or plugged into web-based activities—what he calls virtual life—pushes people toward developing a separate e-personality that then bleeds back into their real life. "From a psychological perspective, something is happening to our identity. Something seems to be hijacking it each and every time we log on."

He says the e-personality is more impulse-driven and more narcissistic; it gives itself permission to explore or seek out more morbid subjects; it regresses to earlier developmental stages that are more about action without heed to consequences; and it has a more grandiose view of itself. "It used to be that some people would say, 'Well, I can be myself online.' But what's worrisome is that offline life is starting to be more like online life. We're becoming more impatient, more narcissistic, more regressed even when there is no browser in sight."

He's troubled at the rise of online communities in which individuals struggling with an array of serious illnesses and conditions—including anorexia, paranoia and depression—eschew therapeutic resources in favor of connecting with others who reinforce or promote dangerous, even deadly behaviors. (See sidebar.) He also worries about the illusion of intimacy these communities and other online relationships create between strangers. A study published in 2007 by researchers at the University of Texas School of Public Health showed that nearly one-third of adult women engaged in sexual activity during their first face-to-face meeting with men they had met online—and 77 percent of those did not use condoms—even though most had been clear in their online communications that they did not intend to meet to have sex and were wary about sexually transmitted diseases. Aboujaoude believes that because of the online communication, "they know—or think they know—virtually everything. They have seen the pictures, researched the company, Googled the ex-wife, and gotten a sense of the man's health history; little is left but to have sex."

Delusions of Productivity

Communication professor Clifford Nass became known in the 1990s for research about how people interact with machines—especially in anthropomorphic ways. (We like computer interactions in which we're told flattering things, for example.) He advised companies including Microsoft and BMW about how to make their technology more appealing by designing features that mimic human interaction. His recent book, written with Corina Yen, '06, MS '08, is The Man Who Lied to His Laptop: What Machines Teach Us About Human Relationships (Current).

Nass became intrigued with multitasking and how young people seemed to switch so effortlessly among online chats, cell phone calls and their homework, all the while listening to music. "I wanted to figure out why kids are so good at multitasking," he explains. "I was trying to figure out what magic they had."

His search for magic has given way to his growing conviction that among multitaskers, productivity and efficiency are the illusions. Nass and his colleagues published a paper in 2009 in the Proceedings of the National Academy of Sciences showing that heavy multitaskers are actually prone to distractions and irrelevant information and perform worse on tests designed to measure their ability to focus and successfully switch among tasks.

He is increasingly asked by high-tech companies to do research that questions the policies and practices that have fostered multitasking among their workers. "The norm has become 'you must answer everybody's text or email right away because if people get immediate answers they can move ahead.' Well, that's fine if you're looking for answers from a Google search. If you keep asking Google questions, it doesn't bug Google." But for employees, "the cost of being constantly questioned is a real cost because there is a time limit to every day. In Silicon Valley you hire people who can think deeply and critically, but then you don't give them time to do that."

More broadly and quite markedly among Stanford students, Nass observes, "the notion of attention has changed radically. It's becoming perfectly OK to use media while we're interacting (in real life). That's an enormous change in the culture. Students will come into my office and not feel at all inhibited from texting while they're talking to me—until I stop them."

More recently Nass has been working on a study of 3,400 girls, ages 8 to 12, exploring such topics as their use of media and their face-to-face interactions, how frequently they multitask either alone or with friends, and their views of online vs. offline friends. The time frame is key: Studies show that girls' self-esteem at this age is a critical factor in how well they fare later in life. He's interested in what it means for preteen girls' development if they increasingly embrace texting, Facebook and other online ways of communicating in place of face-to-face interaction. "We worry whether you can learn to be social if you are not getting a great deal of practice reading faces and listening to voices," he says. "Online media remove the nuances of emotion and may make it seem that it is relatively unimportant when people interact with each other."

Enhance-able You

What if, on the other hand, you could enter a world where the faces you "read" have been engineered to enhance their connection to you? That's the kind of question studied in the Virtual Human Interaction Laboratory run by associate professor of communication Jeremy Bailenson. His lab has shown, for example, that a subject's attraction to a given political candidate can be enhanced by digitally blending aspects of the subject's face into the candidate's. In another study, Bailenson showed that a student listening to a lecture in a virtual classroom retained more information from the lecture if he or she sat in the "front row" where an avatar/lecturer appeared to make frequent eye contact with the student. Unlike in a real lecture hall, a multitude of cyberstudents could have the experience of sitting in the sweet spot.

One fascinating aspect of research on what people do in virtual worlds is that the sensors and cameras designed to make it all work capture incredible amounts of data. "Every single action is tracked 60 times per second," Bailenson explains. Thus are created enormous databases that scientists can probe for insights. In one study, for example, subjects' faces were monitored by a video camera as they performed particular tasks. Software analyzed the facial movements and later correlated that data to the subjects' performance on the tasks. The upshot was a program that could predict when people were about to make a mistake. It's not hard to imagine such a program being used to analyze the face of a fighter pilot, or a factory worker, to notice signs of fatigue, confusion or distraction—and intervene before a problem occurs.

Yet what would it do to our morale and stress levels to have a camera trained on us constantly? If such technology can predict mistakes, what about our likes and dislikes? Our hopes and fears? Virtual worlds have a "yin and yang to them," Bailenson acknowledges. "Virtual reality is like nuclear power—it can make energy or destroy nature."

But mostly he is bullish. In Infinite Reality: Avatars, New Worlds, Eternal Life, and the Dawn of the Virtual Revolution (HarperCollins, to be published in April), he predicts that avatars "are going to qualitatively change the way people interact socially. Avatars offer the possibility of doing something perfectly, of self-presenting much more effectively."

Communication professor Byron Reeves is eager to speed that possibility along. He and J. Leighton Read, a Silicon Valley venture capitalist, founded a company, Seriosity, to explore online games that they say can offer a new management approach to motivating employees. Their inspiration is the millions of people worldwide who have an avatar in an online multiplayer game.

The popular stereotype of online gamers may be teenage boys, but in fact the median age of players is 35 and the majority of players work full time. And that suggests to Reeves and Read that games offer powerful design elements that can be harnessed for much more than entertainment. In Total Engagement: Using Games and Virtual Worlds to Change the Way People Work and Businesses Compete (Harvard Business Press), they write that games such as World of Warcraft or Eve Online "require extraordinary teamwork, elaborate data analysis and strategy, the recruitment, evaluation, and retention of top players in multiperson guilds, the cooperation of people with complementary roles that require coordinated action, player innovations that come from everyone, and decision-making and leadership behavior that happens quickly and with transparent consequences."

Reeves and Read can envision a call-center employee who feels hemmed in by repetitive tasks and a soulless work environment. His job experience could be redesigned so that reporting for work is as simple as logging into a virtual world from his laptop. His contributions could be scored and constantly updated. Workers could "level up" just as in online games, by scoring sufficient points. Rewards could be virtual or real.

Reeves is not oblivious to the possibilities for unhealthy consequences of these games. "This is powerful stuff and it's easy to imagine consequences good and bad," he says. Total Engagement contains a chapter, "Danger," that warns about potential addiction to gaming and advises, "A new form of industrial hygiene, not yet well developed for the current tolls of the information age, will need to be invented to help make sure these powerful technologies are used safely."

So who gets to determine what "safely" means? A doctor who understands the spectrum of addictive behaviors and knows that the urge to play a game sometimes becomes overwhelming—and leads to isolation and real-world dysfunction? Or a company executive who chooses the most seductive aspects of gaming in order to improve worker performance and the bottom line?

Psychiatrist Aboujaoude says that immersion in gaming runs the risk that a player begins to believe that behaviors acceptable in a game might also pass offline: Heavy gamers may develop an offline persona with the swagger and bravado of their avatars. "It also becomes easier to lose perspective on one's divergent priorities: the need to perform well as a favorite game character or as an accomplished player versus the need to function as a responsible adult. It's all one big life with one big 'cumulative' score, the faulty justification goes, and if we are breaking records in an online game, we may feel, in aggregate, responsible and productive enough, and thus allow for some gross negligence elsewhere in life."

Psychologist Stephanie Brown, director of The Addictions Institute in Menlo Park, notes that "the internal experience today is one of hyperanxiety" and that "there has been a devaluing of quiet thoughtfulness." She treats more and more families struggling with both children and parents who cannot tear themselves away from their devices. "Addictions happen when people are trying to control their emotional state. You find something that makes you feel better and then you want more of it, but then there is emptiness in the payoff. We're seeing that, overnight, the happy little soccer player becomes the addicted gamer on World of Warcraft."

Reeves counters, "The term addiction [when used with gaming] can cause trouble. Does it mean playing too long? What is too long?" While he knows that some people play too much, he believes that for many there are positive effects of extended play. He cites a study that showed that teens who play multiplayer games have more friends, lower Body Mass Index, and are more socially integrated.

Walter Greenleaf, PhD '88, is the founder of InWorld Solutions, a Palo Alto firm that uses virtual-reality techniques for cognitive and behavioral therapy. Among his clients, Greenleaf explains, are therapists who work with emotionally disturbed and violent minors. The patients use InWorld game interfaces to interact with a therapist during a cybersession. Such systems seem particularly good at providing a laboratory for situations where emotional or social intelligence is required, Greenleaf says. "We can use it to train doctors to deliver bad news more appropriately or to learn to interview rape victims. We can use it to train addicts to deal with difficult social situations, like going to a park where they are offered drugs. It's very hard to practice these kinds of social interactions."

No Substitute for Real

Ultimately, however, isn't it essential that humans experience life, well, live? David Levy, MS '74, PhD '80, studied computer science at Stanford, but he's also chosen some decidedly analog paths, including a diploma he picked up in bookbinding and calligraphy from the Roehampton Institute, London. A professor at the Information School of the University of Washington, he often speaks about the onslaught of digital demands on our time and what it does to our essential humanity. "The central problem is not the technology, itself," Levy says. "Since the Industrial Revolution, we've been living out an economic system and a philosophy of life of more/faster/better. In the process, we've developed technology that enhances that program." The past couple of decades in particular have witnessed such a bloom of gadgets that he sees it as "vast expansion in radical instrumentality." Unfortunately, "the faster we go, the more we overload what we can do, must do and should do. We lose the life-giving dimension of being in the moment."

Levy notes that today's world assumes "the way to success is accelerated interaction and access; however, questioning that assumption involves the way we look at what gives life meaning and value. That's not something you really go do a study about. It's why you do philosophy. I see myself as a philosopher of technology trying to frame what the problem is." (Nass agrees: "Studying chronic phenomena is very difficult. It's hard to study the stuff happening to everybody all the time.")

Levy says research on technology's unintended consequences is "sort of a hodgepodge" but "that's just the truth of where we are." He cites an awakening by essayists and journalists that he compares to the publication of Rachel Carson's Silent Spring warning of the threats to the environment. Examples include Nicholas Carr, who argues that the digital onslaught that makes us better at skimming information is eroding our ability to concentrate and contemplate, and Kevin Kelly, who has suggested that the techno-selective Amish might have something to teach the wired world about the servant/master relationship of devices and their users.

Levy is also concerned that many people point at the younger generation and talk about their use of technology as a problem. He and colleagues have been distributing questionnaires to college students at several universities and asking them about whether they worry that they spend too much online, too much time texting, whether they feel a need to slow down, etc. "Across all different majors a very high proportion, over 90 percent, are saying yes. They care about it. They are much more articulate and concerned about what adults are concerned about, but nobody has been having a conversation with young people about these issues."

The challenge for young and old, it seems, is to keep refreshing the dividing line between real and virtual, cherishing unmediated spaces, and reminding ourselves of the difference between our important personal bonds and the poking connections we maintain with 622 Facebook friends. This may be one of the few challenges we face today where there isn't, as the saying goes, an app for that.


JOAN O'C. HAMILTON, '83, is a frequent contributor to Stanford.

You May Also Like

© Stanford University. Stanford, California 94305.