On the evening of November 9, having barely been awake to see the day, I took the subway to Sunset Park. My objective was to meet a friend at the arcade Next Level.
In size, Next Level resembles a hole-in-the-wall Chinese restaurant. It does indeed serve food — free fried chicken and shrimp were provided that night, and candy, soda, and energy drinks were available at a reasonable markup — but the sustenance it provides is mostly of a different nature. Much of Next Level’s space was devoted to brilliant banks of monitors hooked up to video-game consoles, and much of the remaining space was occupied by men in their 20s avidly facing them. It cost us $10 each to enter.
I had bonded with Leon, a graphic designer, musician, and Twitter magnate, over our shared viewership of online broadcasts of the Street Fighter tournaments held every Wednesday night at Next Level. It was his first time attending the venue in person and his first time entering the tournament. I wasn’t playing, but I wanted to see how he’d do, in part because I had taken to wondering more about video games lately — the nature of their appeal, their central logic, perhaps what they might illuminate about what had happened the night before. Like so many others, I played video games, often to excess, and had done so eagerly since childhood, to the point where the games we played became, necessarily, reflections of our being.
To the uninitiated, the figures are nothing if not staggering: 155 million Americans play video games, more than the number who voted in November’s presidential election. And they play them a lot: According to a variety of recent studies, more than 40 percent of Americans play at least three hours a week, 34 million play on average 22 hours each week, 5 million hit 40 hours, and the average young American will now spend as many hours (roughly 10,000) playing by the time he or she turns 21 as that person spent in middle- and high-school classrooms combined. Which means that a niche activity confined a few decades ago to preadolescents and adolescents has become, increasingly, a cultural juggernaut for all races, genders, and ages. How had video games, over that time, ascended within American and world culture to a scale rivaling sports, film, and television? Like those other entertainments, video games offered an escape, of course. But what kind?
In 1993, the psychologist Peter D. Kramer published Listening to Prozac, asking what we could learn from the sudden mania for antidepressants in America. A few months before the election, an acquaintance had put the same question to me about video games: What do they give gamers that the real world doesn’t?
The first of the expert witnesses at Next Level I had come to speak with was the co-owner of the establishment. I didn’t know him personally, but I knew his name and face from online research, and I waited for an opportune moment to approach him. Eventually, it came. I haltingly asked if he’d be willing, sometime later that night, to talk about video games: what they were, what they meant, what their future might be — what they said, perhaps, about the larger world.
“Yes,” he replied. “But nothing about politics.”
In June, Erik Hurst, a professor at the University of Chicago’s Booth School of Business, delivered a graduation address and later wrote an essay in which he publicized statistics showing that, compared with the beginning of the millennium, working-class men in their 20s were on average working four hours less per week and playing video games for three hours. As a demographic, they had replaced the lost work time with playtime spent gaming. How had this happened? Technology, through automation, had reduced the employment rate of these men by reducing demand for what Hurst referred to as “lower-skilled” labor. He proposed that by creating more vivid and engrossing gaming experiences, technology also increased the subjective value of leisure relative to labor. He was alarmed by what this meant for those who chose to play video games and were not working; he cited the dire long-term prospects of these less-employed men; pointed to relative levels of financial instability, drug use, and suicide among this cohort; and connected them, speculatively, to “voting patterns for certain candidates in recent periods,” by which one doubts he meant Hillary Clinton.
But the most striking fact was not the grim futures of this presently unemployed group. It was their happy present — which he neglected to emphasize. The men whose experiences he described were not in any meaningful way despairing. In fact, the opposite. “If we go to surveys that track subjective well-being,” he wrote, “lower-skilled young men in 2014 reported being much happier on average than did lower-skilled men in the early 2000s. This increase in happiness is despite their employment rate falling by 10 percentage points and the increased propensity to be living in their parents’ basement.” The games were obviously a comforting distraction for those playing them. But they were also, it follows, giving players something, or some things, their lives could not.
The professor is nevertheless concerned. If young men were working less and playing video games, they were losing access to valuable on-the-job skills that would help them stay employed into middle age and beyond. At the commencement, Hurst was not just speaking abstractly — and warning not just of the risk to the struggling working classes. In fact, his argument was most convincing when it returned to his home, and his son, who almost seemed to have inspired the whole inquiry. “He is allowed a couple of hours of video-game time on the weekend, when homework is done,” Hurst wrote. “However, if it were up to him, I have no doubt he would play video games 23 and a half hours per day. He told me so. If we didn’t ration video games, I am not sure he would ever eat. I am positive he wouldn’t shower.”
My freshman year, I lived next door to Y, a senior majoring in management science and engineering whose capacity to immerse himself in the logic of any game and master it could only be described as exceptional. (This skill wasn’t restricted to electronic games, either: He also played chess competitively.) Y was far and away the most intrepid gamer I’d ever met; he was also an unfailingly kind person. He schooled me in Starcraft, let me fiddle around on the PlayStation 2 he kept in his room while he worked or played on his PC. An older brother and oldest child, I had always wanted an older brother of my own, and in this regard, Y, tolerant and wise, was more or less ideal.
Then, two days before Thanksgiving, a game called World of Warcraft was released. The game didn’t inaugurate the genre of massively multiplayer online role-playing games (MMORPGs), but given its enormous and sustained success — augmented by various expansions, it continues to this day — it might as well have. Situated on the sprawling plains of cyberspace, the world of World of Warcraft was immense, colorful, and virtually unlimited. Today’s WoW has countless quests to complete, items to collect, weapons and supplies to purchase. It was only natural that Y would dive in headfirst.
This he did, but he didn’t come out. There was too much to absorb. He started skipping classes, staying up later and later. Before, I’d leave when it was time for him to sleep. Now, it seemed, the lights in his room were on at all hours. Soon he stopped attending class altogether, and soon after that he left campus without graduating. A year later, I learned from M, his friend who’d lived next door to me on the other side, that he was apparently working in a big-box store because his parents had made him; aside from that, he spent every waking hour in-game. Despite having begun my freshman year as he began his senior one, and despite my being delayed by a yearlong leave of absence, I ended up graduating two years ahead of him.
Y’s fine now, I think. He did finally graduate, and today he works as a data scientist. No doubt he’s earning what economists would term a higher-skilled salary. But for several years he was lost to the World, given over totally and willingly to a domain of meanings legible only to other players and valid only for him. Given his temperament and dedication, I feel comfortable saying that he wasn’t depressed. Depression feels like an absence of meaning, but as long as he was immersed in the game, I believe that his life was saturated with meaning. He definitely knew what to do, and I would bet that he was happy. The truth is, as odd as it might sound, considering his complete commitment to that game, I envy this experience as much as I fear it. For half a decade, it seems to me, he set a higher value on his in-game life than on his “real” life.
What did the game offer that the rest of the world could not? To begin with, games make sense, unlike life: As with all sports, digital or analog, there are ground rules that determine success (rules that, unlike those in society, are clear to all). The purpose of a game, within it, unlike in society, is directly recognized and never discounted. You are always a protagonist: Unlike with film and television, where one has to watch the acts of others, in games, one is an agent within it. And unlike someone playing sports, one no longer has to leave the house to compete, explore, commune, exercise agency, or be happy, and the game possesses the potential to let one do all of these at once. The environment of the game might be challenging, but in another sense it is literally designed for a player to succeed — or, in the case of multiplayer games, to have a fair chance at success. In those games, too, players typically begin in the same place, and in public agreement about what counts for status and how to get it. In other words, games look like the perfect meritocracies we are taught to expect for ourselves from childhood but never actually find in adulthood.
And then there is the drug effect. In converting achievement into a reliable drug, games allow one to turn the rest of the world off to an unprecedented degree; gaming’s opiate-like trance can be delivered with greater immediacy only by, well, actual opiates. It’s probably no accident that, so far, the most lucid writing on the consciousness of gaming comes from Michael Clune, an academic and author best known for White Out, a memoir about his former heroin addiction. Clune is alert to the rhetoric and logic of the binge; he recognizes prosaic activities where experience is readily rendered in words and activities like gaming and drugs, where the intensity eclipses language. Games possess narratives that have the power to seal themselves off from the narratives in the world beyond it. The gamer is driven by an array of hermetic incentives only partially and intermittently accessible from without, like the view over a nose-high wall.
In Tony Tulathimutte’s novel Private Citizens, the narrator describes the feeling near a porn binge’s end, when one has “killed a week and didn’t know what to do with its corpse.” An equally memorable portrait of the binge comes from the singer Lana Del Rey, who rose to stardom in 2011 on the strength of a single titled “Video Games.” In the song, Del Rey’s lover plays video games; he watches her undress for him; later, she ends up gaming. Pairing plush orchestration with a languid, serpentine delivery, the song evokes an atmosphere of calm, luxurious delight where fulfillment and artifice conspire to pacify and charm. The song doesn’t just cite video games; it sounds the way playing video games feels, at least at the dawn of the binge — a rapturous caving in.
1 Comments
Gud work
ReplyDelete