I recently received a Facebook invitation for a massive Pokemon Go meetup in Chicago, despite not owning a compatible phone or being particularly interested in the franchise (yes, I’m that person). I declined the invitation without much thought, but the very next day, the event’s absurdly high turnout was making its way onto my various social media newsfeeds all on its own. Aside from the game’s commercial success, Pokemon Go has had some seemingly profound social implications, not the least of which being players reportedly experiencing alleviated symptoms of depression and other mental health issues. Has the videogame industry finally delivered unto us the cure to all of society’s ills in the form of anti-sedentary augmented reality? No, not exactly. Though, this is hardly the first time that videogames have purported to yield real-world positive effects on such a massive scale.
You may recall, a few years ago, when players openly took pride in collectively solving a complex problem involving an AIDS protein in a matter of weeks. Considering the implications of such scenarios, it seems that videogames, in their exceptional ability to motivate players, are capable of producing amazing results. However, they’re also capable of producing some not-so-amazing results. If you spend a lot of time around frequent players, chances are that for every person you know who’s met their soulmate in EverQuest, you also know someone whose career stalled for months or years because of their playing habits, or someone whose social life has been seriously impaired due to them spending their days playing League of Legends for twelve hours at a time. Maybe that someone has even been you.
Players are often all too eager to sing the praises of videogames while dismissing much of the criticism that comes their way, and for somewhat understandable reasons. Videogames have been raked over the coals for decades by everyone from overzealous ministers to woefully uninformed news anchors. Beyond that, we’ve seen ongoing attempts to regulate the videogame industry in ways that are often excessive and unfair, not to mention irrational. As a result, players sometimes struggle to differentiate legitimate, factual scrutiny from the alarmist white noise that has surrounded the medium since its inception. This has been demonstrated with a myriad of social issues, including violence, sexism, and now mental health.
Despite what critics and pundits have said on either side of the debate, the simple truth is that videogames are neither inherently good nor bad; they are merely a tool, albeit an important one, capable of producing transformative, interactive experiences like nothing else. In that regard, it’s not surprising that they provoke such strong responses. People on all sides of the discussion seem to recognize videogames’ potential, one way or the other, but most are likely ignorant as to why that is. So let’s peel back a layer and examine what makes playing games so psychologically powerful.
Arguably the most important component is that playing videogames is one of the easiest and most reliable ways to achieve what psychologists refer to as ‘flow.’ ‘Flow’ is a trance-like state involving deep concentration that requires a delicate balance of challenge and skill, ultimately making an activity far more enjoyable and rewarding to the person performing it. What’s more, by inducing flow in the player, a game can effectively and deliberately captivate, diverting the player’s attention from other experiences — even powerful, biological ones.
Burn victims who played Snow World while receiving procedural medical care saw a significant decrease in their pain levels.
Precisely because of this phenomenon, games have proven to be an especially potent method of pain reduction. A 2011 study conducted at the University of Washington Seattle found that burn victims who played a virtual reality game called Snow World while receiving procedural medical care saw a significant decrease in their pain levels. Hypothesizing the VR environment would act as a powerful distraction for patients, researchers used brain scans to verify what patients reported as a 35-50% reduction in their pain levels, offering a safer and more effective alternative to the morphine that is usually prescribed. While virtual reality has been getting a lot of attention recently, with reports now coming in that VR has been used to help paraplegics regain partial functionality after sustaining spinal cord injuries, more conventional videogames have been utilized as well.
For example, researchers and developers worked together to create a series of games that help cancer patients adhere to what are often unpleasant and physically debilitating regimens of chemotherapy and antibiotics, thus improving their odds of remission. In terms of videogames’ explicit mental health applications, there is still plenty of room for additional research, but therapists have been using games to assist patients for some time now with varying degrees of success. So far, playing games has been attributed to decreased levels of depression, stress, and anxiety, as well as improved self-esteem and social skills. There have even been ongoing studies examining the usefulness of Tetris in curbing the long-term effects of psychological trauma by helping to disrupt the memory consolidation process that takes place after experiencing a traumatic event.
The common threads among many of these medical applications of videogames are the audio-visual and visuospatial aspects that, compounded with games’ interactivity, make playing games uniquely engaging and can even improve certain cognitive functions. All of this is to say nothing of videogame’s ability to alter various regions of the brain and drastically increase dopamine production. Dopamine is commonly known to be responsible for feelings of pleasure at a neurological level and is frequently associated with drugs like nicotine and cocaine. It’s not hyperbole to say that playing games can change our perceptions of reality.
Outside of the doctor’s office, videogames have made their way into a number of social and institutional spaces as well. Teachers frequently take advantage of games like Minecraft: Education Edition, adding to the long list of games that have become commonplace in schools. Educators and researchers are still weighing the pros and cons of having videogames in the classroom, but there’s legitimate reason to believe that by providing the right kinds of stimulation and keeping students interested in the subject matter that they’re learning about, games can, at the very least, act as a helpful supplement to educational methods that already exist.
In a broader and perhaps more obvious sense, videogames can affect players’ social behavior. With many millions of people playing online games like Call of Duty and League of Legends, we’ve seen entirely new forms of social interactions influence players’ lives. Some have even claimed to experience ‘spiritual’ benefits from playing, like the Chinese World of Warcraft players who described their time in WoW as ‘a feeling of existence [they] cannot find in the real world.’ Similarly, another player told the story of how he was able to maintain a connection with his deceased father through his old Xbox, by challenging his father’s ‘ghost’ in a racing game they once played together.
It’s important to keep in mind that games are the product of a multi-billion dollar industry.
However, while videogames have demonstrated their positive qualities many times over — as many players will gladly attest to — it’s important to keep in mind that games are the product of a multi-billion dollar industry that deals in neurochemical events and extensive behavioral conditioning. For many of the same reasons that they are able to affect players’ lives for the better, they can also affect them for the worse.
All in all, these outcomes aren’t as surprising as they may seem when you consider that game developers have long utilized their understanding of psychology to make their games more ‘compelling,’ or as many have probably more accurately put it, ‘addictive.’ Videogames’ ‘addictiveness’ — which is fueled heavily by the massive dopamine releases that result from game designers’ careful use of behavioral conditioning — has been built into games for financial reasons since the arcade days. This financial model is alive and well today in the mobile game market, and while the AAA industry has been happy to criticize their mobile counterparts’ habit-forming mechanics for what are usually professed to be ethical reasons, one has to wonder if this is an attempt to divert at least some amount of blame away from more mainstream PC and console games, towards videogames’ well-established pariah.
But AAA developers are perfectly keen on using psychological hooks of their own. They often draw inspiration from Maslow’s ‘hierarchy of needs,’ for instance, when designing reward structures and core gameplay loops. When used in games, these hierarchies are designed to fulfill a variety of player ‘needs’ while simultaneously instilling a sense of even more need, often with the help of randomized rewards. The most notable example of this is World of Warcraft, which is practically the poster child for what is finally, commonly being acknowledged as pathological game addiction, even amongst game developers.
Some videogame advocates have made semantics-based arguments against the existence of videogame addiction in the past, usually chalking it up to bad parenting. However, the reality is that there are indeed complex neurological impacts that videogames can have on our brains. So while there may be some physiological distinctions between drug addiction and videogame addiction, the end results are strikingly similar.
We’ve seen a lot of good come from the videogame industry, but now that we have an understanding of the technical components and design philosophies that produce such outcomes, it’s time to start asking the hard questions: what are the implications of videogames having similar effectiveness as certain pharmaceuticals, and is that something we’re comfortable with? What are the ethical concerns regarding the storage of players’ digital representations in the form of save data or user profiles after they’ve stopped playing — especially in the case of death? Games being used to ‘motivate’ children to perform in ways they otherwise wouldn’t in a classroom setting seems benign enough, but what about when the same practice results in someone losing their wife and job after being consumed by Fallout 4, or spending hundreds of dollars on mobile games? The answers, much like the science itself, are likely quite complicated and to be taken with a grain of salt.
Top image source: Jason Henry, New York Times.