I WAS MAKING the news rounds today when I ran across this article at Wired talking about a new psychological study. The study supposedly established a link between violent video games and aggression. Well, my B.A. in Psych (a.k.a. “Just enough to get you into trouble” or, more pessimistically, “You want fries with that?”) tempted me into looking at the original article itself. I found some interesting things in it, and I’m going to discuss them here.
Let it be noted that I enjoy video games, so I’m probably a little biased here. Let it also be noted that I enjoy video games I’m sure would be characterized by the authors of the article as “violent,” such as Quake 3 or Unreal Tournament. Let it finally be noted that I have never struck another human being (or any other living thing, for that matter) in a violent fashion.
Additionally, I have never (in real life) shot anyone with any type of gun more harmful than a Super Soaker, including but not limited to handguns, machine guns, shotguns, rocket launchers, guns that shoot extremely long arcs of electricity, or a BFG. I will admit, however, that I find the idea of skeet shooting with a BFG somewhat entertaining.
High drama for a journal
Now that my own personal biases on the subject are out of the way, let’s examine the article. The introduction begins in extremely sensationalistic fashion for an article in a psychological journal, with the first sentence bringing up the tragedy in Columbine, Colorado. By the fourth sentence, the authors have implicated video games as a possible factor, citing anecdotal evidence that the perpetrators liked the “bloody, shoot-’em-up video game Doom. . . .”
The authors then quote an investigator associated with the Simon Wiesenthal Center who said that Eric Harris and Dylan Klebold were “playing out their game in God mode.” A dramatic quotation to be sure, and one attributed in the text to “Pooley, 1999.” The publication that this quotation was drawn from? Time magazine.
As one who spent a fair amount of time in college poring over psychological studies that introduced their subject matter in dry and starkly scientific terms, I find it highly unusual and somewhat suspect that the authors quote, as an implied fact, a statement from someone who is quite probably a layman stating an opinion, with no real justification of that opinion.
As evidence that Harris and Klebold were “playing out their game,” the authors point out that, for a class project, they made a videotape that was similar to the scenario they had constructed within the game, in which they “dress[ed] in trench coats, carr[ied] guns, and kill[ed] school athletes.” As everyone now knows, they then acted out the events on the videotape in real life.
Apparently, the authors want the reader to believe that this behavior stemmed from a desire to act out the video game. This is certainly one possible explanation. A more plausible one, however, is that Harris and Klebold were using increasingly more realistic methods to act out their homicidal tendencies and, more specifically, their hatred of athletes. First there was the customization of the video game, then there was the video tape, and finally the massacre itself.
To posit that the video game was the genesis of the violence and not simply one step in an increasingly realistic path of expressing violent fantasies is short-sighted at best. That psychologists employed at universities would come to such a conclusion raises the question of whether it was manufactured to fit the opinions of the authors and/or to bolster what we will see are fairly thin results.
Correlation, causation, and confusion
Before we get to the studies proper, let’s run through a brief primer on correlation versus causation. A correlation is established when research reveals that one characteristic or tendency is found to occur with another characteristic or tendency. Suppose you undertake a study where you poll a group of students about their study habits, then compare that data to their GPA’s. You might find that students who study for longer periods of time generally have higher grades than those who study for less time. Your study reveals a correlation between study time and grades. Students who study more tend to get higher grades than those who don’t.
It is important to realize, however, all you’ve really established here is that the two characteristics tend to be found together. You cannot make any reliable inferences as to whether the presence of one characteristic causes the other characteristic to occur. Such a connection, called causation, can only be established through clinical experimentation.
A celebrated example of these principles is the ice cream-crime connection. Studies have shown that the crime rate typically increases in the summer months. Studies have also shown that the consumption of ice cream increases in the summer months. A study comparing ice cream consumption and crime rate might very well reveal that the two are correlated; the more ice cream consumed, the higher the crime rate tends to be. Having said that, it obviously cannot be inferred that eating ice cream causes people to commit more crime; nor can it be established that doing crime causes people to crave ice cream. Such statements are an attempt at establishing causation, and the lunacy of such statements drives home the point: correlation does not equal causation.
Before discussing their own results, the authors cite several other studies that have explored the video game-aggression link on a correlational level. Three of the studies showed a correlation; one did not. The authors note, however, that none of these studies distinguished between a violent and non-violent video game. To quote the article, “[t]hus, none test the hypothesis that violent video games are uniquely associated with increased aggression.”
The authors also cover the existing experimental studies in this area. A few brief quotes:
The extant experimental studies of video games and aggression have yielded weak evidence also . . . Two additional experimental studies of violent video games and aggression found no effect of violence . . . . In sum, there is little experimental evidence that the violent content of video games can increase aggression in the immediate situation.
Four experimental studies cited in the article showed a possible link between violent videogames and aggression. However, even the authors discount these studies, because “none of these studies can rule out the possibility that key variables such as excitement, difficulty, or enjoyment created the observed increase in aggression.” They then sum up the results of the previous studies as follows: “There are methodological shortcomings in many of these studies, which, when combined with the mixed results, suggest that there is little evidence that short-term exposure to violent video games increases aggression-related affect.”
The first study: squeezing a truism from a rock
The authors’ own studies begin with a survey of six separate scales. They were irritability, aggression, delinquency, video game preferences, world view, and academic achievement. The first five scales were obtained via survey, the last through the participant’s college GPA. All of the participants were college students.
There are some interesting results here. First, 91% of the students surveyed (comprised of 88% of the females and 97% of the males) played video games. Of those surveyed who did play video games, one third played video games described by the authors as violent/aggressive. Interestingly, the authors seemed eager to classify “Super Mario Brothers” as a violent game, which would push the percentage to 44%. I will leave it to the reader to decide if Super Mario Brothers is in fact a violent video game.
I think the above results are interesting because of the percentages involved; if approximately 30% of all college students are playing violent video games, and video games cause aggressive behavior as the authors later suggest, where is the huge outpouring of violence on our college campuses?
The argument could be made that because the subjects in question are college students, they might be more intelligent, have had more opportunities, and are thus less prone to the supposed aggressive side effects of video games. Assuming that at least 30% of junior high or high school students play these same games, however (a fairly safe assumption in this anecdotal context at least; after all, college students have beer!), there should be at least the same level of violent incidents in our middle and high schools, as well. In terms of numbers, it’s just not there.
The article goes on to point out that VGV (the article’s acronym for Video Game Violence) “was positively and significantly related to aggressive behavior. . .” Additionally, “[v]iolent video game play and aggressive personality separately and jointly accounted for major portions of both aggressive behavior and nonaggressive delinquency.”
The authors then state “The positive association between violent video games and aggressive personality is consistent with a developmental model in which extensive exposure to violent video games (and other violent media) contributes to the creation of an aggressive personality.” Strong words, but remember, correlation does not equal causation.
Indeed, after stating the above, the authors admit that “[t]he cross-sectional nature of this study does not allow a strong test of this causal hypothesis”. The authors later state:
the correlational nature of Study 1 means that causal statements are risky at best. It could be that the obtained video game violence links to aggressive and nonaggressive delinquency are wholly due to the fact that highly aggressive individuals are especially attracted to violent video games.
Thus, the first study yields only this result: violent individuals play violent video games. Based on this study alone, it’s impossible to say whether ‘violent games make violent individuals’
or simply ‘violent games attract violent individuals.’ Without clinical experimental data to establish causation, the correlation means no more than the connection between crime and ice cream.
Study two: Shall we play a game?
Next up is the second study, a clinical trial. Because clinical trials take place with control and experimental groups, under controlled conditions, one may reliably derive causation from them. Of course, this assumes the the controlled conditions are all valid, and I have some issues with the conditions in the second study.
In order to understand the results, it is necessary to become familiar with the procedures and measures used during the experiment. The authors first had to choose video games to use in the experiment; Myst, Tetrix (a Tetris clone), Marathon and Castle Wolfenstein were chosen as possible games. Thirty-two students were invited to play each of the four games, then rate them on scales of difficulty, enjoyability, frustration, action speed and excitement. Additionally, blood pressure, heart rate, and other physical measurements were taken while each of the four games were being played.
The purpose of this trial was to rule out factors other than violence as being responsible for aggressive behavior in the second phase of the experiment. A noble goal, but I began to doubt the validity of this phase when the authors revealed that “[t]he goal was best achieved by pairing of Myst and Wolfenstein 3D. . . There were also no differences on ratings of game difficulty, enjoyment, frustration, and action speed [between the two games]. However, Wolfenstein 3D was rated as more exciting than Myst. . .”
Anyone who has experience with these two games knows that they are worlds apart in terms of genre; Myst is an adventure game with “brain teaser” puzzles that does not rely on reaction time; indeed, one could leave a game of Myst running while gone to a movie for a couple of hours, and nothing in the game would change in the interim. By contrast, Castle Wolfenstein is a first-person shooter game, commonly referred to as a “twitch” game because of the predominant action focus and emphasis on quick reaction time. Abandoning the controls for a few seconds will probably result in the player losing the game.
The idea that two dissimilar games would score equally on a scale such as “action speed” troubles me greatly, and in my opinion makes the results of this phase suspect. Unfortunately, the study gives no specifics on the amount of time the participants in this phase spent playing each game. There is also no indication that the participants in this phase were told the rules of each game. The omission of these factors is troubling; an experiment comparing the speeds of a Corvette and a minivan might end similarly if the test involved driving twenty feet.
The lack of this information means it’s difficult to be certain what the specific test conditions were, but the fact that Castle Wolfenstein was rated more exciting than Myst at least gives some hope for the validity of the game selections. However, the difference in rating only occurred among males. Given that males are much more likely than females to play action games like Castle Wolfenstein, it is possible that neither group was explained the rules, but the males knew them from having played the game (or similar games) before.
Additionally, the participant pool for this phase of the experiment was different from the main phase of the experiment. This fact tarnishes the validity of the results; the conclusions of the experiment rely on the two games matching on the above scales, but there is no proof that the participants in the main experiment have the same opinion on this matter as the participants in the game selection phase.
On the whole, that two such wildly dissimilar games were found to be identical on five of six scales is a concern, especially since this finding forms the foundation of the later findings.
Trials and tribulations
Once the games had been selected, the main part of the experiment got underway. Students took a personality survey that yielded a “Trait Irritability” score. Those with scores in the top 25% and bottom 25% were selected for the experiment. Each group was split in half, with one half of each group playing Castle Wolfenstein and the other half playing Myst. In the first session, participants played the game for fifteen minutes, then took a test to judge their “state hostility.” After a second fifteen-minute session, they took another test to measure their cognitive aggression. This completed the first session.
One week later, the participants were called back for a second session, where they played one more fifteen minute round of the same game. At this point, the participants were told they were taking part in a competitive reaction time test with another participant. Before each trial, participants set the duration and intensity for a burst of white noise; if they won that trial, the other participant would be given a “blast” of noise of the duration and intensity chosen by the participant. If the participant lost the trial, they would receive a blast of a duration and intensity set by their opponent.
The idea here is that louder and/or longer bursts of noise inflicted upon the other participant signified greater levels of aggression. In this experiment, the second participant was fictional; noise levels and durations were randomly chosen from a pre-determined list of values. In other words, all participants got the same noises, but the order was randomized. Not only were the noise values predeterminedalthough the participants didn’t know it, the test was rigged; each participant won 13 times and lost 12.
The results of each part of the test are somewhat interesting. The “state hostility” test did not support the authors’ hypothesis; the scores did not vary significantly at all based on whether the participant played a non-violent or violent game.
The cognitive aggression test did reveal a higher measured aggression from those who played the violent game than from those who played the non-violent game. The same test revealed higher aggression in male subjects. Interestingly, the participants’ irritability scores from the pre-experiment screening had no correlation with this test.
The cognitive aggression test measured the speed at which participants could read aloud words on a computer screen. According to the test, cognitive aggression was higher if the participant read aggressive words (such as “murder”) more quickly than non-aggressive words (such as “consider”). I will leave it to the reader to ponder how the speed at which an individual reads words with aggressive meanings may or may not translate into real world aggressive or violent behavior.
The final test was the “noise test”. The purpose of this test was to measure aggressive behavior. As mentioned before, there were two factors to be considered here: the intensity and duration of the noise that the participant inflicted on his imaginary “opponent.” For whatever reason, the intensity of the noise didn’t vary with any of the factors tested; gender, irritability score, and game type had no effect on the intensity of the sound chosen by the participant.
The duration of the sound did vary, however. As one might predict, participants tended to be more aggressive in general when setting the duration of the noise immediately following trials where they “lost” (i.e. were subjected to a burst of noise). This fact was taken into account by the experimenters.
Following a “win” trial, the only pattern was that females tended to be more aggressive than males, delivering longer noise blasts. The same was true after a “lose” trial; females delivered longer noise bursts than males. Additionally, subjects with high irritability scores delivered longer noise blasts than those with low scores after a “lose” trial. Finally, those who played Castle Wolfenstein delivered longer noise blasts than those who played Myst.
Leaping a chasm to conclusions
From here, the study takes a disturbing turn in its conclusions. First, the experimenters declare that the effect of video game type on noise duration is “most important”, despite the fact that their data seems to indicate game type has the weakest correlation of the three factors with significant results. By way of example, look at this graph from the article that compares the effects of irritability versus game type.
After looking at this graph, I came to the conclusion that in the worst possible case, the type of video game has less of an effect on a person’s aggressive tendencies than does his everyday, natural personality. The authors, on the other hand, see the graph as suggesting “a third possibility: The AP × VGV interaction in Study 1 may reflect a long-term bidirectional causality effect in which frequent playing of violent video games increases aggressiveness, which in turn increases the desire and actual playing of even more violent video games.” Given the clinical findings of this and prior studies, such a statement seems little more than rampant speculation in an attempt to justify a hypothesis not supported by the data.
The rest of the article is a rambling dissertation on the need for further study, with fearmongering statements that conflict with the findings of the study, such as “[v]iolent video games provide a forum for learning and practicing aggressive solutions to conflict situations.” The conclusion then makes unwarranted and unproven claims about the long-term effects of violent video games, despite the article’s focus on short-term effects that only marginally pan out.
In my opinion, the selection of the video games used in this trial is fundamentally flawed. The fact that the participants rating the various measured scales of the games was not the same group that actually played the games is cause enough to discount the results. That such disparate games as Myst and Castle Wolfenstein were judged to be so very similar to one another calls the selection process into serious question, as well. Such fundamental problems, combined with the apparent bias and slant towards sensationalism present in the authors’ writing, makes the article better suited for Newsweek than for a psychological journal. I only hope the further research called for by the authors will be performed by someone more interested in valid results than in doggedly hanging onto a hypothesis that stands contrary to the data.