Erhalten Sie Zugang zu diesem und mehr als 300000 Büchern ab EUR 5,99 monatlich.
'Etchells writes eloquently ... A heartfelt defence of a demonised pastime' The Times 'Once in an age, a piece of culture comes along that feels like it was specifically created for you, the beats and words and ideas are there because it is your life the creator is describing. Lost In A Good Game is exactly that. It will touch your heart and mind. And even if Bowser, Chun-li or Q-Bert weren't crucial parts of your youth, this is a flawless victory for everyone' Adam Rutherford When Pete Etchells was 14, his father died from motor neurone disease. In order to cope, he immersed himself in a virtual world - first as an escape, but later to try to understand what had happened. Etchells is now a researcher into the psychological effects of video games, and was co-author on a recent paper explaining why WHO plans to classify 'game addiction' as a danger to public health are based on bad science and (he thinks) are a bad idea. In this, his first book, he journeys through the history and development of video games - from Turing's chess machine to mass multiplayer online games like World of Warcraft- via scientific study, to investigate the highs and lows of playing and get to the bottom of our relationship with games - why we do it, and what they really mean to us. At the same time, Lost in a Good Game is a very unusual memoir of a writer coming to terms with his grief via virtual worlds, as he tries to work out what area of popular culture we should classify games (a relatively new technology) under.
Sie lesen das E-Book in den Legimi-Apps auf:
Seitenzahl: 520
Veröffentlichungsjahr: 2019
Das E-Book (TTS) können Sie hören im Abo „Legimi Premium” in Legimi-Apps auf:
For Malcolm
In your pocket, in your bag, on your desk, there is a window onto the entirety of human knowledge and understanding. For the vast majority of us, screens are now an inseparable part of everyday life. They connect us to our friends and family and bring us together with strangers. They are our work, our play – an inspiration and a source of distraction. And yet, despite their ubiquity, despite their familiarity, for many people screens are something to be wary – even scared – of. There are some scientists and intellectuals who will go so far as to say that screens are changing the way our brains work (which is true), and that this is a bad thing (which is false).
Using screens does change the way our brains work, but this is not an interesting point to make, because everything that you do changes your brain. Reading these words, right now – whether on a screen or on a page – is modifying the way that neurons connect to each other inside your head. That’s what happens every time we learn new things, make new memories, and remember old ones. The interesting question is: how do things change your brain? And, as far as the fledgling science of screens is concerned, the answer is complex, nuanced and woefully incomplete.
Nevertheless, spending lots of time parked in front of a computer or smartphone doesn’t seem to feel right on an almost instinctual level. This dissonance, between what we feel from past experience, and what we know from current scientific research, might go some way towards explaining the culture war that is currently being waged against screens in general, and video games in particular.
Here’s an example of what I mean. If I ask you to think of a stereotypical gamer, who comes to mind? The chances are that you’ll conjure up an image of a lone teenage boy with skin so pale it’s near-translucent, bathed in the blue-white glow of a computer monitor in a darkened bedroom, playing a shooter game – something like Call of Duty, or Fortnite, or Overwatch. And something about that scenario feels unhealthy and unnatural. If you have that image in mind, and someone comes along and announces that we should be encouraging kids to go and play outside more, mere common sense dictates that it would be silly for us to argue against such a position.
What little scientific evidence is available so far seems to suggest that it isn’t a zero-sum game, though – less time spent outdoors doesn’t simply correlate with more time spent playing video games. And on top of that, this is a false dichotomy, balanced precariously on a poor understanding of who actually plays video games. It turns out that the demographics of gamers plays counter to our common (mis)conceptions. According to the Interactive Software Federation of Europe, across all age groups there are approximately equal numbers of men and women playing games. And perhaps more surprisingly, adults aged 45 and up are more likely to play than children aged six to fourteen.
There is an unspoken assumption here, of course: that while playing outdoors is a wholesome and healthy activity, playing video games is at best a meaningless waste of time, and at worst an unprecedented health risk. But the reality of the situation is far more complex. Video games are a creative medium, and they offer us unparalleled opportunities for exploring what it means to be human. Certainly, there is potential for them to be misused and abused. But they also offer us new ways to explore the world around us, our thoughts and feelings, our demons and aspirations. And of course, playing outside isn’t without its own risk, be it from air pollution, traffic on our roads, or ever-increasing concerns about ‘stranger danger’. The rose-tinted view that it is wholesome and healthy often seems to spring from a rather privileged assumption that ‘outside’ means ‘in the countryside’.
There are many different ways to understand video games. One of them is by looking at why we play. What is it that makes some of us want to spend time in those worlds? Different people play different games for different reasons. This is a book that will uncover those reasons and consider the effects that they have on us: as a society, as well as at a personal level.
But I also want to take a grander view of the history of games and look at their relationship with science. Video games were, of course, a product of scientific development; but now they themselves are starting to feed that very development. Science and gaming are locked together in a symbiotic relationship. Through examining that symbiosis I want to uncover a natural history of video games – a sense of where they came from, and how they have changed over the years.
Though I’ll try to maintain objectivity – certainly as far as the scientific research into the effects of gaming on the human brain go – I ought to come clean and say that my own engagement in this field began for personal reasons. As far as my own relationship with video games goes, the story is a complicated one.
And, like many things, it starts with an end.
CHAPTER 1
There’s a landscape, not all that far away from here, that over the years I have come to know in intimate detail. It’s a frigid, desolate place where snow-crusted mountaintops give way to ravines scattered with pockets of hardened civilisation. To the east of where I’m camping, the sharp, rocky heads of a mountain range climb towards a purple and stormy sky. Far below, I can just about make out a frozen stretch of water, lined by trees that look, from this distance, like cake decorations dusted with icing sugar. I stare off into the fog. I’m in a dangerous place, but it’s one that I have come to associate with a certain serenity. It’s peaceful here. Quiet.
I’m waiting for dragons.
Back in my room, I pulled my legs up onto the chair, and reached for a mug of coffee as a lightning storm played out in the distance on the screen in front of me. The mug had been empty for several hours now, and I was left with only a drying brown halo of silt at the bottom. It was late, and I was tired, but I couldn’t sleep. I was in the first year of my PhD, but it wasn’t work that was bothering me. Today was an anniversary. I blinked as I carefully studied the screen.
No dragons yet.
It wasn’t just any dragon I was looking for. This particular one lived up to its epithet. ‘The Time-Lost Proto Drake’. What a name! In my mind, it evoked an image of an ancient monstrosity with vast wings of torn and mouldering yellow leather. But the name carried a double meaning – it was also one of the hardest things to find in World of Warcraft. You might spend weeks, months – even years, God forbid – tracing a path around the mountains in search of it, and only come across tantalising flecks of evidence reminding you that it’s there, but just out of reach; perhaps the old corpse of an instance of the beast that someone else got to first. Or you might be one of the infuriating ones, the lucky bastards who claim that they just ‘happened across it’ without even trying. A random-number generator masquerading as good fortune, or karma for that rare weapon you didn’t receive after killing that dungeon boss* last week. It was somewhere in this snowy landscape, an area called the Storm Peaks, and I was hoping I would be one of the lucky ones. It didn’t look like it was going to turn out that way though – this time around I’d been sitting there for over an hour, and so far, nothing.
In a way, I didn’t really care whether I actually saw the damn thing or not. This was all about distraction. I was imagining what it would be really like, sheltering on that ledge at the top of this rich fantasy world, watching other players fly by on gryphons, wyverns, and levitating mechanical heads. Trying to imagine what the dramatic storms overhead would actually sound like, feel like. Smell like. Some people get lost in a good book. I get lost in a good game. A message popped up in the chat window in the bottom corner of the screen. It was Dave, the leader of the small guild of which I was a member.
‘Seen it yet?’
‘Not a chance,’ I replied. ‘Not really looking though.’ His response flashed up after a moment. ‘Wanna do a dungeon instead?’ An invite to join a group popped up on my screen, and a few minutes later, we were off on an adventure someplace else to take down a monster and grab some loot. The Time-Lost Proto Drake would have to wait for another day.
World of Warcraft is one of the most – if not the most – successful ‘massively multiplayer online’ games, or MMOs, of all time. To the untrained eye, it’s an archetypal ‘violent video game’ – you create a character, grab a weapon, and jaunt off on various quests to smash things, ranging from fairly innocent low-level boars to terrifying Lovecraftian monstrosities. But the thing that gets lost in the overly-simplistic narratives you might see in the news about World of Warcraft being a violent game is that there’s more than one way to play it. Some people like to create an array of different characters, just for the experience: elf druids, human warriors, dwarven hunters, zombie warlocks. Or, you can play it as a purely competitive team game – two factions, the Alliance and the Horde, square off in anything from strategic ‘capture the flag’ style matches to all-out brawls. Or you can play it in true role-playing style, develop a character with a rich and lengthy history, and spend your time acting out a story on the grandest of scales. Sometimes, you find players who approach the game in a completely pacifist way, levelling their characters up solely by harvesting flowers and mining for ore. Other people devote the majority of their time to collecting riding mounts – animals like the Time-Lost Proto Drake that you can use to travel around the world. There are hundreds of them. I once spent three weeks – three weeks – wandering around a tiny mine in a distant corner of the game, collecting randomly-spawning eggs, just so that I could claim a Netherwing Drake mount. It became an obsession, and was completely worth the effort. The first time I took to the skies on it, it was beautiful – wings of iridescent purple that spanned the entire width of the screen made it difficult to see where I was flying, but I thought that it was a wonder to behold. It still takes pride of place in the ranking of my zoo’s worth of rides. In short, to simply call World of Warcraft a violent game is to miss the innumerable experiences that it has to offer.
This sense of freedom probably explains some of World of Warcraft’s runaway success. Video games like this provide us with the opportunity to experience the world (as well as other worlds) in a way that no other form of media really comes close to, in part because they are an inherently personal experience. In a 2013 radio essay coinciding with the centenary of Albert Camus’ birth, Naomi Alderman, the novelist and games designer, elaborates on why: ‘While all art forms can elicit powerful emotions,’ she says, ‘only games can make their audience feel the emotion of agency. A novel can make you feel sad, but only a game can make you feel guilty for your actions. A play can make you feel joyful, but only a game can make you feel proud of yourself. A movie can make you feel angry with a traitor, but only a game can make you feel personally betrayed.’
Alderman is talking about how games embody the principles of existentialism. Just as philosophers like Camus or Sartre suggested that, in a universe from which God has departed, we define our own meaning in life (that we are nothing but that which we make of ourselves) so too do games force us to define ourselves via a series of choices, to make decisions in order to achieve something; anything. MMOs like World of Warcraft encapsulate this idea beautifully. There is an overarching storyline, but you’re not required to participate in it if you don’t want to. It’s not a linear game. You’re free to do as much or as little as you choose – and from the point of view of the individual player, the possibilities are endless.
There are all sorts of reasons why people play video games, and there are all sorts of people who play them. Over the course of this book, I’ll explore these reasons and the scientific research that’s gone into understanding them. I should say at this point, though, that the scientific research comes with some heavy caveats. Video games research is only a budding area of science, sitting largely within psychology, which is itself still a relatively young discipline when compared to some of the ‘harder’ sciences like physics or chemistry. It’s made all the more complicated by two facts: firstly, technology develops at a faster rate than research can be conducted, which means that the methods used to study video games are often contentious. The second is that people are messy. Running psychological studies that involve human participants doing anything (let alone playing video games) is hard. They do things that you never anticipated: things that can break your experiments. They try to give you the answer they think you want. Some of the more annoying ones try to give you the answer they think you don’t want. And all of this put together means that there are as yet no universal or conclusive truths about what researchers do or do not know about the effects that video games have on us, or why people play them. Sorry to disappoint you so early on, but I promise that digging deeper into this state of affairs will give you a pretty good understanding of where we’re at in terms of the current state of psychological science. And hopefully, along the way, I’ll be able to dispel a few myths about the effects of games – and technology in general – that might make you worry less over some of the more hysterical headlines in the news about society as we know it being destroyed by your smartphone, or by Instagram-saturated millennials, or by whichever video game people are taking exception to this particular week.
Anyway, as I say, there is a plethora of reasons why people get into gaming. Some play purely to interact with other people. Some simply enjoy the level of escapism offered by complex and multifaceted digital worlds. Fundamentally, whatever our reasons for playing them, video games afford us a chance to learn something different, to explore somewhere new, and, potentially, find out something about ourselves. Reinforcing this point, Naomi Alderman suggests that ‘the game is the only form that actually places the audience on that existentialist stage, where we’re all forced to find out who we really are.’ In that sense, video game play is one of the most fundamentally important activities we can take part in.
For me though, there was a simpler reason that I was playing Warcraft that night, looking for that elusive dragon. I was playing to distract myself from the anniversary of my dad’s death.
When I think back to that day, I remember conflicting details. It’s like someone has taken an old jigsaw, removed half of the pieces, and thrown some other bits in from another set. The pieces look similar, the sky is the same shade of blue, and they almost fit – but not quite. That day, both of us ended up in hospital for different reasons. In the afternoon, I’d managed to injure my ankle playing football – gamer clichés aside, I’ve never been a natural athlete. I somehow made it back to my school’s main building, which was some walk (or in this case, hobble) from the football pitch, and one of my teachers tried to phone my dad. Someone came to pick me up and take me to hospital – my mum, I think, though my grandad must have been there too because I remember him looking ashen with worry. I remember thinking it odd that he would be so upset about something as trivial as a sprained ankle.
My mum drove me back home after my ankle had been patched up, and I remember my nan calling me on a black-brick-monstrosity of a mobile phone: ‘Peter, your father’s died.’ Sprawled out on the back seat of the car, I started so much that I kicked the door with my duff foot. As it turned out, I’d misheard, but the real news wasn’t much better. Not dead. Dying. For the past two years, my dad had been slowly succumbing to the tidal onslaught of motor neurone disease. That morning, he’d taken a turn for the worst, and he was on his way to the very hospital I’d just spent the last four hours in. As the phone call ended, I saw an ambulance drive past, and hoped it was a coincidence.
Later, I’m at the hospital. There’s a family room; that’s where my mum, grandparents, my auntie and my uncle are. They’re arguing over silly little things, like who the doctor should be explaining the situation to. My dad’s in the next room over, on a ventilator, with strips of white tape covering his eyes. He’s not responsive. For the most part, neither am I. I’m just sitting there, in a wheelchair, alone in the corridor, looking and feeling a bit pathetic. As the argument starts, I quietly wheel myself out of the room to get away from the noise. Being lost with my own thoughts out here isn’t much better though. Why were they fighting? And what could anyone possibly hope to gain from being told what was going on? It was simple really: my dad’s dying, and he shouldn’t be. He’s only 45. We should be at home, having dinner, or watching a movie, or doing something, anything else. Anywhere but here. At some point, the doctor came and spoke to me, but the words didn’t really penetrate the cocoon of disbelief I’d wrapped myself in. At another point, he wheeled me into my dad’s room, so I could say goodbye.
Death in the virtual reality of a video game is an odd sort of thing. One life might end, but then time rewinds ever so slightly, and another alternative continuity pops up. (Interestingly, some physicists believe that the universe may actually work like this; that for every decision we make, a real alternative history branches away from us, another universe in which we made the opposite decision, or succeeded instead of failed, or got a negative result on that test instead of a positive one.) Each time, a new world opens up: one where you didn’t accidentally fall into a massive hole, or trigger a trap, or get shot by a sniper. I think that’s part of the reason why I like them so much – you get another chance. In a sense, death is robbed of its terrifying power. It’s not the inevitable end that we must all face at some point in our existence; instead, it’s a minor inconvenience. It’s a way of being told that you screwed up, pressed the wrong button. Ultimately, it’s all about failure.
People worry that games are melting our brains, or that they are turning generations of kids into social zombies, incapable of stringing a coherent sentence together in the name of enjoyable conversation. For a time, I wondered, too, if playing games to escape death was a bad thing. I worried that reinforcing this idea – that dying in a video game is, at its core, a commentary on failure – would make it spill over into the real world in some way. That I would start thinking that death is just about some sort of deficiency. Or, that it would somehow interrupt one of the stages of grief. Everyone knows them: denial, anger, bargaining, depression, acceptance. Maybe games override that last one. If you can delude yourself that death doesn’t really happen, that there’s always another chance, then you can never really accept when someone you love has truly gone forever. That’s probably a bad thing, right?
We all know this, deep down, but it doesn’t hurt to say it out loud: all that stuff about five universal stages of grief is a load of bollocks. The original idea came about in the 1960s as the brainchild of a Swiss psychiatrist, Elisabeth Kübler-Ross, who developed the model after working with a number of terminally-ill patients. If you look at the scientific research literature, there’s not much to back up the idea that people go through some sort of standardised or predictable pattern of dealing with loss. There’s never been a study that’s shown that these rigid stages of grief actually exist. Our emotions just aren’t granular or consistent in that way. Instead, whenever it lurches into our lives, I feel that death has a tendency to throw us into uncertainty: as a way of removing any pretentiousness or psychological veneers with which we might protect ourselves. The heuristics that we use to navigate through day-to-day life fall away, and we must truly and honestly respond to a situation that presents us with what seems like the most terrible of unknowns. In fact, everyone responds to bad news differently. So, rather than vying with the process of grieving for someone lost to the ravages of time, perhaps playing video games instead offers a way to deal with a situation that often seems to escape understanding, that defies any attempt at explanation. They might offer help with other things, too.
‘Do you think it helps? Do they draw people in?’ I ask.
I’m sitting in a stark, cream-coloured conference room somewhere in the heart of the National Exhibition Centre in Birmingham. On the far side of the room is a table laden with coffee, tea and biscuits, and as I sit on some hastily rearranged blue conference chairs next to Johnny Chiodini, a quiet stream of sleepy passers-by idle past us in search of caffeine.
‘I think it doesn’t so much draw people in, as it’s there for the people who already know’, he explains. ‘People who use games as a support mechanism do it because they’ve already been doing it, maybe without realising.’
Johnny Chiodini is the senior video producer for EuroGamer. net, a website which hosts articles and videos covering all aspects of video games journalism, from reviews through to features about games design, industry issues, and more. We began talking to each other on Twitter a couple of years previously, after he started a series of videos for the website, called Low Batteries, where he delivers a spoken essay set to scenes from various video games. Meeting him for the first time in person, I fully expected him to be a hardened, grizzled broadcast veteran with no time for stupid questions. Instead, he couldn’t have been more welcoming or amiable – someone with a genuine, infectious enthusiasm for video games, who could speak with equal ease about mental health. Which makes sense, really, because Low Batteries looks specifically at how the two interact with each other, with episodes covering topics such as how PTSD is portrayed in games, how the tired old trope of psychiatric units being a hotbed of horror is perpetuated in them, or how games are used by many as a coping mechanism for dealing with anxiety and depression.
‘I was going through a really bad period with depression. I’d been low for a couple of weeks and I was just trying to push through it,’ he explains. At the time, the Eurogamer video team were still very much finding their feet. Whereas nowadays they sit down on a Monday, plan out the entire week and have a clear idea what they’re producing each day, a few years earlier, Johnny explains, they were flying by the seat of their pants. ‘It was about 2pm and I’d done nothing,’ he says, ‘So I went for a shower to try and clear my head a bit, and I thought to myself “if I’m feeling so shitty and it’s all I can focus on, I might as well talk about it”.’ He came up with the name of the show in the shower, and the script just started falling out of him. The first episode was all done in one draft; it was quickly recorded and edited, and launched online. ‘I didn’t tell anyone I was doing it. Luckily, the feedback was just lovely. It really resonated with people.’
Talking about mental health online – never mind talking about your own mental health – has always been a risky business. It’s obviously a sensitive issue and, perhaps because of that, the chances are good that there will always be someone ready to pick a fight with you online, which can hurt even more if you end up getting trolled. So I was surprised to hear that the reception for Low Batteries had apparently been almost universally positive. ‘It was really, really overwhelming. Actually, I think it was characterised by an absence of people telling us to shut the fuck up,’ explains Johnny. For Eurogamer’s videos, it’s not always this way. ‘We could be doing the most innocuous coverage and people will tell us we’re awful and that we should die. But [for Low Batteries] there’s just been a complete lack of people coming out the woodwork to tell me I’m a dick. Which is refreshing.’
Perhaps one of the reasons that Low Batteries has been so well-received is that it bucks the trend, as far as online video or written content that tries to cross the mental health/video game divide goes. A lot of articles consist of experiential accounts about, for example, how playing Doom helped the writer with their depression, or how four weeks of Skyrim got them through a particularly difficult point in their lives. Not that I think there’s anything wrong with this, but individual accounts dwelling on a specific moment in time can only go so far in helping others. Low Batteries instead tries to take the long view. Rather than a reactive, retrospective account, it is instead proactive – each video provides a starting point for a discussion that can be carried on in the comments section, almost like an open forum. What makes Low Batteries special is that it provides a place where people who share an immediate common interest (that is, video games) can get together and discuss the sort of mental health issues that are both featured in the videos, and which they themselves might be affected by (and a lot of other people besides). It’s a nuance that often gets lost in the public narrative about video games, which can end up devolving into an argument about whether or not it’s the games that are causing the mental health issues in the first place. Games can help people to process grief, stress – anything that you’re going through. And yes, sometimes, they can become an all-consuming obsession – although the scientific evidence behind whether, and to what extent, we can become addicted to video games really isn’t all that convincing. I’ll talk about this in more detail later, but for the present purposes, the point is this: the discussion about video games shouldn’t be black or white, all or none. It shouldn’t be a debate where the only two positions we’re able to take is either that they’re perfectly fine and don’t have any effect on us in any way, or that they’re literally melting our brains. There is a vast grey area in between these two positions, and that’s where the true effects of games lie. In embracing that idea, it’s worth considering both the positive and negative experiences that games can afford us. Because in the end, games are imperfect things, made by (and for) imperfect beings. They are able to mirror and amplify both our foibles and virtues in ways that no other entertainment medium can possibly hope to emulate.
In World of Warcraft, in an area called Mulgore, there is a small village inhabited by ‘Tauren’, which are humanoid, cow-like characters. The village is surrounded by a grassy open plain, bordered on three sides by a narrow river. Sometimes, when the sun is setting over the hills in the distance, it’s pleasant just to sit there and watch the world go by. A bridge spans the water, and as you enter the village, you might be greeted by an old rancher called Ahab Wheathoof, pinning a notice to the totemic archway that signals dry land. If you talk to the character, you’ll be greeted with a simple quest – help him to find his beloved dog, Kyle. It doesn’t take long to complete, and it isn’t particularly taxing. When you finish it, nothing momentous happens. Ahab thanks you; you receive a token reward and carry on your way.
There’s more to that story than a simple quest, though. Ahab, along with the mission to find his lost dog, was designed by an eleven-year-old boy called Ezra Phoenix Chatterton. Ezra was an avid World of Warcraft player who had the chance to visit the game developer Blizzard Entertainment’s offices through the Make-A-Wish Foundation in 2007. During the visit, he recorded voiceover material for Ahab, and Kyle the missing pup was named after Ezra’s own dog. He also got to design a new crossbow for the game, and his character became the world’s first rider of a unique phoenix mount – a fitting touch, given his middle name.
Ezra died from a form of brain cancer in October 2008. After he had gone, droves of players made a pilgrimage to Mulgore to complete the quest that he’d designed. It was a simple homage, a way of dealing with a death that didn’t make any sense. Afterwards, Blizzard renamed another character in the game after him – Ezra Wheathoof – a timeless memorial for players to find, talk to, and reflect on. We might have a finite amount of time on this earth, but video games allow us to live multiple lives in a countless number of ways. A decade after he died, this virtual backwater still contains an echo of the things that Ezra had thought about and loved, and that mattered to him. And in some ways our digital lives grant us a certain type of immortality – we might go, but the characters we play never really die. At least, not until the servers are switched off. Even then they remain latent, digital possibilities.
As in the everyday world, death in video games is something multifaceted, complex and uncertain. In the most basic, mechanistic sense, it serves as a learning tool; when our characters die, it’s because we did something wrong. But we can fix that – we get another chance. There are no second chances in reality but, like games, perhaps death still acts as an opportunity for us to learn something about ourselves. Maybe death gives us a chance to find out who we really are. After all, the way we act in the face of oblivion is something that we can’t prepare for; it’s something that can only be experienced in that fleeting, indelible moment.
When the doctor closed the door to leave me alone with my dad, I didn’t know what to say. I felt embarrassed, of all things. Like I was in some ridiculous TV show, and that if I dared to say anything out loud to this wonderful person, comatose in front of me, the walls would give way to reveal an audience in the throes of laughter: What an idiot! He actually said something! So I sat there, for a while, silently holding his hand. Eventually, I managed to croak out a few words. I told him I loved him, and that I didn’t want him to go. He didn’t respond. At some point later – seconds? Minutes? – I wheeled myself out of the room and into the empty corridor, sobbing silently. I never saw him again.
It’s a memory that plays over and over in my mind, especially on the anniversary of his death. Time doesn’t dampen it; it just seems to bring all of these feelings more acutely into focus. So sometimes, I play games to try to forget. It doesn’t always work, though. A couple of hours after finishing the dungeon with Dave, I returned to the Storm Peaks, half-searching for the Time-Lost Proto Drake again. All those years later, I’d had more than enough time to think about what I should have actually said to my dad that night. All those years later, and I still had no understanding as to why I had felt so embarrassed. As I sat there, staring at the screen, I started to develop an increasingly grandiose script in my mind – all the things I should have said, and the way I should have said them to actually mean something. And maybe, if I’d said the right thing …
I thought I caught a flash of yellow flapping out in the distance: That’s the drake. Leaning forward in my chair, I forced my character on to get a closer look, forgetting the precarious position that he was in on the mountaintop. He dropped off the ledge, flailing, unable to stop me from directing him towards a rocky death. The screen turned to shades of grey as I respawned in a nearby graveyard. Fuck. Pulled out of the increasingly hysterical re-run of that night in my mind, I revived my hapless digital self, logged out, and tried to get some sleep.
You never truly die in a video game, you know. You always get a second chance.
* ‘Bosses’ in video games are usually primary antagonists or significant enemies that the character has to overcome, most often at the end of a particular level or section. They tend to be much more powerful than your average computer-controlled enemy, and often require the player to use a specific strategy to overcome them.
CHAPTER 2
When I first booked an appointment to rifle through the Science Museum’s collection of video games memorabilia, I fully expected to be deposited in a dusty old attic somewhere in South Kensington where I could get lost among ancient cardboard boxes and leather trunks. And while I’m sure that every museum must have an area like this, I instead find myself in the Dana Research Centre and Library, a state-of-the-art home for viewing the museum’s archival collections. Clean, quiet and uncluttered, the library is what you would realistically expect somewhere you can examine rare and expensive scientific artefacts to look like. A window running the full left-hand side of the library looks out onto a grassy courtyard. Row upon row of wood-on-black-metal shelving houses decades of scientific ephemera. A student desperately clatters away at his keyboard on a communal desk, surrounded by dog-eared notebooks. I sit down at a workstation and glance over at the researcher opposite me. He’s studying a rather old-looking, leather-bound handwritten book. To save it from being handled too much, it is nestled in a miniature desktop beanbag. It is, I suspect, the comfiest book in London. The reason for my being here, though, sits rather more unceremoniously in a beige envelope on the desk in front of me. The museum curator carefully takes the document out, places it on top of the envelope, and quietly leaves me to my own devices for the next couple of hours.
The document is a pale blue handbook that, despite the odd grubby-looking fingerprint on the spine and leading edge, is in remarkably good condition for paper that is nearly 70 years old. Emblazoned on the top half of the cover are the words FASTER THAN THOUGHT, followed by the title: THE FERRANTI NIMROD DIGITAL COMPUTER. On the bottom half, the price is displayed – one shilling and sixpence – along with the logo for the 1951 Festival of Britain. Nimrod is thought to be one of the earliest-known prototypes of a computer that could play games, and this document is all that remains of it. Built by the British electrical engineering firm Ferranti, Nimrod was designed to allow attendees of the Festival (a national exhibition that was an attempt to instil Britain with a sense of revitalisation and progress in the aftermath of the Second World War) a chance to play the mathematical game ‘Nim’, against a rudimentary artificial intelligence. Nim is an ancient strategy game and one, I think it is fair to say, that’s fairly tedious. In the standard, real-world version of the game, two players start with a number of distinct heaps of objects – usually matches. On each turn, a player must remove a number of matches (it can be any number, but has to be at least one) from a single heap, and the aim of the game is to avoid being the player who has to take the last match. Nim is a game of some mathematical importance, but as an experience, well … Let’s just say that I don’t think it has much in terms of replayability value.
Nimrod used a bank of lights and a set of 32 buttons to represent the matches from the original game. Each button would switch an adjacent light off (the equivalent of removing a match from a heap), and the display was arranged into four evenly distributed rows, each corresponding to a heap of matches. In this version of the game, each ‘heap’ could contain a maximum of seven lights, with the eighth button giving players the option to remove the entire row. At the start of each game, a player (or demonstrator) first specified how many lights there were in each starting row by selecting between one and seven lights to be switched on. Once the display was set up, a player could then either battle a computer-controlled opponent, or watch as Nimrod simulated a game between two players. The computer was basically an advertisement: its entire purpose was to demonstrate the principles and possibilities that so-called ‘automatic digital computers’ could afford.
The machine itself was huge: twelve feet wide by five feet tall, with a depth of nine feet. As I read about it, my mind wanders a little and I try to imagine what it might have been like to be in the presence of something so new and alien. The accompanying handbook details how it worked and lays out the mathematical principles behind Nim, and it’s an interesting example of early written science communication for the public. It does get a little dense in places: the latter half is devoted to explaining the mathematical principles behind the algorithms used by Nimrod, as well as the mathematically correct way to play the game in order to win. Nevertheless, while the aim of Nimrod was to showcase the potential power of computing to an audience who could not yet conceive of the reach that such machines would eventually have, the computer also did something else: it demonstrated that even at these early beginnings, science and video games enjoyed a certain kind of symbiosis, with each growing from the other. The first computer games – depending on your definition of what that means – were not created by commercial games studios. Instead they can, in part, trace their roots back to the computational advances that emerged during and just after the Second World War. Tucked away in the middle of the handbook is a brief section on why, when Nimrod was created, the engineers involved decided to make it play a simple game:
‘It may appear that, in trying to make machines play games, we are wasting our time. This is not true, as the theory of games is extremely complex and a machine that can play a complex game can also be programmed to carry out very complex practical problems. It is interesting to note for example that the computation necessary to play Nim is very similar to that required to examine the economies of a country in which neither a state of monopoly nor of free trade exists.’
Nimrod’s creators, it would seem, were taking great pains to stave off the view that they were developing something frivolous or inconsequential. Yes, at face value this was a simple game. But game theory – which, in a basic sense, involves using mathematical models to describe and predict how decision-making agents interact with each other – has important applications in fields such as economics, political science and biology. By implementing this type of mathematics (albeit in a rudimentary form), the broader aim of Nimrod was to show that the computational processes that powered the machine had practical value beyond its being a mere plaything. Nimrod was not the first iteration of a machine that could play a game; nor was it even the first iteration of a machine that could play Nim – although it was likely the first computerised version. A patent had been placed for an earlier, electromechanical ‘Machine To Play Game Of Nim’ on 26 April 1940 by a nuclear physicist, Edward Condon, and his colleagues Gereld Tawney and Willard Derr of the Westinghouse Electric Company. That machine was demonstrated to crowds at the New York World’s Fair, which ran from April 1939 to October 1940. Throughout the 1940s, the first digital computers began to be developed, but the evolution of computers that could play games remained largely stagnant – other, darker things were preoccupying the idle hands of men. As part of an unpublished manifesto on artificial intelligence written in 1948, Alan Turing mused that games such as bridge, chess, poker and noughts and crosses would be ideal ‘branches of thought for the machine to exercise its powers in’. He would go on to develop a chess simulation called Turochamp but this was never actually executed on a computer – instead Turing would resort to demonstrating in lectures how the programme would operate using pen and paper.
There are perhaps two other significant milestones that were reached in the history of video games in the late 1940s. The first was known as the ‘cathode-ray tube amusement device’ (CRTAD), and while not the most inspiring name for a gaming machine, it was arguably the first that incorporated the use of a real-time moving display to simulate a game. The device was patented on 14 December 1948 by Thomas Goldsmith Jr and Estle Ray Mann of Du Mont Laboratories, and where Nimrod can best be thought of as a computer game without a proper visual display, the CRTAD was really a game display without a computer – players could mechanically control a dot simulating an artillery shell to ‘attack’ targets represented by a piece of transparent plastic overlaid on the screen. The second milestone came in the form of a machine called Bertie the Brain, which debuted at the Canadian National Exhibition at the turn of the next decade, in August 1950. Built by the Canadian engineer Josef Kates of the company then known as Rogers Majestic, Bertie the Brain was a huge computer that allowed people to play a game of noughts and crosses on a static light display. While it didn’t incorporate any of the types of moving visual units that would probably allow it to be defined as a true video game, it nevertheless provided the first recorded implementation of a computerised game, the sort that Turing and others had only previously theorised about.
This is an admittedly jolted history of the early days of computerised games, but it shows that it’s quite difficult to hone in on an exact starting point. All of the machines I’ve talked about so far actualised a game in some way, but to call them ‘video games’ seems to be a little stretch of the terminology and imagination. All of these devices were produced, by and large, in isolation, for a specific (almost single-use) purpose, and they probably didn’t have much influence on each other. Moreover, it is likely that the true ‘last universal common ancestor’, to borrow a phrase from evolutionary biology, was developed quietly in an unassuming university laboratory somewhere, discarded out of hand, and ultimately lost to the ravages of time. Nevertheless, the concepts behind these early machines would each, in their own little way, eventually pave a way through to what we would consider to be modern video games consoles.
After a couple of hours in the company of the Nimrod handbook, I make my way over to the Science Museum itself, dropping in through a back entrance. It’s just after Easter, and the place is overrun with hordes of children, either on school trips, or under the watchful eye of parents looking for a way to spend the last few exhausting days remaining of the spring holiday. I head up to the second floor, push open a giant set of doors and descend into a scene of organised chaos. For the past three years, the Science Museum has hosted Power Up, a temporary exhibit of the last half-century of video games. Sprawled across a giant hallway is pretty much every popular games console that you can think of. Towards the centre of the room are sixteen Xboxes arranged in a circle and a group of children huddle around as a spectacular Halo 3 multiplayer battle gets under way. A sign next to the area states that the game is recommended for children aged sixteen and over; none of these children meet that criterion, but all are apparently fine, all are having a brilliant time. Computers running Minecraft sit happily alongside a parent and child playing the original Secret of Monkey Island. Along the wall to my right is a living chronological timeline of consoles. It starts with a 1977 Binatone TV Master playing a cloned version of Pong, alongside a 1978 Atari VCS running a rather disjointed-looking imitation of Pac-Man. At the far end is the latest in gaming rigs – an Xbox One running Burnout: Paradise. As I wander along the line through the digital ages, I pause. A volunteer is patiently explaining to a small child how to load a game up on a 1982 ZX Spectrum, a console I had fleeting possession of in my younger years. I look on as the old, familiar multicoloured loading frame dances around a blank rectangle on the screen, while information is slowly drawn from the cassette tape. After what feels like an eternity, Chequered Flag by Psion Software – one of the first racing car simulators – finally appears, much to the disappointment of most of the onlookers. While it was a brilliant game of its time, the graphics now look horribly dated, with an uninviting dashboard showing a complex array of gauges and dials taking up the bottom half of the screen. Deep in the room, amid a sea of Mega Drives, Super Nintendos, BBC Micros and PlayStations, there’s a VR area where people are getting the chance to play the latest version of Battlezone, the original of which is often considered to be one of the first virtual reality games (and one that I will come back to later).
I make my way back to the start of the timeline, sit down at the Binatone TV Master and try my hand at the version of Pong on offer. The on-screen paddle is controlled by a small orange box wired up to the main console, the joystick loosened by decades’ worth of use. Moving the stick even slightly causes the paddle to dance frantically across the screen, making anything that resembles fine control of the game a distant possibility. It is an infuriating experience. But something else doesn’t sit right with the prominence that the console has been given in this chronology. The TV Master originally launched in 1976, and it was by no means the first mass-produced games console to penetrate the home market – that honour belongs to the 1972 Magnavox Odyssey, conspicuous by its absence here. After a couple of attempts at the game, I give up, idly musing about why this console – one that I hadn’t really come across before – was placed first in line, in this version of the history of video games. Despite this frustration, I left the Power Up exhibition with a sense of optimism about how games could bring people of all ages together, and a desire to see the original 1972 Odyssey in the flesh.
About a week or so after my trip to the Science Museum, I find myself in the middle of Nottingham’s bustling city centre. The National Videogame Arcade* blends in so well with the rest of the shops and cafes in the centre that I almost walk straight past it. Arcades, for me at least, are fortresses of neon and noise, defiantly standing against the onslaught of decay in crumbling British seaside resorts. In other times and other places, they are something different – the mega-arcades of Japan and South Korea are worlds unto their own; digital theme parks built within acres of concrete and colour. But the National Videogame Arcade, or NVA, is something different. The outward-facing facade is quietly serious, almost museum-like in appearance. This is not accidental.
As I walk in, I immediately get a warm, welcoming feeling in the pit of my stomach. It takes me a few moments to consciously realise why. Tucked in next to the entrance is a lone, matte-black arcade cabinet, playing the familiar bit-tone wakka-wakka sounds of Pac-Man. Billed as ‘the first permanent cultural centre for video games’ in the UK, the NVA is an idea that arose out of the GameCity Festival, which has been held annually in October since 2006. The NVA itself consists of four floors of interactive exhibits and video game displays, as well as hosting coding workshops, gaming competitions and tournaments, and talks throughout each month. The second floor is a maze of rooms housing every sort of game you can think of – from graphical adventures like 1983’s The Dark Crystal through to Rock Band and one of my own childhood favourites, the Teenage Mutant Ninja Turtles arcade game. I have very happy memories of spending endless summers playing it on the north coast of Wales, in an unassuming little village near the surfing centre of Abersoch. My grandparents had an old holiday home there that I would visit with my parents for a few days at a time, but the slow crawl of youthful days felt like months. When the sun started to set on our days at the beach, we would retreat back to a tired little pub at the top of a nearby hill. In the beer garden was a miniature arcade – nothing more spectacular than a small, grey Portakabin, but one crammed full of the best arcade machines of the time. As my parents sat with a drink, I would alternate between playing a few rounds of Turtles or Final Fight and messing around in a makeshift adventure play area that would likely be a modern-day health and safety nightmare. Returning to things that you once loved in your childhood can be a precarious business – happy memories of games you played when you were young can often be tarnished when viewed again through the jaded eyes of an adult. Happily, this does not happen with Turtles. As I stand there playing it in the NVA, I still get the familiar excitement that I did when I was seven years old. I don’t remember having seen the game through to completion at the time, which strikes me as odd now, since the entire experience only takes about twenty minutes from start to finish – although it probably helps that it’s free to play in the NVA, and that this time I can burn through about 50 lives in the process. As I finish off the last boss, the leaderboard appears on the screen, and with a secret sense of pride – my score is top of the rankings – I enter my initials. I check my watch. I still have a bit of free time. I start playing again, but this time I don’t pay so much attention to the game. The background noise of the arcade behind gives way and I become enshrouded in memories, like I’ve been transported to that Portakabin again, from all those years ago. For a second, I imagine hearing my dad calling for me to come and join everyone for a spot of dinner. Playing that game, feeling the joystick and the buttons under my hands, is a visceral experience – much like when a long-forgotten aroma hits your nose and triggers the memory of your first love, or your favourite holiday. Pulling myself away from my thoughts, and the game, I check my watch and head up to the third floor to find the real reason I’m here today.
In a glass case in a quiet room upstairs at the NVA, above the revelry of the arcade, sits an original, pristine Magnavox Odyssey. It was developed in the 1970s by the inventor Ralph Baer, one of many people who lay claim to the accolade: ‘father of video games’ (and perhaps the one who has the most right to the title). Born in Germany in 1922, Baer left the country just before the onset of the Second World War to join the United States’ war effort. Later on, he would join the defence contractor Sanders Associates as an electronic engineer, where he remained from 1956 until 1987. It was during this time that he started to develop the idea of a home games console, coming up with a series of prototypes in the late 1960s that would culminate in the first mass-market console in May 1972.
Baer was meticulous in his design efforts. In his 2005 book Videogames: In the Beginning, he outlines the series of incremental experiments that he conducted in the late 1960s, complete with handwritten schematics, which build up a picture of how the Odyssey was created. The first experiment had the simple (!) goal of developing a system to display a vertical line on-screen that could be manually controlled to move both horizontally and vertically. Progressive experiments added the ability for two players to control on-screen objects at the same time, right up to more complex paddle-and-ball games, colour displays, and even a schematic for a rudimentary light-gun game (‘Objective: Point gun at spot displayed on TV set and develop logic-level output if trigger is pulled at same moment’).† Admittedly, the final, commercial console – the Odyssey – appeared to roll back some of these more ambitious elements to deliver a relatively simple gaming experience (although the ‘shooting gallery’ game did come bundled with a light-gun peripheral controller, the first of its kind). The Odyssey couldn’t display colour or transmit sound, and had only enough processing power to generate four objects on the television screen: a vertical line that could denote a boundary to split the screen into two halves, and three square dots. Two of the dots could be manipulated by brick-like wired controllers, while the third was controlled by the machine. No power was left over to be able to generate the basic quality-of-life information that seems to be second-nature in games nowadays. There was no score counter or countdown timer, nor was there anything in the way of background graphics. Instead, the Odyssey shipped with dice and pencil and paper for keeping score and deciding who would go first, as well as a series of plastic overlays. These were colour transparencies that could be attached to a television screen in order to turn the scene into, say, a hockey pitch or a football field. It was almost as if the Odyssey served as a bridge between more traditional tabletop board games, and a new emerging breed of digital games consoles.
While seemingly simplistic, Baer’s creation was revolutionary. It marked the first time that people could actually control the action on a television screen, compared to the more passive experience of watching a broadcast programme. I stare at the Odyssey, sitting almost majestically in its glass trophy cabinet. This is a masterpiece, the very reason that we have an Xbox, PlayStation or Nintendo sitting in our homes right now. And yet, I can’t help but feel as though I haven’t quite found the first video game machine yet. Clearly it took Baer years to develop, but that still leaves a substantial period of time between Nimrod (in the early 1950s) and the first home games system (in the early 1970s). In order to trace what happened in those intervening years, it’s necessary to look back again at the symbiotic relationship between video games and science.
After the Festival of Britain in 1951, Nimrod was transported to the Berlin Industrial Show for the final three weeks of October that same year (although some accounts mention that it appeared at an event of the Society of Engineers in Toronto after Berlin). As with a number of machines of its time, the real purpose of Nimrod was never to function as a gaming device – it was created simply to demonstrate the principles behind computer science. And so the machine was eventually dismantled, and Ferranti moved on to other projects, never to return to the idea. But elsewhere in the UK and US throughout the early 1950s, other scientists were tinkering with the idea of games machines. In 1951 the mathematician and computer scientist Christopher Strachey, then a schoolmaster at Harrow, was introduced to a pilot version of the Automatic Computing Engine (ACE) – a computer based on Alan Turing’s designs – at the National Physics Laboratory. To familiarise himself with the computer he wrote a software programme that could play a game of draughts – thought to be one of the first instances of a computer game to run on a machine that wasn’t dedicated to that specific purpose. After a number of programming issues and problems with the hardware running at maximum capacity, Strachey finalised the game in 1952, on the encouragement of Turing himself. According to the video game historian Alexander Smith, talks given by Strachey inspired other scientists, such as IBM electrical engineer Arthur Samuel, to write similar programmes.
Another significant moment in gaming history came again in 1952, this time at the University of Cambridge. During the process of working towards a PhD in theoretical physics, a student called Alexander Shafto Douglas wrote software to run games of noughts and crosses on the university’s landmark Electronic Delay Storage Automatic Calculator, or EDSAC computer. The programme was notable because it was the first time a computer game had included graphical output to a CRT display. But it wasn’t until 1958 that a more complete first prototype of what could be considered to be a video game was made. That game was called Tennis for Two, and its creator was William Higinbotham.