Tyranny of the Masses

Why developers should be weary of tracking player behavior.

“Big Brother is watching; not in an effort to control you but rather to learn from you. You’re not just playing videogames anymore. You’re actively providing feedback about what parts you like and which you don’t. How you play could ultimately help shape the future of videogame design.”

BioWare is just one of numerous development studios and publishers that have begun collecting anonymous player data. No identifying information is tied to the information harvested, so you don’t have to worry about things being traced back to you. You’re just a data point amongst millions.

-Erik Budvig

Were you worried that Bioware was not being influenced enough by the thousands of reviews, forum posts, emails, and tweets they receive for every single one of its games? Are you looking for a more impersonal way of communicating your gaming experiences with developers? Are you too lazy to write them an email with your comments and complaints?

If you are any one of these unfortunate people, then worry not my friends, for now you too can HELP SHAPE THE FUTURE OF GAME DESIGN. If, on the other hand, you’re one of those party poopers who still cares about antiquated 20th century concepts like ‘privacy,’ then you may still rest easy knowing that “you’re just a data point amongst millions.” In other words, Bioware’s data gathering efforts will somehow manage to give you unprecedented power while simultaneously making you an insignificant statistic. Makes perfect sense (statistically speaking, of course).

As you can probably tell, I’m more than a little baffled by the utopian declarations that have accompanied news of Bioware’s efforts to collect anonymous player data from Mass Effect 2. Gamers already influence the design decisions of mainstream developers  in a variety of ways, so it is simply absurd to imply that player tracking will somehow give “voice” to a previously disenfranchised demographic.

The question we should be asking is whether player tracking is good for Bioware and, consequently, good for people who want to continue playing ‘Bioware games.’ To be sure, I  absolutely get the appeal of collecting tracking data. Indeed, tracking a modest bit.ly link is pretty fun in itself (‘Ooooh look! I gots me 10 new clicks from Malaysia…clearly,  those guys know good Mario fan-art when they see it’), so I can’t even begin to imagine how great it must feel to be able to track one’s audience after years of working on a project as large and complex as ME2. But alas, the road to development hell is paved with good intentions, and I can’t help but worry that Bioware’s understandable desire to quantify player experiences might eventually backfire.

To understand the potential dangers of player tracking, we need to ask ourselves at least two questions: First, who gets to interpret the collected data? Second, how will this data influence the decision-making process of the interpreter?

The biggest brother

I don’t think it is too elitist to suggest that, as a general rule, artists do not produce their best work by worrying too much about the public’s (alleged) expectations. It was this overriding concern with fan service that made the completely arbitrary appearance of R2-D2 in The Phantom Menace seem like less than a terrible idea. This deference to “the audience” is also the reason that Nintendo decided to follow-up the Link of Wind Waker  with a more “mature,” conventional, and far less interesting version of Link in Twilight Princess.

So even if  Budvig’s claim that “Big Brother is watching” referred only to Bioware, that  would be reason enough to worry. But alas, Bioware is not the only one watching, and they will not have final say on how the data should be interpreted.  For Bioware depends on a bigger, badder brother; a passive-aggressive patron that has never made a secret of its desire to take your lunch money and leave you in tears. I’m talking about the biggest brother of all, which is to say Bioware’s publisher and parent company, EA.

If you’re reading this, chances are you already know a thing or two about EA’s history. When Trip Hawkins founded the company in 1982, his idea was to create a “different” kind of videogame publisher. His goals were incredibly idealistic for the time: EA would foster creativity, approach videogames as an “artform,” and treat game developers with the ‘respect’ that ‘artists’ deserve. Even the name “Electronic Arts” was a self-conscious attempt to convey these founding principles. As Hawkins explained in a 2007 Gammasutra article about the company:

“The original name had been Amazin’ Software. But I wanted to recognize software as an art form….So, in October of 1982 I called a meeting of our first twelve employees and our outside marketing agency and we brainstormed and decided to change it to Electronic Arts.”

But then Hawkins left the company in 1991, leaving a former Johnson & Johnson executive in charge. This is the point at which Electronic Arts began to transform itself into a “serious” publisher, abandoning its founding principles in the process.

Eric-Jon Rossel Waugh continues the story:

No sooner was Hawkins out the door than the acquisitions (and Madden milking) began.

[...]

The pattern to these acquisitions, if not universal, is infamous: find a company that made a really popular game, acquire the company and its properties; then set the team on churning out sequel after sequel to the game in question. Sometimes, likely not by design, the staff leaves or burns out, or one of the products sells poorly; the studio is closed or subsumed. Of EA’s acquisitions, only Maxis is known for retaining its autonomy and culture within the EA corporate structure, the jewel in EA’s crown.

EA seemed to have abandoned all of its founding principles and developed an attitude of rapid growth whatever the long-term cost, thereby setting a poor example for the rest of the industry.

And thus, the iconoclastic developer formerly known as Electronic Arts had completed its transformation into a faceless entity known simply as EA. Incidentally, EA was not the only multinational corporation to reduce it’s formerly descriptive name into an ambiguous acronym. According to Wikipedia, 1991 (the year of Hawkins’s departure) was also the year when Kentucky Fried Chicken changed it’s name to “KFC.” Conspiracy theorists claimed that the name-change came about because it’s genetically altered meat could no longer be considered “chicken.” Using that same conspiratorial logic, we may also say that Electronic Arts became “EA” once gamers realized that the company was no longer interested in promoting anything that could reasonably be called “art.”

Of course, my “theory” about EA’s name-change is a complete fabrication and, just like the KFC conspiracy theory, this will probably turn out to be false. But that’s entirely beside the point. The real problem for both of these companies is that they put themselves in a position that makes these rumors seem plausible to begin with. If KFC’s chicken still looked unmistakably like chicken, then no one would’ve developed a conspiracy theory about their use of KFC. Likewise, if EA hadn’t lost its way, perhaps there wouldn’t be a need for anyone to begin an article about the company by explaining “what the word ‘EA’ in ‘EA Games’’ stands for.” (Notice how ‘EA’ is described as a “word,” not an acronym.)

(What’s the moral of this somewhat obtuse and certainly gratuitous analogy? Perhaps that you shouldn’t try to use KFC in analogies. They tend to drag on for a bit, as you may or may not have noticed while reading the previous two paragraphs. I certainly learned my lesson, so let’s move on to the present day shall we?)

It must be said that things have gotten better at EA since John Riccitello was made president in 2007. A recent profile of the company in the October issue of Edge details much of what has gone right under Riccitello’s reign. First, he acknowledged that the company had grown too big and was releasing too many titles. He laid off staff and trimmed the release schedule. EA COO John Schappert is quoted saying that “A couple of years ago we shipped 67 titles; this year we’ll ship 36. Our goal is: ‘Let’s make sure the titles we’re going to make are great.’”

Under Riccitello, EA expanded the EA Partners program and allegedly made  efforts to improve the creative environment for the company’s in-house developers. By the end of 2007, EA also bought Bioware and made Bioware co-founder Ray Muzyka a Senior Vice President of EA’s RPG division. The long-term effects of this acquisition remain to be seen, however: it could improve the quality of EA’s overall RPG output, but it could just as easily result in a less focused and creative environment for Bioware’s own designers. Still, from EA’s point of view, this was a good step towards creating a more developer-friendly environment within the company (at least as far as RPGs go).

But in spite of these efforts, EA still has a lot to prove to gamers if it wants to become a respectable publisher once more. And make no mistake: “respectability” is the most that a publisher its size can ever hope to achieve. A publicly traded publisher like EA cannot hope to be “loved” or admired by gamers or developers.  Such adulation is reserved for studios (and the occasional first-party developer, e.g. Nintendo). This is because gamers recognize that publishers are not in the business of creating games, they’re in the business of making money. As such, their loyalties ultimately reside with shareholders, not gamers. Sure, EA wants gamers to be happy–and they spend quite a bit of money trying to figure out what gamers want–but gamers are simply a means to achieving the return-on-investment that shareholders expect from the company. Riccitello, after all, was not brought to EA in order to rekindle its creative spirit, but rather to sell more games and help the company regain its once dominant position in the industry (they were putting out mediocre titles long before its sales began to flounder, that’s for sure).

To that end, Riccitello has sought to diversify the company even as he tries to improve the quality of its “core” games. They have made serious in-roads into the casual games market and continue to experiment with “free-to-play” titles like Battlefield Heroes. These initiatives may very well be necessary for EA to keep pace with changes in the industry brought on by the internet, but they also serve EA’s long-term financial interests for slightly different reasons: namely, they will make the company less dependent on the sort of “core” gamer who takes games seriously, pays attention to reviews, and complains loudly when a game fails to meet her expectations. By targeting the lowest common denominator, EA (and most other major publishers) is building a mass audience that doesn’t know much about videogames, doesn’t read game reviews and, most importantly, doesn’t expect their games to be more than a 10 minute distraction to help them pass the time at airports and doctor’s offices. Unlike traditional gamers, these people are not asking for a five course meal, and they  certainly won’t get critical if you overcook the meat. All they want is a bite-sized piece of digital chocolate to get them through the day, and that’s a much easier business to manage.

Allow me to illustrate this point with one last quote from Edge’s recent profile of EA:

Battlefield Heroes attracted mediocre reviews and was heavily criticised earlier this year when the payment model was changed, making it almost impossible to progress through the game without paying for new weapons. “The perception was:  ‘Oh, EA has fucked this up, we’re never going to play again,’” says Patrick Soderlund, SVP and group general manager of EA Games’ FPS and driving titles. “But funnily enough, when we changed the way you pay, we had more players, and the game is now profitable.”

Funnily enough indeed. Hilarious in fact. But like many great jokes, there is a sad truth beneath the laughter. The truth is that EA did fuck up. It released a pretty bad game, then made it less of a game by charging real world money for the privilege of getting better at it. Somehow, this change for the worse attracted an audience and now the game is profitable, which is all they cared about to begin with. Hence the dark, pathetic laughter at EA. Note to Battlefield Heroes players: EA is not laughing with you, they’re laughing at you.

Not that there is anything intrinsically wrong with this attitude. After all, it’s their job, indeed their ‘responsibility,’ to make a profit for shareholders, and to do it as efficiently and painlessly as possible. And like any cunning politician, EA knows that the best way to achieve this is to alienate as few people as possible.  This the sort of corporate attitude that led to EA’s recent decision to eliminate playable Taliban characters in Medal of Honor (a cowardly move that is problematic for several reasons, which Ian Bogost already analyzed quite brilliantly in this essay). As a general rule, then, this strategy requires videogame publishers to pander to the lowest common denominator while simultaneously pretending to care about “taking the medium to another level,” in order to make the game seem less threatening  to casual gamers without alienating the traditional World War II ‘modern warfare’ audience.

Again, my point here is not to single out EA or vilify videogame publishers in general. I simply wish to note the basic fact (obvious to everyone in the music or film industries, but surprisingly absent from many gaming discussions) that videogame publishers are fundamentally different from both videogame developers and serious videogamers. Whereas we believe that a videogame is an end in itself, publishers use it as a means to profitability. This makes them a completely different animal: their priorities are different, their expectations from games are different, and their outlook of “success” will also be different most of the time. Not evil, just different in ways that often run counter to the interests of the medium.

This brings us, finally, to the issue of player tracking. As Bioware’s publisher and parent company, EA’s interpretation of the data collected for Mass Effect 2 is ultimately the only one that matters. So how will it read this data, and to what end? How will their interpretation differ from Bioware’s?

 

‘We deal with numbers’

Did you know that the videogame industry already has a system in place that mathematically determines the quality of every new release? It’s called “Metacritic,” perhaps you’ve heard of it? Like player tracking, Metacritic reduces incredibly complex subjective experiences into numerical values. But unlike player tracking, Metacritic does not derive its numbers from isolated in-game “events.” In fact, Metacritic scores are not really “derived” from anywhere – they are assigned by the site’s staff, whose  impressive qualifications include reading “a lot of reviews.”

You know how it works: the site monitors a wide variety of gaming publications, (including sources as diverse as the New York Times, Eurogamer Italy, Eurogamer Spain, and Eurogamer Plain), reads their reviews for you, and then provides you with a convenient numerical summary of what each reviewer thought, on a scale of 0 to 100. It doesn’t matter if you grade games on a 100 point scale, or even if you don’t grade games at all: as long as you call it a ‘review,’ Metacritic will attempt to assign a numerical value to it. Yes, Metacritic will literally convert another publication’s words into a 100 point scale, even if the publication makes a conscious decision to exclude grades from its reviews. It then “weighs” averages these scores in order to produce a “metascore,” a number that purports to reflect the overall ‘quality’ of a work without resorting to silly “qualitative concepts like art and emotion.”

If only all of life were like that!” says the Metacritic website, surely echoing the sentiments of socially awkward statisticians the world over. Luckily, life isn’t ‘like that’ – it’s just too rich and complex to be reduced to a number. The same goes for videogames – but try telling that to publishers like EA, who now rely on the site to evaluate the output of its development teams. As a developer told Michael Abbott recently, a particularly low Metacritic score means “people lose their jobs.”

The problems with using Metacritic as some kind of arbiter between publishers and developers are well-known by now, so I won’t bore you with more details. The bigger and more complex question is the one first posed by Stephen Totilo in 2008: namely, “why would a development studio ever tolerate publishers setting up deals like that?” Why indeed. Why would a developer ever agree to risk their very livelihood on Metacritic’s subjective impression of a ‘critical consensus’ which, by definition, consists of nothing more than an aggregate of the various individual subjectivities monitored on the site?

While it is true that most smaller studios have no choice on the matter, part of the blame must be placed on the development community itself,  for acquiescing to such a draconian system in the first place. Indeed, many developers direct their anger at individual critics (for depressing their metascore) while remaining deeply ambivalent towards the system that put them there in the first place. Others, like designer Soren Johnson (incidentally, a very talented guy who writes the excellent Designer Notes blog), seem to regard Metacritic as a necessary evil of sorts:

What should executives do if they want to objectively raise the quality bar at their companies? They certainly don’t have enough time to play and judge their games for themselves. Even if they did, they would invariably overvalue their own tastes and opinions. Should they instead rely on their own internal play-testers? Trust the word of the developers? Simply listen to the market? I’ve been in the industry for ten years now, and when I started, the only objective measuring stick we had for “quality” was sales. Is that really what we want to return to?

 

 

I’ve argued before that it is impossible to objectively determine (much less raise) the quality of a product that can only be experienced subjectively. Even Johnson seems to recognize this when he notes that publishers who play their own games would “invariably overvalue their own tastes and opinions.” Well, duh: of course our own tastes and opinions will be central to any activity that requires us to taste and opine – that’s just common sense. Still, why exactly would this be a problem? Publishers are the ones financing the product after all; why, then, shouldn’t it reflect their taste? By Johnson’s logic, we should also worry about a restaurant owner who hires a chef “just because” he enjoys the person’s cooking.

(His last point – that sales are not a good way to measure quality – is more compelling, and I’m certainly not one to argue that against it. But it does have one thing going for it: actual objectivity. True, sales tracking may not be a reliably objective measure of a game’s quality, but at least it’s an objective measure of something, which is more than we can say for metascores.)

 

I don’t want to keep harping on about the role that developers have played in allowing such a flawed system to be used against them because, as I mentioned earlier, most simply don’t have a choice. If forced to choose between placing their royalties at the mercy of Metacritic or not making the game at all, most  studios will understandably go with the former option. In this respect, they’re like the proverbial starving musician who signs a record deal without reading the contract only to discover, years later, that she was screwed by the label.

So perhaps we should have addressed Totilo’s question to publisher’s instead. Why do publishers rely so much on metacritic anyway? What is it about the site that publishers find so attractive and useful when dealing with studios? Here, Johnson gives us a hint, when he notes that publishers “don’t have enough time to play and judge their games for themselves.” But it’s not just that they don’t have the time, it’s that they don’t have the skill or the know-how to play and pass judgment on games. See, executives are numbers people. They like and respect numbers, and have very little patience for the ambiguities of art, language, and criticism (this is also why many business leaders are so contemptuous of  “fancy political rhetoric”).

Thus, Metacritic seems like an ideal solution to many publishers: it reduces numerous qualitative opinions into a single number and, since the number is determined by an outside party with its own (secretive) methodology, it confers the illusion of objectivity upon the final number. But don’t take my word for it, here’s game industry marketer Bruce Everiss (itallics mine):

I have used [Game Rankings] countless times as a tool to help in my work. Most notably to prove to the directors of Codemasters that their game quality was slipping in comparison to their direct competitors.

Then in 2001 Metacritic came along and changed the world. Firstly they convert all the review scores into percentages, then they average them to come up with one figure. (They also weight the average so more respected reviewers have more influence.) This single figure to represent a game is a very powerful thing and everybody in the industry is far more aware now of game Metacritics than they ever were of individual review scores, they have become the standard benchmark for the industry.

Of course, the only way that a site like Game Rankings or Metacritic can actually “prove” anything is if we buy into their methodologies, and since Metacritic keeps its methodology (including its “weight” system) a secret, it seems odd that Everiss would use those numbers to prove anything. But we’ve already been over that. The point is that those numbers are being used as if they were the final word on a game’s quality, even though their reliability is questionable at best. In short, Metacritic scores have empowered publishers to make decisions over “quality” in spite of knowing that they lack the gaming knowledge to do so, and they do this by drawing specific conclusions from numbers that at best provide us with nothing more than one person’s impressionistic assessment of a critical “consensus.”

Seeing as most developers actually take the time to read individual reviews of their games–and therefore better suited to put Metacritic scores in their proper context–it is hard to see how this publisher-dominated metascore system benefits anyone other than the publishers themselves. At the very least, it has empowered publishers, by giving them a greater say over areas of development that typically belong to developers.

My fear is that player tracking will eventually create the same situation: i.e., statistics that were meant to aid developers when starting work on their next game may be taken out of context by publishers, who would then use these numbers as “proof” of what audiences want from future games. The result will be even less risk-taking than we currently see in the games industry. The era of focus-group-tested games will slowly give way to an age of mathematically tailored experiences, targeting the lowest common denominator with unprecedented precision.

Personally, that’s not a future I want to see – but then again, I’m not a numbers guy.

GamePub Story

Let’s pretend for a second that I am an EA executive. It’s my job to supervise an external development team that is currently working on a Mass Effect spin-off project scheduled to be released before Bioware finishes work on Mass Effect 3. Being a responsible executive, I’ve decided to do some research before my next formal meeting with the studio in charge of the spin-off. Of course, this research doesn’t involve me actually playing the game–hell no. By research I mean reading up on Metacritic scores, development schedules, sales numbers, and – you guessed it – Mass Effect 2 player data.

 

Some of the data really startles me. I focus my attention on two in particular:

  • 80% of players played as a male Shepard?! (I wonder how much time and money was spent developing the female Shepard’s character models, dialogue options, and voice-acting).”
  • 80% of players chose to play as a Soldier? More than every other class combined?! (Was our investment to develop the 5 other classes worth the other 20%? And of that 20%, how many of them were belong to the 50% of hardcore fans who imported their saves from the first game?).”

So I jot those stats down on my Blackberry and head out to meet the developers in person.

When I get there, the news is not good. The project manager tells me that the team is six months behind schedule and significantly over budget. “This is our first attempt to make a game within the Mass Effect universe, but we’re confident that our next ME spin-off will take less time to complete,” they explain. “All we need is an additional six to ten months to deliver a product worthy of the Mass Effect brand.”

But I don’t want to hear that. Having reviewed my company’s release schedule prior to the meeting, I know that Mass Effect 3 is scheduled to release in late 2012 or early 2013, and since the whole point of releasing this spin-off is to satiate the fans while they wait for the third installment of the main series, we simply can’t afford to push it back another six months. “Sorry,” I tell them, “the game simply must be released in 2011–we’re going to have to figure out a way to make this work. I could perhaps get you some more funding, but we also need to figure out how to cut costs on your end–otherwise, our request for more money won’t be too well-received at corporate headquarters.” This prompts the creatives sitting at the table to roll their eyes at me.

“So then….what do you propose we cut? Any ideas?,” asks one of them with a hefty dose of sarcasm.

“As a matter of fact, I have some suggestions right here on my Blackberry. For instance, how much time and money would we save if we axed the option of playing a female character altogether?”

“Well that would certainly help us quite a bit, though I’m not sure it’s enough; besides, giving you a choice of gender is a Bioware tradition, and we want to do them justice with this game.”

“Of course we do,” I answer, “we want to be as faithful to the series as possible, but remember that this game will not form part of the main series; since it is only a side story, I think we can persuade the guys at Bioware to let us limit the game to a male protagonist just this once, especially if you give him a compelling back-story.” After showing them the Mass Effect 2 player statistics and some further discussion, I manage to convince them that this is a good idea, a task that was probably made easier by the fact that–surprise plot-twist ahead!–there were no women present at the meeting. But that is still not enough to bring development back on track, so I move on to my next idea, which proves to be far more controversial.

“What!?!?! You want us to axe every other class in the game!? You really want us to limit players to the role of soldier??? That is simply unacceptable,” says a visibly angry lead designer. “No. That’s just not happening, and I don’t care what statistic you show me…the class system is part of the game’s legacy–hardcore gamers expect this from us, there is just no way we can risk alienating them like this. Trust me, you’ll get a backlash from the dedicated fans, and that won’t be good for any of us.”

“Besides,” adds the project manager, “we’re already pretty far along in the design of the various classes, so that wouldn’t really cut costs as much as you’d think.”

“But would it cut development time?” I ask.

“Yes, maybe. But we’d still need additional funding, so it would be a waste of resources to simply abandon something that our team has been working on for months.” Faced with this impasse, I lean back on my chair and close my eyes for a second. The room stays silent until, suddenly, a little light bulb goes off in my head.

“What about this,” I tell them. “What if we save the other classes for DLC? That way, we can postpone developing them for the time being, get additional ‘DLC funding’ to finish the classes at a later date, while earning additional income from the hardcore players, who are the only ones interested in playing with them in the first place.” Once I show them that a whopping 80% of players chose to play as a soldier in ME2, and that our company has committed itself to prioritizing the hardcore-gamer-cash-cow that is DLC, the  team grudgingly goes along with my brilliant idea. And so the meeting comes to a close, I express my gratitude towards the development team, and assure them that “my bosses at EA will be very grateful for your understanding, and grateful to know that we already have promising DLC content in development!”

Now back to work guys….”

Visibility is a trap

Think of this as a kind of Nietzschean parable: the point is not so much to provide a faithful account of the future, as it is to speculate and warn about what could happen if we continue down this path. As we have seen, publishers and developers have profoundly different ways of looking at the world, and this creates the possibility of conflict when it comes to interpreting player data. Developers may look at a statistic such as “80% percent of players chose the soldier” and see it as an eminently solvable problem of menu design and presentation, but publishers could just as easily seize on that as a justification to cut costs or–worse–to make additional money off of the dedicated fan who  is willing to pay for DLC.  The worst case scenario is that developers will end up losing such arguments more often than not, and we the audience will end up settling for lesser games.

Trust me, dear friends and developers, I get why you would be excited by the prospect of using new technology to learn more about your audience. Why wouldn’t you be?  But please be careful how you collect such data; be careful who you share it with; and for goodness sake, be ready to defend your findings in front of the people who pay your bills, lest you end up in another meta-prison of your own making.

In short, beware of unintended consequences, and always remember Foucault’s prophetic warning: “visibility is a trap.”

Videogame vs. video game, cont.

Mark J.P. Wolf weighs in on the videogame/video game debate in the opening chapter of The Video Game Explosion: A History From Pong to Playstation and Beyond:

What exactly constitutes a “video game”? Although the term seems simple enough, its usage has varied a great deal over the years and from place to place. We might start by noting the two criteria present in the name itself; its status as a “game” and its use of “video” technology. These two aspects of video games may be reason for why one finds both “video game” (two words) and “videogame” (one word) in use: considered as a game, “video game” is consistent with “board game” and “card game,” whereas if one considers it as another type of video technology, then “videogame” is consistent with terms like “videotape” and “videodisc.” Terms like “computer games” and electronic games” are also sometimes used synonymously with “video games,” but distinctions between them can be made. “Electronic games” and “computer games both do not require any visuals, while “video games” would not require a microprocessor (or whatever one wanted to define as being essential to being referred to as a “computer”). Thus, a board game like Stop Thief  (1979), for example, which has a handheld computer that makes sounds that relate to game play on the board, could be considered a computer game, but not a video game. More of these kinds of games exist than games that involve video but not a computer, making “video games” the more exclusive term. The term “video games” is also more accurate in regard to what kinds of games are meant when the term is used in common parlance, and so it will be the term used here.

It’s clear why Wolf would choose to say “video game” instead of electronic or computer games, but it seems to me that he never really explains why this is preferable to writing “videogame” as one word. Moreover, he mischaracterizes the motivation behind those who consciously choose to treat “videogames” as one word. The reason I write videogame is not because I consider them to be a “different type of video technology.” That would imply that I am giving priority to the video element of videogames at the expense of their gaming roots. But that’s not the case at all. I don’t consider videogames to be a new type of video technology, I consider them to be a new type of technology, period. It is a technology and a medium composed of two preexisting mediums–i.e., games and video–but one which remains irreducible to either one; more precisely, I don’t think of the videogame as a new form of video or a new form of game, but rather as an entirely new form in and of itself, one with distinct characteristics and powers of expression.

Simply put, both games and video are essential precursors to the modern videogame, but if you believe that the medium is more than the sum of its parts, then it follows that we give it a name all its own. The advantage of using videogame as one word, then, is that it acknowledges and transcends these precursors in one fell swoop. It is a new word, a made-up word, but one that is very clearly anchored in the medium’s roots.

This is a surprisingly divisive issue, but I find it endlessly fascinating. For more points of view, check out this impromptu debate that took place over at Gameology 2.0 in 2006. Perhaps my favorite comment in that thread is the one by videogame critic/scholar Ian Bogost, who wrote in support of the one word spelling. (I only discovered this recently through his twitter feed–wish I had read it prior to my last post on the subject!) Bogost:

I use the term “videogame” for rhetorical reasons. Separating the words, in my opinion, suggests that videogames are merely games with some video screen or computer attached. But, I believe that videogames are fundamentally a computational medium, not just the extension of a medium like board or role-playing games (although there is also a genealogy there). I think that closing the space, in part, helps consolidate this concept. Personally, I’m only interested in gaming as it relates to computation. That doesn’t mean I don’t think gambling or board games or whatnot are useful, it just means that they are not my primary focus.

As for the argument that “videogame” implies video display…I don’t really care. I’m more interested in common usage, and the fact is that people use “videogame” to refer to the kinds of artifacts I want to talk about. I think video qua television screen is a vestigial effect of the arcade era and nobody is really confused about it.

For the same reason I abhor terms like “interactive entertainment.” I think inventing terms like this is a bit like trying to rename film or photography. More precise terms are more dangerous because they will lead to fragmentation. Jane McGonigal and I have had inconclusive conversations about whether ARGs and other so-called “big games” are videogames. I contend that they are, if they make significant use of computation (so, Cruel 2 B Kind, the game she and I created, is a videogame for me!). “Videogame” is a fine equivalent for “film” if we’d just stop worrying about it so much. And forcing the term into broader usage will help expand the medium much more than making up new words for each sub-type.

(Image by Bill Mudron)

noby noby

Videogame, no space

Why are they called video games?  Okay, stupid question.  We all know why they are called video games, but why do we still call them video games.  The term is so literal.  It is a game that you play on a video screen.  Is that the best we can come up with?  Why do we come up with something cool like….. Visual Attack Challenge Activities?  Or VACAs for short.  I realize it’s kind of wordy but it was just off the top of my head.  Give me a break. –Shelby Coulter

Recently, I’ve started calling them videogames–no space–in spite of constant objections from my spellchecker, for many of the same reasons alluded to in that piece. A videogame is not simply a game on video, it’s a distinct form. Accordingly, I write videogame as a single word in an attempt to acknowledge gaming’s roots without undermining its claim of being a unique expressive medium in its own right.

Chess, now on video! = video game

**********

**********

= videogame

Νow, this is not to say that there is no such thing as a video(_space_)game. When you play Chess or Monopoly online, you’re playing a video game; when you play Solitaire on your computer, you’re also  playing a video game. Those are games, plain and simple. They’ve just been transported into a video monitor for your convenience.

So, if online chess is a “video_game,” then what do we call something like Noby Noby Boy?  It obviously uses video and, clearly, it’s meant to be played with. But many gamers remain suspicious of it: “Sure,” they say, “Noby Noby Boy might have all those things, but is it really a game?

Alas, that question, interesting as it is to debate, is (in this specific case) largely beside the point.  For Noby Noby Boy is neither a game nor a video game, and it  was never meant to be.

It is, however, a fantastic videogame, no space—and that’s something else entirely.

panopticon-sketch

[Resonance Machine] Welcome to gamespace

Ever get the feeling you’re playing some vast and useless game whose goal you don’t know and whose rules you can’t remember? Ever get the fierce desire to quit, to resign, to forfeit, only to discover there’s no umpire, no referee, no regulator to whom you can announce your capitulation? Ever get the vague dread that while you have no choice but to play the game, you can’t win it, can’t know the score, or who keeps it? Ever suspect that you don’t even know who your real opponent might be? Ever get mad over the obvious fact that the dice are loaded, the deck stacked, the table rigged and the fix–in? Welcome to gamespace. It’s everywhere, this atopian arena, this speculation sport. No pain no gain. No guts no glory. Give it your best shot. There’s no second place. Winner take all. Here’s a heads up: In gamespace, even if you know the deal, are a player, have got game, you will notice, all the same, that the game has got you. Welcome to the thunderdome. Welcome to the terrordome. Welcome to the greatest game of all. Welcome to the playoffs, the big league, the masters, the only game in town. You are a gamer whether you like it or not, now that we live in a gamespace that is everywhere and nowhere. As Microsoft says: Where do you want to go today? You can go anywhere in gamespace but you can never leave it.

-McKenzie Wark, Gamer Theory 2.0 [001].

Version 1.1 of this “networked book” can be read online for free.

Panopticon Sketch by Robert Spahr.

IMG_0318

If you build it, they will come

I just got my copy of the inaugural issue of Kill Screen in the mail. It comes in a smaller journal-sized format which looks great and fits quite nicely in a regular bookshelf. The decision to release it in this format is a good one. It gives the magazine an air of permanence, as if to say that “this is not a regular magazine; magazines are disposable, this is not; magazines belong in the magazine pile, this belongs in a public library, you know, next to that Daedalus

I’ll comment on the magazine’s content once I get the chance to read it. But for now, I encourage you to visit their site and grab your own copy. There is nothing like this magazine in the videogame world: a magazine geared not towards consumers or fans, but to those chosen few who see games as a powerful mode of expression with serious cultural implications. The first issue features contributions from some of the smartest peeps in the business, including journalists Leigh Alexander and L. B. Jeffries, and game design mega-legend Peter Molyneux, creator of the mega-legendary Populous and more recently the Fable series. Okay, I’m going to end this sales-pitch right here. I’ll try to post on the articles when I’m done reading.

Why Gaming Veterans are Flocking to Social Gaming

Ashley Davis reports from the Game Developers Conference:

The names Brenda Brathwaite, Brian Reynolds, Noah Falstein and Steve Meretzky may not be instantly recognizable, but their gamemaking pedigrees are more than enough proof of their service to core gamers. Meretzky created, among other things, the Hitchiker’s Guide to the Galaxy computer game alongside Douglas Adams. Brathwaite started her career in the videogame industry by working on the Wizardry series. Reynolds worked with Sid Meyer on the Civilization series, and Falstein was one of the earliest team members at Lucasfilm Games (now LucasArts).

People like these are the ones you would least expect to have hopped onto the Facebook game train, but during this GDC panel, they took the time to explain why they’ve done exactly that. Hit the jump for a summary of their talk; their motives may suprise you.