In February, Eurogamer's Dan Whitehead wrote the preview of Resident Evil 5, a game that had been under no small amount of scrutiny for what appeared to be blatantly racist imagery in its trailers. He noted:
One of the first things you see in the game, seconds after taking control of Chris Redfield, is a gang of African men brutally beating something in a sack. Animal or human, it's never revealed, but these are not infected Majini. There are no red bloodshot eyes. These are ordinary Africans, who stop and stare at you menacingly as you approach. Since the Majini are not undead corpses, and are capable of driving vehicles, handling weapons and even using guns, it makes the line between the infected monsters and African civilians uncomfortably vague. Where Africans are concerned, the game seems to be suggesting, bloodthirsty savagery just comes with the territory.Whitehead's comments were welcome: from a game journalism industry that too often acts as cheerleader instead of gadfly, they represented someone willing to point out both racism and the shallow terms on which the debate has typically been conducted--in a preview, no less, usually the most vile and sycophantic of press vehicles!
Later on, there's a cut-scene of a white blonde woman being dragged off, screaming, by black men. When you attempt to rescue her, she's been turned and must be killed. If this has any relevance to the story it's not apparent in the first three chapters, and it plays so blatantly into the old cliches of the dangerous "dark continent" and the primitive lust of its inhabitants that you'd swear the game was written in the 1920s. That Sheva [the game's African co-protagonist] neatly fits the approved Hollywood model of the light-skinned black heroine, and talks more like Lara Croft than her thickly-accented foes, merely compounds the problem rather than easing it. There are even more outrageous and outdated images to be found later in the game, stuff that I was honestly surprised to see in 2009, but Capcom has specifically asked that details of these scenes remain under wraps for now, whether for these reasons we don't know.
There will be plenty of people who refuse to see anything untoward in this material. "It wasn't racist when the enemies were Spanish in Resident Evil 4," goes the argument, but then the Spanish don't have the baggage of being stereotyped as subhuman animals for the past two hundred years. It's perfectly possible to use Africa as the setting for a powerful and troubling horror story, but when you're applying the concept of people being turned into savage monsters onto an actual ethnic group that has long been misrepresented as savage monsters, it's hard to see how elements of race weren't going to be a factor.
All it will take is for one mainstream media outlet to show the heroic Chris Redfield stamping on the face of a black woman, splattering her skull, and the controversy over Manhunt 2 will seem quaint by comparison. If we're going to accept this sort of imagery in games then questions are going be asked, these questions will have merit, and we're going to need a more convincing answer than "lol it's just a game."
Unfortunately, Eurogamer's actual review of the game, posted today, was not written by Whitehead, and it contains no mention whatsoever of the racism he noted. In fact, it hardly even mentions the African setting at all, or the nature of antagonists, devoting most of its column inches to gameplay mechanics and comparisons to RE4. Sample line: "...Resi 5 embraces the action element without concession. Whether it goes too far, of course, will be a matter of serious discourse." Oh, is that where the 'serious discourse' is heading these days?
(On a side note, when you're reading through a review expecting some kind of racial commentary and not finding it, tech terms like "reskinned" take on a whole new meaning, as does the story-related phrase "viral shenanigans.")
There are perfectly valid reasons for Whitehead to have not been assigned the RE5 review--he may not have been in editorial rotation, wasn't interested, or had other matters on his plate. That said, there's really no excuse for Eurogamer to have dodged the issue completely. Any editor worth their salt should have looked at the piece and asked where the follow-up analysis was (especially since it's only 2 pages long, one page shorter than Whitehead's preview). It's also surprising from a revenue perspective, given that EG is ad-supported, and the preview garnered a high amount of incoming coverage from aggregators like Joystiq. Given those points, the absence of commentary on racism in the review raises questions: Did Capcom complain? Did advertisers threaten to pull out? Did Eurogamer chicken out? Or did they simply drop the ball?
Eurogamer's failure is most depressing, I suspect, because many of the progressive voices in the gaming community had hoped for better from them, based on the preview and the strength of their writing stable as a whole. A recognition that critical questions have merit, and that by extension serious analysis is possible (and desireable), is something that's been sorely lacking in mainstream industry coverage--both in general and with regards to this game in particular. EG had a very real chance to provide some actual 'serious discourse' and yet chose not to do so. Is it any wonder that the mainstream gaming press can't be taken seriously, when even its better examples behave this way?
Clearly, Mirror's Edge isn't my favorite game this year. But that's not to say it was all bad. As Brinstar has detailed, the main character, Faith, embodies a number of positive traits. Sadly, I think she also highlights the real problem of designing "franchise" characters: one isn't enough.
Here are the good points: Faith is an Asian-American woman, but her race and gender aren't mentioned in or relevant to the story at all. She dresses sensibly, given that she's in a pretty athletic career, and her body shape also reflects that without being overtly sexualized. She's stubborn, but not snarky (there's nothing worse than a "clever" character written by not-so-clever writers). Perhaps, you might be tempted to say, she's still a little rough around the edges.
The problem is that there might be one decent character there, but there's really no-one else for her to interact with, so those rough edges are never really sharpened. Everyone else in the game is dishwater-dull, from the other messengers to the cops to Faith's sister. The same goes for the dystopian setting--it's vague to the point of nonexistence. That's bad for the players, but it's worse for the protagonist: without anything to push back against, Faith doesn't have much ground to define herself. She has no points on which to take a stand, except for the most basic (I think most of us are anti-framing and anti-betrayal).
There are two trends in game writing at the moment: strongly-defined protagonists, a la Jade from BG&E, or mute stand-ins like Samus Aran. In the latter case, the surrounding world has to be made as interesting as possible. It's not a coincidence that the Metroid Prime titles tag everything in sight with scannable text, or that Half-Life 2 devotes so much work to giving the Combine little bits of "business," like their constant radio chatter and introductory set pieces. On the other hand, if you're going to make the protagonist an actual character, you can probably get away with a less defined world (BG&E's setting does what it's supposed to do, and not much else), but you'd better have someone for that main character to talk to, and their actions had better be strongly tied to concrete, interesting motivations. Jade, for example, is constantly interacting with her companions, and she clearly has strong opinions about each of them. Faith has neither--both her world and her friends are generic--and as a result, she herself is uninteresting.
It's a shame, because as I played Mirror's Edge I was reminded strongly of William Gibson's Virtual Light. Like the game, Gibson's book concerns a city messenger and outsider who gets tangled up in a class struggle. But Chevette Washington (Gibson's protagonist) is surrounded by interesting people: Rydell the reluctant rentacop, Sammy Sal and Bunny the bike messengers, and the eccentrics living on the ruined Golden Gate bridge. Chevette is not only defined for the reader by her interactions with these characters, she kicks off the plot herself when she impulsively steals the titular virtual light glasses from a sleazeball partygoer--at every step of the way, we're learning something about her. Faith never even displays that much initiative: her story really begins when her sister is framed, and she spends the rest of the game reacting to events.
Storytelling in games is like the weather: everyone complains about it, but nobody does much of anything about it. Progress has been slow, but (as opposed to other media) it's often shaped by the technology and culture surrounding it. One of the advantages I see from widespread console multiplayer is that it may build support for ensemble casts, as opposed to mascot characters. Gears of War, for example, is nobody's idea of a well-rounded drama, but its characters are inarguably much more lively than Faith is. Mirror's Edge gets caught on the wrong side of this trend for a variety of reasons: the first-person perspective, emphasis on time trials, and a primary mechanic of player-vs.-environment. I'm not sure that better characters would have saved the game entirely--it's got plenty of its own issues, as I've noted--but they probably would have made its failure a lot less aggravating.
In his follow-up to January's round table, which invited participants to reinvent literature as a game, Corvus has asked us to take someone else's proposed design and elaborate on it, disposing of strict ties to the original literary source, but continuing on the themes and rules inside.
If I hadn't put this off until the last possible day, I would have actually written the Flash version of "l(a" sketched out at Discount Thoughts. Instead, I want to take a closer look at Nerje's Super God Delusion 64 at Ludic Thoughts, which is a riff on Dawkin's book of (almost) the same title.
To summarize: in the design laid out by Nerje, the game is a kind of Animal Crossing filled with both believers and secularists, where players are rewarded for acts of skepticism and science. The game also regularly states that a secret score is being kept for the player's actions--but in a final twist, the end of the game is simply a blank, and the only reward is the feeling of accomplishment. (I am, of course, already a sucker for bizarre Animal Crossing variations.)
It's a fun idea, but the problem with making a game that satirizes religion is that it's easy to be betrayed by the medium. Of course there's no God in your software, players might respond, you programmed it that way! In fact, aren't you a kind of Intelligent Designer for the whole scenario? Perhaps we would be better served by setting our sights a little lower, at the behavior of religion instead of its belief system--and in doing so, we may be able to make the original point, albeit more indirectly.
I propose changing both the player's role in the game, and adding a new influence: Dungeon Keeper (we'll also change the title of the game to reflect this--I like Tithe, personally). In this version, the player character arrives in town as the seed of a nascent religion. Setting up a small house/worship center, your task is to grow your flock and your influence over them.
There are two methods for attracting believers. The first is where the Animal Crossing influence remains: being social, trading letters, learning about the community, and performing favors to gain good will. The second, and more powerful, method is to increase the drawing power of your church by adding "attractions" to it. You might start out, for example, with some bargain-basement artifacts, like a magic translating hat or a moldy sandwich shaped vaguely like a saint. Followers who are impressed by a display will donate funds (cha-CHING! goes the animation), which can be used to upgrade further: a state-of-the-art sound system, Creationism Museum wing, or even visits by higher religious authorities in funny hats. The tone of this should be exaggerated and gently satirical--not mean-spirited, but targeted at the extremes of modern superstition and their tendency toward graphic spectacle.
Players can also create their own attractions, using a combination of Little Big Planet-style sandbox and some lightweight graphical scripting. Solutions that play on physics and statistical misjudgement will be particularly effective in growing the flock. Don't expect that the other religious communities in town will take your expansion lying down, though: they'll also begin ramping up their efforts in order to hold onto their members and possibly steal yours. At higher levels of gameplay, a simplified political simulation is even mixed in, giving the ability to form alliances and allowing you to champion rule modifiers that will benefit your organization over the others.
The idea, as I see it, is not to champion secularism directly. Rather, it's to satirize the materialistic and commercial aspects of religion in America. In his 2007 book Shopping for God, marketing expert James Twitchell noted the many ways that branding and advertising have become a part of American belief--at root, perhaps, because this country has always had a unique "marketplace" for religion, although Twitchell himself does not point this out. American churches work hard to maintain their base, using strategies as simple as the now-ubiquitous church sign or as encompassing as the megachurch (or as disturbing as the Jesus Junk described in Daniel Radosh's Rapture Ready, which includes "Testamints" and a smiling cross).
At their most basic, video games provide an ideal vehicle for satire of fundamentalist American belief: they're rigidly rule-bound, arbitrarily-constructed, and market-driven. It is difficult to directly critique faith (particularly moderate, relatively harmless faith) given such a system, but easy to mock a worldview that admits no ambiguity or rationalism. By moving from the original's sandbox to a design that puts the player in the position of church leader, we limit the message a bit, but we also sharpen its aim at a target that arguably needs more deflating than the broad concept of God itself.
Who else wants to talk?
Imagine that someone invents a machine that makes omelettes: brilliantly-colored, spicy omelettes made with breathtaking speed. Taken by its combination of verve and simplicity, you order the machine. But when it arrives, to your dismay, you discover that the omelette-making process is actually fraught with danger--80% of the time, due to a misstep in the instructions, it sets your kitchen on fire. Also, for some reason, the manufacturer has added a mode for making breakfast sausage instead. The machine is a very poor sausage-maker, but it keeps getting stuck in sausage-making mode, and until the sausage is successfully cooked you can't get back to the omelettes (and the kitchen fires, which are starting to lower your enthusiasm somewhat for the whole idea of breakfast).
Mirror's Edge is this omelette-maker. It's filled with absolutely gorgeous visual design, presenting parkour from the first-person perspective. Except that it doesn't work, about half the time. The controls are overly touchy, especially strafing, and the context-sensitive options aren't nearly sensitive enough. Worse, the part of the game that's really fun--the running, in between falling--is interrupted regularly with fight scenes. Often, you can't run from the fights, because the soldiers are very good shots and the escape routes are (intentionally) via slow and exposed pipe-climbing. It's like someone on the design team said "We've really got something here, with the running part of the game. Let's make sure to take it away from the player on a regular basis."
There's always a lot of comparison to Prince of Persia whenever a game tries free running and acrobatics, and with good reason, since it did it best. But people often take the wrong lessons from this, citing the "rewind" function that largely canceled out dying-as-punishment. That wasn't the genius of the gameplay, however: what made it really good was in fact the inaccuracy of the controls, the way that the Prince would do the right thing as long as you hit a button with something close to the right timing. PoP realized that the fun wasn't in being a precise platformer, but in the simple thrill of directing a complicated flow of leaps, grabs, and wall-runs around the game's carefully-crafted spaces.
It's strange, actually, that a title with such aggressive visual editing as Mirror's Edge--it's practically monochromatic--would have such weak editing on the interaction side. Eliminate combat from the mix, and you've cut the game down to basically two buttons, up and down. Get rid of strafing while you're at it, since all it does is let me swerve off catwalks by accident, and make it work more like the first Metroid Prime games (which also took a third-person gameplay conceit and moved it into first-person). Those changes would work the level designers a bit harder, but the end result is leaner, more focused gameplay.
What it all comes down to, really, is that when asked to make a choice between realism and fun, Mirror's Edge chooses the former. No doubt, in real life, Faith would be riddled with bullets almost instantly, and so in the game, she is. As a result, the player is discouraged from approaching situations with speed and daring, because it's a process of trial-and-error fatalities made worse by Faith's clumsiness. By contrast, it would be highly unrealistic for players to be able to sprint through a gauntlet of enemy fire, bullets whizzing by but rarely breaking the flow of action--unrealistic, but much more rewarding. As it is, the game just feels unfair: it gives you the tools to do one thing fairly well, and then punishes you for trying to use them.
Like everyone else who's tried it, I was completely charmed by World of Goo, finishing it in about a week. What surprised me about it was that it seemed familiar: although I don't know if this was their inspiration at all, WoG's gamemplay is basically a force-directed node graph, plus gravity and very clever level design.
The term "force-directed node graph" is kind of wonky. You probably know it better from Visual Thesaurus, or the 6pli del.icio.us tag browser. It's a method of taking a semantic web of interconnected nodes, then allowing it to self-organize (instead of placing the nodes manually) by A) making them repel each other while B) applying elastic limits to the connections between them. It is a lot of fun to mess with. I could drag nodes in one of these graphs all day long, watching them spasm and then reassemble themselves into a kind of order.
I don't know that World of Goo takes its inspiration from these kinds of node graphs--the idea isn't exactly revolutionary--but I certainly think that the simple enjoyment of adding nodes and watching them shift in response is a part of the game's appeal. It's got me thinking about other simple pleasures, and wondering if they, too, could be made into games: stuff like throwing a cursor across screens with a trackball, zooming in and out of Google maps, or playing with the 3D formula graphing on my old TI calculator. It's the kind of thing that's mindlessly rewarding, and that data visualizations are increasingly good at creating.
Which raises a second question: as we're increasingly confronted with data, how will visualization crossbreed with gaming, so that either the games become more reactive, or the graphs become more entertaining? How does it change our relationship with data--and what that data represents--when it's primarily presented to us through software toys?
In his post on a short experience in World of Warcraft, PeterB hits on something fairly profound:
Throughout the parts of the game that I've seen, never once while in-game have I had to sit and wait for a "Loading..." screen. If you have to descend into a cave to search for loot, it flows smoothly from the outer world. Fly across the ocean to another continent, and you watch the scenery below you as your griffin beats his wings beneath you. Surely there is some sort of loading or paging going on under the hood, but the user never feels it.If this sounds very familiar to you, maybe you've been playing Geometry Wars 2. I certainly have. Despite promising myself that I'd stop trying to beat a pesky leaderboard score, I wasn't able to kick the habit. The thing about GW2 is that it's really, really easy to spend a relatively long time chasing high scores in it, partly because the gameplay is very good, but additionally because restarting a level is practically instant. I can be playing Pacifism, run into an enemy, and before I've finished yelling at the game I'm already back at the start of the level. Just hammering the A button--which, helpfully, is not used for anything else in GW2--runs the user through the menu as fast as they can thumb. There's no death animation. There's no menu lag. There's nothing, in other words, to provide the "cognitive break" that Peter's discussing above. Instead, the game is constantly rewarding players with stimulation. Combined with the quick start-up of XBox Arcade titles, this means I end up playing a lot more Geometry Wars than I probably intend to do, because it's easy to get into it and surprisingly hard to get out.
I describe this achievement as 'technical', but its impact on the immersiveness of the game can't be understated. Like so many other people, I have a short attention span. "Loading" screens do more than provide entertainment while the computer gets work done, they provide a cognitive break. When I'm playing a game and a load screen appears, more often than not I will look away. Maybe I'll go get a cup of tea, or pause the game, or check my email. World of Warcraft doesn't have these cognitive breaks, except for those that the player makes for him or herself by retreating to a safe place. The end result (at least for me) is a sort of tunnel vision composed of equal parts concentration and fatigue. You eventually look up and find that several hours have passed, and you hadn't noticed.
You can, in fact, judge how likely I am to stick with any given game by determining how quickly and effectively it reloads after I die. I was astonished by reviewers who punished the new Prince of Persia for simply eliminating death-by-falling: that's exactly what I want! Hurl me directly back into the action, don't make me sit through a non-game sequence first! We can even take this further: the less I am punished for any failure, the more likely I'll keep playing. That doesn't mean the game is easier--feel free to make tasks difficult. But when I fail, I don't want to have to replay large chunks in order to reach that point again. I'm an adult, I understand: the failure itself is punishment enough. Anything else is just kind of rubbing it in.
Let's take this even another step, outside gaming: the less my workflow on any given task is disrupted by either failure or success, the more progress I find I can make. For example, I used to do my audio work at the Bank in Pro Tools. Unlike a lot of people, I really like Pro Tools. It has a fantastically well-designed toolkit for patching and editing audio (one day, I'll write a post about how the connection routing of audio software is possibly its most crucial feature). As a result of this incredibly flexible routing matrix, bouncing audio from multiple tracks into a single mixdown track is a joy. There's just one problem: partly as a consequence of that design, Pro Tools can only bounce in real-time. So while the user experience of mixing is very pleasant, it involves a lot of sitting around and waiting for the audio to play through the mixing bus. During that time, I tended to get distracted--or, on long projects, even leave the room to work on something else.
Nowadays I do my audio work in Cubase or Sonar, neither of which is anywhere near as graceful as Pro Tools. Bouncing a track in these apps requires 1) soloing the tracks in questions, 2) running a mixdown command to export the mix to a file, and 3) importing the newly-created file to its own track. Both Cubase and Sonar kind of apologetically include options during mixdown to automate this process, but it still feels clumsy compared to the Pro Tools mixer. The advantage they have, however, is that these packages can bounce audio as fast as the computer can process it, usually far faster than realtime. As a result, I don't enjoy my new Cubase workflow nearly as much as I enjoyed editing at the Bank, but on many projects it has made me much more productive, and not just because non-realtime bouncing is technically faster. There's no "cognitive break" during which time I would be tempted to multitask.
I think there are two interesting items of note here. The first is to note the degree to which gaming often associates punishment (including death, which barely deserves the name) with wasted time. It's the accepted method of "charging" a player for failure--either take away their time during an animation/reload/restart cycle, or force them to spend substantial time recovering lost ground, or both. This actually strikes me as particularly perverse, given that the audience has grown older, and has less spare time to spend. There are plenty of currencies that could be used punitively in design: loss of experience, equipment, or even simple mockery. And yet we return, over and over again, to design decisions (no quicksave, sparse respawn points, long menu trees) that make failure above all a lengthy and slow process.
Second, I think it's kind of funny that--even though gamers are often considered part of a "multitasking generation"--one of the most important factors in a game's addictive potential is its determination to keep the user focused on a single task for as long as possible. You'd think, if the trend were really so pronounced, that the most successful tools and entertainment would be those that work around a multitasked mindset, not one of constant obsession. It's almost like that kind of generation-gap jargon were just some kind of nonsense buzzword invented by would-be social critics.
Choosing the literary subject for an imaginary game adaptation in this month's fantastic Round Table topic was difficult, particularly since there are so many great games in fiction that could be adapted. In the end, though, one book caught my imagination more than any of the other options: China Mieville's Iron Council.
Probably the most overtly political of his novels and a New Weird take on the Western, Iron Council returns again and again to the theme of plans that spin into unpredictable motion from hidden beginnings. The "Iron Council" of the title, for example, is a train that becomes its own autonomous society after a crew mutiny, and travels across the landscape on recycled tracks. The parts of the book set in the city of New Crobuzon cover plots within plots, each of which actually serves a very different purpose from its outward intent. And indeed, it's not for nothing that one of Iron Council's central characters (failed messiah Judah Low) is a golemist, who creates lumbering simulacrums of life from whatever materials are at hand.
Mieville's other books would probably make great RPG supplements--something Mieville has probably already considered, since he's an avowed D&D geek--but that's the easy way out. Iron Council, on the other hand, has the vivid central image of the Council itself, which thunders out into the frontier aimlessly before being called back to the city to support a populist rebellion. This concept of a train that charts its own destination, to me, cries out for a physical analog. So, while I'm not a game designer and will not be going into specifics, I'd love to set this up as a board game--but one where the path is created during play, by the players.
Before the start of the game, the board is an empty cardboard frame, which the players will fill with hexagonal tiles as play continues. In one corner, a tile showing a cityscape is pre-placed--this is New Crobuzon, where the game begins and ends. Also before the first turn, each player is issued a set of tokens: a large Role card, a pile of board hexes, and a set of Intercession cards. Finally, there's a single playing piece: the Iron Council itself, which is used to keep track of the end of the path (this isn't technically necessary, since usually the path doesn't double back on itself, but it's handy and a nice visual touch).
In theory, Iron Council: The Game (or ICTG, for the sake of expedience), is won by returning the Council to New Crobuzon successfully: everybody wants that to happen. But each player's Role card, representing a character from the book, dictates a certain set of conditions (time frame as represented by tiles on the board, cards in play, and position of other players) for that particular player to "win the game." For example, a player who draws Ann-Hari, the prostitute who becomes a revolutionary leader, wins if A) the Council remains intact and B) returns before the Mayor can crush the Toro rebellion, but not before C) a certain number of Intercession event cards with her name on them are brought into play. Role cards also come with a special ability that's spelled out on the card unique to each role: Judah Low can play Golem tokens to bolster the Iron Council's position on Intercessions, Weather Wrightby can look through other players' cards once per game, and Qurabin can permanently reduce his hand size by one to counter some events.
Each turn, players go around the circle laying down hex pieces to guide the track being laid for the Iron Council. The pieces have a picture of a track on them (either straight or curving to a different hex side), and the track has to form a contiguous line, although it can "overlay" old tracks if the path curves back on itself. After placing the track, each player can play a card from their hand, with varying effects depending on the card and sometimes which Role card the player was assigned. Track tiles are also tagged with a number, which is used as a random number generator for certain cards.
Here are some sample Intercession cards:
Iron Council is a rich story covering a wide set of characters and locations away from the perpetual train--surely a video game could tell its story far better? Perhaps, but two caveats make the narrower focus of the board game more appropriate. First, Mieville's imagery would be, I think, ill-served by fixing it into polygons. Take his description of the Bounty Man, for example, or the creation of the time golem:
It could not always clearly be seen. The crude rips in the temporal from which the golem was made gave it edges like facets, an opalescence of injured time. From some angles the train was hard to see, or hard to think of, or difficult to remember, instant to instant. But it was unmoving.You know how that gets translated into a game engine: some translucent polygons, a volumetric fog, and a stylized blur effect. I can see that in my head, and the pleasure of the prose is lost.
Second, the group dynamic of a boardgame makes it more suited to the spirit of the novel, if not the letter. Mieville says in an interview with The Believer:
...one of the things that I think as a socialist is that there is absolutely nothing wrong with humans wanting to intervene in the world, wanting to exploit the world, wanting to change the world, wanting to bend the world to their will. What goes wrong for me is not that people want to do that, but that they do it under conditions of capitalism, which they don't control.The interaction between players isn't formalized in a board game the way it would probably be in an electronic program. Under what conditions will they choose to operate? And more importantly, how could the game make them think about those conditions? I don't know for certain that my game would do it--but I doubt that video games, the mechanics of which tend to be steeped in capitalism, would have a chance. And it certainly couldn't compete with the assembly of a physical board in the same way, a process that evokes the spirit of intervention and exploration that Mieville's trying to portray.
Who else wants to talk?
Or: My New Year's Resolutions for Gaming Only, Because I Don't Follow the Other Ones (As If I'm Going To Follow These), 2009:
It took a broken console for me to work out exactly why playing shooters on a thumbstick gives me hives.
With the XBox out of commission, I went back and finished Darwinia, Introversion's charmingly odd RTS. Darwinia uses a kind of FPS-like control system: the mouse moves a cursor around the screen, rotating to keep it close to the center of the view pane, while movement is controlled using the standard WASD (or in my case, WAXD) keys. In perspective, the game reminds me of Black and White, but without that game's idiotic mouse-only policy. Remember movement in B&W? In order to travel somewhere, instead of using a perfectly-reasonable autoscroll, players had to click-and-drag, like moving Google Maps, but without the ease of use or search function. Doing that for an hour at a time was an exercise in repetitive stress injury.
Darwinia, being far more sensible than B&W, uses the same basic principle that shooters use for movement and selection/aiming: it creates a direct connection between mouse's physical movement and the onscreen change in view arc. The reason this works is because computer users have been training for it during the entire life of the GUI. When the mouse moves a certain amount, the cursor moves correspondingly (factoring in a natural acceleration factor). In 3D space, the entire view moves instead of the cursor, but the relationship between physical change and virtual shift is preserved.
Compare to aiming with a thumbstick. Now, if you want a certain amount of change, you can't move the corresponding amount with your hand. Instead, you have to hold the stick in the desired direction for a variable length of time, then ease it back into position as you reach the target. If the target is moving, you can't follow its movement directly. You have to match its vector, both in direction and in amount (scaled to the bounds of the joystick).
Is there a way to solve this, and to make console shooters less tank-like? Probably not. You can't link movement directly to thumbstick position, because there's no way to reset the view center (you can't pick up and move the stick to its new position like a mouse). One fascinating idea I've seen is to replace the thumbstick with a trackball--as a long-time Logitech Marble user and RSI victim, I heartily approve of this idea. It will, of course, never happen, even though it would be tremendously awesome.
But short of reinventing the hardware, which no-one but Nintendo seems interested in, designers can at least minimize the annoyance. I noted, while I had a working XBox, that I found Gears of War much less fiddly than most shooters on the platform, probably because its emphasis on cover lowers the importance of precise aim. Gears gives much higher priority to movement, where consoles have an advantage in analog control, for getting behind cover and spraying suppressive fire. It also uses the cover mechanic as a way to guide players into a two-level stick sensitivity--when popping out for aimed shots, the view zooms in to make up for the stick's imprecise movement. Finally, the art design in Gears strongly supports the "feel" of its control: tank-like aiming seems natural given the hulking, ungainly build of Fenix and the other characters, in a way that it feels unnatural for most nimble FPS protagonists.
The best argument I've seen for why mouse hasn't been added to XBox, given the USB ports that could obviously support it, is that it would segment the player population: mouse users would have an clear advantage over the others, an advantage they would have effectively gotten by paying for it. It's unbalancing to give players with more money a leg up, and I can see why they want to avoid it. But when I'm playing the single-player campaign at home, I'd like to be able to do it in comfort instead of fighting constantly with the controls. The inability to do so is a constant source of frustration. Of course, this is a microcosm of the entire console-vs.-computer debate--my preference for an adaptable, hackable platform explains why I identify as a PC gamer in the first place.
Oh, how nice! Look what Microsoft got me for the holidays: a broken XBox.
No, really. You shouldn't have.