this space intentionally left blank

July 14, 2016

Filed under: gaming»perspective

Emu Nation

It's hard to hear news of Nintendo creating a tiny, $60 NES package and not think of Frank Cifaldi's provocative GDC talk on emulation. Cifaldi, who works on game remastering and preservation (most recently on a Mega Man collection), covers a wide span of really interesting industry backstory, but his presentation is mostly infamous for the following quote:

The virtual console is nothing but emulations of Nintendo games. And in fact, if you were to download Super Mario Brothers on the Wii Virtual Console...

[shows a screenshot of two identical hex filedumps]

So on the left there is a ROM that I downloaded from a ROM site of Super Mario Brothers. It's the same file that's been there since... it's got a timestamp on it of 1996. On the right is Nintendo's Virtual Console version of Super Mario Brothers. I want you to pay particular attention to the hex values that I've highlighted here.

[the highlighted sections are identical]

That is what's called an iNES header. An iNES header is a header format developed by amateur software emulators in the 90's. What's that doing in a Nintendo product? I would posit that Nintendo downloaded Super Mario Brothers from the internet and sold it back to you.

As Cifaldi notes, while the industry has had a strong official anti-emulation stance for years, they've also turned emulation into a regular revenue stream for Nintendo in particular. In fact, Nintendo has used scaremongering about emulation to monopolize the market for any games that were published on its old consoles. In this case, the miniature NES coming to market in November is almost certainly running an emulator inside its little plastic casing. It's not so much that they're opposed to emulation, so much as they're opposed to emulation that they can't milk for cash.

To fully understand how demented this has become, consider the case of Yoshi's Island, which is one of the greatest platformers of the 16-bit era. I am terrible at platformers but I love this game so much that I've bought it at least three times: once in the Gameboy Advance port, once on the Virtual Console, and once as an actual SNES cartridge back when Belle and I lived in Arlington. Nintendo made money at least on two of those copies, at least. But now that we've sold our Wii, if I want to play Yoshi's Island again, even though I have owned three legitimate copies of the game I would still have to give Nintendo more money. Or I could grab a ROM and an emulator, which seems infinitely more likely.

By contrast, I recently bought a copy of Doom, because I'd never played through the second two episodes. It ran me about $5 on Steam, and consists of the original WAD files, the game executable, and a preconfigured version of DOSBox that hosts it. I immediately went and installed Chocolate Doom to run the game fullscreen with better sound support. If I want to play Doom on my phone, or on my Chromebook, or whatever, I won't have to buy it again. I'll just copy the WAD. And since I got it from Steam, I'll basically have a copy on any future computers, too.

(Episode 1 is definitely the best of the three, incidentally.)

Emulation is also at the core of the Internet Archive's groundbreaking work to preserve digital history. They've preserved thousands of games and pieces of software via browser ports of MAME, MESS, and DOSBox. That means I can load up a copy of Broderbund Print Shop and relive summer at my grandmother's house, if I want. But I can also pull up the Canon Cat, a legendary and extremely rare experiment from one of the original Macintosh UI designers, and see what a radically different kind of computing might look like. There's literally no other way I would ever get to experience that, other than emulating it.

The funny thing about demonizing emulation is that we're increasingly entering an era of digital entertainment that may be unpreservable with or without it. Modern games are updated over the network, plugged into remote servers, and (on mobile and new consoles) distributed through secured, mostly-inaccessible package managers on operating systems with no tradition of backward compatibility. It may be impossible, 20 years from now, to play a contemporary iOS or Android game, similar to the way that Blizzard themselves can't recreate a decade-old version of World of Warcraft.

By locking software up the way that Nintendo (and other game/device companies) have done, as a single-platform binary and not as a reusable data file, we're effectively removing them from history. Maybe in a lot of cases, that's fine — in his presentation, Cifaldi refers offhand to working on a mobile Sharknado tie-in that's no longer available, which is not exactly a loss for the ages. But at least some of it has to be worth preserving, in the same way even bad films can have lessons for directors and historians. The Canon Cat was not a great computer, but I can still learn from it.

I'm all for keeping Nintendo profitable. I like the idea that they're producing their own multi-cart NES reproduction, instead of leaving it to third-party pirates, if only because I expect their version will be slicker and better-engineered for the long haul. But the time has come to stop letting them simultaneously re-sell the same ROM to us in different formats, while insisting that emulation is solely the concern of pirates and thieves.

November 21, 2012

Filed under: gaming»perspective

Bundled Up

The fourth Humble Bundle for Android is wrapping up today: if you like games and charity, it's a ridiculously good deal, even if you don't own an Android device--everything works on Windows, Mac, and Linux as well. Although it turns the Nexus 4 into a toasty little space heater, it would be worth it just to get Waking Mars, the loopy botany platformer I've been playing for a couple of days now.

If nothing else, I like that the Humble Bundle proves that it's still feasible to sell software the old-fashioned way: by putting up a website and taking orders yourself. Digital retailers like Steam or the various mobile platform stores are all well and good (the Bundle comes with Steam keys, which I usually use to actually download the games), but a lot of my favorite gaming memories come from this kind of ad-hoc distribution. I don't want to see it die, and I think it would be bad for independent developers if it did.

In the last few months, people like Valve's Gabe Newell and Mojang's Markus Persson have raised concerns about where Windows is going. Since the PC has been the site of a lot of really interesting experimentation and independent development over the last few years, Microsoft's plan to shut down distribution of Metro-style applications on Windows 8, except through a centralized store that they own, is troubling. At the same time, a lot of people have criticized that perspective, saying that these worries are overblown and alarmist.

There may be some truth to that. But I think the fact that the Humble Bundle is, across the three or four mobile platforms in popular use, only available on Android should tell us something. Why is that? Probably because Google's OS is the only one where developers can handle their own distribution and updates, without having to get approval from the platform owner or fork over a 30% surcharge. That fact should make critics of Newell and Persson think twice. Can the Humble Bundle (one of the most successful and interesting experiments since the shareware catalogs I had in the 80s) and similar sales survive once traditional computing moves to a closed distribution model? It looks to me like the answer is no.

July 23, 2012

Filed under: gaming»perspective

Progressive Scan

When I started thinking about blogging again, after an unintentional break, I realized that I'd been doing this, almost continuously, for more than seven years now. That's a long time. Although it was tempting to let it lie fallow, I figured it would be a shame after such a long run--and besides, I do like writing here, especially now that most of the readers (such as they were) are gone.

When I turned Mile Zero into a blog, way back in the day, one of the main things that I wrote about was gaming--specifically, gaming culture. That wasn't all I wrote about, but it was something I was interested in, and there was a whole community of great gaming blogs I could join. Gaming culture had plenty to write about, because it was (and is) a problematic place dominated by emotional children and shameless hacks pretending to be journalists. If I took on those issues, even in a tiny way, I hoped it could help--and it was a good distraction from an office job I wasn't thrilled about and a freelance career that probably wasn't headed anywhere either.

A few years later I got a job at CQ as a "Multimedia Web Producer." Nobody at CQ knew what that was supposed to mean, so gradually I turned myself into the newsroom's go-to person for interactive journalism. I loved my job, and the time and energy I put into it (not to mention the strict editorial policy of non-partisanship) meant I cut back on blogging. I also threw myself into dancing, which I think took me by surprise as much as anyone else, particularly once I joined Urban Artistry. And I went on a bit of an information diet, angry with the low quality/high volume approach of most gaming and tech sites. When I got a chance to write here, usually once a week, the spread of subjects had become more random than ever.

So here we are, seven years (and almost two months dark) later. Sure, this was never really a gaming blog. But I did write about gaming, particularly the sexism, racism, and classism I saw there, and I hoped it could get better. Has it?

Well, kind of better. I mean, it's still awful, isn't it? Sometimes it just seems like the exploitation gets more subtle over time. Tomb Raider pops back up, for example, but now Lara Croft's proportions are less exaggerated--and she's being threatened with sexual assault so players can feel protective toward her. One step forward, two steps off a cliff marked "Seriously, guys, what on earth were you thinking?"

At the other end of the malevolence spectrum, I just finished Driver: San Francisco. Loved it: it's funny, well-balanced, filled with homage to classic car movies and TV (including constant callbacks to its obvious inspiration, Life on Mars). But even though it's a game where the main character is never playable outside a car, even though it's set in a world where the solution to every crime involves vehicular damage, even though the physical make-up of the hero is literally of absolutely no consequence whatsoever... you're still playing as John "Incredibly Generic White Dude With An Incredibly Generic White Dude's Name" Tanner. You could not possibly challenge fewer conventions than Driver:SF, which these days is not so much actively frustrating as it is wearying.

That said, I think there's hope. When I look at something like Anita Sarkeesian's Tropes Vs. Women project on Kickstarter, which went from zero to troll-ridden to ridiculously over-funded in a matter of hours, it kind of blows me away. Seven years ago, would Sarkeesian's videos have gotten that much support? Would it have gotten sympathetic attention from the corporate blogs? Would it have been picked up across a wide range of non-gaming media? I feel like no, it wouldn't have. And while tools like Kickstarter have made it a lot easier for small projects like this to get the funding they need, I suspect that changes in the culture have also made a big difference.

More importantly, it's not just one culture anymore, if it ever was. Communities don't just grow by getting bigger, they also grow by having new circles intersect at their Venn diagram. You see this everywhere: look at the way that music fans start out as a small, particular group, and then as the artist gets bigger, different people begin to attach--sometimes for very different reasons, which may eventually drive the original fans away. The reasons why I love the Black Keys (their early, filthy-sounding recordings from Akron) are not the reasons that new fans probably love them, but we all end up at the same concerts together.

When I was studying intercultural communication in college, the term for these meshed sub-populations was "co-culture." I didn't care for the term then, but now it seems appropriate. Gaming is bigger than it was seven years ago, and it's no longer accurate--or seen as desirable--to say that the "real" gamers are the angry 14-year-olds with a chip on their shoulder about girls and minorities. This space can (and does) support more than that: from Troy Goodfellow's series on science and national characters in gaming, to The Border House providing a critical examination of character and plot, to rhetorically-stunning games like Auntie Pixelante's dys4ia. These are not all the same voices I was reading and responding to seven years ago, but they are stronger and louder and more influential. That's fantastic.

I'll probably never refocus here to the degree I did when I was writing a post or more a day, because being a single-issue blogger (or a single-issue anything) has never been interesting to me. But I'm thrilled other people are doing good work with it. As a gamer, the same way that other people might be movie buffs or music snobs, I want to see it grow and change so that I'll be exposed to new and interesting perspectives. I don't want to see it stagnate. While progress is slow, I think it's being made. Let's hope in another seven years, I can look back and say the same.

June 2, 2011

Filed under: gaming»perspective

The Backlog

There are three consoles stacked behind our TV. They're the retro platforms: my SuperNES and Dreamcast, and Belle's PS2. I think they're hooked up, but I can't honestly remember, because I rarely turn them on or dig out any of the games that go with them. They just sit back there, collecting dust and gradually turning yellow in the sun, like little boxes of electric guilt. I'm almost starting to hate them.

Most people probably have a media backlog of some kind: books they haven't gotten around to reading, movies they haven't had time to watch, music they can't give the attention it might deserve. But I think gamers have it worst of all, for two reasons. First, the length of the average game, especially older games, is a huge obstacle to completion. Second, there's a lot of hassle involved for anything going back more than a generation.

Belle and I are trying to reduce our physical footprint, so having to keep older consoles around "just in case" grates, but emulation's a mixed bag even when it works. Worse, I have a really difficult time tossing old games that I haven't finished: how could I get rid of Virtual On, Chu Chu Rocket, or Yoshi's Island? Those are classics! I'm also prone to imagine unlikely scenarios during which I'll finish a game or two--my favorite is probably "oh, I'll play that when I get sick one day" as if I were in grade school, a plan that ignores the fact that I'm basically a workaholic. If I'm sick enough to stay home, I'm probably too ill to do anything but lay in bed and moan incoherently.

Having realized that I have a problem, one solution is simply to attack it strategically--if I can only decide what that strategy would be. Should I work backward, from newest to oldest? Or start from the SuperNES and go forward through each platform, gradually qualifying each one for storage? Clearly, the "play at random" approach is not narrowing my collection with any great success.

There is, however, another option, and ultimately it's probably for the better: to simply accept that the backlog is not a moral duty. I don't have to play everything. I think gaming culture is very bad about this: the fact that many gamers grew up with certain titles lends them a nostalgic credibility that they probably don't entirely deserve. And frankly, if the titles I'm considering were that compelling, I wouldn't have to force myself to go back and play them.

I'm hardly the only gamer I know caught between practicality and sentiment. The one plan that would unify both would be digital distribution on a neutral platform--the current XBox and Wii emulations fall far short of this, since they just lock my classic games to a slightly newer console. I'd love to see a kind of "recycling" program, where I could return old cartridges in exchange for discounts on legitimate ports or emulations on a service like Steam or Impulse. After all, even without the trade-in value, I sometimes buy Steam copies of games I already own just because I know they'll then be available to me, forever, without taking up any physical space.

Game publishers probably won't go for that plan. I can hardly blame them: the remake business, just as with the "high-def remaster" film business, is no doubt a profit machine for them. But I don't think it'll last forever. Just as I buy fewer movies these days, since I'd rather stream from Netflix or rent from Amazon Digital, the writing is probably on the wall for buying software in boxes. That won't eliminate the backlog--but it'll certainly clear up the space behind my TV set.

Our cat will be thrilled.

December 8, 2010

Filed under: gaming»perspective


One of Belle's favorite hobbies is to take a personality test (such as the Meyers-Briggs) once every couple of months. She makes me take the same test, and then she reads our results aloud. The description for her type never explicitly says "finds personality test results comforting," but it probably should. I'm skeptical of the whole thing, frankly, but then someone with my personality type would be.

I found myself thinking about profiles after having a conversation with a friend about the appeal of Diablo (or lack thereof). I understand the theory behind the Diablo formula--combining the random reward schedule of an MMO with a sense of punctuated but constant improvement--but games based on this structure (Torchlight, Borderlands) leave me almost entirely unmoved.

For better or worse, game design increasingly leverages psychological trickery to keep players interested. I think Jonathan Blow convincingly argues that this kind of manipulation is ethically suspect, and that it displays a lack of respect for the player as a human being But perhaps it's also an explanation for why Diablo doesn't click for me, but other people obsess over it: we've got different personality profiles.

I think the idea of a Meyers-Briggs profile for game design is kind of a funny idea. So as a thought exercise, here's a quick list I threw together of personality types, focused mainly on psychological exploits common in game design. I figure most people--and most games--have a mix of these, just in larger or smaller proportions. Some of them may even overlap a little.

  • Completionist: Enjoys the feeling of finishing all items in a list of goals. Probably has sub-categories, depending on the type of task required (story-related, achievement-based, simple collection).
    Prototypical games: Pokemon, Crackdown, Donkey Kong Country
  • Storyteller: Enjoys the creation of emergent stories, particularly in sandbox-type games. These can range from actual narratives, to the construction of Rube Goldberg-like scenarios within the rules and physics of the game world, or simply games that offer "great moments" during regular play.
    Prototypical games: The Sims, Minecraft, Deus Ex
  • Audience: Enjoys playing through a linear story. Would, if all else were equal, be just as happy watching a really good movie.
    Prototypical games: JRPGs, Metal Gear Solid, Resident Evil, adventure games
  • Explorer: Enjoys finding new locations and entities inside the game world. Prefers expanse and novelty to either realism or deep-but-restricted scenarios.
    Prototypical games: Metroid, Castlevania, Fallout
  • Grinder: Enjoys the process of slowly improving an avatar, either by levelling up or obtaining new equipment, or both. Becomes invested in the game as a long-term product of effort, creating an artifact that's a source of pride.
    Prototypical games: World of Warcraft, Diablo, Borderlands
  • Mechanic: Enjoys figuring out, then mastering, the underlying gameplay system, even if that limits the overall scope of the game. Prefers rulesets that are predictable, and "levelling the player" over increasing an avatar's stats.
    Prototypical games: Street Fighter, Team Fortress 2, Legend of Zelda
  • Munchkin: Like a mechanic, but with an emphasis on learning how to break/exploit the system. Ranges from people who read David Sirlin's "Playing to Win" and loved every word, to people who just like turning on cheat codes in GTA and seeing how far they can get in the tank.
    Prototypical games: sandbox games, broken games, broken sandbox games, Marvel Vs. Capcom
  • Competitor: Enjoys being ranked against other players, either AI or human. Likes the interplay of competition and cooperation, and prefers "winning" to simply "finishing."
    Prototypical games: Halo, Geometry Wars, Defcon
  • Partier: Enjoys playing with other players, particularly in single-couch co-op. More interested in an enjoyable play session than "winning" the game.
    Prototypical games: Rock Band, Mario Party, Wii Sports
  • Thinker: Enjoys making comprehensive strategic decisions, often at a slower pace. Not necessarily a wargamer, but often is.
    Prototypical games: Advance Wars, Defense Grid, Age of Empires
  • Buttonmasher: Enjoys reflex-based games that offer a lot of rapid stimulation. Not necessarily a shooter fan, but often is.
    Prototypical games: Ikaruga, Super Mario, Demon's Souls

There's probably a good way to simplify these, or sort them into a series of binaries or groups, if you wanted to make it more like a legitimate personality quiz. Still, looking over this list, I do feel like it's better at describing my own tastes than a simple list of genres. I think I rank high for Audience, Mechanic, and Buttonmasher, and low for Storyteller, Completionist, and Grinder--makes sense for someone who loves story-driven FPS and action-RPGs, but generally dislikes open-world games and dungeon crawlers.

Such a list certainly helps to describe how I approach any given title: concentrating more on getting through the narrative and learning the quirks of the system, less on grabbing all the achievements or experimenting with the environment. I almost wish reviewers ranked themselves on a system like this--it'd make it a lot easier to sort out whether my priorities sync with theirs.

In general, I agree with Blow: the move toward psychological manipulation as a part of game design is at best something to be approached with great caution. At worst, it's actually dangerous--leading to the kinds of con-artistry and unhealthy addiction in Farmville and (to a lesser extent) WoW. I don't think we can eliminate these techniques entirely, because they're part of what makes gaming unique and potentially powerful. But it would probably be a good idea to understand them better, and package them in a way that people can easily learn to be aware of them, similar to the ways that we teach kids about advertising appeals now. After all, as other sectors adopt "gamification," industry-standard psychological manipulation is only going to get more widespread.

December 9, 2009

Filed under: gaming»perspective


I'm thrilled, personally, to see actual actors doing voice and motion work for video games, after years of Resident Evil-style butchery. Not to mention that it's nice to see Sam Witwer (Crashdown from BSG) getting work as the Apprentice in Star Wars: The Force Unleashed, or Kristen Bell (Veronica Mars) taking a bit part for the Assassin's Creed games. But friends, I have to say: the weird, digital versions of these actors used onscreen are freaking me out.

We have truly reached the point of the uncanny valley in terms of real-time 3D, which is kind of impressive if you think about it. Or horrifying, if you try to play these games, and are interrupted at regular intervals by dialog from cartilage-lipped, empty-eyed mannequins. It's actually made worse by the fact that you know how these actors are supposed to look, giving rise to macabre, Lecter-esque theories to explain the discrepancies between their real-life and virtual appearance. Don't get me wrong--I'm glad that we've reached the point that such a high level of technical power is available. I'm just thinking it would be nice to be more selective about how it's used.

The problem reminds me of movie special effects after computer graphics really hit their stride--say, around the time I was in high school, and George Lucas decided to muck around with the look of the original Star Wars trilogy, perhaps concerned that they lacked the shiny, disjointed feel of the prequels. In one scene, for example, he added a computer-generated Jabba the Hutt getting sassed by Han Solo, even though it really added nothing to the film apart from a sense of floaty unreality.

The thing is, there wasn't anything wrong with the original effects in Star Wars. They've held up surprisingly well--better than Lucas's CG replacements. The same goes for films like Star Trek II or Alien or The Thing. Even though the effects aren't exactly what we'd call "realistic," they don't kill the suspension of disbelief--and they're surprisingly charming, in a way that today's effortless CG creations are not. Scale models and people in rubber suits have a weight to them that I, personally, miss greatly (the most recent Indiana Jones movie comes to mind). When the old techniques are used--Tarantino's Death Proof, for example, or Guillermo Del Toro's creaturescapes--the results have an urgency and honesty that's refreshing.

Back in videogameland, it amazes me that no-one looks at their cutscenes during development and asks themselves "is there a better way? Is the newest really the best?" At one point, right when CD-ROM became mainstream, it looked like composite video with real human actors might be the future, a la Wing Commander. Somehow, it didn't happen (fear of Mark Hamil, maybe? Psychological scarring from Sewer Shark?). But when you're watching robot Kristen Bell shudder through a cutscene in Assassin's Creed, it's hard not to wish that you could just watch the real Bell, even through a cheesy green screen.

Or, at the very least, it'd be nice if more developers would try alternatives instead of pushing ahead with a character look that they're just not pulling off. I have had harsh words for Mirror's Edge--I believe I compared it to a flammable kitchen appliance--but the developers' choice to create animated interstitial movies instead of realtime rendering was a bold and interesting choice, particularly since the game actually boasted very well-crafted and animated character models. In-engine cutscenes may have been a great bullet-point when we made the transition to hardware-based 3D, but now that novelty has passed. We've worked for years to get to the uncanny valley: it's time to find a way out.

August 13, 2009

Filed under: gaming»perspective

The Future is Non-upgradeable

This was a submission for the 2009 Call for Writers at Gamers With Jobs. It wasn't a good match for them, but I still enjoy parts of it, so I'm posting it here.

The PC has been killed so many times, they put a revolving door on the coffin. It's been declared deceased so often, the death certificate has scorch marks from the copier. The tech community has buried it so deep, Australia's complaining about the tunnelling's environmental damage. And yet, somehow, it's still around. As someone who has long identified with the PC, I find this strangely comforting. When they start writing editorials about the great shape the industry's in, I'll start worrying.

Of course, the constant hum about its mortality masks the traditional role of the PC as a weathervane for gaming elsewhere. For example, Sean Sand's "Don't call it a comeback" points out that big-budget titles may make up a smaller portion of the platform's future library--what's that but XBLA and WiiWare writ large? Likewise, the shift to digital distribution over Steam, Impulse, and other services puts the PC at the forefront of an industry-wide trend away from physical media. It's the open nature of PC development--its generativity, as author Jonathan Zittrain would say--that lets it lead the pack this way.

But in another sense, PC gaming is changing on a more basic, demographic level. The platform itself is evolving, and gaming will have to evolve with it. I'm referring, of course, to the gradual rise of the laptop and netbook as computing platforms. Like it or not, both of these hardware configurations are increasingly common, carry serious implications for developers, and have already begun to influence this corner of the industry.

In 2007, desktop sales dropped by 4%, while laptop sales rose by 21%. The gap has no doubt risen since that time, in part due to the rise of the netbook niche--indeed, laptops outsold desktops in 2008 for the first time, ever. Those kinds of numbers aren't broken out by profession or use, unfortunately, so we don't know how many gamers specifically have moved to portable. But it's not outrageous to assume that it's true for the general gaming population, particularly as we gamers get older and want a computer to pull double-duty for work and play.

When the time came to replace my own aging, hand-built tower PC, I ended up going with a Lenovo Thinkpad. It has a discrete graphics card, putting it roughly in the midrange of portable rendering power between the poor suckers with integrated Intel chips and those city-block-sized, SLI-capable monsters from Asus and Alienware. For a two year-old business notebook, the Thinkpad is pretty good at gaming: it runs Half-Life 2 and Team Fortress 2 acceptably--if not extravagantly--well, which was my priority when I bought it. Fallout 3 also plays well enough that I can't complain about the graphics (the controls, on the other hand...). Not everything fares so well: Crysis was a slideshow (literally: "What I Did On My Summer Vacation--Visited Beautiful North Korea, Fought Aliens, and Choked People With My Terrifying Robot Fetish Suit"). But then, people with small nuclear reactors under their desks had trouble running Crysis when it first came out. I don't let it get me down.

The salient point is not that the hardware's a bit behind the cutting edge. It's that, as a laptop, it's mostly not upgradeable--at least, not in the parts that really count for 3D rendering, like the graphics card. My laptop will never run so-called AAA titles, no matter what I do. This doesn't mean that I've stopped gaming, or buying games. But it does mean that my purchasing dollars tend to go to companies that will support a somewhat more lenient range of hardware when it comes to their software. I end up buying from companies like Stardock or Valve--developers that still target graphics cards from a few generations back. Or I've found a new interest in the indie scene--games like World of Goo or Introversion's back catalog. When all else fails, I'm catching up on older titles I never played, like the original Fallout games. In a way, the laptop hardware lag has been a gift.

The PC gaming industry's become accustomed to being the top performer in the rendering game for a while now, and that will no doubt continue among the niche of hardcore enthusiasts. But anyone who wants to actually be profitable in this space would do well to keep us laptop gamers in mind. You think World of Warcraft's success isn't at least partly due to its generous system requirements? The shift might even be a blessing in disguise: notebooks, while still diverse compared to console hardware, are notably more standardized than desktop systems--a complaint that developers have against the platform for years.

It will be painful for a while, as publishers and developers adjust to the new reality of notebook gaming, but ultimately I think we'll be better for it. From constraint often comes inspiration. That's true in other media, and I think it will be true in gaming as well. So feel free to cheer for the end of PC gaming--after all, it's not going anywhere.

July 16, 2009

Filed under: gaming»perspective


"You are crippled," says Fallout 3.

"Huh," says I. "That seems a little tone-deaf."

When you get shot in the leg, or you fall off a building, or a mole rat eats your hand (or all of the above) in Fallout 3, the damage gets broken out into one of six body zones. Take enough damage, and there's an appropriate penalty (loss of accuracy, slower movement, etc.) as well as an amusing pained expression on your in-game HUD. I'm okay with all that. But the word "crippled" took me aback. I'm not terribly savvy when it comes to persons with disabilities, but it strikes me as a particularly loaded term--I certainly would have avoided it when I was writing for the World Bank.

I'm not trying to cast Bethesda as insensitive bigots. The terminology (used in the classic RPG tradition of Capitalized Status Conditions like Confuse, Sleep, or Haste) appears to be a holdover from the older Fallout titles. They're using it as an adverb, and not as a noun ("You are crippled" instead of "You are a cripple"), which makes some difference. And it's not like they did something outrageously stupid, like putting Africans in grass skirts and wooden masks or something. Still, you'd think that while they were updating everything else about those previous games in the series, putting them on a new engine and everything, they must have thought about the language they were using. I wonder why they thought "crippled" was the best choice.

It's not even, from a writer's perspective, a particularly flavorful word. A thesaurus search finds several alternatives with more punch, including "wrecked," "maimed," and "mangled" ("vitiate" is also amusing, but probably too obscure). Individual terms for specific injuries would be even better: "You are hobbled." "You are concussed." "You are defenestrated." And here I've always taken such good care of my fenestrates.

In any case, it's easier to nitpick someone else's hard work than it is to figure out what it means, and I'm still not sure how I feel about Fallout's status condition. Is it significant? How does it relate to the game's subject, as well as its underlying mechanics? Can it tell us something about portrayals of disability and normality in media? Or is "crippled" just a writing decision that rubs me the wrong way?

June 23, 2009

Filed under: gaming»perspective

Rocked Out

Rock Band was actually the reason that we bought the XBox. Belle and I have a soft spot for gimmicky party attractions. Somehow, we forgot that we also have a neurotic, overprotective pit bull mutt. They don't really mix, and we kept putting off our plans. This weekend, we finally bit the bullet, boarded the dog, and brought the noise.

Watching people play for the first time, particularly people who are not A) incredibly extroverted or B) experienced gamers, was interesting. They were usually put on the drums, under the reasonable logic that hitting things is fun, and everyone was pretty much on Easy, because failing a song is not fun (the primacy of fun may be a debated topic in design circles, but when people are drinking it's not really an option). When the song first starts, the newbie would have an expression of utter panic--hitting the pad too late, bewildered by the number of notes coming in, only using one stick--and then, all of a sudden, there'd be this ah-hah! moment and they'd get it.

The speed of that jump between dread to drumming is so quick, in fact, that I've been trying to figure out the cause in the couple of days since. My best guess is that it comes from the realization that you're not just hitting buttons when they cross the bottom of the screen, but that you're playing in time with the music--the onscreen action is actually kind of a miscue. Once new players make that conceptual leap, the rest is a cakewalk. Which begs the question: the "highway o' notes" approach has become so standard that experienced gamers don't question it, but could it be the weakest part of the modern rhythm game? How else could we visualize a musical score without resorting to actual notation?

Once they sat down and got the hang of things, I think people enjoyed themselves. But there's certainly a karaoke factor--nobody wants to be the first to act like an idiot in front of everyone. You have to have a few Judas goats get things started with a couple of songs--the cheesier the better--before people will start to jump in. And even so, I think reports of the game's universal appeal may be a little presumptuous. And that's okay: it's a party, not an enforced Rock Band prison camp.

Not yet, anyway. I'm thinking of training Wallace to be the Fun Enforcer. If he's so set on biting people, we might as well channel it into a useful direction. And snarling madly at the end of a short leash while I shriek "more fun! MORE FUN!" sounds like a good party starter. For me, at least.

May 5, 2009

Filed under: gaming»perspective

Brave New Wargame

John Robb's Brave New War basically confirms a suspicion that I've had for some time now: that so-called "fourth-generation" warfare is really just the military catching up to its nonviolent counterparts. Robb's book serves as a useful summary of 4GW thought, incorporating examples from Iraq and elsewhere. In short, it amounts to the realization that straightforward military conflict--soldiers firing guns directly at other soldiers--is no longer the predominant threat. Instead, Robb says, the goal of "global guerrillas" is to disrupt the enemy economically, psychologically, and logistically. None of this would be a surprise to, say, the Danish under Nazi rule, or Ruhrkampf in 1923, or the organizers of the American civil rights movement. The violence of the methods listed by Robb may be different, but the underlying philosophy is very similar.

This is kind of satisfying as an advocate for nonviolence, but it's also interesting as a gamer. There's a whole genre of shooters and strategy titles that are based around the ideas of third-generation warfare: get better equipment than the other guy, then go beat the crap out of him. I could be wrong, as I'm not an expert on the strategy/RTS genre, but I can't think of a single popular title that isn't firmly rooted in that idea (tower defense games might come the closest).

Not that I'm saying that shooters should necessarily be following up-to-date strategic doctrine. Or that they should be anything near realistic. I like a good me-against-the-world shooter as much as the next guy. But even if you don't believe that gaming can influence cognitive approaches--and I go back and forth on that point--the lack of progress does seem a shame, for two reasons. First, because 4GW is more interesting: it's about finding weak points and undermining legitimacy, the kind of min-max problem that munchkin-style gamers have salivated over for years. Robb says that knocking out 1 percent of high-load nodes would make up to 40% of our electrical grid go dark. Can you imagine the GameFAQs entry for that? Or the feeling of accomplishment when it's figured out?

Second, it's less violent (and more parallelizable). The violence thing is not just me being squeamish. I can't be the only person to have noticed that as consoles and PCs have gotten more powerful, one of the primary uses for that power is to enhance violence: zone-specific injury, ragdoll physics, more on-screen enemies, bloodsprays, etc. It's kind of morbid, frankly. Surely there's more challenge (and gameplay) in modeling the network of relationships between infrastructure and population--and it might be easier to scale that kind of modeling, in a world where concurrency is the new dominant programming paradigm. Easier on the art team, too.

Of course, if you do believe that games are educational experiences, perhaps this is not the education that we want: how to sabotage a developed society? Creepy. But then, if you believe that, you should already be worried about the lessons that third-generation wargames are teaching. The strategies of current military titles are largely generalizable only to other military applications, and they carry the implicit message that coordinated force is a valid solution in international conflict resolution. At the very least, games that address weaknesses in community resilience and redundancy can also be applied to sustainability and our economic situation (at the extremes, the green movement and the paranoid survivalists become strikingly similar), to name just two of the networks that increasingly define our world. More importantly, it's a view of the world that stresses interdependence and complexity over unilateral force. I can't help but see that as a (slight) improvement.

Past - Present