this space intentionally left blank

October 23, 2009

Filed under: gaming»roundtable

What Sharp Teeth

Tale of Tales' The Path tells an old story: a girl dressed in red walks through the woods to an elderly relative's house. The path through the forest begins at the edge of a paved road, with a large city off in the distance. It ends at a bridge crossing the moat-like lake around the grandmother's cottage. Your choice, as a player, is to either proceed directly to the end of the path, or to wander off into the woods in search of novelty (and, ultimately, The Wolf). In either case, a significant piece of the storytelling and gameplay takes place after the "end" has been reached--the denouement, as Corvus puts it in this month's Round Table.

The Path features a lot of really... interesting gameplay choices, but one that stands out for me is the control scheme. It's the essence of minimalism: the only keys are for turning and movement. To interact with a scene in the forest, you simply stop near it--the girl will move into position and begin the scene, but you can cancel by simply choosing to move again. Combined with the translucent, dreamlike fog, the effect is a feeling of inevitability. While the game warns you not to leave the path, the real story only happens when you abandon its purposeful motion for something interrupted and inconsistent--it only advances when you stop.

Where it gets interesting is when you finish the game, either by going straight to the end of the path or by finding the "Wolf" (metaphorically speaking--it's something different for each of the characters, but each time it deposits them outside the cottage in a state of visible pain, limping to shelter from the sudden downpour of rain). At that point, the girl opens the gate, crosses the bridge, and enters the house.

Now we're in the denouement. The view switches to a first-person view, and the controls don't seem to respond. After a few moments, you work it out: pressing any of the movement keys will move a single step along a predetermined path through the house while a wolf growls and barks somewhere out of view. Tapping a key repeatedly, your trip through the house takes detours into different rooms along the way, depending on the encounters found in the forest, and ends in an unsettling sequence of flashbacks related to each girl's Wolf. (If you didn't find the metaphorical Wolf in the forest, you'll end up in the grandmother's bedroom instead, with a literal beast staring at you from the corner. This is considered failure.)

Although it's tempting to stay in one place in the house and give yourself time to recover, remaining motionless causes the screen to darken and the wolf sounds to become louder and more aggressive--it's extremely unnerving, and I've never actually managed to stand still long enough to find out what happens after that. So now the dynamic has changed, even though the gameplay remains similar: elements from the forest are recontextualized inside the house, but now stopping is a source of dread and movement is... well, not rewarded, exactly. Less uncomfortable, I guess. It also mimics a kind of nightmare logic: no matter what direction you try to go, your viewpoint drifts grimly forward.

As a game, The Path is a distinct oddity, but I generally like it, and one of the reasons is this two-act, post-'victory' structure it's got going. In a way, the cottage tour is really nothing more than a twisted version of the Mega Happy Ending that concludes most JRPGs and Nintendo games, where they revisit each character and location encountered during the game as a form of wrap-up. But Tale of Tales uses a few audio cues and a simple gameplay change to turn a linear cutscene into something a little scary, with a lot more implied agency than actually exists. I'm not entirely sure what it means--I'm not sure I'm supposed to--I only know that the combination of structure and interaction makes for a pretty unforgettable experience.

September 10, 2009

Filed under: gaming»software»rock_band

Beatlemania

OF COURSE we bought Beatles: Rock Band the day it came out. Belle is a full-fledged Beatlemaniac. Her calendar has eight days a week and when I say goodbye, she says hello--it makes our home life confusing, but you can't fault her taste.

The game was obviously created by equally-intense fans, which comes into play in interesting ways. Not being a real Beatles listener myself, a few things leapt out at me:

  • If you lose a song, it just fades to black. Nobody's going to boo the lads from Liverpool off stage in this game.
  • The whammy bar is disabled on guitar and bass. It still affects the note trail, and for all I can tell it still increases your score, but there's no audible effect. There will be no dive-bombing during "I Am The Walrus," no matter how much you want to.
  • Especially since, in the more production-heavy songs from their studio days, the guitar controller is actually playing strings and synth parts.
  • The songs are arranged chronologically, and have Harmonix's usual care with note placement (for the most part--"Birthday" and "Come Together" are going to bug anyone who's played them on a real instrument, but there's nothing they could do about that). This means it's a great process of discovery for someone who loves the band, but the difficulty level is all over the map. These are easily some of the most difficult Rock Band songs I've seen--not just in the Guitar Hero showers-of-notes sense, but because they're complex songs to begin with. The game provides a helpful "difficulty meter" when choosing a song: pay attention to it. They're not joking around.
  • I love some of the achievements, particularly "Authenticity:" "Finish any song as a bassist with Lefty Mode turned on, hitting at least 50% of the notes." They should really make you sing at the same time--Sir Paul is a monster bass player.

We played for about three hours, and beat every stage but one last night, only stopping when my hands started to cramp up. I think Belle likes it.

September 8, 2009

Filed under: gaming»software»arkham

Intensive Treatment

Some three hours into Batman: Arkham Asylum, we are introduced to the Lunatic after someone opens all the cell doors on the island. The Lunatic is a shambling, almost-skeletal enemy dressed in a straitjacket. His attack entirely consists of leaping onto Batman's back and thrashing wildly until thrown to the ground and knocked out with a blow to the head. Because that's therapy, superhero-style: brutally beating the mentally-ill senseless with your heavy, armored fists.

I'm not the only person who has found this a little unsettling: Justin Keverne calls it "the intellectual and social equivalent of bumfighting," and Travis Megill follows up with a post discussing the stigmatization of mental illness perpetuated by the game, and recommends using it as a consciousness-raising opportunity. Both make some great points.

One of the things that I like about Batman as a character is how plainly ambiguous he has become. Other superheroes may be able to perpetuate the myth of vigilante justice, but after The Dark Knight Returns (a barely-disguised John Bircher fantasy styled after Red Dawn), The Dark Knight (the film, which bears little plot-wise resemblance to the comic but touches on many of the same themes), and (most importantly) Alan Moore's The Killing Joke, it's difficult to imagine an interpretation of the character that isn't a damaged, near-fascist personality locked in a feedback cycle with equally-psychotic "supervillains." Calling the modern Batman a hero is hilarious.

So while his treatment of the "Lunatic" enemy is unsettling, I could almost believe that it's purposefully so. Likewise the depiction of the asylum itself: while the game never explicitly comes out to say so, this is clearly not an enlightened institution (and never was, as the hidden story items make clear). The inmates are locked into tiny, solitary cells and effectively left to rot. The guards are vicious, unpleasant people, and the doctors are using their patients as experimental subjects. In Killer Croc's case, they've just dumped him into the sewers, dropping rotten meat to feed him. The warden is a political animal more concerned for his career than for those under his care. It's like something out of Nellie Bly's undercover reporting on the Blackwell's Island asylum. No doubt anyone sentenced to Arkham would emerge more damaged than when they entered, and many are sent there as much from a desire to remove the undesirables as to rehabilitate them.

Indeed, one thing I found interesting, particularly while listening to the "interview tapes" scattered through out the game, is the degree to which several of the inmates are not insane at all. Killer Croc, for example, is violent and dangerous, but he shows no signs of being disconnected from reality: the outside world really does see him as a monster, and Croc merely reacts accordingly. Poison Ivy has entirely valid reasons to identify more with plants than humans--she's half-plant herself. And the Joker, as voiced by Mark Hamill, has never seemed crazy to me--sociopathic, perhaps, but no more so than many mobsters and criminals. It did not surprise me to find out that Paul Dini, the writer for both The Animated Series and Arkham Asylum, has written a story titled "Case Study" that frames the Joker as an entirely sane criminal using a deranged persona to pursue a vendetta against Batman.

Regardless, I have two reactions to the Lunatic. First, as Megill points out, focusing on the individual inmates (such as the Lunatic) or the institution is to overlook the overarching message of the game's view on mental illness, which is firmly rooted in unsubtle stereotypes. In its universe, disorders aren't a continuum of mental function, but a strict sane/insane dichotomy. This isn't necessarily Arkham Asylum's fault--it's derived directly from the comics themselves, which have always treated insanity as a shortcut directly to wearing tights and planning crimes centered on random concepts. ("Calendar Man?" Really?) As counted among the offenses perpetuated in our pop cultural psyches by Marvel and DC, I rank this relatively low on the list, but it's good to see it noticed when it pops up.

Second, the game's unsympathetic portrayal of the asylum itself doesn't really excuse its dehumanized view of the patients themselves, or Batman's enforcement of the status quo (he beats the inmates, but frees the crooked administrators to return to their jobs). It's one thing to say that Bruce Wayne is an anti-hero at best, but another to watch him blithely ignore the conditions around him. This is where Batman sends people, remember, after he's caught them. And it's not like he doesn't know about Arkham's policies: he's on the island enough to have built a fully-equipped Batcave there. Talk about your bad neighborhoods. If that's not indicative of the unhealthy relationship the "Caped Crusader" has with his foes, I don't know what is.

August 21, 2009

Filed under: gaming»software»castlevania

Dis-ordered

I never played the original Castlevania on its original platform in its original era. I only got around to it when they released it on GBA. So I think my opinion's unclouded by nostalgia when I say that, with reservations but in general, I like it.

In the pantheon of retro classics, Castlevania slots in right next to Ninja Gaiden. Both are sidescrollers emphasizing close combat (as opposed to Mario-style hopping), with health bars and a rudimentary power-up system. Castlevania has better secondary weapons. Ninja Gaiden has better level design, and is probably the superior title overall--the flow of its levels is pure 8-bit choreography. Either way, they're simple games. Over the years, Ninja Gaiden has stayed fairly simple. Castlevania has not.

Which brings us to Order of Ecclesia, the most recent side-scrolling title in the series. It's not that OoE is a bad game, so much as it is way more complicated than it needs to be.

I'm giving up on the game about seven levels in, having gotten through the first four bosses or so. I'm doing so because the level design (which is awful, having largely abandoned the intricate "Metroid-vania" style of navigation) has begun throwing in enemies that completely wreck the difficulty curve (specifically, the demonic gravediggers). The options available seem to be either learning an attack pattern that's not particularly enjoyable, or improving my character. Since neither appeals, I'm ditching it.

"Improving my character"--what a fun turn of phrase that is, as if Castlevania were Emily Post and Buddhism mixed together. What it really means is going out and either leveling-up (a long, painful process left over from RPGs that I thought we had largely abandoned in the civilized world) or tediously killing the same enemies over and over again until they drop a more powerful weapon. It's all the worst parts of World of Warcraft, but without a sense of humor!

This complication doesn't have any particular justification for its existence. Its only point is to add a pseudo-cerebral tint to an otherwise fluffy and unredeemable arcade experience, something it has in common with the vapid plots that Konami insists on jamming in there, as if I really cared. "We're not just engaging your reflexes," OoE defensively protests, "we're engaging your mind!" Yeah: because making me constantly interrupt play to struggle through a poorly-designed menu system, all to find the collectible weapon that will harm this particular recycled sprite from the last seven Castlevania titles is certainly a challenge that will stretch my capacity for non-linear thinking, isn't it? Give me a break. These games are basically mental Diet Coke. The least they could do is have the dignity to act like it.

Let's make a deal, video games: you don't make me grind for a frakking sidescroller, and I won't sell you on eBay.

August 13, 2009

Filed under: gaming»perspective

The Future is Non-upgradeable

This was a submission for the 2009 Call for Writers at Gamers With Jobs. It wasn't a good match for them, but I still enjoy parts of it, so I'm posting it here.

The PC has been killed so many times, they put a revolving door on the coffin. It's been declared deceased so often, the death certificate has scorch marks from the copier. The tech community has buried it so deep, Australia's complaining about the tunnelling's environmental damage. And yet, somehow, it's still around. As someone who has long identified with the PC, I find this strangely comforting. When they start writing editorials about the great shape the industry's in, I'll start worrying.

Of course, the constant hum about its mortality masks the traditional role of the PC as a weathervane for gaming elsewhere. For example, Sean Sand's "Don't call it a comeback" points out that big-budget titles may make up a smaller portion of the platform's future library--what's that but XBLA and WiiWare writ large? Likewise, the shift to digital distribution over Steam, Impulse, and other services puts the PC at the forefront of an industry-wide trend away from physical media. It's the open nature of PC development--its generativity, as author Jonathan Zittrain would say--that lets it lead the pack this way.

But in another sense, PC gaming is changing on a more basic, demographic level. The platform itself is evolving, and gaming will have to evolve with it. I'm referring, of course, to the gradual rise of the laptop and netbook as computing platforms. Like it or not, both of these hardware configurations are increasingly common, carry serious implications for developers, and have already begun to influence this corner of the industry.

In 2007, desktop sales dropped by 4%, while laptop sales rose by 21%. The gap has no doubt risen since that time, in part due to the rise of the netbook niche--indeed, laptops outsold desktops in 2008 for the first time, ever. Those kinds of numbers aren't broken out by profession or use, unfortunately, so we don't know how many gamers specifically have moved to portable. But it's not outrageous to assume that it's true for the general gaming population, particularly as we gamers get older and want a computer to pull double-duty for work and play.

When the time came to replace my own aging, hand-built tower PC, I ended up going with a Lenovo Thinkpad. It has a discrete graphics card, putting it roughly in the midrange of portable rendering power between the poor suckers with integrated Intel chips and those city-block-sized, SLI-capable monsters from Asus and Alienware. For a two year-old business notebook, the Thinkpad is pretty good at gaming: it runs Half-Life 2 and Team Fortress 2 acceptably--if not extravagantly--well, which was my priority when I bought it. Fallout 3 also plays well enough that I can't complain about the graphics (the controls, on the other hand...). Not everything fares so well: Crysis was a slideshow (literally: "What I Did On My Summer Vacation--Visited Beautiful North Korea, Fought Aliens, and Choked People With My Terrifying Robot Fetish Suit"). But then, people with small nuclear reactors under their desks had trouble running Crysis when it first came out. I don't let it get me down.

The salient point is not that the hardware's a bit behind the cutting edge. It's that, as a laptop, it's mostly not upgradeable--at least, not in the parts that really count for 3D rendering, like the graphics card. My laptop will never run so-called AAA titles, no matter what I do. This doesn't mean that I've stopped gaming, or buying games. But it does mean that my purchasing dollars tend to go to companies that will support a somewhat more lenient range of hardware when it comes to their software. I end up buying from companies like Stardock or Valve--developers that still target graphics cards from a few generations back. Or I've found a new interest in the indie scene--games like World of Goo or Introversion's back catalog. When all else fails, I'm catching up on older titles I never played, like the original Fallout games. In a way, the laptop hardware lag has been a gift.

The PC gaming industry's become accustomed to being the top performer in the rendering game for a while now, and that will no doubt continue among the niche of hardcore enthusiasts. But anyone who wants to actually be profitable in this space would do well to keep us laptop gamers in mind. You think World of Warcraft's success isn't at least partly due to its generous system requirements? The shift might even be a blessing in disguise: notebooks, while still diverse compared to console hardware, are notably more standardized than desktop systems--a complaint that developers have against the platform for years.

It will be painful for a while, as publishers and developers adjust to the new reality of notebook gaming, but ultimately I think we'll be better for it. From constraint often comes inspiration. That's true in other media, and I think it will be true in gaming as well. So feel free to cheer for the end of PC gaming--after all, it's not going anywhere.

July 16, 2009

Filed under: gaming»perspective

Wordplay

"You are crippled," says Fallout 3.

"Huh," says I. "That seems a little tone-deaf."

When you get shot in the leg, or you fall off a building, or a mole rat eats your hand (or all of the above) in Fallout 3, the damage gets broken out into one of six body zones. Take enough damage, and there's an appropriate penalty (loss of accuracy, slower movement, etc.) as well as an amusing pained expression on your in-game HUD. I'm okay with all that. But the word "crippled" took me aback. I'm not terribly savvy when it comes to persons with disabilities, but it strikes me as a particularly loaded term--I certainly would have avoided it when I was writing for the World Bank.

I'm not trying to cast Bethesda as insensitive bigots. The terminology (used in the classic RPG tradition of Capitalized Status Conditions like Confuse, Sleep, or Haste) appears to be a holdover from the older Fallout titles. They're using it as an adverb, and not as a noun ("You are crippled" instead of "You are a cripple"), which makes some difference. And it's not like they did something outrageously stupid, like putting Africans in grass skirts and wooden masks or something. Still, you'd think that while they were updating everything else about those previous games in the series, putting them on a new engine and everything, they must have thought about the language they were using. I wonder why they thought "crippled" was the best choice.

It's not even, from a writer's perspective, a particularly flavorful word. A thesaurus search finds several alternatives with more punch, including "wrecked," "maimed," and "mangled" ("vitiate" is also amusing, but probably too obscure). Individual terms for specific injuries would be even better: "You are hobbled." "You are concussed." "You are defenestrated." And here I've always taken such good care of my fenestrates.

In any case, it's easier to nitpick someone else's hard work than it is to figure out what it means, and I'm still not sure how I feel about Fallout's status condition. Is it significant? How does it relate to the game's subject, as well as its underlying mechanics? Can it tell us something about portrayals of disability and normality in media? Or is "crippled" just a writing decision that rubs me the wrong way?

July 2, 2009

Filed under: gaming»software»chrono_trigger

Do the Time Warp Again

I have probably started four or five games of Chrono Trigger, across four or five different computers (I didn't own an SNES at the time), and never gotten past the Prehistoric Era segment. So while Square's habit of re-releasing its entire classic catalog every time a new platform reaches critical mass may seem grating and money-grubbing (probably because it is), it is sometimes valuable. The DS port of Chrono Trigger is the first time I stand a chance of finishing it. This is probably because, like a lot of adult gamers, I use portable games as a way of multitasking. It's something I can pull out if a movie or TV show starts to drag but I still want to see the end, as well as a time-killer during the inevitable Metro delays. And while I've always got games loaded on a smartphone of some kind these days, it's rarely as satisfying as the experience on an actual console--not to mention that the battery life is far better. So compared to the emulated versions, I've gotten much farther this time around.

Chrono Trigger is almost fifteen years old now, which is pretty amazing if you think about it. It's held up well. More than that: I'd argue that it's better than most anything Square's put out during the intervening years, on either portable or home console. Mostly this is because it's such a lean design: unlike the excesses the company developed in the 32-bit era, there are no collectible card games (FF8) or watersports (FF10) that you have to learn to navigate, and the battle system is relatively simple. It feels like this left the development team free to concentrate on the worldbuilding: the result is a series of rich, often comical time periods linked to each other by a decidedly quirky kind of causality. Great characters, as well, although I'm not the biggest fan of the art style.

Although the game isn't non-linear, it's also impressive how well it fakes it. Shuffle the party as much as you want, they'll all still have appropriate dialog choices (some more appropriate than others, granted). Halfway through, it opens up a whole bunch of sidequests that players can approach in any order. The experience is still basically guided at every step, but in a way that feels empowering and entirely in sync with the time travel theme: at practically any point in the game, players can jump straight to the final boss, although they'll probably get creamed if they haven't done at least a few of the optional missions.

If anything has not aged well about the game, it's the mechanics of the battle system--more specifically, the endlessly frustrating menu options that must be navigated under pressure. At the time, this was how RPGs worked--hell, it's how a lot of them still work today. For a short time, Square seemed to have chafed a bit under that convention: FF6 (released a year before Chrono Trigger) supplemented the menus with oddball conventions like Sabin's Street Fighter-esque combos, while Super Mario RPG went to a far more manageable system of assigning different actions to the largely-unused face buttons. Then the Playstation rolled around, and the company apparently gave up on control innovation and concentrated on putting elaborate CG movies in between boring menus.

In the meantime, it cannot be stressed how annoying Chrono Trigger's menus are, especially since by default they let enemies continue to attack while you try and find the right $%!@-ing Dual Tech. I particularly love hunting for a single healing item through a vast inventory list using a tiny little window, during which time monsters have probably managed to kill the character I wanted to heal in the first place. Once upon a time, this was called "adding tension," but looking back on it, it's a lot like trying to solve sudoku while someone shoots you with a BB gun: a synthesis of tedium and tension that I could personally do very much without. The DS port of Chrono Trigger "solves" this problem by making the same menus into big, touch-friendly targets, which utterly fails to help. It may feel like a blow to your hardcore gamer cred, but I'd recommend switching from "Active" to "Wait" mode instead.

All in all, though, Chrono Trigger's an example of Doing It Right. If everything Square had made was this good, as opposed to say every Final Fantasy except for 6, it probably wouldn't be so galling to see them regurgitate the whole lot every time a marketable piece of hardware came out. There's even a theoretical justification for their actions: even more than other digital artifacts, console games age badly as the march to new platforms and formats makes them difficult--or even impossible--to play them as they should be played, and clearly emulation doesn't always cut it. In theory, I don't begrudge the company for reselling the classics, even if it is just locking them to a newer block of soon-to-be-obselete hardware. In practice, however, the only thing worse than watching them repackage both the good and the awful is watching all of it sell like hotcakes.

June 23, 2009

Filed under: gaming»perspective

Rocked Out

Rock Band was actually the reason that we bought the XBox. Belle and I have a soft spot for gimmicky party attractions. Somehow, we forgot that we also have a neurotic, overprotective pit bull mutt. They don't really mix, and we kept putting off our plans. This weekend, we finally bit the bullet, boarded the dog, and brought the noise.

Watching people play for the first time, particularly people who are not A) incredibly extroverted or B) experienced gamers, was interesting. They were usually put on the drums, under the reasonable logic that hitting things is fun, and everyone was pretty much on Easy, because failing a song is not fun (the primacy of fun may be a debated topic in design circles, but when people are drinking it's not really an option). When the song first starts, the newbie would have an expression of utter panic--hitting the pad too late, bewildered by the number of notes coming in, only using one stick--and then, all of a sudden, there'd be this ah-hah! moment and they'd get it.

The speed of that jump between dread to drumming is so quick, in fact, that I've been trying to figure out the cause in the couple of days since. My best guess is that it comes from the realization that you're not just hitting buttons when they cross the bottom of the screen, but that you're playing in time with the music--the onscreen action is actually kind of a miscue. Once new players make that conceptual leap, the rest is a cakewalk. Which begs the question: the "highway o' notes" approach has become so standard that experienced gamers don't question it, but could it be the weakest part of the modern rhythm game? How else could we visualize a musical score without resorting to actual notation?

Once they sat down and got the hang of things, I think people enjoyed themselves. But there's certainly a karaoke factor--nobody wants to be the first to act like an idiot in front of everyone. You have to have a few Judas goats get things started with a couple of songs--the cheesier the better--before people will start to jump in. And even so, I think reports of the game's universal appeal may be a little presumptuous. And that's okay: it's a party, not an enforced Rock Band prison camp.

Not yet, anyway. I'm thinking of training Wallace to be the Fun Enforcer. If he's so set on biting people, we might as well channel it into a useful direction. And snarling madly at the end of a short leash while I shriek "more fun! MORE FUN!" sounds like a good party starter. For me, at least.

June 5, 2009

Filed under: gaming»software

Disinterested Parties

Like most people, I tend to write about games when I either hate them or love them. But in keeping with my new year's resolutions, there are also games I've stopped playing because I just can't bring myself to care about them.

  • Disgaea DS: A critical darling, mostly for the writing, which is (unlike most J-RPGs) wildly slapstick and genuinely funny. Unfortunately, it's wrapped around a game that I just don't find terribly interesting, centering as it does around a single tedious mechanic (ganging up on enemies, then leveling up). Has been replaced by: the slapstick humor of getting the cat riled up and watching her ineffectually attempt to maul a dog five times her size.
  • Zeno Clash: Critics love this one too, and I can see why: the art direction and storyline are a stunning, surrealist treat. Being a child of the eighties, it reminds me of the weirder items in the Jim Henson catalog. The game itself seems fine--repetitive, but fine. My main problem is just how sluggish everything feels. Blocking, pulling back for a punch, dodging--there's a maddening lag between pressing the key and actually taking action, and as a result I find myself frustrated as enemies easily work around my defenses. Has been replaced by: forcing Belle to watch classic science fiction movies Terminator 2 and Wrath of Khan, then sluggishly dodging her snarky comments.
  • Uplink: The game that apparently gave Introversion the funding and confidence to create Darwinia and DEFCON, I'm grateful to Uplink but that doesn't mean I have to like it. Purportedly a game about Hollywood-style "hacking," as far as I can tell it's actually about clicking on menus. And if I wanted to wander aimlessly around a cryptic, mouse-centric interface, I'd run OS X in a virtual machine. Has been replaced by: OS X running in a virtual machine.

May 14, 2009

Filed under: gaming»hardware»failure

Lack of Drive

The Xbox is broken. Again. Nicely done, Microsoft. Just in time for my week off.

At least it's not another Red Ring of Death. In fact, it's something more frustrating: the disc drive has gone bad. Since we probably use it for playing DVDs as much (or more) than playing games, it kind of puts a cramp in our entertainment options. The only other DVD player hooked up to the TV is the PS2, which was apparently designed by utter sadists--there's one button on the controller that, for some unexplainable reason, stops the movie instantly. This wouldn't be so bad if it weren't located right next to the button for selecting menu options, or if Sony didn't feel like labeling the controls using cross-region hieroglyphs. Invariably, Belle and I spend fifteen minutes restarting whatever we want to watch after getting the two confused.

Adding insult to injury, the Xbox is less than a month out of warranty, so I'll have to pay for repair. It's a testimony to the quality of the software that I'm actually going to do so, instead of donating my game library to charity and sitting the rest this console generation out. But for three reasons, I'm giving it a shot once:

  1. It's more expensive to replace it with other devices. The repair is $100, which is what we'd pay for one of those lovely Roku Netflix boxes for the streaming functionality I enjoy so much. And we'd still need a DVD player that's not reminiscent of an Ikea assembly process, preferably one that upscales. All told, we'd probably have to spend another $70 for that.
  2. We've got an investment in it. I could probably live without a lot of the games, but we're still bullish on Rock Band parties. We may have to drug Wallace heavily, but one day we swear it'll happen.
  3. Getting rid of stuff is a hassle. Giving up the Xbox means finding a home for all the games, the extra hardware, recycling the broken console, etc. Preferably, it means doing so without helping Gamestop or some other retailer line their pockets on used merchandise. It makes me tired just thinking about it. As annoying as Microsoft's repair process might be, it's generally a pretty smooth ride. I guess they've had enough practice.

That said, this is strike two, Xbox. Don't think I won't replace you with a lava lamp and a Betamax deck if it happens again.

Future - Present - Past