Some three hours into Batman: Arkham Asylum, we are introduced to the Lunatic after someone opens all the cell doors on the island. The Lunatic is a shambling, almost-skeletal enemy dressed in a straitjacket. His attack entirely consists of leaping onto Batman's back and thrashing wildly until thrown to the ground and knocked out with a blow to the head. Because that's therapy, superhero-style: brutally beating the mentally-ill senseless with your heavy, armored fists.
I'm not the only person who has found this a little unsettling: Justin Keverne calls it "the intellectual and social equivalent of bumfighting," and Travis Megill follows up with a post discussing the stigmatization of mental illness perpetuated by the game, and recommends using it as a consciousness-raising opportunity. Both make some great points.
One of the things that I like about Batman as a character is how plainly ambiguous he has become. Other superheroes may be able to perpetuate the myth of vigilante justice, but after The Dark Knight Returns (a barely-disguised John Bircher fantasy styled after Red Dawn), The Dark Knight (the film, which bears little plot-wise resemblance to the comic but touches on many of the same themes), and (most importantly) Alan Moore's The Killing Joke, it's difficult to imagine an interpretation of the character that isn't a damaged, near-fascist personality locked in a feedback cycle with equally-psychotic "supervillains." Calling the modern Batman a hero is hilarious.
So while his treatment of the "Lunatic" enemy is unsettling, I could almost believe that it's purposefully so. Likewise the depiction of the asylum itself: while the game never explicitly comes out to say so, this is clearly not an enlightened institution (and never was, as the hidden story items make clear). The inmates are locked into tiny, solitary cells and effectively left to rot. The guards are vicious, unpleasant people, and the doctors are using their patients as experimental subjects. In Killer Croc's case, they've just dumped him into the sewers, dropping rotten meat to feed him. The warden is a political animal more concerned for his career than for those under his care. It's like something out of Nellie Bly's undercover reporting on the Blackwell's Island asylum. No doubt anyone sentenced to Arkham would emerge more damaged than when they entered, and many are sent there as much from a desire to remove the undesirables as to rehabilitate them.
Indeed, one thing I found interesting, particularly while listening to the "interview tapes" scattered through out the game, is the degree to which several of the inmates are not insane at all. Killer Croc, for example, is violent and dangerous, but he shows no signs of being disconnected from reality: the outside world really does see him as a monster, and Croc merely reacts accordingly. Poison Ivy has entirely valid reasons to identify more with plants than humans--she's half-plant herself. And the Joker, as voiced by Mark Hamill, has never seemed crazy to me--sociopathic, perhaps, but no more so than many mobsters and criminals. It did not surprise me to find out that Paul Dini, the writer for both The Animated Series and Arkham Asylum, has written a story titled "Case Study" that frames the Joker as an entirely sane criminal using a deranged persona to pursue a vendetta against Batman.
Regardless, I have two reactions to the Lunatic. First, as Megill points out, focusing on the individual inmates (such as the Lunatic) or the institution is to overlook the overarching message of the game's view on mental illness, which is firmly rooted in unsubtle stereotypes. In its universe, disorders aren't a continuum of mental function, but a strict sane/insane dichotomy. This isn't necessarily Arkham Asylum's fault--it's derived directly from the comics themselves, which have always treated insanity as a shortcut directly to wearing tights and planning crimes centered on random concepts. ("Calendar Man?" Really?) As counted among the offenses perpetuated in our pop cultural psyches by Marvel and DC, I rank this relatively low on the list, but it's good to see it noticed when it pops up.
Second, the game's unsympathetic portrayal of the asylum itself doesn't really excuse its dehumanized view of the patients themselves, or Batman's enforcement of the status quo (he beats the inmates, but frees the crooked administrators to return to their jobs). It's one thing to say that Bruce Wayne is an anti-hero at best, but another to watch him blithely ignore the conditions around him. This is where Batman sends people, remember, after he's caught them. And it's not like he doesn't know about Arkham's policies: he's on the island enough to have built a fully-equipped Batcave there. Talk about your bad neighborhoods. If that's not indicative of the unhealthy relationship the "Caped Crusader" has with his foes, I don't know what is.
I never played the original Castlevania on its original platform in its original era. I only got around to it when they released it on GBA. So I think my opinion's unclouded by nostalgia when I say that, with reservations but in general, I like it.
In the pantheon of retro classics, Castlevania slots in right next to Ninja Gaiden. Both are sidescrollers emphasizing close combat (as opposed to Mario-style hopping), with health bars and a rudimentary power-up system. Castlevania has better secondary weapons. Ninja Gaiden has better level design, and is probably the superior title overall--the flow of its levels is pure 8-bit choreography. Either way, they're simple games. Over the years, Ninja Gaiden has stayed fairly simple. Castlevania has not.
Which brings us to Order of Ecclesia, the most recent side-scrolling title in the series. It's not that OoE is a bad game, so much as it is way more complicated than it needs to be.
I'm giving up on the game about seven levels in, having gotten through the first four bosses or so. I'm doing so because the level design (which is awful, having largely abandoned the intricate "Metroid-vania" style of navigation) has begun throwing in enemies that completely wreck the difficulty curve (specifically, the demonic gravediggers). The options available seem to be either learning an attack pattern that's not particularly enjoyable, or improving my character. Since neither appeals, I'm ditching it.
"Improving my character"--what a fun turn of phrase that is, as if Castlevania were Emily Post and Buddhism mixed together. What it really means is going out and either leveling-up (a long, painful process left over from RPGs that I thought we had largely abandoned in the civilized world) or tediously killing the same enemies over and over again until they drop a more powerful weapon. It's all the worst parts of World of Warcraft, but without a sense of humor!
This complication doesn't have any particular justification for its existence. Its only point is to add a pseudo-cerebral tint to an otherwise fluffy and unredeemable arcade experience, something it has in common with the vapid plots that Konami insists on jamming in there, as if I really cared. "We're not just engaging your reflexes," OoE defensively protests, "we're engaging your mind!" Yeah: because making me constantly interrupt play to struggle through a poorly-designed menu system, all to find the collectible weapon that will harm this particular recycled sprite from the last seven Castlevania titles is certainly a challenge that will stretch my capacity for non-linear thinking, isn't it? Give me a break. These games are basically mental Diet Coke. The least they could do is have the dignity to act like it.
Let's make a deal, video games: you don't make me grind for a frakking sidescroller, and I won't sell you on eBay.
This was a submission for the 2009 Call for Writers at Gamers With Jobs. It wasn't a good match for them, but I still enjoy parts of it, so I'm posting it here.
The PC has been killed so many times, they put a revolving door on the coffin. It's been declared deceased so often, the death certificate has scorch marks from the copier. The tech community has buried it so deep, Australia's complaining about the tunnelling's environmental damage. And yet, somehow, it's still around. As someone who has long identified with the PC, I find this strangely comforting. When they start writing editorials about the great shape the industry's in, I'll start worrying.
Of course, the constant hum about its mortality masks the traditional role of the PC as a weathervane for gaming elsewhere. For example, Sean Sand's "Don't call it a comeback" points out that big-budget titles may make up a smaller portion of the platform's future library--what's that but XBLA and WiiWare writ large? Likewise, the shift to digital distribution over Steam, Impulse, and other services puts the PC at the forefront of an industry-wide trend away from physical media. It's the open nature of PC development--its generativity, as author Jonathan Zittrain would say--that lets it lead the pack this way.
But in another sense, PC gaming is changing on a more basic, demographic level. The platform itself is evolving, and gaming will have to evolve with it. I'm referring, of course, to the gradual rise of the laptop and netbook as computing platforms. Like it or not, both of these hardware configurations are increasingly common, carry serious implications for developers, and have already begun to influence this corner of the industry.
In 2007, desktop sales dropped by 4%, while laptop sales rose by 21%. The gap has no doubt risen since that time, in part due to the rise of the netbook niche--indeed, laptops outsold desktops in 2008 for the first time, ever. Those kinds of numbers aren't broken out by profession or use, unfortunately, so we don't know how many gamers specifically have moved to portable. But it's not outrageous to assume that it's true for the general gaming population, particularly as we gamers get older and want a computer to pull double-duty for work and play.
When the time came to replace my own aging, hand-built tower PC, I ended up going with a Lenovo Thinkpad. It has a discrete graphics card, putting it roughly in the midrange of portable rendering power between the poor suckers with integrated Intel chips and those city-block-sized, SLI-capable monsters from Asus and Alienware. For a two year-old business notebook, the Thinkpad is pretty good at gaming: it runs Half-Life 2 and Team Fortress 2 acceptably--if not extravagantly--well, which was my priority when I bought it. Fallout 3 also plays well enough that I can't complain about the graphics (the controls, on the other hand...). Not everything fares so well: Crysis was a slideshow (literally: "What I Did On My Summer Vacation--Visited Beautiful North Korea, Fought Aliens, and Choked People With My Terrifying Robot Fetish Suit"). But then, people with small nuclear reactors under their desks had trouble running Crysis when it first came out. I don't let it get me down.
The salient point is not that the hardware's a bit behind the cutting edge. It's that, as a laptop, it's mostly not upgradeable--at least, not in the parts that really count for 3D rendering, like the graphics card. My laptop will never run so-called AAA titles, no matter what I do. This doesn't mean that I've stopped gaming, or buying games. But it does mean that my purchasing dollars tend to go to companies that will support a somewhat more lenient range of hardware when it comes to their software. I end up buying from companies like Stardock or Valve--developers that still target graphics cards from a few generations back. Or I've found a new interest in the indie scene--games like World of Goo or Introversion's back catalog. When all else fails, I'm catching up on older titles I never played, like the original Fallout games. In a way, the laptop hardware lag has been a gift.
The PC gaming industry's become accustomed to being the top performer in the rendering game for a while now, and that will no doubt continue among the niche of hardcore enthusiasts. But anyone who wants to actually be profitable in this space would do well to keep us laptop gamers in mind. You think World of Warcraft's success isn't at least partly due to its generous system requirements? The shift might even be a blessing in disguise: notebooks, while still diverse compared to console hardware, are notably more standardized than desktop systems--a complaint that developers have against the platform for years.
It will be painful for a while, as publishers and developers adjust to the new reality of notebook gaming, but ultimately I think we'll be better for it. From constraint often comes inspiration. That's true in other media, and I think it will be true in gaming as well. So feel free to cheer for the end of PC gaming--after all, it's not going anywhere.
"You are crippled," says Fallout 3.
"Huh," says I. "That seems a little tone-deaf."
When you get shot in the leg, or you fall off a building, or a mole rat eats your hand (or all of the above) in Fallout 3, the damage gets broken out into one of six body zones. Take enough damage, and there's an appropriate penalty (loss of accuracy, slower movement, etc.) as well as an amusing pained expression on your in-game HUD. I'm okay with all that. But the word "crippled" took me aback. I'm not terribly savvy when it comes to persons with disabilities, but it strikes me as a particularly loaded term--I certainly would have avoided it when I was writing for the World Bank.
I'm not trying to cast Bethesda as insensitive bigots. The terminology (used in the classic RPG tradition of Capitalized Status Conditions like Confuse, Sleep, or Haste) appears to be a holdover from the older Fallout titles. They're using it as an adverb, and not as a noun ("You are crippled" instead of "You are a cripple"), which makes some difference. And it's not like they did something outrageously stupid, like putting Africans in grass skirts and wooden masks or something. Still, you'd think that while they were updating everything else about those previous games in the series, putting them on a new engine and everything, they must have thought about the language they were using. I wonder why they thought "crippled" was the best choice.
It's not even, from a writer's perspective, a particularly flavorful word. A thesaurus search finds several alternatives with more punch, including "wrecked," "maimed," and "mangled" ("vitiate" is also amusing, but probably too obscure). Individual terms for specific injuries would be even better: "You are hobbled." "You are concussed." "You are defenestrated." And here I've always taken such good care of my fenestrates.
In any case, it's easier to nitpick someone else's hard work than it is to figure out what it means, and I'm still not sure how I feel about Fallout's status condition. Is it significant? How does it relate to the game's subject, as well as its underlying mechanics? Can it tell us something about portrayals of disability and normality in media? Or is "crippled" just a writing decision that rubs me the wrong way?
I have probably started four or five games of Chrono Trigger, across four or five different computers (I didn't own an SNES at the time), and never gotten past the Prehistoric Era segment. So while Square's habit of re-releasing its entire classic catalog every time a new platform reaches critical mass may seem grating and money-grubbing (probably because it is), it is sometimes valuable. The DS port of Chrono Trigger is the first time I stand a chance of finishing it. This is probably because, like a lot of adult gamers, I use portable games as a way of multitasking. It's something I can pull out if a movie or TV show starts to drag but I still want to see the end, as well as a time-killer during the inevitable Metro delays. And while I've always got games loaded on a smartphone of some kind these days, it's rarely as satisfying as the experience on an actual console--not to mention that the battery life is far better. So compared to the emulated versions, I've gotten much farther this time around.
Chrono Trigger is almost fifteen years old now, which is pretty amazing if you think about it. It's held up well. More than that: I'd argue that it's better than most anything Square's put out during the intervening years, on either portable or home console. Mostly this is because it's such a lean design: unlike the excesses the company developed in the 32-bit era, there are no collectible card games (FF8) or watersports (FF10) that you have to learn to navigate, and the battle system is relatively simple. It feels like this left the development team free to concentrate on the worldbuilding: the result is a series of rich, often comical time periods linked to each other by a decidedly quirky kind of causality. Great characters, as well, although I'm not the biggest fan of the art style.
Although the game isn't non-linear, it's also impressive how well it fakes it. Shuffle the party as much as you want, they'll all still have appropriate dialog choices (some more appropriate than others, granted). Halfway through, it opens up a whole bunch of sidequests that players can approach in any order. The experience is still basically guided at every step, but in a way that feels empowering and entirely in sync with the time travel theme: at practically any point in the game, players can jump straight to the final boss, although they'll probably get creamed if they haven't done at least a few of the optional missions.
If anything has not aged well about the game, it's the mechanics of the battle system--more specifically, the endlessly frustrating menu options that must be navigated under pressure. At the time, this was how RPGs worked--hell, it's how a lot of them still work today. For a short time, Square seemed to have chafed a bit under that convention: FF6 (released a year before Chrono Trigger) supplemented the menus with oddball conventions like Sabin's Street Fighter-esque combos, while Super Mario RPG went to a far more manageable system of assigning different actions to the largely-unused face buttons. Then the Playstation rolled around, and the company apparently gave up on control innovation and concentrated on putting elaborate CG movies in between boring menus.
In the meantime, it cannot be stressed how annoying Chrono Trigger's menus are, especially since by default they let enemies continue to attack while you try and find the right $%!@-ing Dual Tech. I particularly love hunting for a single healing item through a vast inventory list using a tiny little window, during which time monsters have probably managed to kill the character I wanted to heal in the first place. Once upon a time, this was called "adding tension," but looking back on it, it's a lot like trying to solve sudoku while someone shoots you with a BB gun: a synthesis of tedium and tension that I could personally do very much without. The DS port of Chrono Trigger "solves" this problem by making the same menus into big, touch-friendly targets, which utterly fails to help. It may feel like a blow to your hardcore gamer cred, but I'd recommend switching from "Active" to "Wait" mode instead.
All in all, though, Chrono Trigger's an example of Doing It Right. If everything Square had made was this good, as opposed to say every Final Fantasy except for 6, it probably wouldn't be so galling to see them regurgitate the whole lot every time a marketable piece of hardware came out. There's even a theoretical justification for their actions: even more than other digital artifacts, console games age badly as the march to new platforms and formats makes them difficult--or even impossible--to play them as they should be played, and clearly emulation doesn't always cut it. In theory, I don't begrudge the company for reselling the classics, even if it is just locking them to a newer block of soon-to-be-obselete hardware. In practice, however, the only thing worse than watching them repackage both the good and the awful is watching all of it sell like hotcakes.
Rock Band was actually the reason that we bought the XBox. Belle and I have a soft spot for gimmicky party attractions. Somehow, we forgot that we also have a neurotic, overprotective pit bull mutt. They don't really mix, and we kept putting off our plans. This weekend, we finally bit the bullet, boarded the dog, and brought the noise.
Watching people play for the first time, particularly people who are not A) incredibly extroverted or B) experienced gamers, was interesting. They were usually put on the drums, under the reasonable logic that hitting things is fun, and everyone was pretty much on Easy, because failing a song is not fun (the primacy of fun may be a debated topic in design circles, but when people are drinking it's not really an option). When the song first starts, the newbie would have an expression of utter panic--hitting the pad too late, bewildered by the number of notes coming in, only using one stick--and then, all of a sudden, there'd be this ah-hah! moment and they'd get it.
The speed of that jump between dread to drumming is so quick, in fact, that I've been trying to figure out the cause in the couple of days since. My best guess is that it comes from the realization that you're not just hitting buttons when they cross the bottom of the screen, but that you're playing in time with the music--the onscreen action is actually kind of a miscue. Once new players make that conceptual leap, the rest is a cakewalk. Which begs the question: the "highway o' notes" approach has become so standard that experienced gamers don't question it, but could it be the weakest part of the modern rhythm game? How else could we visualize a musical score without resorting to actual notation?
Once they sat down and got the hang of things, I think people enjoyed themselves. But there's certainly a karaoke factor--nobody wants to be the first to act like an idiot in front of everyone. You have to have a few Judas goats get things started with a couple of songs--the cheesier the better--before people will start to jump in. And even so, I think reports of the game's universal appeal may be a little presumptuous. And that's okay: it's a party, not an enforced Rock Band prison camp.
Not yet, anyway. I'm thinking of training Wallace to be the Fun Enforcer. If he's so set on biting people, we might as well channel it into a useful direction. And snarling madly at the end of a short leash while I shriek "more fun! MORE FUN!" sounds like a good party starter. For me, at least.
Like most people, I tend to write about games when I either hate them or love them. But in keeping with my new year's resolutions, there are also games I've stopped playing because I just can't bring myself to care about them.
The Xbox is broken. Again. Nicely done, Microsoft. Just in time for my week off.
At least it's not another Red Ring of Death. In fact, it's something more frustrating: the disc drive has gone bad. Since we probably use it for playing DVDs as much (or more) than playing games, it kind of puts a cramp in our entertainment options. The only other DVD player hooked up to the TV is the PS2, which was apparently designed by utter sadists--there's one button on the controller that, for some unexplainable reason, stops the movie instantly. This wouldn't be so bad if it weren't located right next to the button for selecting menu options, or if Sony didn't feel like labeling the controls using cross-region hieroglyphs. Invariably, Belle and I spend fifteen minutes restarting whatever we want to watch after getting the two confused.
Adding insult to injury, the Xbox is less than a month out of warranty, so I'll have to pay for repair. It's a testimony to the quality of the software that I'm actually going to do so, instead of donating my game library to charity and sitting the rest this console generation out. But for three reasons, I'm giving it a shot once:
That said, this is strike two, Xbox. Don't think I won't replace you with a lava lamp and a Betamax deck if it happens again.
John Robb's Brave New War basically confirms a suspicion that I've had for some time now: that so-called "fourth-generation" warfare is really just the military catching up to its nonviolent counterparts. Robb's book serves as a useful summary of 4GW thought, incorporating examples from Iraq and elsewhere. In short, it amounts to the realization that straightforward military conflict--soldiers firing guns directly at other soldiers--is no longer the predominant threat. Instead, Robb says, the goal of "global guerrillas" is to disrupt the enemy economically, psychologically, and logistically. None of this would be a surprise to, say, the Danish under Nazi rule, or Ruhrkampf in 1923, or the organizers of the American civil rights movement. The violence of the methods listed by Robb may be different, but the underlying philosophy is very similar.
This is kind of satisfying as an advocate for nonviolence, but it's also interesting as a gamer. There's a whole genre of shooters and strategy titles that are based around the ideas of third-generation warfare: get better equipment than the other guy, then go beat the crap out of him. I could be wrong, as I'm not an expert on the strategy/RTS genre, but I can't think of a single popular title that isn't firmly rooted in that idea (tower defense games might come the closest).
Not that I'm saying that shooters should necessarily be following up-to-date strategic doctrine. Or that they should be anything near realistic. I like a good me-against-the-world shooter as much as the next guy. But even if you don't believe that gaming can influence cognitive approaches--and I go back and forth on that point--the lack of progress does seem a shame, for two reasons. First, because 4GW is more interesting: it's about finding weak points and undermining legitimacy, the kind of min-max problem that munchkin-style gamers have salivated over for years. Robb says that knocking out 1 percent of high-load nodes would make up to 40% of our electrical grid go dark. Can you imagine the GameFAQs entry for that? Or the feeling of accomplishment when it's figured out?
Second, it's less violent (and more parallelizable). The violence thing is not just me being squeamish. I can't be the only person to have noticed that as consoles and PCs have gotten more powerful, one of the primary uses for that power is to enhance violence: zone-specific injury, ragdoll physics, more on-screen enemies, bloodsprays, etc. It's kind of morbid, frankly. Surely there's more challenge (and gameplay) in modeling the network of relationships between infrastructure and population--and it might be easier to scale that kind of modeling, in a world where concurrency is the new dominant programming paradigm. Easier on the art team, too.
Of course, if you do believe that games are educational experiences, perhaps this is not the education that we want: how to sabotage a developed society? Creepy. But then, if you believe that, you should already be worried about the lessons that third-generation wargames are teaching. The strategies of current military titles are largely generalizable only to other military applications, and they carry the implicit message that coordinated force is a valid solution in international conflict resolution. At the very least, games that address weaknesses in community resilience and redundancy can also be applied to sustainability and our economic situation (at the extremes, the green movement and the paranoid survivalists become strikingly similar), to name just two of the networks that increasingly define our world. More importantly, it's a view of the world that stresses interdependence and complexity over unilateral force. I can't help but see that as a (slight) improvement.
The fact that the Korg DS-10 exists in the first place is testament to something. I don't know what that something is, exactly. But while music software on game consoles is hardly new--LSDJ, the NES MIDI cart, C64 SIDs, and Mario Paint all spring to mind--the DS-10 is the first program that I'm aware of that A) has the stamp of an actual music technology company, B) is not wrapped in a game or "art project" of some kind, and C) requires no greymarket hardware or hacking to work. That makes it kind of special, to my mind.
The danger in evaluating these kinds of unexpected niche products is the sharp whiplash of expectations: it's too easy to get carried away by the novelty of it all--or conversely, to be upset that it isn't the second coming. The truth, as always, is in the middle there somewhere. Once you figure out what it's not trying to do, there's a lot to be excited about.
The most impressive part of the cart, by far, is the synthesizer package. Each sequencer part gets two monophonic synths modeled on the Korg MS-10, plus four drum voices (these are programmed the same way as the primary voices, but they get "frozen" into samples before playback). Both synths are virtual analog units with two oscillators (each with triangle, saw, square, and noise waveforms), an envelope generator, filter (with low/high/bandpass modes), and an impressive modulation patchbay with its own LFO. (If that's gibberish to you, the DS-10 homepage has an impressive set of synth tutorials that you can watch.) Each synth can be fully automated in the step sequencer, and you can play them live using either an onscreen keyboard or a Kaoss pad interface (for the world's cheapest Theremin).
It had been a while since I'd messed with an analog-style synth, and I'd forgotten how much fun it is. Everything reacts in real-time as you twist knobs and flip switches using the stylus, and the interface has a lot of well-considered design choices. A particularly nice touch is the modulation section, which lets you stretch little yellow cords between the various input/output jacks of the patch bay. I'm not really a discerning synth tone maven, but the sounds seemed perfectly workable to me. You're not going to fool anyone into thinking you've got a Moog in your pocket, but it's hardly an NES, either. If there's anything I wish they'd added, it would be the ability to play the synths using the hardware buttons, maybe with a Band Brothers-style control scheme. As it is, the L and R triggers swap between the top and bottom screens, the d-pad moves around the signal path, and the X button is a play-pause control. That seems kind of like a waste, particularly since the sequencing itself is limited in frustrating ways.
It's not the style of it that bothers me--I like both step sequencers and trackers--it's the way that it's structured. Here's how it works: at the top level, a Song is built out of 16 Parts (capitalization Korg's). Each Part is a setup containing the settings for all six synth voices, plus a sequence of up to 16 steps for each voice. You can copy Parts from one slot to another, and you can save and load your synth/drum voices between Parts. What you can't do is control any automation across Parts, or separate the instrument sequences from each other. If you want to combine the drum pattern from one Part with the synth pattern from another, you're going to have to copy one Part to a new slot, then manually recreate the pieces from the other--there's no mix-and-match ability here. In that light, those 16 Part slots start to look pretty thin, particularly if you want a melody/chord progression that's longer than 16 steps long. Also, you'd better have your patches all set before you start sequencing--even if they're loaded from the same synth patch, changes in the voices of one Part don't apply to the others. Tweak that string sound in one, and you'll have to manually copy the change to every other Part.
But to hold this against the DS-10 is almost certainly a mistake. This isn't really a composition package like Reason. It's a groovebox, powered by a pretty decent virtual analog synth sim. And while you could probably write a song on it somehow, I wouldn't recommend it, anymore than I would recommend trying to compose on an 808. What you could do, and easily, is use the DS-10 as accompaniment to fill out live instrumentation (see: the recent Yeah Yeah Yeahs performances), or as accompaniment while writing songs on a less-restricted instrument. In a niche like that, it performs admirably--in fact, it arguably punches far above its weight range as far as cost and ease-of-use. It's a genuine musical tool for less than $40, running on cheap, durable, battery-powered hardware. What's not to love about that?