Oh, man. Where to start with Chris Anderson and Michael Wolff's dreadful "The Web Is Dead"? With the hilarious self-congratulatory tone, which treats a misguided 1997 article on push technology (by the equally-clueless Kevin Kelly) as some kind of hidden triumph? With the gimmicky, print-mimicking two-column layout? How about the eye-burning, white-on-red text treatment? Or should we begin with the obvious carnival-barker pitch: the fact that Anderson, who just launched a Wired iPad application that mimics his print publication, and who (according to the NY Times and former employees) has a bit of an ongoing feud with Wired.com, really wants you to stop thinking of the browser as a destination.
Yes, Anderson has an agenda. That doesn't make him automatically wrong. But it's going to take a lot more than this weaksauce article to make him right. As I noted in my long, exhausting look at Anderson's Free, his MO is to make a bold, headline-grabbing statement, then backpedal from it almost immediately. He does not abandon that strategy here, as this section from the end of the piece shows:
...what is actually emerging is not quite the bleak future of the Internet that Zittrain envisioned. It is only the future of the commercial content side of the digital economy. Ecommerce continues to thrive on the Web, and no company is going to shut its Web site as an information resource. More important, the great virtue of today's Web is that so much of it is noncommercial. The wide-open Web of peer production, the so-called generative Web where everyone is free to create what they want, continues to thrive, driven by the nonmonetary incentives of expression, attention, reputation, and the like. But the notion of the Web as the ultimate marketplace for digital delivery is now in doubt.Right: so the web's not actually dead. It's just that you can't directly make money off of it, except for all the people who do. Pause for a second, if you will, to enjoy the irony: the man who wrote an entire book about how the web's economies of "attention, reputation, and the like" would pay for an entire real-world economy of free products is now bemoaning the lack of a direct payment option for web content.
Wolff's half of the article (it's the part in the glaring red column), meanwhile, is a protracted slap-fight with a straw man: it turns out that the web didn't change everything, and people will use it to sell traditional media in new ways, like streaming music and movies! Wolff doesn't mention anyone who actually claimed that the web would have "transformative effects," or how streaming is not in and of itself fairly transformative, or what those other transformative effects would be--probably because the hyperbole he's trying to counter was encouraged in no small part by (where else?) Wired magazine. It's a silly argument, and I don't see any reason to spend much time on it.
But let's take a moment to address Anderson's main point, such as it is: that the open web is being absorbed into a collection of "apps" and APIs which are, apparently, not open. This being Chris Anderson, he's rolled a lot of extraneous material into this argument (quality of service, voice over IP, an incredibly misleading graph of bandwidth usage, railroad monopolies), but they're padding at best (and disingenuous at worst: why, for example, are "e-mail" and VPNs grouped with closed, proprietary networks?). At the heart of his argument, however, is an artificial distinction between "the Web" and "the Internet."
At the application layer, the open Internet has always been a fiction. It was only because we confused the Web with the Net that we didn't see it. The rise of machine-to-machine communications - iPhone apps talking to Twitter APIs - is all about control. Every API comes with terms of service, and Twitter, Amazon.com, Google, or any other company can control the use as they will. We are choosing a new form of QoS: custom applications that just work, thanks to cached content and local code. Every time you pick an iPhone app instead of a Web site, you are voting with your finger: A better experience is worth paying for, either in cash or in implicit acceptance of a non-Web standard."We" confused the two? Who's this "we," Kemosabe? Anderson seems to think that the web never had Terms of Service, when they've been around on sites like Yahoo and Flickr for ages. He seems to think that the only APIs in existence are the commercial ones from Twitter or Amazon. And, strangest of all, he seems to be ignoring the foundation on which those APIs are built--the HTTP/JSON standards that came from (and continue to exist because of) the web browser. There's a reason, after all, that Twitter clients are not only built on the desktop, but through web portals like Seesmic and Brizzly--because they all speak the language of the web. The resurgence of native applications is not the death of the web app: it's part of a re-balancing process, as we learn what works in a browser, and what doesn't.
Ultimately, Anderson doesn't present a clear picture of what he thinks the "web" is, or why it's different from the Internet. It's not user content, because he admits that blogging and Facebook are doing just fine. He presents little evidence that browser apps are dying, or that the HTTP-based APIs used by mobile apps are somehow incompatible with them. He ignores the fact that many of those mobile apps are actually based around standard, open web services. And he seems to have utterly dismissed the real revolution in mobile operating systems like iPhone and Android: the inclusion of a serious, desktop-class browser. Oh, right--the browser, that program that launches when you click on a link from your Twitter application, or from your Facebook feed, or via Google Reader. How can the web be dead when it's interconnected with everything?
You can watch Anderson try to dodge around this in his debate with Tim O'Reilly and John Battelle. "It's all Internet," O'Reilly rightly says. "Advertising business models have always only been a small part of the picture, and have always gotten way too much attention." Generously, O'Reilly doesn't take the obvious jab: that one of the loudest voices pitching advertising as an industry savior has been Chris Anderson himself. Apparently, it didn't work out so well.
Is the web really a separate layer from the way we use the Internet? Is it really dead? No, far from it: we have more power than ever to collect and leverage the resources that the web makes available to us, whether in a browser, on a server, or via a native client. The most interesting development of "Web 2.0" has been to duplicate at the machine level what people did at the social level with static sites, webrings, and blogs: learn to interoperate, interlink, and synthesize from each other. That's how you end up with modern web services that can combine Google Maps, Twitter, and personal data into useful mashups like Ushahidi, Seesmic, and any number of one-off journalism projects. No, the web's not dead. Sadly, we can't say the same about Chris Anderson's writing career.
The tags your tags could link like.
At the DICE 2010 conference, a guy named Jesse Schell gave a speech about bringing reward systems from gaming (achievements, trophies, etc.) into real life as a motivational system. You've probably seen it--if you haven't, you can watch it and read designer David Sirlin's comments here.
Essentially, Schell lays out a future where there's a system that awards "points" for everyday tasks, ranging from the benign (brushing your teeth, using public transit) to the insidious (buying products, taking part in marketing schemes). Sometimes these points mean something (tax breaks, discounts), and sometimes they don't (see also: XBox GamerPoints). You can argue, as Jane McGonigal does, that this can be beneficial, especially if it leads to better personal motivational tools. I tend more towards the Sirlin viewpoint--that it's essentially a dystopia, especially once the Farmville folks latch onto it.
(The reasons that I think it's inevitably dystopian, besides the obvious unease around the panopticon model, is that a reward system would inevitably be networked. And if it's networked and exploitable, you'll end up with griefers, of either the corporate spam variety or the regular 4chan kind. It's interesting, with Facebook grafting itself more and more onto the rest of the Internet, that social games have not already started using the tools of alternate reality gaming--ARGs--to pull people in anyway. Their ability to do so was probably delayed by the enormous outcry over debacles like Facebook's Beacon debacle, but it's only a matter of time.)
That said, as an online journalist, I also found the idea a little intriguing (and I'm thinking about adding it to my own sites). Because here's the thing: news websites have a funding problem, and more specifically a motivation problem. As a largely ad-funded industry, we've resorted to all kinds of user-unfriendly strategies in order to increase imperfect (but ad network-endorsed) metrics like pageviews, including artificial pagination and interstitial ads. The common thread through all these measures is that they enforce certain behaviors (view n number of pages per visit, increase time on site by x seconds) via the publishing system, against the reader's will. It feels dishonest--from many points of view, it is dishonest. And journalism, as much or more than any other profession, can't survive the impression of dishonesty.
An achievement system, while obviously manipulative, is not dishonest. The rules for each achievement are laid out ahead of time--that's what makes them work--as are the rewards that accompany them. It doesn't have to be mandatory: I rarely complete all achievements for an XBox game, although I know people who do. More importantly, an achievement system is a way of suggesting cultural norms or desired behavior: the achievements for Mirror's Edge, for example, reward players for stringing together several of the game's movement combos. Half Life 2 encourages players to use the Gravity Gun and the environment in creative ways. You can beat either one without getting these achievements, but these rewards signal ways that the designers would like you to approach the journey.
And journalism--along with almost all Big Content providers--is struggling with the problems of establishing cultural norms. This includes the debate over allowing comments (with some papers attempting paid, non-anonymous comment sections in order to clean things up), user-generated content (CNN's iReport, various search-driven reporting schemes), and at heart, the position and perception of a newspaper in its community, whatever that might be. It's not your local paper anymore, necessarily. So what is it to you? Why do you care? Why come back?
Achievements might kill multiple birds with one stone. They provide a way to moderate users (similar to Slashdot's karma) and segregate access based on demonstrated good behavior. They create a relationship between readers and their reading material. They link well with social networks like Facebook and Twitter. And most importantly, they give people a reason to spend time on the site--one that's transparently artificial, a little goofy, and can be aligned with the editorial vision of the organization (and not just with the will of advertisers). You'd have several categories of achievements, each intended to drive a particular aspect of site use: social networking, content consumption, community engagement, and random amusements.
Here's a shallow sampling of possible News Achievements I could see (try to imagine the unlocked blip before each one):
Is this a little ridiculous? Sure. But is it better than a lot of our existing strategies for adapting journalism to the online world? I think it might be. Despite the changes in the news landscape, we still tend to think of our audiences as passive eyeballs that we need to forcibly direct. The most effective Internet media sites, I suspect, will be the ones that treat their audiences as willing participants. And while not everyone has noble intentions, the news is not the worst place to start leveraging the psychological lessons of gaming.
The Internet has many virtues (and no small number of vices), but its most surprising effect has been the way it has made research both easy and addictive. While you have to be critical of what you read, of course, at no other time in our history has it been easier to scarf down information like a big bowl of knowledge-flavored ramen.
But this is mainly useful for certain types of knowledge--mainly intellectual, abstract data. For example, when I was in high school I decided to learn how to play the harmonica, which is not a skillset that you can really pick up from written description (although I certainly spent enough time on the HARP-L list, just in case). Likewise, I may have mentioned my recent interest in breakdancing--you can watch a lot of videos and read a lot of forum posts, but I think that's a relatively ineffective way to learn. I don't mean to say that online communities for these activities are useless, because they have value in other ways. But for concrete tasks, you can't beat physical instruction.
So anyway, I'm kind of intrigued by Kinect (and, to a lesser extent, the Playstation Move/Eye or the Wii remote/balance board combinations). We have been working for a while now toward a world where we can query the Internet's store of information based on a macro-level location in space and time, via smartphones. Inventions like Google's local search, and to a lesser degree Foursquare or Yelp, add geographic location to human input. Kinect and its brethren, on the other hand, are attempts to turn the perspective around: interaction based on the topology of the user's body itself.
These early attempts are primitive. They'll be used in crude ways, for gaming and parlor tricks, and they'll have limitations like Kinect's inability to handle prone positions and relatively low resolution. But think of the potential here one that's only hinted at in Harmonix's Dance Central. Among other things, real motion interfaces are a first step toward extending the tremendous communication and educational value of the Internet out into the realm of physical movement. Imagine an educational program for athletic skills that could see your movements, compare them to a model, and tell you how to correct them--or a video chat session with a teacher who could walk "around" to critique your technique in 3D space. Even if it were non-interactive, this could have real advantages--I'd love to have a clean motion-capture of Vic Wooten's slap bass technique to study in slow motion. And surely there are commercial applications, like virtual dressing rooms or telepresence tourism.
Thanks to some literal handwaving, the vision of motion control since Minority Report has been to provide a fancy, grand gestural control mechanism for data manipulation--because there's a problem we've all had, right? In much the same way, the current focus on camera-view augmented reality ignores its real, current applications in relatively dull location-sensitive mapping, probably because most critics are more interested in the human-machine interface than the way these new technologies shape our culture. But surely we should have learned by now: in the age of networked communcation, it's the mundane social uses--chatting, teaching, and sharing--where innovation will get really interesting.
Part of a series looking back on my first year of breakdancing.
I competed in forensics--college-level public speaking competitions--for two years. It was a tremendous influence on my life. I learned a great deal about writing, about working with other people, and about confidence. But one of the best lessons I learned from it came from failure at a national tournament.
At GMU, where I competed, the team was fiercely competitive--to an unhealthy degree, in my opinion. Reach the national finals, and you got your name on the team room wall, which was basically the highest honor they offered. I was eliminated at the semi-finals my second year year at nationals, but I went to watch the final rounds, and I was struck by something: yes, most of the finalists were better than I was, but the gap was not tremendous. To reach their level, I probably would have to spend another two years polishing my skills and adapting to some of the community's odd, inbred speaking tics--two years spent on diminishing returns, at the expense of pretty much everything else in my life.
What, I asked myself, was I really hoping to accomplish? Was I here for a handful of plastic stick-on letters in a GMU office building, or did I want to learn about rhetoric? There were other reasons that I left the team--a bad relationship with a teammate, wanting to branch out into other parts of the college experience, the desire to sleep in past 5 AM on the weekends--but that moment was key. That was when I realized that you can choose what to get out of an experience, and that those lessons could be very different from the intended deliverables.
I mention this, not just because I'm a former speech geek who sees most experiences through the lens of three-point structure, but because that realization has been a big part of my perspective on breaking. At 27, I was older than most beginners, and there are portions of the dance's daredevil side that I'll probably never master. As such, I'll almost certainly never win a battle. But that doesn't mean someone like me can't learn a lot from b-boying, even while acknowledging that the dance's competitive spirit is a driving force behind its development. So when making a list of what I've learned, I want to avoid simply listing off a set of moves and freezes, or complaining about all the things I'm still very bad at, and discuss the less obvious, personal lessons instead.
But you know what? It turns out that this doesn't have to be the case. There's no reason that your self-image has to be set in stone, or that you can't go out and meet new people and do new things. Perhaps this is obvious, but to me it was refreshing. As someone who has always hated the idea of "finding one's self" anyway, I've loved how b-boying has grabbed my life by the corners and shaken it a little.
Life's too short to take myself seriously. Besides, I'm a gamer as well as a musician. I regularly position myself in front of a television and A) twitch a lot, B) play pretend instruments with other people, C) stomp on a mat in time with music, or D) all of the above. Looking stupid can be a lot of fun. Worrying about embarrassment, not so much.
This is no small amount of what good teachers pass on to their students, whether in higher education or at a studio. It's what autodidacts often lack: the ability to prioritize and discriminate how they learn, and to acquire knowledge systematically (the same is true of conspiracy theorists, not coincidentally). The self-taught learners I know tend to suffer from this: they spend a lot of time running down dark alleys and backtracking, because they never learned how to learn. Every time I go back to the classroom, I learn a little bit about the metacognitive process, and breaking has been no different.
But maybe most of all, b-boying has reminded me that there are no shortcuts to self-improvement. When I find myself faced with a new task, I'm always tempted to look for a trick, some quick fix that'll let me master it. I think that mentality served me well early in life, and it became a bad habit. There are good and bad methods for learning, but the real improvement in my dancing (and elsewhere) has come when I stopped spending my time looking for shortcuts, and took the hard way instead. I have a lot of work to go. I'd better get back to practicing.
Part of a series looking back on my first year of breakdancing.
Learning the techniques of breaking, whether in a classroom or an informal group, is only half of b-boying. The other half is the cypher--the group circle where breakers dance, at a jam or a battle. That's where the competitive aspect of the dance and large portions of the surrounding culture are realized. As local MC Gorilla Will is fond of saying, you're not really a b-boy or b-girl if you don't cypher. This is also the reason that my friends and coworkers rarely see me dance: b-boys and b-girls don't exactly have recitals. But for a newcomer, finding events to attend in the first place has not always been as easy as you'd think.
Spend enough time on the Internet, and you naturally begin to expect that any offline hobby community--bassists, knitters, fitness instructors, etc.--will have a corresponding centralized online presence that you can tap into, especially given the prevalence of free tools like maps, calendars, and forums. This doesn't seem to be the case for b-boying. With the caveat that I may be missing something entirely, as far as I can tell the breaking community communicates sort of under-the-radar. Events are publicized through word-of-mouth, through social networks like Facebook, and via flyers at other jams. If you're not already networked with other dancers, in other words, it may be hard to break in. As much as anything else, I think, that's the value of local classes: they give newbies a start on building the necessary connections. On one level, this obscurity is intensely frustrating, but it's also got an allure to it. It's a friendly, open underground, but an underground nonetheless.
But let's say you've made it, finally, to a typical DC-area battle event. If it's indoors, you're probably looking at a large, single room of some kind--a gymnasium, a church, or a community center. The DJ is down at one end, with an MC nearby calling out instructions and organizing the battles, which take place in a large circle close to the DJ stand. The battles are usually organized in a loose tournament structure, with prelim rounds followed by a single-elimination tree. An event can take a long time--eight hours, for some events I've attended, especially if there are lots of entrants in large team battles. In between competition rounds, there are usually periods of freestyle dancing, with circles forming up spontaneously around the room.
(It would be easy to read meaning into the many symbolic circles available at a jam: the cyphers on the dance floor, the vinyl records spun against each other to create loops of musical time, and the fluid rotation of footwork and power moves. Sometimes, as part of the dance's rich mythology, these relationships are made explicit. For example, check out this group routine by Ichigeki at the 2005 Battle of the Year competition, which combines all those circles into a single, show-stopping performance.)
Breaking is incredibly competitive, so it's funny to watch the interactions between crews during a battle. They'll toss out rude gestures, taunt the opposing dancers, and generally project an air of (over)confidence. Dancers are judged, in part, on how much spirit they bring to the battle, and how expressive their presentation is. The "character" of a b-boy or b-girl isn't always in-your-face--some of my favorites, like Toyz, may spend pretty much the entire battle just goofing around--but aggression is definitely the dominant mode. And yet at the end of a round, with some exceptions, everyone shakes hands or exchanges embraces. The burns are just for show.
In much the same way, I'm always amused by the contrast between the visual appearance of a jam and its sonic character. As a gathering of (mostly) minority youth wearing baggy clothes and making rude gestures, it's a cultural conservative's worst nightmare. And yet the patron saint of breaking is none other than American icon James Brown, and its musical touchstones are old-school funk, soul, and rock tracks like Babe Ruth's The Mexican or the Jimmy Castor Bunch's It's Just Begun--the kinds of records that DJ Kool Herc spun in the '70s. I don't think it's a coincidence that many b-boys and b-girls, especially the older dancers, regard themselves as partial guardians of "real" hip-hop, dating back from the days when it first emerged from Brooklyn street parties. The idea that breaking is a key element of an empowering urban movement still rings true in the cypher.
In his scholarly study of the dance, Foundations: B-Boys, B-Girls, and Hip-Hop Culture in New York, Joseph Schloss writes:
A cypher can be "built" virtually anywhere at any time: all that is required is a group of dancers. It does not require a stage, an audience, a roof, a dance floor, or even a designated block of time. The cypher's very informality and transience are part of its power; it appears when and where it is needed, then melts away. Rhetorically, it is often referred to as "the" cypher, rather than "a" cypher, which suggests that all cyphers are, in some abstract way, connected. B-boys and b-girls view the cypher with an almost mystical reverence, befitting its status as the most authentic, challenging, and raw environment for b-boying.
There's a lot to unpack in Schloss's chapter on the relationship of breaking to its physical location, but I like this passage in particular. Even with my limited experience, it captures the way that the cypher is not just a place where b-boying takes place, but an integral part of the dance's identity: you can't have real breaking without jams to break at, and you can't be a b-boy or b-girl without cyphering. The cypher is a microcosm of both the dance itself and the social movement it represents. Like b-boying, it creates a dialog of both competition and collaboration. And like hip-hop, it's a way for practitioners to impose a new interpretation onto their surroundings--to remix the environment, effectively, into a space of their own.
First of a series looking back on my first year of breakdancing.
People often ask me why I started breakdancing. "Spite," I usually reply, because if there's one thing I've learned in life, it's how to set up an attention-getting device.
In early 2009, a friend of mine in the non-profit sector invited me to a book club run by a group of D-list conservative pundits and professional think-tank employees (average age: 65 million years. Like the dinosaurs. They were old, is what I'm saying here). It was exactly as awful as it sounds. On the other hand, they served free pizza and it gave me stories to tell. Still, by the last meeting, I was fed up with the discussions, with the topic ("civic religion," which made me feel like Groucho Marx: whatever it is, I'm against it), and with the majority of the participants. So before it wrapped up, I decided to pick a fight.
For the last class, in addition to discussing an Ursela le Guin short story, the organizer told us that we'd round out the experience by singing patriotic music as a group. When my turn came, I said that I hadn't prepared any particular songs--that, in fact, I found most patriotic music to be saccharine and hokey. Instead, I noted that when I thought of music that was uniquely American, what came to mind were jazz and hip-hop: they're both musical forms birthed here (instead of derived from another country's folk music), they both emphasize individual expression within a collaborative structure, and most importantly, they define value in terms of improvisation and invention. All of which struck me as a pretty good description of the American national character, for better or worse.
From the room's dead-eyed stare, followed by its loud denunciation of my ideas, my parentage, and possibly my genetic material (for those members of the room that believed in that new-fangled "DNA" invention), you'd have thought I'd suggested replacing the national anthem with "Big Pimpin'." The rest of the meeting was pretty much derailed: petty revenge achieved! But the irony of it was that while I had argued sincerely, I wasn't really a jazz or hip-hop fan. I generally disliked the former, and never really listened to the latter. After I left the group, that kept bothering me. If I was serious about my argument, I thought, I really ought to put my money where my mouth was and do something about my near-total ignorance of hip-hop. A little bit later, I signed up for my first dance class at Joy of Motion in Bethesda.
I like telling this story for a couple of reasons. One is that I think it's genuinely amusing, and explains how a sedentary rock-and-roll type (read: suburban white boy) like me ended up dancing to hip-hop. But another is that it reminds me that there's no such thing as a bad motivation. I started b-boying because I needed to get more exercise, because I wanted to meet new people, and because it was part of a cultural tradition I wanted to learn more about. But yes, it was also partly out of spite. And that's okay.
Now granted, there are an awful lot of people out there who fuel their worldly interactions with spite, to no positive effect. You know these people: they're the ones who don't understand why certain words are off limits to their particular demographic, or who get upset when they need to press a button to continue an automated phone call in English. I've never really understood that, just as I don't understand people who, when they accidentally step on someone's feelings in a conversation, can't simply apologize and move on (seriously guys, it doesn't cost you anything to say you're sorry even if you're really not). Nobody would say that those are healthy expressions of conflict. Is it possible I've learned the wrong lesson?
The difference, I hope, between those cases and my own comes from the target for that anger. Striking out at other people from spite? Not productive, not cool--and yet, something that many people (including myself) do all too often. What I aimed to do instead was to direct my energies toward myself, using them to kick-start my self-improvement. The resulting experiences with breakdancing have been almost entirely positive: I'm in better shape, I've made new friends and discovered new music, and it's a great conversation starter. I've got lots of reasons that I'm going to stick with it. And yet, none of this would have happened in the first place if I hadn't gotten annoyed at a group of cranky old hip-hop haters. It's like the old saying: living well (or dancing badly) is the best revenge.
I woke up this morning to hear that Robert Byrd had died, and that I needed to fix the timeline we had made of his life. As I logged into the VPN, the screen went oddly pink and blue, like Doom's old Hall of Mirrors effect if you cheated your way out of a valid sector. Then it went black, and then it refused to boot to anything but an external monitor in Psychedelic Snow VGA Mode. The video card, it would seem, is fried. Luckily, I'm just barely within the three-year extended service plan (good until August!), so Lenovo is sending a box and will fix it for no extra cost. But I'll be without hardware for probably three to five days.
I've been on a vaguely weekly schedule here for a while now, so I figure going quite for another week won't shock anyone too much. I had planned, starting today or tomorrow, to write my one-year look back at b-boying, but it looks like that will have to wait. I'll also have to hold off on playing through more of Planescape: Torment, which is too bad since it was just starting to get pretty good, and I'm looking forward to rambling a little on its relationship with death and meta-gaming when I get a chance. And finally, I hope I've gotten the database bugs straightened out on NPR's client now, because it'll be a lot harder to debug and fix them on my lunch hour at work. Still, it's not all bad: forced breaks like this are no doubt good for my tendonitis, and maybe it'll give me some extra incentive to drill footwork for Crafty Bastards if the heat lets up. Here's to a productive week, and a speedy return of my soon-to-be-repaired Thinkpad.
I'd like to take this opportunity to apologize to the users of NPR's Android application, whose playlists crashed after last week's update. That was my fault--I wrote a 2 where there should have been a 3, or maybe a < where there should have been a <=. Either way, I'm sorry I broke your application, and a fix is on the way.
I'd also like to apologize to baby freezes. Lately I have been leaving them out of my breaking runs, and if they had feelings, I bet they'd be hurt. But I can explain! See, if you mess up a shoulder freeze (the only other footwork freeze I know), it's an big, dramatic mistake. It looks difficult--you're balancing upside-down on your shoulder! In theory, you're not supposed to get credit for tough moves you don't land, but I find that people (particularly non-dancers) can respect them. Whereas, if you mess up a baby freeze, it just looks like you curled up in a ball and fell over. From a risk management perspective, it's a no brainer. Sorry, baby freezes.
While I'm at it, I'd like to apologize to Stieg Larsson, whose book "The Girl Who Kicked The Hornets Nest" I was unable to finish because there's only so much Swedish hospital intrigue a man can take. Also, there are about 700 billion characters, and they all have ridiculous Swedish names like Torsten Edklinth and Gunnar Bjork (only with an umlaut, a punctuation mark that I find personally offensive and for which I will not bother to look up the HTML entity, no matter what the New Yorker says). Unfortunately, Mr. Larsson is deceased, and cannot accept my apologies, but that's never stopped me before.
Finally, I'd like to apologize to the general public for the Twilight series, both the books and the movies. I'm not responsible for them in any way, of course. But someone needs to apologize, and nobody actually involved with the production of these glitter-drenched grotesques seems likely to do so. It might as well be me.
It's been a little over six months since the last time I looked over my Kindle reading list. During that time, Amazon and the publishing industry got into an enormous brawl, books were pulled, books were restored at higher prices, and as a result my reading habits may have slowed a little. I've glanced from time to time at other reading hardware, I've used my phone to run through a few titles from Feedbooks, but the e-ink and the selection on the Kindle are still a powerful combination. It's still, for now, my favorite way to read.
So here's the highlights:
Joe Abercrombie gets shelved under "fantasy" but it's hard to imagine anything less like the pastel-colored glow of the typical genre entry. His influences are more in line with Fritz Leiber and Steven Brust, possibly crossed with Terry Pratchett's gift for writing characters who are both sympathetic and completely oblivious. I started with Best Served Cold, a Seven Samurai-like revenge plot that spirals unpredictably into darker territory with every step, and somewhat later worked my way through the First Law trilogy, which is somewhat more epic. These are not cheerful books--their main characters include a berzerker, a torturer, and a woman who swears vengeance after being thrown off a mountain--but they've got depth and humor, characters who can (and often do) choose badly with realistic consequences, and not an elf in sight. It's a refreshing combination.
At the other end of the meta-genre viciousness spectrum is Lev Grossman's The Magicians, a thinly-veiled critique of both Harry Potter and the Chronicles of Narnia. Grossman's protagonist, Quentin Coldwater, heads off to a secret magical academy, spurred by his love for a Narnia-like children's series named "Fillory and Further." Yet the magic turns out to be decidedly un-magical, graduation leaves him mired in ennui and boredom, and Quentin himself is not particularly talented or admirable. In many ways, it's a book about how badly the unexamined expectations of magical thinking have primed Grossman's characters for adult life, and the difficulty of learning to accept a difficult and ambiguous reality. And yet, while I appreciated the book's psychological perspective, something about it still rubbed me the wrong way--which is probably the point, honestly.
Ian MacDonald's River of Gods has come highly-recommended, and it's easy to see why: set in a near-future India where the new stars of Bollywood are entirely virtual and AI is illegal, it's a complicated mess of intertwining plotlines strongly reminiscent of early William Gibson. And if it's not completely coherent, or if it telegraphs its surprises a bit early, it does so with enough constant momentum that it's not completely jarring. I like MacDonald's globalized perspective, too--it's nice to read a sci-fi book where the protagonists aren't all white people from LA--and if I didn't rush out and download the rest of his catalog, I've certainly flagged it as promising.
I read Everyman, by Philip Roth, for the PEN/Faulkner book series this year (it was an award winner in 2006, I believe). I'd be very curious as to the other books up for the award that year, because this is awful. It's as if someone decided to write a terrible parody of a Philip Roth novel--in which a vain, sexually-obsessed, self-hating Jew obsesses over a list of endless sickness, both real and imagined--and then, to add insult to injury, got Roth himself to write it.
The problem with describing The Coyote Kings of the Space Age Bachelor Pad, by Minister Faust, is that it invariably sounds a lot more fun than it actually is. I mean, this is a book about a part-psychic graduate student and dishwasher who's swept up in an intergalactic drug operation with his mad scientist roommate, and in which each character gets introduced via a D&D-style character sheet. Shades of Buckaroo Banzai, it's certainly got style to spare, but some of the stylistic tics edge toward reader-hostile mania: several chapters (each of which is in first-person dialect) are nigh-unreadable, the plot is unclear, and parts of it meander interminably in between enormous dumps of exposition. You could charitably call it "uneven," but I have to say it didn't leave me feeling particularly charitable. Ian Tregillis's Bitter Seeds has a similar problem: psychic Nazi experiments vs. British occult blood magic? Sounds awesome, almost completely fails to deliver.
Horns is a kind of surreal detective novel, I guess. It's about a man who wakes up one day with devil horns growing out of his head, and anyone who sees them starts telling him their deepest secrets, a kind of ambiguous "gift" that he tries to use to uncover the truth behind his ex-girlfriend's murder. Author Joe Hill gradually lets the horns expose all kinds of queasy awfulness in the ways that people hide their real feelings from each other--and from themselves--in a small town. But does it work as a story? I'm not sure. At some point, earlier than expected, the murder gets resolved, and it becomes more of a slowly-paced thriller. Still, Hill wraps things up nicely without sugar-coating his characters, and if the horns aren't ever exactly explained... well, maybe we shouldn't want the secrets behind everything after all.
On the non-fiction front, Sarah Ellison's The War at the Wall Street Journal has garnered rave reviews from Slate and the Columbia Journalism Review, so my expectations may have been too high going in. I expected more details of how Newscorp's acquisition has changed one of the country's most prestigious papers for the worse. And I got some of that, eventually, after endless chapters of internal politics in the Journal's former owners, the Bancroft family. It takes 2/3 of the book to get to any details of the paper's changing newsroom, and then it proves disappointingly light on dirt (or, for that matter, outrage). This is, in other words, pretty much the book you'd expect from a former WSJ business reporter on the acquisition--but I don't think I'll be alone in saying I hoped for more.
Finally, N.K. Jemisin's The Hundred Thousand Kingdoms: the child of an estranged royal heir is called back to the capitol, where the tyrannical rulers of, yes, a hundred thousand kingdoms hold onto power by keeping their ex-gods as slaves. In its focus on politics and control, not to mention the shackled djinn-like servants, Jemisin's debut reminds me of Daniel Abraham's "Long Price" books in the best possible way. It's also got a lovely use of narrative voice from an African-American author who doesn't shy away from racial diversity in her worldbuilding. Perhaps the ending is a bit deus ex machina, but I think it's earned. My understanding is that there's a follow-up on the way, and I'm eager to see where Jemisin will try to go from here.