this space intentionally left blank

November 13, 2012

Filed under: journalism»new_media»data_driven

Nate Silver: Not a Witch

In retrospect, Joe Scarborough must be pretty thrilled he never took Nate Silver's $1,000 bet on the outcome of the election. Silver's statistical model went 50 for 50 states, and came close to the precise number of electoral votes, even as Scarborough insisted that the presidential campaign was a tossup. In doing so, Silver became an inadvertent hero to people who (unlike Joe Scarborough) are not bad at math, inspiring a New Yorker humor article and a Twitter joke tag ("#drunknatesilver", who only attends the 50% of weddings that don't end in divorce).

There are two things that are interesting about this. The first is the somewhat amusing fact that Silver's statistical model, strictly speaking, isn't actually that sophisticated. That's not to take anything away from the hard work and mathematical skills it took to create that model, or (probably more importantly) Silver's ability to write clearly and intelligently about it. I couldn't do it, myself. But when it all comes down to it, FiveThirtyEight's methodology is just to track state polls, compare them to past results, and organize the results (you can find a detailed--and quite readable--explanation of the entire methodology here). If nobody has done this before, it's not because the idea was an unthinkable revolution or the result of novel information technology. It's because they couldn't be bothered to figure out how.

The second interesting thing about Silver's predictions is how incredibly hard the pundits railed against them. Scarborough was most visible, but Politico's Dylan Byers took a few potshots himself, calling Silver a possible "one-term celebrity." You can almost smell sour grapes rising from Byers' piece, which presents on the one side Silver's math, and on the other side David Brooks. It says a lot about Byers that he quoted Brooks, the rodent-like New York Times columnist best known for a series of empty-headed books about "the American character," instead of contacting a single statistician for comment.

Why was Politico so keen on pulling down Silver's model? Andrew Beaujon at Poynter wrote that the difference was in journalism's distaste for the unknown--that reporters hate writing about things they can't know. There's an element of truth to that sentiment, but in this case I suspect it's exactly wrong: Politico attacked because its business model is based entirely on the cultivation of uncertainty. A world where authority derives from more than the loudest megaphone is a bad world for their business model.

Let's review, just for a second, how Politico (and a whole host of online, right-leaning opinion journals that followed in its wake) actually work. The oft-repeated motto, coming from Gabriel Sherman's 2009 profile, is "win the morning"--meaning, Politico wants to break controversial stories early in order to work its brand into the cable and blog chatter for the rest of the day. Everything else--accuracy, depth, other journalistic virtues--comes second to speed and infectiousness.

To that end, a lot of people cite Mike Allen's Playbook, a gossipy e-mail compendium of aggregated fluff and nonsense, as the exemplar of the Politico model. Every morning and throughout the day, the paper unleashes a steady stream of short, insider-ey stories. It's a rumor mill, in other words, one that's interested in politics over policy--but most of all, it's interested in Politico. Because if these stories get people talking, Politico will be mentioned, and that increases the brand's value to advertisers and sources.

(There is, by the way, no small amount of irony in the news industry's complaints about "aggregators" online, given the long presence of newsletters like Playbook around DC. Everyone has one of these mobile-friendly link factories, and has for years. CQ's is Behind the Lines, and when I first started there it was sent to editors as a monstrous Word document, filled with blue-underlined hyperlink text, early every morning for rebroadcast. Remember this the next time some publisher starts complaining about Gawker "stealing" their stories.)

Politico's motivations are blatant, but they're not substantially different from any number of talking heads on cable news, which has a 24-hour news hole to fill. Just as the paper wants people talking about Politico to keep revenue flowing, pundits want to be branded as commentators on every topic under the sun so they can stay in the public eye as much as possible. In a sane universe, David Brooks wouldn't be trusted to run a frozen yoghurt stand, because he knows nothing about anything. Expertise--the idea that speaking knowledgably requires study, sometimes in non-trivial amounts--is a threat to this entire industry (probably not a serious threat, but then they're not known for underreaction).

Election journalism has been a godsend to punditry precisely because it is so chaotic: who can say what will happen, unless you are a Very Important Person with a Trusted Name and a whole host of connections? Accountability has not traditionally been a concern, and because elections hinge on any number of complicated policy questions, this means that nothing is out of bounds for the political pundit. No matter how many times William Kristol or Megan McArdle are wrong on a wide range of important issues, they will never be fired (let's not even start on poor Tom Friedman, a man whose career consists of endlessly sorting the wheat from the chaff and then throwing away the wheat). But FiveThirtyEight undermines that thought process, by saying that there is a level of rigor to politics, that you can be wrong, and that accountability is important.

The optimistic take on this disruption is, as Nieman Journalism Lab's Jonathan Stray argues, that specialist experts will become more common in journalism, including in horse race election coverage. I'm not optimistic, personally, because I think the current state of political commentary owes as much to industry nepotism as it does to public opinion, and because I think political data is prone to intentional obfuscation. But it's a nice thought.

The real positive takeaway, I think, is that Brooks, Byers, Scarborough, and other people of little substance took such a strong public stance against Silver. By all means, let's have an open conversation about who was wrong in predicting this election--and whose track record is better. Let's talk about how often Silver is right, and how often that compares to everyone calling him (as Brooks did) "a wizard" whose predictions were "not possible." Let's talk about accountability, and expertise, and whether we should expect better. I suspect Silver's happy to have that talk. Are his accusers?

November 1, 2012

Filed under: tech»web

Node Win

As I've been teaching Advanced Web Development at SCCC this quarter, my role is often to be the person dropping in with little hints of workflow technique that the students will find helpful (if not essential) when they get out into real development positions. "You could use LESS to make your CSS simpler," I say, with the zeal of an infomercial pitchman. Or: "it will be a lot easier for your team to collaborate if you're working off the same Git repo."

I'm teaching at a community college, so most of my students are not wealthy, and they're not using expensive computers to do their work. I see a lot of cheap, flimsy-looking laptops. Almost everyone's on Windows, because that's what cheap computers run when you buy them from Best Buy. My suggestion that a Linux VM would be a handy thing to have is usually met with puzzled disbelief.

This makes my students different from the sleek, high-profile web developers doing a lot of open-source work. It's a difference both cultural (they're being taught PHP and ASP.net, which are deeply unsexy), but technological as well. If you've been to a meetup or a conference lately, you've probably noticed that everyone's sporting almost exactly the same setup: as far as the wider front-end web community is concerned, if you're not carrying a newish MacBook or a Thinkpad (running Ubuntu, no doubt), you might as well not exist.

You can see some of this in Rebecca Murphey's otherwise excellent post, A Baseline for Front End Developers, which lists a ton of great resources and then sadly notes:

If you're on Windows, I don't begin to know how to help you, aside from suggesting Cygwin. Right or wrong, participating in the open-source front-end developer community is materially more difficult on a Windows machine. On the bright side, MacBook Airs are cheap, powerful, and ridiculously portable, and there's always Ubuntu or another *nix.

Murphey isn't trying to be mean (I think it's remarkable that she even thought about Windows when assembling her list--a lot of people wouldn't), but for my students a MacBook Air probably isn't cheap, no matter what its price-to-performance ratio might be. It could be twice, or even three times, the cost of their current laptop (assuming they have one--I have some students who don't even have computers, believe it or not). And while it's not actually that hard to set up many of the basic workflow tools on Windows (MinGW is a lifesaver), or to set up a Linux VM, it's clearly not considered important by a lot of open source coders--Murphey doesn't even know how to start!

This is why I'm thrilled about Node.js, which added a Windows version about a year ago. Increasingly, the kinds of tools that make web development qualitatively more pleasant--LESS, RequireJS, Grunt, Yeoman, Mocha, etc.--are written in pure JavaScript using Node. If you bring that to Windows, you also bring a huge amount of tooling to people you weren't able to reach before. Now those people are not only better developers, but they're potential contributors (which, in open source, is basically the difference between a live project and a dead one). Between Node.js, and Github creating a user-friendly Git client for the platform, it's a lot easier for students with lower incomes to keep up with the state of the art.

I'm not wild about the stereotype that "front-end" means a Mac and a funny haircut, personally. It bothers me that, as a web developer, I'm "supposed" to be using one platform or another--isn't the best thing about rich internet applications the fact that we don't have to take sides? Isn't a diverse web community stronger? I think we have a responsibility to increase access to technology and to the Internet, not focus our efforts solely on a privileged few.

We should be worried when any monoculture (technological or otherwise) takes over an industry, and exclusive tools or customs can serve as warning signs. So even though I don't love Node's API, I love that it's a web language being used to build web tools. It means that JavaScript is our bedrock, as Alex Russell once noted. That's what we build on. If it means that being a well-prepared front-end developer is A) more cross-platform and B) more consistent from top to bottom, it means my students aren't left out, no matter what their background. And that makes me increasingly happy.

October 24, 2012

Filed under: fiction»reviews»banks_iain

The Hydrogen Sonata

I believe there are two kinds of Iain Banks readers: those who are in it for the plot, and those who are looking for spectacle. Banks does both tremendously well, but hardly ever in the same book, which means that invariably reviews are split between people who thought his most recent novel was amazing, or merely very good.

I tend towards plot, myself. I think Banks is at his best when he keeps the scale small, and finds ways to twist and undermine his setting of high-tech, post-scarcity, socialist space dwellers, the Culture. Nobody does huge, mind-boggling scenes like him, but at those galaxy-spanning scales (and when starring the near-omniscient AIs that run the Culture) it's hard to feel like there's much at stake. My favorites, like Matter or Player of Games, combine the large and the small convincingly, hanging the outcome of huge events on the shoulders of fallible, comprehensible characters.

But for his last two books, Banks has tended more towards the huge-explosions-in-strange-places side of things. 2010's Surface Detail spun up a war in virtual Hells that spilled into reality, and now (with The Hydrogen Sonata), he's taken a look at a civilization trying to reach closure, even while long-kept secrets keep pushing up into the light.

I re-read Surface Detail this week, and I like it a bit more than I did the first time around. I still think it suffers from a lack of agency surrounding too many of its characters, who end up simply as pawns being ferried around to each major plot point, but I'll admit that those characters are charming, and the idea of the Hells--virtual worlds set up to punish people even after religion is technically obsolete--is thornier than it first appears.

The Hydrogen Sonata has a lot of the same issues: the events of its plot, while fascinating, are ultimately of dubious importance, and it's not entirely clear if any of the characters actually have real influence on anything that happens. But to its credit, the events of THS are so diverting, you almost don't care. This is Iain Banks doing spectacle at a level he hasn't really tried since Excession, and to a surpising degree it works. It's widescreen science fiction, and he's clearly having fun writing it.

The book opens as the Gzilt, one of the original co-founders (but not members) of the Culture, have decided to leave the material plane and "sublime" to a higher order of existence. Just as they're counting down, however, representatives of another sublimed civilization contact a Gzilt ship, hinting that they may have planted the seeds of Gzilt religion eons ago (and thus prevented them from joining the Culture when they had the chance). This sets off turmoil in the local government, and a gang of Culture ships recruits one former Gzilt military officer, named Vyr Cossont, to hunt down the oldest living survivor of the civilization's founding for a first-hand account of events.

There's not much actual mystery to be had here--Banks telegraphs how things are going to end up pretty quickly. But the fun is in the oversized set pieces being tossed around one after another, from the "Girdlecity" (a giant, elevated metropolis wrapped all the way around a planet's equator) to the hapless group of insects who conduct bee-like dances with their spacecraft while waiting to scavenge on the remains of the sublimed worlds. There's a Last Party being thrown by one rich Gzilt before the subliming that continually tops itself in extravagance. I was also tickled by Cossont's quest to play the titular composition on an instrument called the "Antagonistic Undecagonstring," which means she ends up lugging a bulky and inconvenient music case around the galaxy despite herself (as a bassist, I sympathize).

But while it's enjoyable enough, playing with these toys that Banks assembles, it's hard to shake the feeling that it's all a bit lightweight. The Culture has been set up in these books as tremendously powerful, almost omnipotent--it's run, if that could be said of decentralized anarchosocialists, by AI Minds at the helm of massive, powerful starships, far outclassing any of the other civilizations in the book. When there's a question of how events will turn out, it often reduces to "can ship X reach destination Y in an amount of time defined by the author?" which is not very dramatically satisfying. Like Excession, my least favorite Culture book, much of The Hydrogen Sonata takes place in catty infodumps between the Minds--these can be funny, but they can also read like you've wandered into someone else's e-mail thread by mistake.

Still, for people who are die-hard Culture fans like me, we'll take what we can get--even if I'd rather see more plot and less spectacle. Books like The Hydrogen Sonata flesh out a rich, funny, dark universe that Banks has been building for 25 (!) years now. It's good to visit, if only to point and enjoy the sights.

October 18, 2012

Filed under: meta»announce

idclev

I went back and forth on a number of ways to write this up, and eventually decided to just keep it simple: on Monday, I'll be starting a new position on the web team at ArenaNet, developers of Guild Wars 2.

It's funny: I've never actually played Guild Wars or its sequel. They're not really my bag (although I guess I'll have to spend some time in them, now). But ArenaNet, like all MMO developers, generates a terrific amount of data from its simulated world, and I find that potentially fascinating. Along with typical web development (and non-typical--GW2 uses an embedded WebKit view for a number of in-game functions), I'll hopefully be taking a crack at ways to expose and visualize that data for players. I'm looking forward to it.

October 14, 2012

Filed under: tech»coding

Repo Man

Although I've had a GitHub account for a while, I didn't really use it much until last week, when I taught my Seattle Central students how to use version control for project sharing. That lesson was the first chance I'd had to play with GitHub's Windows client (although I develop most of my web code on Linux, I do a lot of JavaScript work on my Thinkpad in Windows). It seemed like a good time to clean up my account and create a few new repositories for projects I'm working on, some of which other people might find interesting.

Code

Code is my personal JavaScript utility belt--I use it for throwing together quick projects, when I don't want to hunt down a real library for any given task. So it provides a grab-bag of functionality: Futures/Promises, an extremely limited template system, shims for Function.bind and Base64 encoding, basic object/array utilities, and a couple of useful mixins. I started building this at CQ and took it with me because it was just too handy to lose. While I doubt anyone else will be using this instead of something like Underscore, I wanted to put it in a repo so that I can keep a history when I start removing unused portions or experimenting.

Grue

Grue is undoubtedly a cooler project: it's a small library for quickly building text games like Zork. I originally wanted to port Inform7 to JavaScript, but found myself stymied by A) Inform7's bizarre, English-like syntax, and B) its elaborate rule matching system. The fact that there's no source to study for the Inform7 compiler (not to mention that it's actually recompiling to an older version of Inform, then compiling from there to z-machine bytecode) puts this way beyond my "hobby project" threshold.

Instead, I tried to think about how to bring the best parts of Inform7, like its declarative syntax and simple object heirarchy, to JavaScript. Instead of connecting items directly to each other, Grue provides a "Bag" collection type that can be queried by object property--whether something is portable, or flammable, or contains a certain keyword--using a CSS3-like syntax. Objects also come with built-in getter/setter functions that can be "proxied" to temporarily override a value, or change it based on world conditions.

It takes about 30 lines of Grue code to write the opening scene of Zork I, which is not quite as concise as Inform7, but it's pretty close. I figure Grue is about halfway done--I still need to add more vocabulary, regional rulesets, and some additional types (Regions, Doors, Devices, etc)--but it's close enough to start dogfooding it. Feel free to pull the repo and open "index.html" to see what I've gotten so far.

KeepassDroid

My fork of KeepassDroid exists entirely to scratch a particular itch: I like the Android port of Keepass, but I find its UI to be functional, at best. The fonts are often too small, and forms end up underneath the virtual keyboard more often than not. So I've changed the view styles, and some of the layout XML, just for my own use (there's a .zip with the compiled application package, in case anyone else is interested). The main project doesn't seem interested in my changes, which is fine by me, but it does mean that every now and then I have to merge in changes from trunk if I want mine up to date. Increasingly, I don't bother.

Urban Artistry

And then there's one project I've been working on that's not located on GitHub, but went live this weekend. Ever since I started maintaining the web presence for Urban Artistry, it's been a mess of PHP files accreted since they first went online. There was an abortive attempt to move to WordPress in 2010, but it never got anywhere, and it would have used the same theme that someone once described as "a bit like a dark nightclub."

When UA went fully non-profit in the state of Maryland, and asked me to be on the board, one of my goals was to turn the site into something that would be a bit more appealing to the typical grant donor. The new site is intented to do exactly that: my design takes its cues from the UA logo with a lightweight, modern feel. The site is also responsive across three sizes--phone, tablet/netbook, and desktop--and since it's built on WordPress, it's easy for other members of the company to log in and make changes if they need to do so. I'm pretty happy with how things turned out, but the design was the easy part: content is much harder, and that's what we're tackling next.

October 3, 2012

Filed under: tech»coding

Teachable Moments

When you're on top of the world, it's the perfect time to start kicking the little people who lifted you up. At least, that's the only conclusion I can draw from Bret Victor's newest post on teaching code. After he did his presentation on "Inventing on Principle" a while back, the tech community went nuts for Victor's (admittedly impressive) visualization work and approach to live programming. This admiration culminated in Khan Academy's computer science curriculum, which integrates a live Processing environment very similar to Victor's demos. In response, he's written a long post bashing the crap out of it.

Instead, he has a plan to redesign programming itself in a new, education- oriented direction. I'm generally a fan of Victor's presentation work (his touch-based animation UX is phenomenal), but I find that his ideas for teaching tend to be impractical when they're examined closely, and I suspect that's the case here. I don't think it's a coincidence that Victor doesn't seem to spend a lot of time asking if anyone else has solved these problems. A little research should have turned up that someone already wrote the language he's proposing: Scratch.

Scratch isn't terribly pretty--it's designed for kids, and it shows--but it provides almost everything Victor claims he wants. Variables are provided in context, with an instant visual environment that lets users examine routines at any time. The syntax is all drag-and-drop, with clear indications of what is nested where, and there's a stepping debug mode that visually walks through the code and provides output for any variables in use. And as much as Victor wants to push the comparison to "pushing paint," Scratch's sprite-based palette is probably as good as that'll get for programming. That no mainstream programming languages have followed its lead doesn't necessarily indicate anything, but should at least give Victor pause.

In his essay, however, Scratch is nowhere to be found. Victor draws on four other programming paradigms to critique Processing: Logo, Smalltalk, Hypercard, and Rocky's Boots. To say that these references are dated is, perhaps, the least of their sins (although it does feel like Victor's research consisted of "stuff I remember from when I was a kid"). The problem is that they feel like four random things he likes, instead of coherent options for how to structure a learning program. They couldn't possibly be farther from each other, which suggests that these lessons are not easy to integrate. Moreover, using Logo as a contrast to Processing is ironic, since the latter's drawing instructions are strikingly similar (I typically use the Logo turtle to introduce people to canvas graphics). And in Smalltalk's case, the syntax he's applauding is deceptively complicated for beginners (even I find the message rules a little confusing).

Meanwhile, where are the examples that aren't twenty years old? The field hasn't stood still since the Apple IIGS, but you wouldn't know it from Victor's essay. Scratch is the most well-known educational programming environment, but there's no shortage of others, from the game-oriented (Kodu, Game Maker) to actual games (The Incredible Machine, SpaceChem). Where's the mention of the vibrant mod community (UnrealScript is many a coder's first language, and I've had several students whose introduction to coding was writing Lua scripts for World of Warcraft)? Like his Braid-inspired live coding demonstration, Victor's essay gives the impression that he's proposing some incredible innovation only by ignoring entire industries working on and around these problems. It's unclear whether he thinks they're not worth examining, or if he just can't be bothered to use Google.

There's also a question of whether these essays solve problems for anyone but Bret Victor. His obsession with visual programming and feedback is all well and good, but it ignores the large class of non-visual problems and learning styles that exist. As a result, it's nearly all untested, as far as I can tell, whereas its polar opposite (Zed Shaw's Learn Code the Hard Way) has a huge stream of actual users offering feedback and experience.

Let me clarify, in case it seems like I'm simply blaming Victor for failing to completely reinvent computing in his spare time. These essays repeatedly return to visualization as the method of feedback: visualization of time, visualization of data, and code that itself performs visualization. Unfortunately, there's an entire field of programming where a graphical representation is either impossible or misleading (how much of web programming is just pushing strings around, after all?).

Frankly, in actual programming, it's counterproductive to try to examine every value by stepping through the code: if I reach that point, I've already failed all other approaches. My goal when teaching is explicitly not for students to try to predict every value, but to think of programming as designing a process that will be fed values. In this, it's similar to this Quora answer on what it's like to be an advanced mathematician:

Your intuitive thinking about a problem is productive and usefully structured, wasting little time on being aimlessly puzzled. For example, when answering a question about a high-dimensional space (e.g., whether a certain kind of rotation of a five-dimensional object has a "fixed point" which does not move during the rotation), you do not spend much time straining to visualize those things that do not have obvious analogues in two and three dimensions. (Violating this principle is a huge source of frustration for beginning maths students who don't know that they shouldn't be straining to visualize things for which they don't seem to have the visualizing machinery.) Instead... When trying to understand a new thing, you automatically focus on very simple examples that are easy to think about, and then you leverage intuition about the examples into more impressive insights.

"Show the data" is a fine mantra when it comes to news graphics, but it's not really helpful when coding. Beginning coders should be learning to think in terms of data and code structure, not trying to out-calculate the computer. Working with exact, line-by-line values is a distraction at best--and yet it's the primary focus of Victor's proposed learning language, precisely because he's so graphically-focused. The idea that the goals of a visualization (to communicate data clearly) and the goals of a visualization programmer (to transform that data into graphics via abstraction) are diametrically opposed does not seem to have occurred to him. This is kind of shocking to me: as a data journalist, my goal is to use the computer to reduce the number of individual values I have to see at any time. A programming language that swamps me in detail is exactly what I don't want.

I'm glad that people are pushing the state of tech education forward. But changing the way people learn is not something you can do in isolation. It requires practical research, hands-on experience, and experimentation. It saddens me that Victor, who has some genuinely good feedback for Khan Academy in this essay, insists on framing them in grandiose proclamations instead of practical, full-fledged experiments. I don't know that I could honestly say which would be worse: if his ideas were imitated or if they were ignored. Victor, it seems, can't decide either.

September 25, 2012

Filed under: tech»web

DOM If You Don't

I've noticed, as browsers have gotten better (and the pace of improvement has picked up) that there's an increasingly vocal group of front-end developers crusading against libraries like jQuery, in favor of raw JavaScript coding. Granted, most of these are the fanatical comp.lang.javascript types who have been wearing tinfoil anti-jQuery hats for years. But the argument is intriguing: do we really need to include 30KB of script on every page as browsers implement dev-friendly features like querySelectorAll? Could we get away with writing "pure" JavaScript, especially on mobile where every kilobyte counts?

It's a good question. But I suspect that the answer, for most developers, will continue to be "no, use jQuery or Dojo." There are a lot of good reasons for this--including the fact that they deliver quite a bit more than just DOM navigation these days--but the primary reason, as usual, is simple: no matter how they claim to have changed, browser developers still hate you.

Let's take a simple example. I'd like to find all the file inputs in a document and add a listener to them for HTML5 uploads. In jQuery, of course, this is a beautifully short one-liner, thanks to the way it operates over collections: $('input[type=file]').on('change', onChange); Doing this in "naked" JavaScript is markedly more verbose, to the point where I'm forced to break it into several lines for readability (and it's still kind of a slog). var inputs = document.querySelectorAll('input[type=file]'); inputs.forEach(function(element) { element.addEventListener('change', onChange); }); Except, of course, it doesn't actually work: like all of the document methods, querySelectorAll doesn't return an array with JavaScript methods like slice, map, or forEach. Instead, it returns a NodeList object, which is array-like (meaning it's numerically-indexed and has a length property). Want to do anything other than a length check or an iterative loop over that list? Better break out your prototypes to convert it to a real JavaScript object: var inputs = document.querySelectorAll('input[type=file]'); inputs = Array.prototype.slice.call(inputs); inputs.forEach(function(element) { element.addEventListener('change', onChange); }); Oh, yeah. That's elegant. Imagine writing all this boilerplate every time you want to do anything with multiple elements from the page (then imagine trying to train inexperienced teammates on what Array.prototype.slice is doing). No doubt you'd end up writing yourself a helper function to abstract all this away, followed by similar functions for bulk-editing CSS styles or performing animations. Congratulations, you've just reinvented jQuery!

All this could be fixable if browsers returned native JavaScript objects in response to JavaScript calls. That would be the logical, sensical thing to do. They don't, because those calls were originally specified as "live" queries (a feature that no developer has ever actually wanted, and exists to implement the now-obsolete DOM-0 document collections), and so the list they return is a thin wrapper over a native host object. Even though querySelector and querySelectorAll are not live, and even we knew by the time of their implementation that this was an issue, they're still wrappers around host objects with the same impedance mismatch.

Malice or incompetence? Who knows? It looks to me like the people developing browser standards are too close to their rendering engines, and not close enough to real-world JavaScript development. I think it's useful for illustration purposes to compare the output of running a DOM query in Firebug vs. the built-in Firefox Web Console. The former gives you a readable, selector-based line of elements. The latter gives you an incomprehensible line of gibberish or a flat "[Object HTMLDivElement]", which when clicked will produce a clunky tree menu of its contents. Calling this useless is an insult to dead-end interfaces everywhere--and yet that's what Mozilla seems to think is sufficient to compete with Firebug and the Chrome dev tools.

At any given time, there are at least three standards committees fighting over how to ruin JavaScript: there's ECMA TC-39 (syntax), W3C (DOM standards), and WHATWG (HTML5). There are people working on making JavaScript more like Python or CoffeeScript, and people working on more "semantic" tags than you can shake a stick at. But the real problems with JavaScript--the reasons that everyone includes that 30KB of jQuery.js--do not have anything to do with braces or function keywords or <aside> tags. The problem is that the DOM is a horrible API, from elements to events, filled with boilerplate and spring-loaded bear traps. The only people willing to take a run at a DOM 2.0 require you to use a whole new language (which might almost be worth it).

So in the meantime, for the vast majority of front-end developers (myself included), jQuery and other libraries have become that replacement DOM API. Even if they didn't provide a host of other useful utility functions (jQuery's Deferred and Callback objects being my new favorites), it's still just too frustrating to write against the raw browser interface. And library authors recognize this: you can see it in jQuery's planned removal of IE 6, 7, and 8 support in version 2.0. With the worst of the cross-compatibility issues clearing up, libraries can concentrate on writing APIs to make browsers more pleasant to develop in. After all, somebody has to do it.

September 18, 2012

Filed under: gaming»software»blendo

Nuevos Aires, 1960

When I bought a new computer a little while back, I figured it would be a chance to play some of the Steam/GOG.com games that I bought while they were on sale, knowing that my laptop couldn't handle them. And in one or two cases, it is. But for the most part, I spent last week's small amount of gaming time buried back in a trio of titles from Blendo--a one-man shop that's becoming my favorite indie developer.

Blendo (AKA Brendon Chung) is best known right now for Thirty Flights of Loving, sequel to his absurdist spy short Gravity Bone. It's a funny, cinematic little nugget of first-person narrative. It's also about seven minutes long. I'm not sure it was worth the $5 asking price, but Chung's definitely playing with some ideas here that are worth rewarding.

Besides, he had my good will starting from my first minutes playing Flotilla last year. This was a game that I'd wanted, but somehow had not been able to find: full 3D space tactics within a randomly-generated campaign. The missions themselves are tense, slow-moving affairs set to classical piano pieces, while the overworld screens are Blendo's typically jazzy blend of surrealism (rastafarian pirate cats, defanged space yeti, and wandering Greek goddesses appear along your journey) and procedural storytelling (decisions along the way are assembled into an illustrated ship's log). The combination of the two should be dissonant, but instead the funny bits serve as a nice break between the tense turn-by-turn bits.

And then there's Atom Zombie Smasher, which is the most unbalanced and most compelling of the three. It's basically a tower defense game, which means I should hate it, and yet somehow I really don't. It's ridiculously unfair--sometimes you get an overwhelming mix of units for a stage, and sometimes you just get barricades and mines, meaning that I tend to win or lose the whole game depending on which two units are randomly assigned in the first stages--and yet tremendously addictive. Maybe that's just the surf guitar talking.

Ultimately, I think what charms the most about these is that they almost remind me of board games in their approach to design and replayability. Even though they're radically different genres, Blendo's stuff shares a common sensibility in the way that they construct stories out of small vignettes and procedural generation. Each takes, at most, an evening to play completely through, and yet there's plenty of detail and reward for digging in. They continue to surprise players outside of all proportion to their actual size. There aren't a lot of people making games in this space--it's all either bite-sized casual fare or sprawling epics. Chung's genius is making the former feel, if only for a little while, like the latter.

September 13, 2012

Filed under: music»performance»dance

Style Wars

On Google Plus, for no reason other than it seemed like a good idea at the time, I've started writing posts about dance videos on YouTube--either footage of events I've been to, or cool examples that I've seen other dancers posting elsewhere. It's a fun chance to introduce people to urban dance culture, as well as good mental practice.

In DC, and particularly as a part of Urban Artistry, I got used to seeing people switch between styles regularly--and in some cases to unexpected combinations, like African or waacking. Here, that seems less common: there's a lot of b-boys and a lot of dubstep poppers, and most of them stick to their particular specialty. Which is a shame, because I'm starting to realize, both from watching events live and on YouTube, that a great all-styles battle is pretty much my favorite kind.

What makes all-styles battles so great? I think it's a combination of factors:

  • They can be surprising. I've been to long b-boy battles, and by the thirtieth crew it doesn't matter what they do, the crowd is dead from watching so many windmills, jackhammers, and flares. In all-styles, by comparison, the moves are probably less strenuous but you never know what's coming up next.
  • They're more musical. The tricks of a given style take on more importance in single-style battles, but when two dancers come from completely different traditions, what they have in common is dancing, not power moves or acrobatics. That makes these match-ups a lot more inviting for newcomers, too.
  • They're more challenging. You never know what music someone will be dancing to. At some jams, I've seen the contestants allowed to pick their own genre, but most of the time it's just whatever the DJ feels like playing--and I think they use these as opportunities to dig out some real oddball tunes. If nothing else, watching some poor hapless dubstepper forced to dance to disco music is always amusing.

Dancers often talk about movement as a conversation. But battling across styles, especially when all the participants have more than one skillset, is a great way to literalize that. Take this clip from Northwest Sweet 16 a few months ago, where both crews trade exchanges back and forth, often beginning their turn in one genre as a response and then transitioning to another as a challenge. It's a ridiculously good show from a group of talented Vancouver dancers.

I wish there were more jams like that. Just as outside the cypher, diverse workplaces have higher productivity, all-styles battles are great inspiration for dancers of all types. I'm trying to take that lesson to heart, and keep learning new dances (even ones I don't really like). A couple of weeks ago I took a class in waacking (best Wikipedia article ever). I have no hope of ever being good at it, but learning how its distinctive arm movements work gave me a ton of new ideas for strutting and locking (and it was a great workout). Specialization would have robbed me of that experience.

I'm always amazed by people who say that you should focus solely on a single style. Almost everywhere in my life, learning in one area has enriched and illuminated others, and dance is no exception. For that matter, if I weren't trying to stay open to new influences, I wouldn't have started b-boying in the first place. Whether you're a dancer, a writer, a coder, or whatever, it's almost always a good idea to take a moment once in a while and do something strange and uncomfortable. It might be inspiring, or a surprising mix with what you already know. You might like it.

September 7, 2012

Filed under: seattle»tourism

Sponsored by Your Local Tourism Board

Or: Things to Do in Seattle When You're Dead

We've gotten settled in, been escorted around, and taken a few visitors around the city ourselves. This is my checklist for cool places to visit in Seattle when we have guests. It is by no means exhaustive--we've only been here about a year--but it's a start.

Go to these places, do these things:
  • The Chihuly Glass Gardens in the shadow of the Space Needle don't sound very exciting. Blown glass: whee. I think I expected paperweights and bubbles, but it turned out to be a genuinely amazing experience. Chihuly's work ranges from vase-like structures based on woven baskets to intricately-detailed sea creatures to abstract explosions of color and form. The exhibition in the garden, which matches the sculpture into natural textures and colors, just caps things off.
  • Working at the World Bank spoiled me for chocolate, because people would bring back really great candy from around the world. If you're in the same boat, you should definitely take the Theo Chocolate factory tour in Fremont, where you can try some very good chocolate and discover that the raw cacao nibs are honestly kind of disgusting.
  • Who doesn't go to the library on their vacation--not before, but during? But the Seattle Public Library's central branch location is a huge, constantly-surprising place. An enormous glass structure 11 stories high, it contains whole floors painted in vivid, shocking colors. There's a theater, a huge skylit reading room, and a ton of interesting art. Besides, it's downtown just a short walk from all the more typical tourist fare, like the Underground Tour and Pike Place Market.
Skip these things and go eat at Paseo instead:
  • The Space Needle: it's just a very tall building. If you've never been anywhere with tall buildings, maybe that's thrilling. If it's a clear day and you can see Rainier, it's pretty cool. But it's not $14 cool.
  • The outside of the Experience Music Project, with its flowing sheets of anodized metal, is supposed to look like a smashed guitar (it really doesn't). Inside, it was supposed to be a rock and roll museum, but they couldn't quite get that to work out, so they added a permanent science fiction wing. The whole thing is kind of vaguely embarrassing, honestly, especially if you come from a city like DC where the disappointing museums have the courtesy to be free.
  • Forks, Washington. The best part of Forks is that it keeps all the sparkly vampire drama way over on the other side of both the Puget Sound and a large national park.

Past - Present - Future