this space intentionally left blank

November 23, 2010

Filed under: tech»activism

The Console Model Is a Regressive Tax on Creativity

This weekend I dropped in on the second day of the 2010 DC PubCamp, an "unconference" aimed at public media professionals. I went for the mobile app session, where I got to meet another NPR Android developer and listen to an extremely belligerent nerd needlessly browbeat a bunch of hapless, confused program managers. But I stuck around after lunch for a session on video gaming for marginalized communities, hosted by Latoya Peterson of Racialicious. You can see the slides and Latoya's extensive source list here.

The presentation got sidetracked a bit early on during a discussion of internet cafes and gender in Asia, but for me the most interesting part was when Latoya began talking about the Glitch Game Testers, a project run by Betsy DiSalvo at the Georgia Institute of Technology. DiSalvo's program aims to figure out why there are so few African Americans, and specifically African American men, in the tech industry, and try to encourage those kids to engage more with technology.

The researchers found several differences between play patterns of her black students and their white peers: the minority gamers began playing younger, tended to play more with their families and less online, viewed gaming as a competition, and were less likely to use mods or hacks. These differences in play (which, as Latoya noted, are not simply racial or cultural, but also class-based) result in part from the constraints of gaming on a console. After all, a console is one shared family resource hooked up to another (the television), meaning that kids can't sit and mess with it on their own for hours. Consoles don't easily run homebrew code, so they don't encourage experimentation with programming.

Granted, these can be true of a PC as well. I didn't have my own computer until high school, and my parents didn't really want me coding on the family Gateway. But I didn't have to leave the computer when someone else wanted to watch television, and I was always aware (for better or worse, in those DOS/Win 3.1 days of boot disks and EMS/XMS memory) that the computer was a hackable, user-modifiable device. Clearly, that was a big advantage for me later on in life. In contrast, console gamers generally don't learn to think of software as mutable--as something they themselves could experiment with and eventually make a career at.

It's hopelessly reductionist, of course, to say that consoles cause the digital divide, or that they're even a major causal factor compared to problems of poverty, lack of role models, and education. But I think it's hard to argue that the console model--locked-down, walled-garden systems running single-purpose code--doesn't contribute to the problem. And it's worrisome that the big new computing paradigm (mobile) seems determined to follow the console path.

Set aside questions of distribution and sideloading just for the sake of argument, and consider only the means of development. As far as I'm aware, no handheld since the original DragonBall-powered PalmOS has allowed users to write a first-class application (i.e., given equal placement in the shell, and has full or nearly-full OS access) on the device itself. At the very least, you need to have another device--a real, open computer--to compile for the target machine, which may be burden enough for many people. In some cases, you may also need to pay a yearly fee and submit a lot of financial paperwork to the manufacturer in order to get a digitally-signed certificate.

I think it's safe to say that this is not an arrangement that's favorable to marginalized communities. It wouldn't have been favorable to me as a kid, and I come from a relatively advantaged background. In terms of both opportunity cost and real-world cost, the modern smartphone platform is not a place where poor would-be developers can start developing their skills. As smartphones become more and more the way people interact with computers and the Internet, a trend like this would largely restrict self-taught tech skills among the white and the wealthy.

The one wild-card against this is the web. We're reaching the point where all platforms are shipping with a decent, hackable runtime in the form of an HTML4/5-compatible browser. That's a pretty decent entry point: I don't personally think it's as accessible as native code, and there are still barriers to entry like hosting costs, but JS/HTML is a valid "first platform" these days. Or it could be, with one addition: the all-important "view source" option. Without that, the browser's just another read-only delivery channel.

I think it's self-evident why we should care about having more minorities and marginalized groups working in technology. It's definitely important to have them in the newsroom, to ensure that we're not missing stories from a lack of diversity in viewpoint. And while it's true that a huge portion of the solution will come from non-technological angles, we shouldn't ignore the ways that the technology itself is written in ways that reinforce discrimination. Closed code and operating systems that are actively hostile to hacking are a problem for all of us, but they hit marginalized communities the hardest, by eliminating avenues for bootstrapping and self-education. The console model is a regressive tax on user opportunity. Let's cut it out.

Past - Present