this space intentionally left blank

July 27, 2011

Filed under: culture»pop»comics

Original Recipe

It's a big year for superhero movies. I wouldn't say it's a good year, but it's certainly been very big, and for better or worse there's more on the way. And you know what that means: origin stories for everybody!

The origin story seemingly defines the comic book flick, for reasons I simply can't understand. The assumption seems to be that the most interesting thing about the title character is "how they got superpowers." This despite the fact that most superhero backstories are either silly or tedious, falling into two main categories: it's either Dude Invents Gadgets or Dude Is Given Power Through Unlikely Means (only men get origin stories in the movies, possibly because women superheros are relegated to supporting members of ensemble casts in the X-Men series). And then comes the training montage! Whee.

Here's the mindboggling part: the second movie in every superhero franchise is almost always the best one, precisely because it doesn't have the baggage of the origin story dragging it down. The sequel can ask the interesting questions raised by the premise (both general--what do I do with this power?--and specific--what do I do with this power?). Meanwhile, viewers who skip the first movie in a franchise aren't going to miss anything important that can't be recapped in a few lines of dialog anyway.

Spiderman 2 had a chance to engage with Peter Parker's double life because it didn't have to waste time on spider-bites and pro wrestling. The Dark Knight could explore the implications of vigilantism because it skipped an hour and a half of inserting bat-tab A into bat-slot B. The second X-Men movie is generally considered the best of the three--maybe because it could jump straight to a team dynamic instead of being Wolverine Tours The X-Mansion.

The exception that proves the rule, of course, is the Iron Man franchise, mainly because watching Robert Downey Jr. goof around for a couple of hours (the first film) is infinitely more fun than watching computer-animated Iron Man suits beat each other up (the second).

But the origin story is so entrenched at this point that it's become part of the money-making strategy: the Marvel movies are all origin stories for themselves, but they're also extended prequels to the inevitable Avengers team movie (which, let's be honest here, is going to be terrible). Spiderman is being rebooted, not because there was anything wrong with Raimi's version, but because it has to be integrated into the marketing plan (Peter Parker's parents are now SHIELD agents, a twist which adds absolutely nothing to the character). We'll get to watch the same movie for 80% of its running time, but with the kid from The Social Network in the starring role, and in over-priced 3D.

No-one will ever let me make a comic book movie, since I'd probably turn the whole thing into a sprawling experiment in genre deconstruction, ending with the heroes buried under criminal charges. But if I somehow found myself at the helm of, say, the Authority movie, I'd start in media res and take the first left turn I could find, because origin stories are boring and they're lazy. They presume that the audience A) needs the premise and character relationships explained slowly to them, and B) cares more about comic continuity than any sane person actually could. Why show, these movies ask, when you can tell in excruciating detail?

Yet there's a reason that the best parts of X-Men: First Class are the scenes where Magneto systematically chases down the Nazis that killed his family, so much so that everyone leaves the theater wishing that the movie had actually been two hours of Magneto: Suave Nazi Hunter. Nobody cares where a superhero comes from. We care what the character does with that power. That's the misunderstood genius of Spiderman's mission statement: the drama isn't in the "great power," it's in the "great responsibility." If only superhero flicks tried to live up to that, every once in a while. They'd still mostly be horrible, probably, but they'd fail in more interesting ways.

July 13, 2011

Filed under: culture»internet

N'est pas un blog

There's one quirk in the publishing process that has always driven me crazy (what, just one?). When we add stories to the main news section of the site, our CMS requires a separate entry for the teaser, headline, and any related links or images. Inside the building, they call these entries--each individual one, mind you--"blogs." Every time I hear it ("I'm going to write a blog for the new debt story." "Can you update the blog to add that link?" "Let's blog a new photo in the top blog BLOG BLOG BLOG.") I want to grind my teeth into little featureless nubs. Which I will call "blogs," because why not? It's not like words mean something!

Breathe. Calm. Find my happy place: Puppies. Sandwiches. Empty spreadsheets. Anyway...

After Google+ launched, something similar happened. People looked at the service, which combines Twitter's asymmetric-follow model with Facebook's rich content stream, and apparently thought to themselves "hey, this thing could be my new blog." Either they redirected their entire domain to their G+ profile (most notably Digg founder Kevin Rose) or (more commonly) they use G+ to write the kind of long-form posts that have traditionally been the province of blogs, whether home-grown or hosted on a platform like Wordpress or Blogger (similar long-form content creation has been attempted on Facebook, but it never really took off). I'm not a fan of this, obviously. It seems to be a real misconception of what blogging is, how it has developed, and where its strengths lie.

In 2011, it is past time that we understand blog culture. The practice of blogging is at least a decade old now. I realized the other day that I've been doing it here for more than 7 years, and I was relatively late to the party. So while I typically hate people who draw large categorical distinctions between, say, "bloggers" and "journalists" almost as much as I hate calling our ledes "blogs," it's not wrong to say that there is a different flavor to the way I publish here, compared to either standalone pieces or social network status updates. I think a lot of it comes down to the surrounding context.

A blog post is not an independent document in the way that (for example) newspaper stories on the same page would be. It's part of a larger dialog with the writer's surroundings, be those people or events. Most of the innovations in blogging--permalinks, comments, blogrolls, trackbacks, and organization-by-tagging, to name a few--revolve around exploring that dialog, implicitly or explicitly. When I write a post, it's informed by many of the posts that came before, by the audience that I expect to read it, and the direction I'm trying to take the blog as a cohesive work-in-progress.

Social networks have some of these aspects: they create dialog, obviously, and they allow sharing and permalinks. But social networks like Facebook and Google+ are not persistent or cohesive the way that a blog is. When you add a status update or whatever they're calling it these days, it's an ephemeral part of your lifestream (to use a now-unfashionable term), alongside all kinds of other activity from across your connections. Unlike a blog, those status updates are not a purposeful body of work. They're a rolling description of you, accreted from the detritus of what you do and what you like. Which is a useful thing to have, but a distinctly different experience from writing a blog.

My blog exists separately from me, while my social media profile is a view of me. That doesn't mean that they're not both valuable ways of interacting with the world. Social networks are great for retaining a kind of "situational awareness" of what my friends are doing, and to maintain a basic connection with them. It's like small talk: it doesn't replace real interaction, but it keeps us from becoming strangers between visits. Blogging, on the other hand, is where I feel like I can dig in and engage mentally. I don't have to worry about being rude by taking over someone's stream, or getting hidden behind a filter. It's a space that's all mine to use, controlled by me, and expressly used for my own purposes. A blog is a place to be a little selfish.

From a technical perspective, a blog is the more curated experience. When someone writes a blog entry, it gets published in a standard, exportable format via RSS. It lives in a database (or in my case, a filesystem) that can be edited and moved. Writing on a blog is property that you can own and control, and it starts from a position of ownership. Writing on a social network, although possible to extract through various APIs or Google's Data Liberation Front, is not under your control in the same way. That may not matter to you now, but one day, if you decide that you want to preserve those words--if you think your writing could become a book, or you want to give your favorite entries to a loved one, or if you just want to preserve them for your own satisfaction--a blog is probably a better option.

There are places of overlap, I think. By relaxing the character constraints, Google+ makes it possible to at least present more complex thoughts than Twitter, and it's a better writing experience than Facebook is. But when people say that they're planning on using Google+ as a blog, I can't help but think that what they really mean is "I didn't really want to blog anyway." I'm glad they've found a solution that works for them: not everyone is cut out to be a blogger. Some days I don't feel like it myself. But when I look back on writing here, on how I feel like I could develop a voice and indulge my obsessions, I wouldn't give this up for all the fancy circles in the world.

March 30, 2011

Filed under: culture»internet

Tone Matters

Last week Gina Trapani wrote an insightful post on Smarterware about designers, women, and hostility in open source, including how she has applied that to her social-networking scraper application ThinkUp, in terms of welcoming non-coders and contributors from a diverse range of backgrounds. And Trapani's been working around those kinds of problems for a while now: as the founding editor of the Lifehacker blog, she created a space for tech-minded people that was a breath of fresh air. From the post:

At Lifehacker, my original vision was to create a new kind of tech blog, one that wasn't yet another "boys worshipping tech toys" site, one that was helpful, friendly, and welcoming versus snarky, sensational, and cutting. (That was no small task in the Gawker-verse, and I learned much in the process.) ...

I learned something important about creating a productive online community: leaders set the tone by example. It's simple, really. When someone you don't know shows up on the mailing list or in IRC, you break out the welcome wagon, let them know you're happy they're here, show them around the place, help them with their question or problem, and let them know how they can give back to the community. Once you and your community leaders do that a few times, something magical happens: the newbie who you welcomed just a few weeks ago starts welcoming new folks, and the virtuous cycle continues.
One of the things that has struck me, as I've paid more attention to the tech news ecosystem online, is how rare that attitude really is. Trapani alludes to the general tone of the Gawker properties, which start at "snark" and work down, but (thanks to imitation and diffusion of former Gawker writers) most of the other big tech blogs sound pretty much the same, which is one of the major reasons that I dropped them from my usual reading habits and blocked them in my browser.

The typical industry blogger persona is aggressive, awestruck (both as irony and as an expression of genuine, uncritical neophilia), and uncompromising to other viewpoints. When they break their pose of cynicism, they usually rave in extreme superlatives, as though it simply wouldn't do to be quietly amused by something. The tone is, in other words, exactly what you'd expect from a group of young men who arrested their development at a precocious seventeen years old--and I say this as someone who is not entirely innocent of similar sins, as a trip through the archives here would show.

For as long as I can remember, Lifehacker did something different. Along with Make and Hackaday, their writers were less interested in tossing off snarky burns at the expense of the day's news, and more focused on appreciating the efforts of their communities. Even though Trapani has moved on, it remains an easy-going, welcoming read. I think they deserve kudos for that. It's certainly a style I'm trying to imitate more--to highlight the positive as much as I call out the negative.

When it comes down to it, I think each of us has to ask ourselves whose future we'd prefer to realize. If I were forced to live in a world represented by TechCrunch, or one built by Lady Ada, I know which one would feel more welcoming. The latter values knowledge, in my opinion, while the former values commerical product--like the difference between learning to cook and reviewing frozen dinners, the result is probably less polished, but ultimately more nutritious.

Tone has consequences beyond self-realization: Lifehacker and Make, in particular, have really encouraged me to take a closer look at open-source hardware and software in a positive way, just because their bloggers are relentlessly cheerful, low-key advocates for those communities. As Trapani noted, that kind of appeal can be a virtuous cycle. A diverse, positive tech community should be more likely to apply its energy to projects that reflect its membership, instead of an endless supply of hipster social networks and expensive new hardware. I'm glad there are writers like her out there, trying to make that a reality.

February 15, 2011

Filed under: culture»corporate


About five years ago, I designed my own business cards. On the front, they had my contact info and a stamp of my name in Mandarin Chinese that I'd gotten in Xi'An. On the back, there was a QR code containing a vCard of everything on the front, which was supposed to show people that I was way ahead of my time (this was, in fact, so far ahead of its time that nobody was ever able to scan one of the stupid things until last year).

Anyway, of the 500 or so cards I had printed, I probably still have 450 of them sitting on a shelf at home. Partly this is because I wasn't nearly as big on actual networking as I was on having a cool business card, but it's also a function of where the world is going: nobody keeps a rolodex full of cards anymore, and our address "books" live behind a touchscreen or on an Exchange server. And while I may have been a bit hasty in adopting them, these days QR codes and digital tags are everywhere. Machine-readable data has invaded the everyday world.

So this year, I'm taking an admittedly small risk and calling it: now is the time to ditch physical business cards. It'll save paper and money, reduce clutter and littering at conferences, and best of all, it'll genuinely put you on the cutting edge of digital networking.

Now you may be saying to yourself, sure, Thomas can do this: he's a bona fide misanthrope, but how can regular people get away with it? Good question. Here's a few easy strategies for going cardless:

  • Install a QR code generator on your smartphone: If you're an Android user, you probably have one already--it comes with Google's stellar Barcode Reader application. Pick a contact, choose "Share" from the menu, and presto: a code pops up that someone can scan right off the screen. I'm sure other platforms have something similar. You'll need to create an address book entry for yourself--something old-school Palm users will remember from the days of IR contact beaming.
  • Upload a vCard: I tried this the other day, and it's surprisingly nifty. Give someone the URL to the card, and it'll download to their phone or computer and prompt to be added to their address book. Once you've got it working, you even could use a link shortener like to make a custom (trackable) URL, and just update the .vcf file when your details change. You'll need a server you can control, because it needs to send the content-type as "text/x-vcard" and the content-disposition as "attachment" for Android and Blackberry phones to understand it correctly (an .htaccess file even lets you set the card as the 'index' for a directory). iPhones are, perhaps unsurprisingly, slightly less cooperative.
  • Once again, own your name: I can't repeat this enough. Of course, it helps if your name is relatively uncommon, and even then you never know when an ad agency will try to steal your thunder. But owning a searchable, easy-to-remember domain is a great way to present yourself, not to mention a fine place to host a copy of your QR code and your vCard file.

In a few years, this'll all probably seem like old hat. That's why it's important to jump on the card-less trend now, so we can look down our noses at the luddites handing out paper slips (and manually copying them into their computers) while we still can.

I kid, of course. Seriously, though: set aside the snobbery, the savings in money and paper, the confetti of unwanted cards after a professional meetup, and the chance to demonstrate your new media credentials. Won't it feel good just to not have to carry around a stack of disposable business cards wherever you go? I feel lighter already.

January 25, 2011

Filed under: culture»internet

Blink and You'll Miss It

I tell everyone they should have Firebug or its equivalent installed, and know how to use it. I believe that people will find it invaluable if they're designing a page and want to test something. They might want to do some in-page scripting. They can examine the source for ideas, or to discover hidden items. But most importantly, they can use it to fix your stupid, unreadable, over-styled web page.

The development of HTML5 means that browsers have gotten more powerful, more elaborate, and more interactive. It also means that they can be annoying in new and subtle ways. Back in the day, page authors used <blink> and <marquee> to create eye-catching elements on their flat gray canvas. Nowadays, thanks to pre-made CMS templates, the web superficially looks better, but it's not necessarily easier to read. Take three examples:

  • Text shadow: There's nothing wrong with a little text shadow. It's a classy effect for giving headlines a little pop. But I've noticed a lot of people lately using text-shadow everywhere, which makes my eyes cross trying to focus on mysteriously blurry text. Why is my browser anti-aliasing broken? Oh, it's not, you just hate your audience.

    Even worse are the people who have realized you can give the shadow an offset of zero pixels. If the shadow is dark, this ends up looking like the page got wet and all the ink has run. If it's a lighter shadow, you've got a poor man's text glow. Remember how classy text glow was when you used it on everything in Photoshop? Nobody else does either.

  • Custom fonts: I sympathize with font nerds on the Internet. It must have been painful, being restricted to the built-in browser fonts. On the other hand, those were the defaults for a reason. Now every wannabe font nerd on the Internet can finally use their own Very Special Typeface, many of which are more "fancy" than "legible." But why stop there? Why not customize the letter-spacing and line-height while you're at it?

    I'm not an expert in typesetting or anything, but the effect of these changes--besides sometimes giving Comic Sans a run for its ugly font money--is to throw me out of my browsing groove, and force me to re-acquire a grip on the text with every link to a custom page. If I'm not expecting it, and the font is almost the same as a system font, it looks like a display error. Either way, it's jarring, and it breaks the feeling that the Internet is a common space. Eventually, we'll all get used to it, but for now I hate your custom fonts.

  • "Pixel-perfect" layouts: Once upon a time, we had no real positioning tools in HTML. Oh, there was <table>, but building a layout using tables is like trying to design a travel brochure in Excel. And then someone invented CSS, and between "float" and "position: absolute" they ushered in a whole new era of web design. A new, fragile era, one that was vulnerable to a whole host of CSS sins. If I had a dime for every page I've visited where content overflowed the elaborate layout and got hidden behind an ad or a sidebar, I'd be making money in a very strange way. Setting "display: none" on offending elements has become an unfortunately common part of my browsing experience. It's enough to make me wish for the days when every page was a simple, boring, single black-on-grey column of text.
  • Rebinding key events: Seriously, IGN? Taking away the spacebar and the page-up/down keys for no apparent reason so I can't scroll the page on the rare occasions that I'm linked to your increasingly-awful content farm? You're why we can't have nice things.

It's no wonder, in an environment like this, that style-stripping bookmarklets like Readability caused such a sensation. There's a fine line between interactive design and overdesign, and designers are crossing it as fast as they can. All I ask, people, is that you think before getting clever with your CSS and your scripts. Ask yourself: "if someone else simulated this effect using, say, a static image, would I still think it looked good? Or would I ask them what Geocities neighborhood they're from?" Take a deep breath. And then put down the stylesheet, and let us read in peace.

January 11, 2011

Filed under: culture»internet

Good Grief

Tim Ferriss was a real-world griefer before real-world griefing was cool. Before Anonymous was putting epileptic kids into seizures, DDOSing the Church of Scientology, and harrassing teenage girls for no good reason whatsoever, Ferriss (through sheer force of narcissism) had already begun gaming whatever system he could get his hands on. And now he writes books about it. The question you should be asking yourself, as you read this tongue-in-cheek New York Times review of Ferriss's "four-hour workout" book is, did he write this to actually teach people his idiosyncratic health plan? Or (more likely) is it just the newest way Ferriss has decided to grief the world, via the NYT bestseller list?

Griefing, of course, is the process of exploiting the rules of an online community to make its members miserable. Griefers are the people who join your team in online games, and then do everything possible to sabotage your efforts. It's a malevolent version of the "munchkin" from old-school RPGs, where a player tries to find loopholes in the rules, except that griefers aren't playing to win--they're playing to get a reaction, which is much easier. The key is in the balance--a griefer or munchkin is looking to maximize impact while minimizing effort. That's basically what Ferriss is doing: he power-games various external achievements, like kickboxing or tango, not for their own sake, but to boost his own self-promotional profile.

The problem with writing about reputation griefers like this guy is, for them, there really is no such thing as bad publicity. They want you to hate them, as long as it boosts their search ranking. And there are an awful lot of people out there following similar career plans--maybe not as aggressively, almost certainly not as successfully, but they're certainly trying. They may not realize that they're griefing, but they are. Affiliate marketers? Griefing. Social networking 'gurus' who primarily seem to be networking themselves? Griefing. SEO consultants? Totally griefing.

Like a zen student being hit with a stick, I achieved enlightenment once I looked at the situation this way: it's the Internet equivalent of being a celebrity for celebrity's sake. Or, perhaps more accurately, griefing provides a useful framework for understanding and responding to pointless celebrities elsewhere. Maybe this is one way that the Internet, for all its frustrations and backwardness and self-inflicted suffering, can make us better people.

The one thing I've learned, from years of "Something Is Wrong On The Internet," is that the key to dealing with griefers--whether it's a game of Counterstrike, Tim Ferriss, or the vast array of pundits and shock jocks--is right there in the name. They benefit from getting under your skin, when you treat them as serious business instead of something to be laughed off. As Belle and I often say to each other, you can always recognize people who are new to the dark side of the Internet's ever-flowing river of commentary by the gravity they assign to J. Random Poster. We laugh a little, because we remember when we felt that way (sometimes we still do), before we learned: it takes two people to get trolled. Don't let them give you grief.

May 26, 2010

Filed under: culture»internet

The Classics

Most book-lovers, I think, have a shelf devoted to their favorite books. It's always half-empty, because those are also the books they lend out when someone asks for a recommendation--oh, you haven't read something by X? Here you go. I love that shelf, even if I rarely lend books: it's where the private activity of reading becomes a shared experience, either through borrowing or via representation: these are the books that have deeply affected me. Maybe they'll affect you, too.

Likewise, there is writing on the Internet that is classic: essays, articles, and fiction that get linked and re-linked over time, in defiance of the conventional wisdom that online writing is transient or short-lived. The Classics are a personal call: what goes on your mental shelf of great online writing won't be the same as mine, and that's okay. This post is a collection of the items that I consider must-reads, accumulated over years of surfing. As I dig stuff out of my memory, I'll keep adding more.

  • Flathead, by Matt Taibbi: the definitive critique of Thomas Friedman. Everything you need to know about one of the world's worst pundits.
  • Creating The Innocent Killer, by John Kessel: an evisceration of Orson Scott Card's manipulative, deceptive Ender novels.
  • Host, by David Foster Wallace. A lengthy profile of radio host John Ziegler, not to mention a brilliant example of hypertextual footnotes.
  • The Zompist Phrasebook, edited by Mark Rosenfelder: the last language phrasebook you'll ever need.
  • The Rhetoric of the Hyperlink, by Venkatesh Rao: essential reading for people going from print to online.
  • The Grim Meathook Future, by Josh Ellis: there are problems with this essay if it's taken straight, as Ellis found out when he presented it to an international hackers' conference. But as a short, punchy antidote to techno-utopian thinking--a way to puncture the silicon valley bubble--I think it's invaluable.

May 11, 2010

Filed under: culture»internet


So, you're thinking about deleting your Facebook account. Good for you and your crafty sense of civil libertarianism! But where will you find a replacement for its omnipresent life-streaming functionality? It's too bad that there isn't a turnkey self-publishing solution available to you.

I kid, of course, as a Cranky Old Internet Personality. But it's been obvious to me, for about a year now, that Facebook's been heading for the same mental niche as blogging. Of course, they're doing so by way of imitating Twitter, which is itself basically blogging for people who are frightened by large text boxes. The activity stream is just an RSS aggregator--one that only works for Facebook accounts. Both services are essentially taking the foundational elements of a blog--a CMS, a feed, a simple form of trackbacks and commenting--and turning them into something that Grandma can use. And all you have to do is let them harvest and monetize your data any way they can, in increasingly invasive ways.

Now, that aspect of Facebook has never particularly bothered me, since I've got an Internet shadow the size of Wyoming anyway, and (more importantly) because I've largely kept control of it on my own terms. There's not really anything on Facebook that isn't already public on Mile Zero or my portfolio site. Facebook's sneaky descent into opt-out publicity mode didn't exactly surprise me, either: what did you expect from a site that was both free to users and simultaneously an obvious, massive infrastructure expense? You'd have to be pretty oblivious to think they weren't going to exploit their users when the time came to find an actual business model--oblivious, or Chris Anderson. But I repeat myself.

That said, I can understand why people are upset about Facebook, since most probably don't think that carefully about the service's agenda, and were mainly joining to keep in touch with their friends. The entry price also probably helped to disarm them: "free" has a way of short-circuiting a person's critical thought process. Anderson was right about that, at least, even if he didn't follow the next logical step: the first people to take advantage of a psychological exploit are the scammers and con artists. And when the exploit involves something abstract (like privacy) instead of something concrete (like money), it becomes a lot easier for the scam to justify itself, both to its victims and its perpetrators.

Researcher danah boyd has written extensively about privacy and social networking, and she's observed something interesting about privacy, something that maybe only became obvious when it was scaled up to Internet sizes: our concept of privacy is not so much about specific bits of data or territory, but our control over the situations involving it. In "Privacy and Publicity in the Context of Big Data" she writes:

It's about a collective understanding of a social situation's boundaries and knowing how to operate within them. In other words, it's about having control over a situation. It's about understanding the audience and knowing how far information will flow. It's about trusting the people, the situating, and the context. People seek privacy so that they can make themselves vulnerable in order to gain something: personal support, knowledge, friendship, etc.

People feel as though their privacy has been violated when their expectations are shattered. This classicly happens when a person shares something that wasn't meant to be shared. This is what makes trust an essential part of privacy. People trust each other to maintain the collectively understood sense of privacy and they feel violated when their friends share things that weren't meant to be shared.

Understanding the context is not just about understanding the audience. It's also about understanding the environment. Just as people trust each other, they also trust the physical setting. And they blame the architecture when they feel as though they were duped. Consider the phrase "these walls have ears" which dates back to at least Chaucer. The phrase highlights how people blame the architecture when it obscures their ability to properly interpret a context.

Consider this in light of grumblings about Facebook's approach to privacy. The core privacy challenge is that people believe that they understand the context in which they are operating; they get upset when they feel as though the context has been destabilized. They get upset and blame the technology.

This is why it's mistaken to claim that "our conception of privacy has changed" in the Internet age. Private information has always been shared out with relative indiscretion: how else would people hold their Jell-o parties or whatever they else did back in the olden days of our collective nostalgia? Those addresses and invitations weren't going to spread themselves. The difference is that those people had a reasonable expectation of the context in which their personal information would be shared: that it would be confined to their friends, that it would used for a specific purpose, and that what was said there would confine itself--mostly--to the social circle being invited.

Facebook's problem isn't just that the scale of a "slip of the tongue" has been magnified exponentially. It's also that they keep shifting the context. One day, a user might assume that the joke group they joined ("1 Million Readers Against Footnotes") will only be shared with their friends, and the next day it's been published by default to everyone's newsfeed. If you now imagine that the personal tidbit in question was something politically- or personally-sensitive, such as a discussion board for dissidents or marginalized groups, it's easy to see how discomforting that would be. People like me who started with the implicit assumption that Facebook wasn't secure (and the privilege to find alternatives) are fine, but those who looked to it as a safe space or a support network feel betrayed. And rightfully so.

So now that programmers are looking at replacing Facebook with a decentralized solution, like the Diaspora project, I think there's a real chance that they're missing the point. These projects tend to focus on the channels and the hosting: Diaspora, for example, wants to build Seeds and encrypt communication between them using PGP, as if we were all spies in a National Treasure movie or something. Not to mention that it's pretty funny when the "decentralized" alternative to Facebook ends up putting everyone on the same server-based CMS. Meanwhile, the most important part of social networks is not their foolproof security or their clean design--if it were, nobody would have ever used MySpace or Twitter. No, the key is their ability to construct context via user relationships.

Here's my not-so-radical idea: instead of trying to reinvent the Facebook wheel from scratch, why not create this as a social filter plugin (or even better, a standard service on sites like Posterous and Tumblr) for all the major publishing platforms? Base it off RSS with some form of secure authentication (OpenID would seem a natural fit), coupled with some dead-simple aggregation services and an easy migration path (OPML), and let a thousand interoperable flowers bloom. Facebook's been stealing inspiration from blogging for long enough now. Instead of creating a complicated open-source clone, let's improve the platforms we've already got--the ones that really give power back to individuals.

February 22, 2010

Filed under: culture»america»race_and_class


I don't remember where I was, the first time I heard the name Malcolm X. I remember that I was maybe 8 years old, growing up in Lexington, Kentucky. It was a mostly African-American neighborhood, so it could have been anywhere, really. I think I remember being confused by the 'X'--how could that be a last name? How did he sign forms or documents? And as someone who fumed at the end of every class roll call and official ceremony, I wondered: why didn't he pick a letter closer to the start of the alphabet?

At that age, of course, history is a pretty boring topic, but I don't remember learning about Malcolm X in class. I don't think I ever really discussed him with my parents, either. He was a cipher, a vaguely sinister one for some reason (maybe the name, maybe not). It wasn't until college, when I took a class on social movements and persuasion, that I learned more about the man: his militance within the Nation of Islam, his pilgrimage to Mecca, and the change in his thinking as a result. It was a revelation, a whole part of the civil rights story that I'd never learned about--and I was simultaneously shamed that I'd never bothered to find out about it on my own.

A couple of years ago, I finally got around to reading his autobiography, and was struck all over again. It's a fascinating story: told to Alex Haley during a time when Malcolm X was himself undergoing a serious self-examination, it's a chronicle of transformation on both explicit and implicit levels. He was an extraordinarily complicated person, undoubtably flawed but capable of tremendous insight and intelligence. It makes clear that his assassination was truly one of the great tragedies of the civil rights movement.

Yesterday was the anniversary of the assassination of Malcolm X, and of course February is Black History Month, so I've found myself thinking about this a lot lately. The thing about Black History Month is that it's a misnomer: as US citizens, Black history is our history. The fallout from slavery, segregation, and the struggle for civil rights still echo through our society in ways that we still stumble to articulate. Nobody, to my mind, represents that complex truth more than Malcolm X.

July 28, 2009

Filed under: culture»pop»trendspotting


Last in a series.

Part 5: Is this the right room for an argument? No, this is abuse.

I will admit this: it took some nerve for Chris Anderson to write the last few chapters of Free. Oh, parts of it are harmless enough: a comparison to old theories of competitive economics, for example, or advice to embrace digital waste. One is obvious padding, and the other is a random plate of leftovers from when this book was a magazine article. But they're nothing compared to the chronicle of chutzpah that is most of the closing section.

It's hard for me to imagine, for example, writing a section on post-scarcity economies in science fiction--much less thanking an intern in the acknowledgements for actually reading the books and regurgitating summaries/analysis to me. I would have a hard time assembling, with a straight face, a chapter on reputation and gift economies that includes a study putting reputation at the very bottom of the motives for Wikipedia contributions. And it almost seems like a practical joke to write "China and Brazil Are The Frontiers of Free. What Can We Learn from Them?"--a chapter which states (without citation, evidence, or any appreciation for irony) that Chinese students are basically Confucian-trained plagiarists, that piracy is the cause of China's growing number of millionaires, and that generic drug prices in Brazil are an endorsement of "free." Seriously?

Of his closing chapters, the only one that's probably worth engaging at length is #14, in which Anderson attempts to answer the criticisms that have been levelled against his argument since its debut. To be fair, these are real arguments, many of which I've raised here. Whether you find his responses convincing will probably depend on your reaction to previous chapters. If you found him less than rigorous before (and I certainly have, what with the Wikipedia, plagiarism, technical misconceptions, and sloppy definitions), his counter-arguments won't change your mind. What frankly surprised me was, looking back at the chapter, how few challenges he actually answered in his replies. In the interests of space and sanity, I'm summarizing them here in the form of the classic Shorter format:

  • "There ain't no such thing as a free lunch." - The low cost of individual transactions allows me to ignore their nontrivial aggregate costs.
  • "Free always has hidden costs/free is a trick." - The fact that I have just spent 13 chapters detailing the methods through which the costs of free products are defrayed should not be seen as trickery, but as magic.
  • "The Internet isn't really free because you're paying for access." - Cost of entry is only important if it matters to me, a wealthy, white, American male.
  • "Free is just about advertising (and there's a limit to that)." - Let's pretend I didn't just spend an entire book talking about web advertising. What were my other two models again?
  • "Free means more ads, and that means less privacy." - If we just act like our privacy is worthless, maybe it'll come true!
  • "No cost = no value." - Instead of being upset that piracy and competition from unsustainable business models is destroying your income, be glad that someone linked to you! (Alternate bonus shorter: It's your fault that people are stealing your intellectual property.)
  • "Free undermines innovation." - I am not going to address the question of drug research, and you can't make me.
  • "Depleted oceans, filthy public toilets, and global warming are the real cost of free." - Despite evidence to the contrary, not to mention my repeated efforts to extend my theory into the real world, externalities do not exist for digital goods.
  • "Free encourages piracy." - This straw man may look sturdy, but watch me knock it down! (Bonus Hyperbole Watch: Anderson once again compares "free" to gravity, as if it were an immutable physical law.)
  • "You can't compete with free." - Despite the fact that at least half of my examples are demonstrated money pits, you should emulate them.
  • "I gave away my stuff and didn't make much money!" - Maybe you're just stupid. Did you think of that?
  • "Free drives out professionals in favor of amateurs, at a cost to quality." - If society has to suffer for my ideology, so be it. (Note: I'm somewhat reluctant to joke about this one, since it's a response to noted anti-Internet troll Andrew Keen, and is thus probably another straw man. I do like his insinuation that journalists will simply be paid less--many journalists may choose now to scoff derisively and wonder if they've invented numbers that low.)
But as the saying goes, Anderson saves the best for last. In his "coda," he writes (location 3486):
...when the markets recovered and we looked back, we found to our surprise that it was practically impossible to see the effect of the crash on the growth of the Internet. It had continued to spread, just as before, with hardly a dip as the public markets cratered.
Wait a second: is he seriously claiming that the dot-com bubble was going to crash the Internet itself? As usual, it's unclear--he continues straight into a paragraph about the netbook market, where low-cost (but not free) computers are loaded primarily with the low-cost (but not free) Windows XP, and will soon be preloaded with the not-so-low-cost Windows 7. Our current financial crisis, he argues, will push people even more into the economics of free content, even though it will make it more difficult for businesses to embrace it as a model. And he closes his book, literally in the last three paragraphs, by running through a list of web startups (including Twitter, YouTube, Digg, and Facebook) that have failed to generate a profitable revenue stream. He concludes (location 3899): is not enough. It also has to be matched with Paid. Just as King Gillette's free razors only made business sense paired with expensive blades, so will today's Web entrepreneurs have to invent not just products that people love but also those that they will pay for. free [sic] may be the best price, but it can't be the only one.
Translation: it is a tale told by an idiot, full of sound and fury, signifying nothing.

In Conclusion

To quote Weezer: Why bother? Everyone knows these "airport books," as Anil Dash calls them, are terrible--why take the time to engage it in such detail? Why spend a whole week on it? Why not just ignore Anderson and his rampaging ego?

To answer, I can only say that sadly, many people do not bring nearly enough skepticism to the table, especially faced with the mighty publicity machine behind Anderson's work. His first "revelation," the so-called long tail, has run afoul of evidence time and time again--and this has not stopped it from being advocated as sound business strategy. I have had it quoted at me, and I expect the same to happen with Free. So at the very least, it's therapeutic to work through the book and marshall my thoughts on it.

But I have to confess to a less rational motive. For some time now, I've been bothered by the way that "Web 2.0" movements encourage us to commercialize ourselves, or at least to quantify that value: what's your traffic? Your PageRank? Your follower count? How much could you make with AdSense on your blog? How can you turn your readers into money? How much are you worth as a reader? We are all in the business now, it seems, of selling ourselves to the world--based on a set of values which I find, if not suspect, then at least highly artificial.

When we talk about a non-professional attention economy, or a "reputational" economy, what we're doing--partly, at least--is putting a price tag on ourselves, and on each other. It disturbs me to think about community this way. Call me a crazy hippie, but the people that I've met while writing here, to me, are not commodities to be traded. And I like to think that (while I attempt to minimize its harm to my career) I don't write this blog for a commercial benefit, monetary or otherwise. If nobody read it (a state of affairs blessedly close to the truth), I'd still write here, just for the pleasure of doing so.

Anderson is not to blame for the marketing of the reputational economy, but he's one of its strongest proponents. His Free is, in many ways, an attempt to lay out a blueprint for the monetization of your attention and spare time. Speaking personally, I don't appreciate the effort. He's offering Free. I say, keep the change.

Future - Present - Past