this space intentionally left blank

July 27, 2011

Filed under: culture»pop»comics

Original Recipe

It's a big year for superhero movies. I wouldn't say it's a good year, but it's certainly been very big, and for better or worse there's more on the way. And you know what that means: origin stories for everybody!

The origin story seemingly defines the comic book flick, for reasons I simply can't understand. The assumption seems to be that the most interesting thing about the title character is "how they got superpowers." This despite the fact that most superhero backstories are either silly or tedious, falling into two main categories: it's either Dude Invents Gadgets or Dude Is Given Power Through Unlikely Means (only men get origin stories in the movies, possibly because women superheros are relegated to supporting members of ensemble casts in the X-Men series). And then comes the training montage! Whee.

Here's the mindboggling part: the second movie in every superhero franchise is almost always the best one, precisely because it doesn't have the baggage of the origin story dragging it down. The sequel can ask the interesting questions raised by the premise (both general--what do I do with this power?--and specific--what do I do with this power?). Meanwhile, viewers who skip the first movie in a franchise aren't going to miss anything important that can't be recapped in a few lines of dialog anyway.

Spiderman 2 had a chance to engage with Peter Parker's double life because it didn't have to waste time on spider-bites and pro wrestling. The Dark Knight could explore the implications of vigilantism because it skipped an hour and a half of inserting bat-tab A into bat-slot B. The second X-Men movie is generally considered the best of the three--maybe because it could jump straight to a team dynamic instead of being Wolverine Tours The X-Mansion.

The exception that proves the rule, of course, is the Iron Man franchise, mainly because watching Robert Downey Jr. goof around for a couple of hours (the first film) is infinitely more fun than watching computer-animated Iron Man suits beat each other up (the second).

But the origin story is so entrenched at this point that it's become part of the money-making strategy: the Marvel movies are all origin stories for themselves, but they're also extended prequels to the inevitable Avengers team movie (which, let's be honest here, is going to be terrible). Spiderman is being rebooted, not because there was anything wrong with Raimi's version, but because it has to be integrated into the marketing plan (Peter Parker's parents are now SHIELD agents, a twist which adds absolutely nothing to the character). We'll get to watch the same movie for 80% of its running time, but with the kid from The Social Network in the starring role, and in over-priced 3D.

No-one will ever let me make a comic book movie, since I'd probably turn the whole thing into a sprawling experiment in genre deconstruction, ending with the heroes buried under criminal charges. But if I somehow found myself at the helm of, say, the Authority movie, I'd start in media res and take the first left turn I could find, because origin stories are boring and they're lazy. They presume that the audience A) needs the premise and character relationships explained slowly to them, and B) cares more about comic continuity than any sane person actually could. Why show, these movies ask, when you can tell in excruciating detail?

Yet there's a reason that the best parts of X-Men: First Class are the scenes where Magneto systematically chases down the Nazis that killed his family, so much so that everyone leaves the theater wishing that the movie had actually been two hours of Magneto: Suave Nazi Hunter. Nobody cares where a superhero comes from. We care what the character does with that power. That's the misunderstood genius of Spiderman's mission statement: the drama isn't in the "great power," it's in the "great responsibility." If only superhero flicks tried to live up to that, every once in a while. They'd still mostly be horrible, probably, but they'd fail in more interesting ways.

July 13, 2011

Filed under: culture»internet

N'est pas un blog

There's one quirk in the CQ.com publishing process that has always driven me crazy (what, just one?). When we add stories to the main news section of the site, our CMS requires a separate entry for the teaser, headline, and any related links or images. Inside the building, they call these entries--each individual one, mind you--"blogs." Every time I hear it ("I'm going to write a blog for the new debt story." "Can you update the blog to add that link?" "Let's blog a new photo in the top blog BLOG BLOG BLOG.") I want to grind my teeth into little featureless nubs. Which I will call "blogs," because why not? It's not like words mean something!

Breathe. Calm. Find my happy place: Puppies. Sandwiches. Empty spreadsheets. Anyway...

After Google+ launched, something similar happened. People looked at the service, which combines Twitter's asymmetric-follow model with Facebook's rich content stream, and apparently thought to themselves "hey, this thing could be my new blog." Either they redirected their entire domain to their G+ profile (most notably Digg founder Kevin Rose) or (more commonly) they use G+ to write the kind of long-form posts that have traditionally been the province of blogs, whether home-grown or hosted on a platform like Wordpress or Blogger (similar long-form content creation has been attempted on Facebook, but it never really took off). I'm not a fan of this, obviously. It seems to be a real misconception of what blogging is, how it has developed, and where its strengths lie.

In 2011, it is past time that we understand blog culture. The practice of blogging is at least a decade old now. I realized the other day that I've been doing it here for more than 7 years, and I was relatively late to the party. So while I typically hate people who draw large categorical distinctions between, say, "bloggers" and "journalists" almost as much as I hate calling our ledes "blogs," it's not wrong to say that there is a different flavor to the way I publish here, compared to either standalone pieces or social network status updates. I think a lot of it comes down to the surrounding context.

A blog post is not an independent document in the way that (for example) newspaper stories on the same page would be. It's part of a larger dialog with the writer's surroundings, be those people or events. Most of the innovations in blogging--permalinks, comments, blogrolls, trackbacks, and organization-by-tagging, to name a few--revolve around exploring that dialog, implicitly or explicitly. When I write a post, it's informed by many of the posts that came before, by the audience that I expect to read it, and the direction I'm trying to take the blog as a cohesive work-in-progress.

Social networks have some of these aspects: they create dialog, obviously, and they allow sharing and permalinks. But social networks like Facebook and Google+ are not persistent or cohesive the way that a blog is. When you add a status update or whatever they're calling it these days, it's an ephemeral part of your lifestream (to use a now-unfashionable term), alongside all kinds of other activity from across your connections. Unlike a blog, those status updates are not a purposeful body of work. They're a rolling description of you, accreted from the detritus of what you do and what you like. Which is a useful thing to have, but a distinctly different experience from writing a blog.

My blog exists separately from me, while my social media profile is a view of me. That doesn't mean that they're not both valuable ways of interacting with the world. Social networks are great for retaining a kind of "situational awareness" of what my friends are doing, and to maintain a basic connection with them. It's like small talk: it doesn't replace real interaction, but it keeps us from becoming strangers between visits. Blogging, on the other hand, is where I feel like I can dig in and engage mentally. I don't have to worry about being rude by taking over someone's stream, or getting hidden behind a filter. It's a space that's all mine to use, controlled by me, and expressly used for my own purposes. A blog is a place to be a little selfish.

From a technical perspective, a blog is the more curated experience. When someone writes a blog entry, it gets published in a standard, exportable format via RSS. It lives in a database (or in my case, a filesystem) that can be edited and moved. Writing on a blog is property that you can own and control, and it starts from a position of ownership. Writing on a social network, although possible to extract through various APIs or Google's Data Liberation Front, is not under your control in the same way. That may not matter to you now, but one day, if you decide that you want to preserve those words--if you think your writing could become a book, or you want to give your favorite entries to a loved one, or if you just want to preserve them for your own satisfaction--a blog is probably a better option.

There are places of overlap, I think. By relaxing the character constraints, Google+ makes it possible to at least present more complex thoughts than Twitter, and it's a better writing experience than Facebook is. But when people say that they're planning on using Google+ as a blog, I can't help but think that what they really mean is "I didn't really want to blog anyway." I'm glad they've found a solution that works for them: not everyone is cut out to be a blogger. Some days I don't feel like it myself. But when I look back on writing here, on how I feel like I could develop a voice and indulge my obsessions, I wouldn't give this up for all the fancy circles in the world.

March 30, 2011

Filed under: culture»internet

Tone Matters

Last week Gina Trapani wrote an insightful post on Smarterware about designers, women, and hostility in open source, including how she has applied that to her social-networking scraper application ThinkUp, in terms of welcoming non-coders and contributors from a diverse range of backgrounds. And Trapani's been working around those kinds of problems for a while now: as the founding editor of the Lifehacker blog, she created a space for tech-minded people that was a breath of fresh air. From the post:

At Lifehacker, my original vision was to create a new kind of tech blog, one that wasn't yet another "boys worshipping tech toys" site, one that was helpful, friendly, and welcoming versus snarky, sensational, and cutting. (That was no small task in the Gawker-verse, and I learned much in the process.) ...

I learned something important about creating a productive online community: leaders set the tone by example. It's simple, really. When someone you don't know shows up on the mailing list or in IRC, you break out the welcome wagon, let them know you're happy they're here, show them around the place, help them with their question or problem, and let them know how they can give back to the community. Once you and your community leaders do that a few times, something magical happens: the newbie who you welcomed just a few weeks ago starts welcoming new folks, and the virtuous cycle continues.
One of the things that has struck me, as I've paid more attention to the tech news ecosystem online, is how rare that attitude really is. Trapani alludes to the general tone of the Gawker properties, which start at "snark" and work down, but (thanks to imitation and diffusion of former Gawker writers) most of the other big tech blogs sound pretty much the same, which is one of the major reasons that I dropped them from my usual reading habits and blocked them in my browser.

The typical industry blogger persona is aggressive, awestruck (both as irony and as an expression of genuine, uncritical neophilia), and uncompromising to other viewpoints. When they break their pose of cynicism, they usually rave in extreme superlatives, as though it simply wouldn't do to be quietly amused by something. The tone is, in other words, exactly what you'd expect from a group of young men who arrested their development at a precocious seventeen years old--and I say this as someone who is not entirely innocent of similar sins, as a trip through the archives here would show.

For as long as I can remember, Lifehacker did something different. Along with Make and Hackaday, their writers were less interested in tossing off snarky burns at the expense of the day's news, and more focused on appreciating the efforts of their communities. Even though Trapani has moved on, it remains an easy-going, welcoming read. I think they deserve kudos for that. It's certainly a style I'm trying to imitate more--to highlight the positive as much as I call out the negative.

When it comes down to it, I think each of us has to ask ourselves whose future we'd prefer to realize. If I were forced to live in a world represented by TechCrunch, or one built by Lady Ada, I know which one would feel more welcoming. The latter values knowledge, in my opinion, while the former values commerical product--like the difference between learning to cook and reviewing frozen dinners, the result is probably less polished, but ultimately more nutritious.

Tone has consequences beyond self-realization: Lifehacker and Make, in particular, have really encouraged me to take a closer look at open-source hardware and software in a positive way, just because their bloggers are relentlessly cheerful, low-key advocates for those communities. As Trapani noted, that kind of appeal can be a virtuous cycle. A diverse, positive tech community should be more likely to apply its energy to projects that reflect its membership, instead of an endless supply of hipster social networks and expensive new hardware. I'm glad there are writers like her out there, trying to make that a reality.

February 15, 2011

Filed under: culture»corporate

Deal

About five years ago, I designed my own business cards. On the front, they had my contact info and a stamp of my name in Mandarin Chinese that I'd gotten in Xi'An. On the back, there was a QR code containing a vCard of everything on the front, which was supposed to show people that I was way ahead of my time (this was, in fact, so far ahead of its time that nobody was ever able to scan one of the stupid things until last year).

Anyway, of the 500 or so cards I had printed, I probably still have 450 of them sitting on a shelf at home. Partly this is because I wasn't nearly as big on actual networking as I was on having a cool business card, but it's also a function of where the world is going: nobody keeps a rolodex full of cards anymore, and our address "books" live behind a touchscreen or on an Exchange server. And while I may have been a bit hasty in adopting them, these days QR codes and digital tags are everywhere. Machine-readable data has invaded the everyday world.

So this year, I'm taking an admittedly small risk and calling it: now is the time to ditch physical business cards. It'll save paper and money, reduce clutter and littering at conferences, and best of all, it'll genuinely put you on the cutting edge of digital networking.

Now you may be saying to yourself, sure, Thomas can do this: he's a bona fide misanthrope, but how can regular people get away with it? Good question. Here's a few easy strategies for going cardless:

  • Install a QR code generator on your smartphone: If you're an Android user, you probably have one already--it comes with Google's stellar Barcode Reader application. Pick a contact, choose "Share" from the menu, and presto: a code pops up that someone can scan right off the screen. I'm sure other platforms have something similar. You'll need to create an address book entry for yourself--something old-school Palm users will remember from the days of IR contact beaming.
  • Upload a vCard: I tried this the other day, and it's surprisingly nifty. Give someone the URL to the card, and it'll download to their phone or computer and prompt to be added to their address book. Once you've got it working, you even could use a link shortener like bit.ly to make a custom (trackable) URL, and just update the .vcf file when your details change. You'll need a server you can control, because it needs to send the content-type as "text/x-vcard" and the content-disposition as "attachment" for Android and Blackberry phones to understand it correctly (an .htaccess file even lets you set the card as the 'index' for a directory). iPhones are, perhaps unsurprisingly, slightly less cooperative.
  • Once again, own your name: I can't repeat this enough. Of course, it helps if your name is relatively uncommon, and even then you never know when an ad agency will try to steal your thunder. But owning a searchable, easy-to-remember domain is a great way to present yourself, not to mention a fine place to host a copy of your QR code and your vCard file.

In a few years, this'll all probably seem like old hat. That's why it's important to jump on the card-less trend now, so we can look down our noses at the luddites handing out paper slips (and manually copying them into their computers) while we still can.

I kid, of course. Seriously, though: set aside the snobbery, the savings in money and paper, the confetti of unwanted cards after a professional meetup, and the chance to demonstrate your new media credentials. Won't it feel good just to not have to carry around a stack of disposable business cards wherever you go? I feel lighter already.

January 25, 2011

Filed under: culture»internet

Blink and You'll Miss It

I tell everyone they should have Firebug or its equivalent installed, and know how to use it. I believe that people will find it invaluable if they're designing a page and want to test something. They might want to do some in-page scripting. They can examine the source for ideas, or to discover hidden items. But most importantly, they can use it to fix your stupid, unreadable, over-styled web page.

The development of HTML5 means that browsers have gotten more powerful, more elaborate, and more interactive. It also means that they can be annoying in new and subtle ways. Back in the day, page authors used <blink> and <marquee> to create eye-catching elements on their flat gray canvas. Nowadays, thanks to pre-made CMS templates, the web superficially looks better, but it's not necessarily easier to read. Take three examples:

  • Text shadow: There's nothing wrong with a little text shadow. It's a classy effect for giving headlines a little pop. But I've noticed a lot of people lately using text-shadow everywhere, which makes my eyes cross trying to focus on mysteriously blurry text. Why is my browser anti-aliasing broken? Oh, it's not, you just hate your audience.

    Even worse are the people who have realized you can give the shadow an offset of zero pixels. If the shadow is dark, this ends up looking like the page got wet and all the ink has run. If it's a lighter shadow, you've got a poor man's text glow. Remember how classy text glow was when you used it on everything in Photoshop? Nobody else does either.

  • Custom fonts: I sympathize with font nerds on the Internet. It must have been painful, being restricted to the built-in browser fonts. On the other hand, those were the defaults for a reason. Now every wannabe font nerd on the Internet can finally use their own Very Special Typeface, many of which are more "fancy" than "legible." But why stop there? Why not customize the letter-spacing and line-height while you're at it?

    I'm not an expert in typesetting or anything, but the effect of these changes--besides sometimes giving Comic Sans a run for its ugly font money--is to throw me out of my browsing groove, and force me to re-acquire a grip on the text with every link to a custom page. If I'm not expecting it, and the font is almost the same as a system font, it looks like a display error. Either way, it's jarring, and it breaks the feeling that the Internet is a common space. Eventually, we'll all get used to it, but for now I hate your custom fonts.

  • "Pixel-perfect" layouts: Once upon a time, we had no real positioning tools in HTML. Oh, there was <table>, but building a layout using tables is like trying to design a travel brochure in Excel. And then someone invented CSS, and between "float" and "position: absolute" they ushered in a whole new era of web design. A new, fragile era, one that was vulnerable to a whole host of CSS sins. If I had a dime for every page I've visited where content overflowed the elaborate layout and got hidden behind an ad or a sidebar, I'd be making money in a very strange way. Setting "display: none" on offending elements has become an unfortunately common part of my browsing experience. It's enough to make me wish for the days when every page was a simple, boring, single black-on-grey column of text.
  • Rebinding key events: Seriously, IGN? Taking away the spacebar and the page-up/down keys for no apparent reason so I can't scroll the page on the rare occasions that I'm linked to your increasingly-awful content farm? You're why we can't have nice things.

It's no wonder, in an environment like this, that style-stripping bookmarklets like Readability caused such a sensation. There's a fine line between interactive design and overdesign, and designers are crossing it as fast as they can. All I ask, people, is that you think before getting clever with your CSS and your scripts. Ask yourself: "if someone else simulated this effect using, say, a static image, would I still think it looked good? Or would I ask them what Geocities neighborhood they're from?" Take a deep breath. And then put down the stylesheet, and let us read in peace.

January 11, 2011

Filed under: culture»internet

Good Grief

Tim Ferriss was a real-world griefer before real-world griefing was cool. Before Anonymous was putting epileptic kids into seizures, DDOSing the Church of Scientology, and harrassing teenage girls for no good reason whatsoever, Ferriss (through sheer force of narcissism) had already begun gaming whatever system he could get his hands on. And now he writes books about it. The question you should be asking yourself, as you read this tongue-in-cheek New York Times review of Ferriss's "four-hour workout" book is, did he write this to actually teach people his idiosyncratic health plan? Or (more likely) is it just the newest way Ferriss has decided to grief the world, via the NYT bestseller list?

Griefing, of course, is the process of exploiting the rules of an online community to make its members miserable. Griefers are the people who join your team in online games, and then do everything possible to sabotage your efforts. It's a malevolent version of the "munchkin" from old-school RPGs, where a player tries to find loopholes in the rules, except that griefers aren't playing to win--they're playing to get a reaction, which is much easier. The key is in the balance--a griefer or munchkin is looking to maximize impact while minimizing effort. That's basically what Ferriss is doing: he power-games various external achievements, like kickboxing or tango, not for their own sake, but to boost his own self-promotional profile.

The problem with writing about reputation griefers like this guy is, for them, there really is no such thing as bad publicity. They want you to hate them, as long as it boosts their search ranking. And there are an awful lot of people out there following similar career plans--maybe not as aggressively, almost certainly not as successfully, but they're certainly trying. They may not realize that they're griefing, but they are. Affiliate marketers? Griefing. Social networking 'gurus' who primarily seem to be networking themselves? Griefing. SEO consultants? Totally griefing.

Like a zen student being hit with a stick, I achieved enlightenment once I looked at the situation this way: it's the Internet equivalent of being a celebrity for celebrity's sake. Or, perhaps more accurately, griefing provides a useful framework for understanding and responding to pointless celebrities elsewhere. Maybe this is one way that the Internet, for all its frustrations and backwardness and self-inflicted suffering, can make us better people.

The one thing I've learned, from years of "Something Is Wrong On The Internet," is that the key to dealing with griefers--whether it's a game of Counterstrike, Tim Ferriss, or the vast array of pundits and shock jocks--is right there in the name. They benefit from getting under your skin, when you treat them as serious business instead of something to be laughed off. As Belle and I often say to each other, you can always recognize people who are new to the dark side of the Internet's ever-flowing river of commentary by the gravity they assign to J. Random Poster. We laugh a little, because we remember when we felt that way (sometimes we still do), before we learned: it takes two people to get trolled. Don't let them give you grief.

May 26, 2010

Filed under: culture»internet

The Classics

Most book-lovers, I think, have a shelf devoted to their favorite books. It's always half-empty, because those are also the books they lend out when someone asks for a recommendation--oh, you haven't read something by X? Here you go. I love that shelf, even if I rarely lend books: it's where the private activity of reading becomes a shared experience, either through borrowing or via representation: these are the books that have deeply affected me. Maybe they'll affect you, too.

Likewise, there is writing on the Internet that is classic: essays, articles, and fiction that get linked and re-linked over time, in defiance of the conventional wisdom that online writing is transient or short-lived. The Classics are a personal call: what goes on your mental shelf of great online writing won't be the same as mine, and that's okay. This post is a collection of the items that I consider must-reads, accumulated over years of surfing. As I dig stuff out of my memory, I'll keep adding more.

  • Flathead, by Matt Taibbi: the definitive critique of Thomas Friedman. Everything you need to know about one of the world's worst pundits.
  • Creating The Innocent Killer, by John Kessel: an evisceration of Orson Scott Card's manipulative, deceptive Ender novels.
  • Host, by David Foster Wallace. A lengthy profile of radio host John Ziegler, not to mention a brilliant example of hypertextual footnotes.
  • The Zompist Phrasebook, edited by Mark Rosenfelder: the last language phrasebook you'll ever need.
  • The Rhetoric of the Hyperlink, by Venkatesh Rao: essential reading for people going from print to online.
  • The Grim Meathook Future, by Josh Ellis: there are problems with this essay if it's taken straight, as Ellis found out when he presented it to an international hackers' conference. But as a short, punchy antidote to techno-utopian thinking--a way to puncture the silicon valley bubble--I think it's invaluable.

May 11, 2010

Filed under: culture»internet

Face/Off

So, you're thinking about deleting your Facebook account. Good for you and your crafty sense of civil libertarianism! But where will you find a replacement for its omnipresent life-streaming functionality? It's too bad that there isn't a turnkey self-publishing solution available to you.

I kid, of course, as a Cranky Old Internet Personality. But it's been obvious to me, for about a year now, that Facebook's been heading for the same mental niche as blogging. Of course, they're doing so by way of imitating Twitter, which is itself basically blogging for people who are frightened by large text boxes. The activity stream is just an RSS aggregator--one that only works for Facebook accounts. Both services are essentially taking the foundational elements of a blog--a CMS, a feed, a simple form of trackbacks and commenting--and turning them into something that Grandma can use. And all you have to do is let them harvest and monetize your data any way they can, in increasingly invasive ways.

Now, that aspect of Facebook has never particularly bothered me, since I've got an Internet shadow the size of Wyoming anyway, and (more importantly) because I've largely kept control of it on my own terms. There's not really anything on Facebook that isn't already public on Mile Zero or my portfolio site. Facebook's sneaky descent into opt-out publicity mode didn't exactly surprise me, either: what did you expect from a site that was both free to users and simultaneously an obvious, massive infrastructure expense? You'd have to be pretty oblivious to think they weren't going to exploit their users when the time came to find an actual business model--oblivious, or Chris Anderson. But I repeat myself.

That said, I can understand why people are upset about Facebook, since most probably don't think that carefully about the service's agenda, and were mainly joining to keep in touch with their friends. The entry price also probably helped to disarm them: "free" has a way of short-circuiting a person's critical thought process. Anderson was right about that, at least, even if he didn't follow the next logical step: the first people to take advantage of a psychological exploit are the scammers and con artists. And when the exploit involves something abstract (like privacy) instead of something concrete (like money), it becomes a lot easier for the scam to justify itself, both to its victims and its perpetrators.

Researcher danah boyd has written extensively about privacy and social networking, and she's observed something interesting about privacy, something that maybe only became obvious when it was scaled up to Internet sizes: our concept of privacy is not so much about specific bits of data or territory, but our control over the situations involving it. In "Privacy and Publicity in the Context of Big Data" she writes:

It's about a collective understanding of a social situation's boundaries and knowing how to operate within them. In other words, it's about having control over a situation. It's about understanding the audience and knowing how far information will flow. It's about trusting the people, the situating, and the context. People seek privacy so that they can make themselves vulnerable in order to gain something: personal support, knowledge, friendship, etc.

People feel as though their privacy has been violated when their expectations are shattered. This classicly happens when a person shares something that wasn't meant to be shared. This is what makes trust an essential part of privacy. People trust each other to maintain the collectively understood sense of privacy and they feel violated when their friends share things that weren't meant to be shared.

Understanding the context is not just about understanding the audience. It's also about understanding the environment. Just as people trust each other, they also trust the physical setting. And they blame the architecture when they feel as though they were duped. Consider the phrase "these walls have ears" which dates back to at least Chaucer. The phrase highlights how people blame the architecture when it obscures their ability to properly interpret a context.

Consider this in light of grumblings about Facebook's approach to privacy. The core privacy challenge is that people believe that they understand the context in which they are operating; they get upset when they feel as though the context has been destabilized. They get upset and blame the technology.

This is why it's mistaken to claim that "our conception of privacy has changed" in the Internet age. Private information has always been shared out with relative indiscretion: how else would people hold their Jell-o parties or whatever they else did back in the olden days of our collective nostalgia? Those addresses and invitations weren't going to spread themselves. The difference is that those people had a reasonable expectation of the context in which their personal information would be shared: that it would be confined to their friends, that it would used for a specific purpose, and that what was said there would confine itself--mostly--to the social circle being invited.

Facebook's problem isn't just that the scale of a "slip of the tongue" has been magnified exponentially. It's also that they keep shifting the context. One day, a user might assume that the joke group they joined ("1 Million Readers Against Footnotes") will only be shared with their friends, and the next day it's been published by default to everyone's newsfeed. If you now imagine that the personal tidbit in question was something politically- or personally-sensitive, such as a discussion board for dissidents or marginalized groups, it's easy to see how discomforting that would be. People like me who started with the implicit assumption that Facebook wasn't secure (and the privilege to find alternatives) are fine, but those who looked to it as a safe space or a support network feel betrayed. And rightfully so.

So now that programmers are looking at replacing Facebook with a decentralized solution, like the Diaspora project, I think there's a real chance that they're missing the point. These projects tend to focus on the channels and the hosting: Diaspora, for example, wants to build Seeds and encrypt communication between them using PGP, as if we were all spies in a National Treasure movie or something. Not to mention that it's pretty funny when the "decentralized" alternative to Facebook ends up putting everyone on the same server-based CMS. Meanwhile, the most important part of social networks is not their foolproof security or their clean design--if it were, nobody would have ever used MySpace or Twitter. No, the key is their ability to construct context via user relationships.

Here's my not-so-radical idea: instead of trying to reinvent the Facebook wheel from scratch, why not create this as a social filter plugin (or even better, a standard service on sites like Posterous and Tumblr) for all the major publishing platforms? Base it off RSS with some form of secure authentication (OpenID would seem a natural fit), coupled with some dead-simple aggregation services and an easy migration path (OPML), and let a thousand interoperable flowers bloom. Facebook's been stealing inspiration from blogging for long enough now. Instead of creating a complicated open-source clone, let's improve the platforms we've already got--the ones that really give power back to individuals.

Past - Present - Future