this space intentionally left blank

May 11, 2010

Filed under: culture»internet

Face/Off

So, you're thinking about deleting your Facebook account. Good for you and your crafty sense of civil libertarianism! But where will you find a replacement for its omnipresent life-streaming functionality? It's too bad that there isn't a turnkey self-publishing solution available to you.

I kid, of course, as a Cranky Old Internet Personality. But it's been obvious to me, for about a year now, that Facebook's been heading for the same mental niche as blogging. Of course, they're doing so by way of imitating Twitter, which is itself basically blogging for people who are frightened by large text boxes. The activity stream is just an RSS aggregator--one that only works for Facebook accounts. Both services are essentially taking the foundational elements of a blog--a CMS, a feed, a simple form of trackbacks and commenting--and turning them into something that Grandma can use. And all you have to do is let them harvest and monetize your data any way they can, in increasingly invasive ways.

Now, that aspect of Facebook has never particularly bothered me, since I've got an Internet shadow the size of Wyoming anyway, and (more importantly) because I've largely kept control of it on my own terms. There's not really anything on Facebook that isn't already public on Mile Zero or my portfolio site. Facebook's sneaky descent into opt-out publicity mode didn't exactly surprise me, either: what did you expect from a site that was both free to users and simultaneously an obvious, massive infrastructure expense? You'd have to be pretty oblivious to think they weren't going to exploit their users when the time came to find an actual business model--oblivious, or Chris Anderson. But I repeat myself.

That said, I can understand why people are upset about Facebook, since most probably don't think that carefully about the service's agenda, and were mainly joining to keep in touch with their friends. The entry price also probably helped to disarm them: "free" has a way of short-circuiting a person's critical thought process. Anderson was right about that, at least, even if he didn't follow the next logical step: the first people to take advantage of a psychological exploit are the scammers and con artists. And when the exploit involves something abstract (like privacy) instead of something concrete (like money), it becomes a lot easier for the scam to justify itself, both to its victims and its perpetrators.

Researcher danah boyd has written extensively about privacy and social networking, and she's observed something interesting about privacy, something that maybe only became obvious when it was scaled up to Internet sizes: our concept of privacy is not so much about specific bits of data or territory, but our control over the situations involving it. In "Privacy and Publicity in the Context of Big Data" she writes:

It's about a collective understanding of a social situation's boundaries and knowing how to operate within them. In other words, it's about having control over a situation. It's about understanding the audience and knowing how far information will flow. It's about trusting the people, the situating, and the context. People seek privacy so that they can make themselves vulnerable in order to gain something: personal support, knowledge, friendship, etc.

People feel as though their privacy has been violated when their expectations are shattered. This classicly happens when a person shares something that wasn't meant to be shared. This is what makes trust an essential part of privacy. People trust each other to maintain the collectively understood sense of privacy and they feel violated when their friends share things that weren't meant to be shared.

Understanding the context is not just about understanding the audience. It's also about understanding the environment. Just as people trust each other, they also trust the physical setting. And they blame the architecture when they feel as though they were duped. Consider the phrase "these walls have ears" which dates back to at least Chaucer. The phrase highlights how people blame the architecture when it obscures their ability to properly interpret a context.

Consider this in light of grumblings about Facebook's approach to privacy. The core privacy challenge is that people believe that they understand the context in which they are operating; they get upset when they feel as though the context has been destabilized. They get upset and blame the technology.

This is why it's mistaken to claim that "our conception of privacy has changed" in the Internet age. Private information has always been shared out with relative indiscretion: how else would people hold their Jell-o parties or whatever they else did back in the olden days of our collective nostalgia? Those addresses and invitations weren't going to spread themselves. The difference is that those people had a reasonable expectation of the context in which their personal information would be shared: that it would be confined to their friends, that it would used for a specific purpose, and that what was said there would confine itself--mostly--to the social circle being invited.

Facebook's problem isn't just that the scale of a "slip of the tongue" has been magnified exponentially. It's also that they keep shifting the context. One day, a user might assume that the joke group they joined ("1 Million Readers Against Footnotes") will only be shared with their friends, and the next day it's been published by default to everyone's newsfeed. If you now imagine that the personal tidbit in question was something politically- or personally-sensitive, such as a discussion board for dissidents or marginalized groups, it's easy to see how discomforting that would be. People like me who started with the implicit assumption that Facebook wasn't secure (and the privilege to find alternatives) are fine, but those who looked to it as a safe space or a support network feel betrayed. And rightfully so.

So now that programmers are looking at replacing Facebook with a decentralized solution, like the Diaspora project, I think there's a real chance that they're missing the point. These projects tend to focus on the channels and the hosting: Diaspora, for example, wants to build Seeds and encrypt communication between them using PGP, as if we were all spies in a National Treasure movie or something. Not to mention that it's pretty funny when the "decentralized" alternative to Facebook ends up putting everyone on the same server-based CMS. Meanwhile, the most important part of social networks is not their foolproof security or their clean design--if it were, nobody would have ever used MySpace or Twitter. No, the key is their ability to construct context via user relationships.

Here's my not-so-radical idea: instead of trying to reinvent the Facebook wheel from scratch, why not create this as a social filter plugin (or even better, a standard service on sites like Posterous and Tumblr) for all the major publishing platforms? Base it off RSS with some form of secure authentication (OpenID would seem a natural fit), coupled with some dead-simple aggregation services and an easy migration path (OPML), and let a thousand interoperable flowers bloom. Facebook's been stealing inspiration from blogging for long enough now. Instead of creating a complicated open-source clone, let's improve the platforms we've already got--the ones that really give power back to individuals.

June 25, 2009

Filed under: culture»internet

Own Your Name

When Facebook recently announced that users would be getting their own human-readable usernames and corresponding URLs, Anil Dash linked back to his 2002 piece, Privacy through Identity Control:

...if you do a simple Google search on my name, what do you get? This site.

I own my name. I am the first, and definitive, source of information on me.

One of the biggest benefits of that reality is that I now have control. The information I choose to reveal on my site sets the biggest boundaries for my privacy on the web. Granted, I'll never have total control. But look at most people, especially novice Internet users, who are concerned with privacy. They're fighting a losing battle, trying to prevent their personal information from being available on the web at all. If you recognize that it's going to happen, your best bet is to choose how, when, and where it shows up.

It was good advice then, and it's good advice now. It's especially good advice for people in my field, new media and online journalism. Own your name: buy the domain, set up a simple splash page or a set of redirection links, or go all out and create a rarely-updated work portfolio. But leaving your Internet shadow up to chance is simply not an option for us anymore.

Here's an example: This week, I got an e-mail in my work inbox from someone who wants to work for us. Well, actually, he's interested in "pitching ideas for new online projects," and he has "a Logline Synopsis and a variety of treatments ready to send upon request." What he doesn't provide is links to any past work, or any hints as to what he wants to do. That's his first mistake: this isn't Hollywood, it's the Internet. We don't want your pitches, we want links and examples, and anyone who doesn't understand that probably isn't someone with whom we want to build online projects.

But it's possible, for very small values of possible, that someone who is aware of all Internet traditions would forget about the humble link, or would be wary of releasing their revolutionary ideas into the wild without keeping them under tight control. So I did what any prospective employer would have done: typed the applicant's name into Google.

The very first link--I kid you not, the first and only link for this guy's name--was a YouTube entry labeled "demo reel" by a username very similar to the applicant's e-mail address. Contained inside were five minutes of poorly-cut, VHS-quality video seemingly from a college TV station, focusing mainly on fratboy humor like asking groups of girls embarrassing sexual questions and being punched in the groin (not at the same time, unfortunately). As far as the Internet is concerned, that's Applicant X's identity. Think he'll get any response on his pitches for "new online projects?"

If you work in a fairly traditional job, or even a low-intensity information technology job, a minimal online presence--maybe even through something like a LinkedIn or Facebook URL--is probably fine. But if, like me, your job is to make digital content (of any variety) specifically for the Internet, you need to do more than that. You need to own your name.

June 10, 2009

Filed under: culture»internet»excession

It Ain't Broke

"You're a tinkerer," the IT guy says to me.

This is not entirely a compliment. I've just been describing how I had to hard-reset my phone yesterday, after a botched process involving root access, the application caches, and the Android marketplace. It was entirely my own fault, mind you, and completely predictable. Almost a week between purchase and the first reformat? For me, that is superhuman restraint.

The IT guy would probably appreciate this more if he didn't spend his workday cleaning up other people's computer messes, to the point where it's not terribly amusing any more. But he's not having to clean up mine, so instead he just tells me that I'm a tinkerer, in the same tone of voice that most people would say "oh, you're a chemical weapons engineer" or "oh, you have rabies." That's interesting, the tone says, maybe you could tell me more about it from a little further away.

I don't mind. I'm reminded of something Lance Mannion wrote about the his Uncle Merlin and the "tinker unit" a couple of years back:

Changing a light bulb, caulking a window, nailing down a loose floorboard on the deck, hanging a picture---these are all acts of puttering.

Tinkering is the self-directed, small but skillful, not necessarily necessary work of actual home repair and improvement. There's an experimental quality to tinkering, as well. When you sit down---or kneel down, squat down, or lie down and crawl under something---to tinker, you don't always know exactly what you're going to do. You're going to try something to see if it does the trick.

Tinkering includes the possibility of using a screwdriver, a wrench, or a pair of pliers, possibly even a voltage meter, and preferably all four. To putter, you might need a screwdriver, but usually you can get the job done with a hammer or a paintbrush.

If you go out to the garage to spray some WD-40 on the tracks of your squeaky garage door, you're puttering. If you install a new automatic garage door opener, you're tinkering.

Changing the oil on your car is a putter. Installing new belts and hoses, especially if the car doesn't really need new belts and hoses yet, is a tinker.

Pouring a new garage floor or rebuilding the car's engine are serious jobs that the words tinker and putter don't begin to describe.

I just changed the filter on our furnace. That was a putter.

But the furnace has been a bit balky the last couple of days and even refused to kick on last night until I went downstairs to tinker with it. I checked the filter, saw that I'd need to change it in the morning---Note: The label on the filter says 30 Day Filter and it means what it says---but for the moment all I could do was pluck dust off it and shake dirt out of it. I put new duct tape around the joints on the outtake pipes. Tripped the circuit breaker a few times. Heard a small, sad click and then an ominous and disheartening silence from the furnace. Went upstairs to re-read the troubleshooting guide in the manual. Heard the burners ignite at last, closed the manual, and went to bed, congratulating myself on a job well done.

That was tinkering.

He's talking about home repair and I'm talking a kind of generalized electronic interference, but they're the same thing. It's the "not necessarily necessary" part that links them. Tinkering is less about problems, more about projects and potential.

Affinity for tinkering is one way to sort the population, I think. Some people get it, some people don't. Belle is one of the ones who doesn't. She has learned to dread those times when a home purchase suggestion is met with the response "oh, we could just make one of those." She also watches with amusement when I find a new project--such as, a couple of weeks ago, when I decided to make a case for my old phone, since the one I'd been using was falling apart. I wanted one of those magnetic cases, but the ones for Blackberries are too short, and the ones that aren't too short are so wide that the phone would slide back and forth and drive me batty.

No problem, I said, and I dragged her to the fabric store, where I bought some jean rivets. Then I found one of the too-short cases online for a couple of bucks (plus shipping and handling, still a deal!), snipped the leather clasp in two, and used the rivets and a part of the old case to extend it just far enough to close around the Nokia. It was my first time riveting something. I really enjoyed it, and said so. Belle rolled her eyes at me.

To some extent, I can understand where she's coming from, since I've been there myself. My family also tends to be hands-on, which makes me suspect that it may be an inherited (or at least acquired) trait, and it's certainly a lot less fun to be involved in someone else's tinkering. Which is not to say that it holds no rewards: my dad recently sold one of his kayaks, and the buyer specifically requested the one with the nose art.

My goal lately has not been to eliminate tinkering, but to make sure it's channeled in productive directions. For example, one of my regular projects has been upgrading the video drivers on my laptop--I'm always seduced by the thought of a few more frames per second, or a slightly-smoother game of Team Fortress 2. Invariably, this has become a mistake: while the early Lenovo drivers might have been a bit buggy, at this point they've pretty much caught up to the hacked releases, and all I get for my trouble is a long night of restoring backups and rebooting. Better just to leave it alone, or at least find less tedious things to disrupt.

The nice thing about digital tinkering, as opposed to the home infrastructure kind, is that there are ways nowadays to make sure that all you lose is time. That's part of the reason I love mobile platforms and virtual machines: in both cases, mess something up and all you've lost is less than an hour, most of which is just restoring from the default image. If only there were a way to say the same for our apartment, since then I wouldn't have a large packet of rivets, a Dremel tool, a box of half-disassembled guitar pedals, and several yards of unused vinyl lying around.

Or maybe I just need the right project for them. Any ideas?

March 23, 2009

Filed under: culture»internet

Boom

I've never particularly cared for Kevin Kelly, but the man's outdone himself this time. In a post quoted at Global Guerrillas, he writes that "we are all collapsitarians these days" because progress is boring, so we all secretly hope that the civilization will break down.

Yeah. Wait, what?

There are two kinds of really stupid reasoning going on here. The first is that he opens the post with a chart of Google Trends for "collapse" and "depression," both of which have spiked since mid-2007. Friedman-like, Kelly reads a lot into the word "collapse," a trend which could be more simply explained by the financial markets, you know, collapsing, and the fact that there's only so many ways that journalists can describe a market breakdown before they start to hit the more obscure parts of the thesaurus. It doesn't mean that the world's population suddenly became infatuated with dystopia. But then, you don't get a reputation as a tech visionary by using common sense.

Hence Kelly's second mistake, in which he decides that these brand-new "collapsitarians" come in six varieties, including luddites, anti-globalists, and conservationists. I say that these are brand-new, because Kelly writes that their existence is "surprising." Why it's surprising, I have no idea. None of the ideologies named began in mid-2007. None of them have been particularly altered by the financial crash, although I imagine the anti-globalization crowd is feeling pretty smug. Why is it surprising? Particularly to Kelly, a person who has been (according to Paula Borsook's Cyberselfish) a pretty hard-core Christian, the existence of apocalyptic or end-times movements should be familiar, historically if not personally. Does the Great Awakening ring any bells?

Now, you may ask why we need to worry about Kelly, who to the outside observer just looks like another geek with odd-looking facial hair (seriously: his headshot seems to have been taking right before he went out to churn some butter, raise a barn, and perhaps sell some fine quilts to Pennsylvania tourists). But of course, as a former editor of Wired and a figure of some standing online (albeit much diminished), Kelly acts as a kind of weathervane for the flakier parts of Internet culture. While those with a healthier viewpoint have begun to think multi-generationally, Kelly represents the people for whom a future without shiny jetpacks and nanotech is unbearably boring. This outlook is one of dangerous extremism that we can't afford.

In many ways, we've already moved beyond our previously-imagined futures. I remember reading William Gibson's Virtual Light in high school, which includes a passage about trucks running on cooking oil that smell like fried chicken, and thinking "Huh. That'd be pretty wild." This weekend I went back to my university for a forensics reunion and ate at a brand-new cafeteria, where all the cooking oil is recycled into bio-diesel. That may not be jetpacks, but how can you say it's not fascinating? What kind of person can look at the dilemmas we face, as well as the solutions we're creating, and not be excited--indeed, who would look forward to destruction instead of inspiration?

"Collapsitarianism" is, at its most basic, a kind of tantrum: you didn't get exactly what you wanted, so you'd rather tear it all down. I'm sorry that you picked the wrong future, guys. But the sign of an actual adult is that they recognize when circumstances have changed, and adapt to them. The process of solving ecological and social problems is going to be very exciting. There's going to be plenty of wizardry to go around without crying that the world looks more like Herbert than Heinlein.

Perhaps the root problem is that we continue to make a distinction between present and future, as if there were a solid break between the two. There's not, of course. The future is just an extension of where we are now. Ironically, this is part of the point of the Long Now Foundation, on which board Kelly sits. But where the Long Now decries a culture in which "people in 1996 actually refer to the year 2000 as 'the future'", I think we should close the gap tighter. We need to get used to the idea of the future as connected and intertwined with modern times--we already live in the future, in other words. By placing ourselves on the arc of history, instead of imagining it vaguely in front of us, it's easier to spur ourselves to action. It certainly beats waiting for the collapse.

February 11, 2009

Filed under: culture»internet

Singularity U

Don't look now, but higher education just got higher:

Singularity University derives its name from Dr. Ray Kurzweil's book "The Singularity is Near." The term Singularity refers to a theoretical future point of unprecedented advancement caused by the accelerating development of various technologies including biotechnology, nanotechnology, artificial intelligence, robotics and genetics.

Singularity University makes no predictions about the exact effects of these technologies on humanity; rather, our mission is to facilitate the understanding, prediction and development of these technologies and how they can best be harnessed to address humanity's grand challenges. The University has uniquely chosen to focus on such a group of exponentially growing technologies, in an interdisciplinary course structure. These accelerating technologies, both alone and when applied together, have the potential to address many of the world's grand challenges.

The other diploma mills are kicking themselves for not thinking of this sooner. Being able to charge $25K to rehash Moravec and talk about how robots will eradicate world hunger? Sign me up!

In all seriousness, though, the real disappointment is that there's an actual niche to be filled, and Singularity University misses it by a mile. After all, we live in a time when technology has had incredible consequences for the way we live, and the future we create together: climate change, I suspect, is going to radically alter the tone of innovation going forward (if it hasn't already--see the recent emphasis on green datacenters and the carbon cost of Google searches). But even in just this one area, SU can't even devote an entire course to it: it gets a minor part of the "Energy and Ecological Systems" section, about equal to the amount devoted to space travel and (tellingly) venture capitalism.

Indeed, the entire curriculum is comical. A path for futurism, in a university named for the event after which technological change becomes impossible to predict? And more importantly, an interdisciplinary program that breaks its studies down into technological disciplines like "Nanotechnology" and "AI & Robotics?" That's a total conceptual failure. Worldchanging's Jamais Cascio gets it right in the comments for his reaction post when he writes:

I proposed the social-centered structure for a few reasons, but they all come down to moving away from the unidirectional technological change -> social change model that seems so commonplace in emerging tech/Singularity discussions.

Implicit in a structure that focuses on particular technology categories is a "here's why this tech is nifty/dangerous" approach. By focusing instead on areas of impact, I'm pushing a "here's an important issue, let's see how the different techs get woven through" model. Both may talk about technologies and society, but the tech-centered structure looks for cause and effect, while the social-centered structure looks for interactions.

Singularity University is distinctly oriented toward a method of thinking where technology leads to social change--unsurprising, since that's much of the appeal of singulatarianism itself. But technology isn't created or used in a vaccuum. Look at development, for example: fifty years of the IBRD trying to build open markets via top-down structural adjustment loans, completely blindsided by microfinance and the ability to transfer cellular minutes. Terrorists in Mumbai are using Blackberries and information technology to coordinate their attacks. Not to mention the rise of the netroots as a political organization that has not only shaped the electoral process, but altered the policies (open government, net neutrality, creative commons) that it demands.

These innovations are not stories of emerging technology with direct, predictable outcomes. They're all rooted deeply in the social and cultural context in which they evolved, and they trade ideas across non-contiguous domains--who would have thought that Daily Kos would borrow community management methods from Slashdot, for example? And yet Singularity University seems to have put together its mission without considering these kinds of Black Swans: invent X technology, they seem to be saying, and Y or Z social impact will follow (or can be guided by visionaries) in a linear fashion. It's a predictive viewpoint straight out of Heinlein-era science fiction, and frankly it's irresponsible. Even assuming that the institution really does foster "the development of exponentially advancing technologies" (if such a thing is at all desirable), it's an act of phenomenal hubris to think that those same leaders will be the ones to "apply, focus and guide these tools" (quotes directly from the SU mission statement).

We could spend all day picking out the inconsistencies and missteps in the SU plan, like the fact that their "international" university has a faculty that's so very white and American. But the wider point remains: at a time when the cost of intellectual overconfidence has been driven home economically and ecologically, Singularity University wants to charge CEOs and government leaders $25,000 to tell them that they're in control of the future. For an academic insitution, that's a pretty big lesson they seem to have missed.

February 2, 2009

Filed under: culture»internet»excession

Time Out

Like a lot of people, I have a hard time leaving well enough alone when it comes to Internet argumentation. And the Internet being what it is, there's a lot of argument out there. Tech forums, political blogs, the extremists who got my e-mail address from Ars and decided to add me to their lunatic press release list... the available incendiary material is endless. And in some way's, that's a good thing: I believe strongly that the Internet's soup of ideas and opinion, debated rationally, can be a great place to learn and explore.

That said, it can also be stressful, and possibly hazardous. I find it way too easy to get into a cycle of comment/refresh/comment, fuming the entire time--and even when I think I've pulled myself away, there's a certain compulsion to check the laptop and keep the cycle going. Indeed, it's probably a good idea that few people read this: I've had the experience of hostile commenters here, and it wasn't worth the stress.

If you have this problem as well--and in my experience, most moderately-opinionated people can fall into this behavior online--it might be helpful to mandate a cooldown period. This weekend, when I found myself starting to obsess a little over a minor point of disagreement, I took a deep breath and then installed BlockSite in Firefox to keep myself away from the site getting me worked up. After a couple of days, I unblocked it--but by that point, I'd gotten some emotional distance. It's not artificial self control, merely assisted.

There are people who believe that the problem with the Web (and, to some extent, the problem with modern life in general) is that it's too fast, too much, and too easy. I don't really agree with that. The way I see it, if technology gives us tools for wreaking havoc in one way or another, it also gives us tools to keep the situation under control. Just because progress makes something possible, it doesn't make it inevitable. Indeed, while it's not always possible to take personal responsibility for the excesses of technology, but this is one of those cases. And since I enjoy the advantages that progress brings, I don't think a little augmented restraint on my part is too much to ask--particularly when it improves my own emotional health as well. Better to have that choice, and not use it, than never to have the choice at all.

January 27, 2009

Filed under: culture»internet

A Merry Life, and a Short One

Apparently it's in Wired's lease or something that every year they have to write another article about crazy libertarians who think they can recreate Galt's Gulch on a floating ocean city (this one's the grandson of Milton Friedman, of all people). I assume it's next to the clause requiring Wired to dedicate a certain amount of annual space to Ray Kurzweil, since in both cases there's never anything new, or even remotely feasible, to write about without a legally-binding reason to do so.

The reoccurrence of Rapture is usually a good opportunity to link to China Mieville's dissection of the Freedom Ship just last year. But of course, this year the seas have been uncommonly dramatic, thanks to the rise of Somali piracy. It's hard not to daydream, so long as there could be no actual chance of it happening, of Patri Friedman's Seasteading Institute drifting into the waters surrounding the Horn of Africa--only to be confronted by a very real example of weak governance and market dysfunction, via the business end of a second-hand AK47.

Linking the two together, both Mieville and Johann Hari (who wrote "You Are Being Lied To About Pirates" regarding the Somalian situation) reference the work of Marcus Rediker when discussing their respective sea communities. Rediker wrote The Many-Headed Hydra in 2002 with Peter Linebaugh, examining the lineage of revolutionary thought (including pirates) in the Atlantic, then he followed it in 2005 with Villains of All Nations, which focused on the "golden age" of piracy. Both books highlight the radical political experimentation that arose in the pirate communities of the 17th and 18th centuries: pirates elected their captains, shared their booty equally, and lived in a roughly egalitarian society that was multiethnic, multicultural, and even relatively subversive in its opportunities for women. It was also fervently anti-state, with pirates declaring that they were "from the sea." Before any of the libertarian sailors-to-be consider appropriating this legacy, however, they'd probably do well to remember that the pirates of the Atlantic were primarily opposed to--and spawned from--the rudimentary free market being established as the new social order of the day. They often turned to piracy in reaction to the brutal treatment of their merchant captains, and took great pleasure in destroying property and sinking merchant ships. The pirates were, in other words, early anti-globalization activists of the most violent kind.

As Hari points out, the parallels to the events off the coast of Somalia are striking. In the absence of a functioning national coast guard, the country has become prey for European commercial ships that fish illegally in Somali waters--or worse, use them as a dumping ground for toxic waste. Hari writes: "This is the context in which the 'pirates' have emerged. Somalian fishermen took speedboats to try to dissuade the dumpers and trawlers, or at least levy a 'tax' on them. They call themselves the Volunteer Coastguard of Somalia - and ordinary Somalis agree. The independent Somalian news site WardheerNews found 70 per cent 'strongly supported the piracy as a form of national defence'."

This is not to defend the actions of pirates, but it is instructive to consider the full picture--and to appreciate the use of the terminology. Both then and now, as Rediker says in his introduction for Villains, sea piracy represents "a terror of the weak against the strong." A similar motivation exists for enterprises like the Seasteading Institute, but their definition of the "weak" and the "strong" is very different.

Given those facts, one striking thought after reading these histories is how the use of the word "pirate" in an intellectual property context is an utter debasement of the term. The golden age pirates existed in opposition to the exploitative labor practices and social structure of the day. The Somali pirates exist in opposition to environmental and commercial exploitation. But what exploitation does the software pirate, or the music pirate, oppose? The harsh world of having to pay for goods? Ironically, referring to IP theft as "piracy" serves the interest of both sides. For the thieves, it glorifies their actions by association with a glamorous history of rebellion. For the commercial interests, it distracts from legitimate issues of intellectual property, like sampling and fair use.

I have a solution, of course. We're going to need a very large boat...

December 18, 2008

Filed under: culture»internet

The Red Rose

I have a soft spot for web sites that maintain the classic "Geocities circa 1999" look and feel. Belle and I both started building HTML back in those days, when marbled backgrounds and marquee tags were the hottest technologies going. We remember them fondly, even if they look funny now. It's Gen-Y kitsch.

My favorite example of this is Bass Northwest, Seattle's premiere boutique bass dealer. Great store, and host to a fine collection of animated horizontal rules and fake 3D text in .GIF form (always a nice touch in an age of slow mobile Internet access). It's not that they've forgotten about it--the stock is constantly updated online. It's just that they have better things to do than to learn anything other than the H1 and P tags. Honestly, I respect that.

But my boss has, today, forever "won" this particular contest. Behold: The Red Rose Inn and Suites of lovely Plant City, FL, where her high school reunion will be held next summer. Words cannot adequately describe it. As she says, "It's like every single thing that I ever made fun of in high school came slamming back in one big pink opera-gloved fell swoop." I highly recommend the virtual tour on the left rail.

I'm really tempted to come up with a pre-2000 CSS flavor now, complete with beveled borders on all the div tags, a la Netscape Navigator 3.0.

What's your favorite Web dinosaur? Anyone else miss those old table-based tar pits?

December 16, 2008

Filed under: culture»internet

All of Us Sidekicks

On Friday, someone named Aaron Swartz pointed out the obvious: that user-friendly smartphones had existed before a certain computer company entered the market, but were ignored for cultural reasons.

But, of course, neither minorities nor schoolchildren rule the world, so the Sidekick has been written out of history. 2007 was the first time anyone had thought to give a smartphone a decent UI, or a web browser, or an over-the-air application store. Well, at least it was the first time anyone thought to tell white people.
Shockingly (or not), hipster gadget bloggers like Joel Johnson overreacted, posturing like they'd been accused of Klan membership for owning an iPhone. There's no reason to reprint his rant, since it's just embarrassing (particularly that Johnson has room to accuse anyone of being "mouthbreathingly turtlenecked"), but I did find this graf interesting:
Sociologically and culturally it was a trend of note. While I'm positive that consumer electronic choice often breaks down differently over cultural and economic lines, phones are one of the few items that we commonly can observe on the street. But to presume that there is some sort aspect of ignorance by "white people" in passing the Sidekick by -- especially when "white people" almost certainly means "working adults" -- is exactly the sort of goofy injection of race into an argument that drives me crazy.
In other words, Johnson recognizes that there are different class issues (and make no mistake, Swartz's point is really more about class than race) related to technology adoption--and then he goes ahead and dismisses those issues anyway, preferring to get hung up on the phrase "white people" and insist that only the features he finds useful (as an upper/middle class, white hipster) are the ones that count "holistically" in a great smartphone.

Now, no-one's going to argue that tech blogs are a source of enlightened social discourse. But I think there is a deeper issue here that we--let's flatter ourselves as fairly cutting-edge folks--could stand to examine, and that's how class informs both the technology we use and how we use it.

Take SMS/MMS, for example. I hate getting SMS, personally. I grew up at a time when none of my friends had cell phones, but we were always within 30 feet of a computer with e-mail access. I never used text messages on my old phone, and now I have a phone that does e-mail, so I don't use it now. All I see when I get an SMS is $.10 extra on my phone bill and a technology that's a bit kludgy and error-prone. The iPhone doesn't even really support MMS--a fact that's led a lot of fans to dismiss the technology out of hand.

But of course, some really exciting stuff is being done with SMS. Systems like FrontlineSMS, which give activists powerful tools for organizing and communicating, and the Obama campaign's announcement database are fascinating solutions built on top of it. And even if that weren't the case, it's still snobbish of me to look down on SMS. After all, what is Twitter but SMS blogging--in other words, SMS/MMS for rich white people?

This is one of the reasons that I love reading blogs like Afrigadget: it's a way to get outside of my comfort zone and see people fixing problems using technology that I would have wrongly ignored as "unsophisticated" or clumsy. And perhaps they are, but here's the thing: they're also cheap, widespread, and egalitarian, not just outside the US but inside as well. As someone whose job is to get news and information out to different audiences using technology, it's good to keep principles of resource-constraint in mind. Even tech-savvy people can find themselves in a situation where a text message--or similarly unglamourous medium--is the most effective way to make a query or get an announcement.

The digital divide isn't just physical or economic. It's also a factor of class culture, particularly since the people who are getting the venture capital funding are from the upper levels of that culture. They're the people who don't think MMS has any particular use, who have shiny smartphones with really nice web browsers, who see a future in rich Internet applications and high-bandwidth multimedia. As Dan Lyons wrote while attending one Web 2.0 conference:

My first reaction was that in the greater scheme of things (economy in free fall, war in Iraq, global warming, energy crisis, not to mention the old reliables like cancer and poverty and AIDS, etc.) this challenge of finding a good restaurant seems like a fairly trivial and unimportant problem for our big geek brains to be trying to solve. If I were funding these guys I might go home scratching my head about what those kids are doing with all of my millions. Maybe there is a point to what they're doing, but honestly, what great problem are these companies trying to solve? Sitting there watching this spectacle - watching these guys unable to simply explain what they do and and how they are going to make a business out of it - it was staggering to think that someone has entrusted these people with very large sums of money. But someone has. I weep for those people.
Unlike Lyons, I don't weep for the money men, the venture capitalists, or the eager young startups. My sympathy is with the rest of us. Because eventually, a lot of us are going to end up on the wrong end of the curve. And when we do, I'd like to think that there's going to be more waiting for us than the scorn of the in-crowd at BBG and Gizmodo.

October 24, 2008

Filed under: culture»internet»excession

On the Grid

I guarantee you, right at this moment, someone is trying to figure out how to display ads, or porn, or ads for porn, based on your current latitude, longitude, and browser history. "Location, location, location" is not just a cliched real-estate slogan anymore. It's the guiding principle behind a whole slew of web startups and new technologies: location-aware browsers, geotagged photos, RFIDs, and who knows how many budding social networks--and those are the least annoying ones. Expect the Beltway to be getting a lot of shipments from the makers of "Margaret Thatcher Gone Wild."

But those applications, like most Web 2.0 startups, are trying so hard to be groundbreaking that they're missing the point. It's not that there aren't legitimate commercial uses for that data, or that those uses won't be seen as essential one day. It's that there's a more intrinsic, human aspect to location awareness, and it has the potential to be culture-changing at a level that's a lot more profound than just virtual grafitti and inventory maintenance.

When Belle and I have travelled in the past, we've often planned our days carefully. And by "we," I mean Belle. It's not so much that she likes planning (although I suspect she does), but probably more that I'm really bad at it, and someone usually has to do it since we rely on public transportation at our destination. That means writing down at least a couple of transit routes, and carrying a lot of maps. It's stressful, especially if (for example) someone read a map wrong in Paris once by accident, resulting in a 30-block trek to a restaurant that the guidebook neglected to note was closed, and then that same person might have made a wrong turn in Chicago once with similar results, and now his girlfriend HAS to second-guess his map-reading skills constantly even though those were ISOLATED INCIDENTS, BELLE, GIVE IT A REST ALREADY.

For the second half of this trip to the Pacific Northwest, we did things a little differently. I've got a smartphone, and Belle's been using a Samsung Instinct, which has a GPS and a number of smartphone-ish features. We'd pick a few things to do each day, then figure out routes and detours dynamically via the data connection. The difference was night and day, and better by orders of magnitude.

As a side note, not to sound like a shill, but I cannot say enough good things about the Google Maps app on S60. It's not only fast and smart, which you'd expect from a search company, but the more recent versions have integrated public transit directions that worked flawlessly for us in Portland. Given the limitations of cellular triangulation-based location-finding, it still requires a little map-reading and common sense, but that's a small price to pay to never look at a bus map again.

Having location information instantly available did more than just make it easy to get from place to place. We stopped worrying about missing a stop on the bus--just keep Maps open and check to see when the blue circle gets close to the end of the purple line. It was still possible to get lost, but it didn't provoke feelings of helplessness anymore. Likewise, we could still spontaneously make little discoveries as we walked--Belle stopped me by chance at a local music shop that happened to sell the Z. Vex boutique effects pedals I've been coveting for years--but it was actually less stressful to just wander around because we could always at least find out where we were, relatively, if not exactly how to get back to where we started.

I suspect sometimes that the core engine of human psychology is a tiny, churning knot of doubt. Hidden deep under layers of rationalization and ego, there's something constantly in need of reassurance: "What's going on? What time is it? Where am I? Who are these people, and why are they staring at me like that? What happened to my pants?" As tool-using mammals, people instinctively gravitate to answers for those questions. The first step in soothing the internal doubt mechanism is to invent a device that answers its queries, like clocks and watches did for time. But the second step is to create a standard for those devices, allowing them to be universal, like the Greenwich Mean Time. Universality means familiarity means comfort. When everyone agrees on the time, it becomes possible to order our lives and interactions from a common reference point, which is not only convenient but also psychologically pleasing.

Yet cell phones were, as others have observed, a disruptive technology for timekeeping etiquette--not because they ruined our ability to plan, but because they decoupled it from the extensive scheduling burden. People no longer coordinate their schedules in such depth, but negotiate them around the less flexible portions of the day. When Belle and I met up with Corvus and Rachel at Powell's in Portland, we didn't bother to set a precise time or part of the bookstore to meet. We just agreed to call when we got there, coordinating on the fly. In general, plans are more vague, and yet everyone's still comfortable with that because the overall level of uncertainty is lower.

Location awareness, I think, has the potential to take that kind of ad-hoc social improvisation even further. Because if you can always figure out where you are, and the others in your group can do the same, meeting places become much more nebulous things. In that situation, any place that meets the necessary criteria for the task at hand--a place to talk, say, and maybe get coffee or other social lubricants--can be determined, shared, and navigated easily. The need for it to be familiar or known beforehand is eased, because when you're always plugged into your physical location every place is a little bit familiar.

One of the revelations I had at the World Bank was the realization, during our street-numbering education project, that not all cities have a systematic method of describing location by street. People in developed countries take for granted the ability to navigate using a series of concrete, standardized instructions instead of searching for landmarks and fumbling with relative distances. Perhaps it's possible that location services will alter the way we look at street mapping the same way that cell phones have blurred our mapping of time. Or maybe it won't. All I know is that it's gone right to the top of my packing list for my next trip.

Future - Present - Past