Last Friday, I gave a short presentation for a workshop run by the SCCC Byte Club called "Technical Interview Mastery for Women." Despite the name, it was attended by both men and women. Most of my advice was non-gender specific, anyway: I wanted to encourage people to interview productively by taking into account the perspective from the other side of the table, and seeing the process more as a dialog instead of a confrontation.
Still, during the question and answer period, several people asked about being women in the interview process. Given that my co-presenter has many years more experience being a woman, I deferred to her whenever possible, but I did chime in when the conversation turned to interaction styles. One participant said she was ignored if she wasn't assertive enough, but was then considered unpleasant if she stuck up for herself--what could she do about this?
It's one thing, I said, to suggest ways that women should adapt their communications for a male-dominated workplace--that kind of pragmatic code-switching may well do the trick. But I think it's unfair to put all the burden on women to adapt to men. There needs to be a way to remind men that it's their responsibility to act reasonably.
The problem is that it's often difficult to have that conversation without falling afoul of the same double-standard that says women in the workplace shouldn't be too loud. Complaining about sexism tends to raise hackles--meaning that the offending statement not only goes uncorrected, but dialog gets shut down. I don't know that I have any good solutions to that, but I suggested finding ways to phrase the issue akin to Jay Smooth's presentations on How To Tell People They Sound Racist. I like to think that most people aren't trying to be sexist, they're just not very self-aware. This may be a faulty assumption.
My talk at the workshop was specifically about interviewing, but obviously this is an issue that goes beyond hiring. Something is happening between the classroom and the workplace that causes this disparity. We have a word for this--sexism--regardless of the specific mechanics. And I would love to have more discussions of those specifics, but it's like climate change: every time there's a decent conversation in a public forum about solutions, it gets derailed by people who insist loudly that they don't think there's a problem in the first place.
That said, assuming that people just don't realize when they've done something wrong, there are doubtless ways to address the topic without defensiveness. If the description "sexist" derails, I'm personally happy to use other terms, like "unprofessional" or "rude"--I'm just embarrassed that I (and others) need to resort to euphemism. We need to change the culture around this discussion--to make it clear that we (both men and women) take this seriously, including respectful responses to criticism. We can do better, and I'd like to be able to tell future workshops that we're trying.
Last year, Google spent most of its I/O conference keynote talking about hardware: Android, Glass, and tablets. This year, someone seems to have reminded Google that they're a web company, since most of the new announcements were all running in a browser, and in many cases (like the photo editing and WebGL maps) pushing the envelope for what's possible. As much as I like Android, I'm really happy to see the web getting some love.
There's been a drumbeat for several years now, particularly as smartphones got more powerful, to move away from web apps, and Google's focus on Android lent credence to that perspective. A conventional wisdom has emerged: web apps were a misstep, but we're past that now, and it'll be all native from this point out. I can't disagree with that more, and Google's clearly staking its claim as well.
The reason the web wins (such that anything will) is not, ultimately, because of its elegance or its purity (it's not big on either) but because of its ubiquity. The browser is the worst cross-platform API except for all the other ones, and (more importantly) it offers persistence. I can turn on any computer with an Internet connection and have near-instant access to files and applications without installing anything or worrying about compatibility. Every computer is my computer on the web.
For context, there was a time in my high school years when Java was on fire. As a cross-platform language with a network-savvy runtime, it was going to revive thin clients: I remember talking to people about the idea that I could log into any computer and load my desktop (with all my software) over the Internet connection. There wouldn't be any point to having your own dedicated hardware in a world like that, because you'd just grab whatever was handy and use it as a host. It was going to be like living in a William Gibson novel.
I don't believe that native programs will ever entirely go away. But I do think we see web applications spreading their tentacles over time, because if something is possible in the browser--if it's a decent user experience, plus it has the web's advantages of instant, no-install launch and sharing across devices--there's not much point in keeping it native. It's better to have your e-mail on any device. It's better for me to do presentations from a browser, instead of carrying a Powerpoint file around. It's better to keep my RSS reader in the cloud, instead of tying its state to individual machines. As browsers improve, this will be true of more and more applications, just as it was true of the Java applets that web technology replaced.
Google and I disagree with where those applications should be hosted, of course. Google thinks they should run it (which for many people is perfectly okay), and I want to run them myself. But that's a difference of degree, not principle. We both think the basic foundation--an open, hackable, portable web--is an important priority.
I like to look at it in terms of "design fiction"--the dramatic endpoint that proponents of each approach are aiming to achieve. With native apps, devices themselves are valuable, because native code is heavy: it takes time to install, it stores data locally, and it's probably locked to a given OS or architecture. Web apps don't give us the same immediate power, but their ultimate goal is a world where your local hardware doesn't matter--walk up to any web-capable surface, and your applications are there. Software in the web-centric viewpoint follows you, not your stuff. There are lots of reasons why I'm bullish on the web, but that particular vision is, for me, the most compelling one.
Soul Society is here again, and so am I. If you're in the DC area this weekend, check it out.
There's a common complaint about the Bioshock games, which is that they're not very good shooters. People writing about Bioshock Infinite tend to mention this, saying that the story is interesting and the writing is sharp but the actual game is poor. And this is true: it's not a very good first-person shooter, and it's arguably much worse than its predecessors. But this implication of most of these comments, from Kotaku's essay on its violence to Brainy Gamer's naming it the "apotheosis of FPS, is that Infinite is bad in many ways because it's a first-person shooter--that it's shackled to its point of view. In doing so, it has become a sort of stand-in for the whole genre, from Call of Duty to Halo.
I sympathize with the people who feel like the game's violence is incoherent (it is), and who are sick of the whole console-inspired manshooting genre. But I love shooters, and it bugs me a little to see them saddled with the burden of everything that's wrong with American media.
Set aside Infinite's themes and its apparent belief that the best superpower is the ability to literally generate plot holes--when we say that it's not a good FPS, what does that means? What is it, mechanically, that separates the two? I'm not a designer, but as a avid FPS player, there are basically three rules that Infinite breaks.
First of all, the enemy progression can't be just about "bigger lifebars." A good shooter increases difficulty by forcing players to change their patterns because they're not able to rely on the same rote strategy. Halo, for all its flaws, gets this right: few of its enemies are actually "tough," but each of them has a different method of avoiding damage, and a different weapon style. By throwing in different combinations, players are forced to change up their tactics for each encounter, or even at multiple points during the encounter. Almost all of Infinite's enemies, on the other hand, are the same walking tanks, with similar (dim-witted) behaviors and hitscan weaponry. I never had to change my approach, only the amount of ammo I used.
Along those lines, weapons need strengths and weaknesses. Each one should have a situation where they feel thrillingly powerful, as well as a larger set of situations where they're relatively useless. This doesn't have to conflict with a limited inventory--I loved Crysis 2's sniper rifle, spending the entire game sneaking between cover positions in stealth mode, but it was always paired with a strong close-in gun for when I was overrun. A good game forces you to change weapons for reasons other than "out of ammunition." Infinite's close-range weapons feel identical, and its sniper rifle is rarely useful, since a single shot alerts everyone to your position.
Finally, every fight cannot simply be about shooting. Most shooters are actually about navigating space and territory, and the shooting becomes a way of altering the priorities for movement. Do you take cover, or dodge in the open? Do you need more range, or need to close on an enemy? The original Bioshock made the interplay between the environment and your abilities one of its most compelling features: electrifying pools of water, setting fire to flammable objects, flinging scenery around with telekinesis. But at the very least, you need an objective from time to time with more complexity than "kill everything," both as a player and in terms of narrative.
Bioshock Infinite has, in all seriousness, no period I can remember when my objective was not reduced to "kill everything." Combined with a bland arsenal and blander enemies, this makes it a tedious game, but it also puts it at complete odds with its characters. The writing in Infinite is unusually good for a shooter, but it's hard not to notice that Elizabeth freaks out (rightfully) during one of Booker's murderous rampages, comes to a cheery acceptance with it a few minutes later, and then spends the rest of the game tossing helpful items to you under fire. That's writing that makes both the narrative and the mechanics worse, by drawing attention to the worst parts of both.
It's not the only shooter with those flaws--people just had higher expectations for it. The average FPS is badly written, and it's a favorite genre for warmongering propaganda pieces. But that's true of many games, and yet we don't see pieces talking about the "apotheosis of platformers," or talking about RTS as though they're emblematic of wider ills just because Starcraft II is kind of a mess. And there's still interesting stuff being done in the genre: Portal and Thirty Flights of Loving come to mind. To say that FPS have reached their limits, ironically, seems like a pretty limited perspective.
A couple of years ago, I spent the money for a subscription to Ars Technica, because I really liked their Anonymous/HBGary reporting, and wanted the full RSS feeds. Along with that, every now and then they'll send out a message about a coupon or special offer, which is how I ended up with a free account on App.net, the for-pay Twitter clone. Then I forgot about it, because the last thing I need is a way to find more people that annoy me.
And then someone linked me to this blog post, which made my week. It's a pitch for App.net in the most overwrought, let-them-eat-cake way. I'm going to excerpt a bit, but you should click through: it's better when you can just soak up the majesty of the whole thing:
The difference between a public and a private golf course is so profound that it's hard to play a public course after being a member of a private course. It's like flying coach your entire life, and then getting a first class seat on Asiana — it's damned hard to go back.
That's the difference between Twitter and App.net to me. Twitter is the public golf course, the coach seat. It's where everyone is, and that's exactly the problem. App.net is where a few people that are invested in the product, its direction, and the overall health of the service, go to socialize online.
[paragraph of awkward self-promotion removed]
Welcome to the first-class Twitter experience.
I actually don't know if I could write a parody of upper-class snobbery that good. If you hold your hand up the screen, you can almost feel the warmth of his self-regard--but not too close! They don't let just anyone into this country club, you know.
Seriously, though: while I've amused myself endlessly trying to come up with even-less-relatable metaphors for things ("Twitter is the black truffle, as opposed to the finer white truffles I eat at my summer home in Tuscany"), one random doofus with a blog is not cause for comment. Silly as it is, that post made me reconsider they way I look at internet advertising and ownership--if only to avoid agreeing with him.
In general, I'm not a big fan of advertising or ad-supported services. On Android, I usually buy apps instead of using the free versions, and I believe that people should own their content on the Internet. But let's be realistic: most people will not pay for their own server or software, and many people can't--whether because they don't have the money, or because they don't have access to the infrastructure (bank account, credit card, etc.) that's required. Owning your stuff on the internet is both a privilege and a visible signifier of that privilege.
This creates heirarchies between users, and even non-savvy people pick up on that. When Instagram finally decided to release an Android client, the moaning from a number of users about "those people" invading their clean, tasteful, iPhone-only service was a sight to behold. The irony of Instagram snobbery is that the company was only valuable because of its huge audience. It only got that userbase because it was free. Therein lies the catch-22 of these kinds of services: the scale that makes them useful and valuable also makes them profoundly expensive to run. Subscription-based or self-hosted business models are more sustainable, but they're never going to get as big.
Meanwhile, the technical people who think they could do something about these problems--"a few people that are invested in the product, its direction, and the overall health of the service"--are off building their own special first-class seating. Not that I think they'll make it, personally--it's a perpetual tragedy that the people threatening to Go Galt never do, since that would require them to stop bothering the rest of us.
I often see people expressing distaste for ad-supported sites with the oft-quoted line "you're not the customer, you're the product." That's nice when you have the option of paying for your own e-mail, and running your own blog, in the same way that minimalism looks awfully nice when you have the credit rating to afford it. People without money have to live with clutter. If we're interested in an internet that offers opportunity to everyone, we have to accept a more forgiving view of ad-supported business, and focus on how to make it safer for people who have no other option. Otherwise we're just congratulating each other on getting into the country club.
Normally, I try to have something written and posted here by Wednesday night each week, because I feel like that's the minimum I can write and still call myself a blogger. This week, unfortunately, between writing my textbook (highly recommended!) and trudging through Bioshock Infinite (not at all recommended!), my right wrist is probably in the worst shape it's been in for about five years now. To recover, I'm giving myself the week off from computers outside of work.
I figure you don't really need to know this, but if I write it up here, I'm more likely to stick to it.
While I'm complaining, my knees hurt and these kids won't stay off my lawn.
Last week, Iain Banks announced that he has terminal cancer, with probably a year remaining to live. He'll hopefully see the publication of one more book, Quarry, before he goes.
Banks has long been one of my favorite authors, to the point that our living room bookshelves have several units devoted entirely to his work. I even had Belle bring me back paperbacks of his literary fiction from a trip to England, since those are still hard to find on this side of the pond. I'm tremendously saddened that he's doing so poorly, and I hope his plans to enjoy his remaining time as much as possible are a success.
If you've never really read any of Banks' work, and you'd like to see what the fuss is about now, where should you start? The answer seems to be fairly personal--especially within the science fiction genre, opinions often differ wildly on which books are better. This is my take, sorted between the two genres (literary and SF) that Banks called home.
These "Academic Freedom Act" laws seem like a very good idea to me, but I wonder if we're taking them far enough. If the Discovery Institute and all manner of right-wing think tanks want to Teach the Controversy, why limit ourselves to evolution and climate change? With that in mind, I've assembled a new school curriculum that (finally!) acknowledges the complicated world beyond "facts" and "truth."
Social Studies: Students will learn about the checks and balances built into our democratic way of life, of course. But we shouldn't leave them ignorant of competing theories, such as David Icke's "lizard oligarchy," in case the queen of England really does turn out to be a giant space reptile bent on world domination. As high school seniors, students will also spend the semester learning about Ayn Rand's theory of radical selfishness, in the hopes that it will keep them from reading Atlas Shrugged in college and becoming insufferably tedious for about a year and a half.
History: Move over, eurocentric history! Take cover, afrocentric and multicultural history! Under new management, history class will approach the hard questions of the past with an open mind toward alternate theories. For example: did the holocaust really happen, or is it just the invention of a shadowy cabal working behind the scenes of our financial and entertainment industries? You know who I'm talking about.
Physical Education: Gym class doesn't change, but students who get sick will now be told that their humours are out of balance, and will be bled by on-site leeches. Coaches also have the option of blaming vaccines when the football team loses.
English: Given the predominance of "literacy" in the early grades, students will spend the second half of their primary education learning how to communicate pre-verbally, mostly by pointing and grunting. For many teenagers, this won't be much of a change. The curriculum will culminate with a trip to a local quarry, where the students will attempt to recreate the Lascaux cave paintings, thus teaching them the valuable life lesson that art is hard so why try anyway?
Math: I tried to think of something funny about math, and then I remembered that we still teach kids about "imaginary" numbers, and to add insult to injury we do so very badly. Math is weird, y'all.
Foreign Languages: One word: Esperanto. Ironically, in Esperanto, this is actually twelve words. It's the language of the future, people. William Shatner did a whole movie in Esperanto once. I've got a good feeling about this one.
There are lots of tools for doing diffs between two source files, but I'm not aware of any source control system (save Perforce, which we use at ArenaNet) that do a timeline view of all revisions since a file was first checked in, and none that store the entire revision history in a single, web-friendly format. This is a shame, because my goal for several parts of the textbook is to be able to "replay" the process of writing a script, to show how it develops from a few lines of simple code into larger and more functional units like functions and prototypes. It's possible that someone else has done something like this, but a cursory Google couldn't turn it up, so I made my own.
You don't have to write these files by hand, which is good, because they can get pretty nightmarish. Instead, I've written an authoring tool for putting in multiple revisions (or importing them, using the HTML5 file API), commenting them, and exporting them. Using Ace means the editor is friendly and includes source-highlighting, which is great. You also don't have to worry about writing an output parser: the TLPlayer module is not quite complete, but it's done enough to wire it up to a UI and let people flip through the file, with new lines highlighted in the output.
If you'd like to see a demo, I've started using it for the chapter on writing functions. My goal is to put at least one timelapse at the end of each chapter, so that readers can see the subject matter being used to build at least on real-world code script. By doing these as revision histories, I'm hoping to avoid the common textbook "dump a huge source example into the chapter" syndrome. I know when I see that, my eyes glaze over--I don't see any reason that it's any different for my students.
Although I don't have a license on the textbook files yet (they'll probably be MIT-licensed in the near future), you're welcome to use these two modules for your own projects, and feel free to submit patches (the serialization, in particular, could probably use some love with someone with a stronger parsing background). I'd love to see if this is useful for anyone else, and I'm hoping it will help make this textbook project much friendlier to new developers.
They always want the writer to work for nothing. And the problem is that there's so goddamn many writers who have no idea that they're supposed to be paid every time they do something, they do it for nothing! ... I get so angry about this, because you're undercut by all the amateurs. It's the amateurs who make it tough for the professionals, because when you act professional, these people are so used to getting it for nothing, and for mooching...
Last week, Nate Thayer wrote a well-linked post about being asked to write for The Atlantic for free--well, for "exposure," which is free in a funny hat. It's gotten a lot of attention in the journalism community, including a good piece on the economics of web-scale journalism by Atlantic editor Alexis Madrigal.
I read this kind of stuff and think that I have never been happier to find a niche within journalism that makes me marketable. I mean, not that marketable: I had to switch industries when I moved out of DC, after all. But inside the beltway, I didn't have to freelance anymore, and I would have had plenty of options if I decided to leave CQ and head somewhere else. Data journalism was good to me, and I can't imagine having to go back to the scramble of being just a writer again.
But beneath that relief, I feel angry. And the fact that Madrigal can write a well-reasoned piece about why they're asking people to write for free doesn't make me any less angry. The fact that Ta-Nehisi Coates, who I respect greatly, can write about how writing for free launched the best part of his career, doesn't make me feel any less annoyed. I'm getting older but I'm still punk enough that when someone tells me the system is keeping us down, my response isn't to say, "well, I guess that's just how it is." The system needs to change.
Let's be clear: I don't expect writers to make a lot of money. They never have. People don't get into journalism because they expect to be rich. But writing--serious writing, not just randomly blogging on your pet peeves like I do here 90% of the time--is hard work. The long-form pieces that I've done have been drawn-out, time-consuming affairs: research, interviews, collecting notes, writing, rewriting, editing, trimming, and rewriting again. People think that writing is easy, but it's not, and it should be a paid job. (Even when it's not paid, it's not easy: I've been editing this post for three days now.)
As Ellison says, when publications can get the work for free, it makes it really hard to be paid for your writing. I'm not sure I'd phrase it with the same antipathy for "amateurs" (let's be clear: Ellison is a terrifying human being that I happen to agree with in this particular case), but it's certainly true that the glut of people willing to write for free causes a serious problem for those of us who write (or have written) for a living. They're scabs, in the union sense: they take work that should be paid, and drive down the cost of labor (see also: unpaid musicians).
All in all, the creative landscape is starting to look more toxic than it's been in our lifetimes: Artists with million-dollar checks in their pockets are telling other artists that they shouldn't expect to get paid; publications are telling writers that they shouldn't expect to get paid, either; and meanwhile everyone wonders why we can't get more diversity in the creative ranks. One obvious way to reverse media's glut of wealthy white people would be to stop making it so few others but wealthy white people can afford to get into media. But in the age of dramatic newsroom layoffs and folding publications, nobody wants to hear that.When your publishing model depends on people writing for free, there are a lot of people who aren't going to get published. I couldn't afford internships during college, meaning that I had a hard time breaking in--but I was still relatively lucky. I worked in office jobs with flexible hours and understanding bosses. If I wanted to take an early lunch break in order to do a phone interview, I could. I had evenings free to work on writing and research. I could take jobs that paid 10¢ a word, because I only had a day job. A lot of people don't have that chance, including a disproportionate number of minorities.
It adds insult to injury when you look at some of the people who are published precisely because they could afford internships and writing for free. Sure, it's wrong to base an argument on a few highly-visible outliers. But it's hard not to be a little furious to see the NYT sending good money to Tom Friedman (the obvious travesty), or Roger Cohen, or David Brooks when the industry claims it can't offer new writers recompense. It burns to see The Atlantic insisting that paying people isn't sustainable when they gave Megan McArdle (a hack's hack if there ever was one) a career for years, not to mention running propaganda for the Church of Scientology. If you're going to claim that you're trying as hard as you can to uphold a long-standing journalistic legacy in tough economic times, you'd better make sure your hands are clean before you hold them out in supplication.
I am skeptical, personally, of claims that the industry as a whole can't afford to pay writers. I have heard newsroom financials and profit margins, both for my own employer and for others. The news is no longer a business that prints money, but it remains profitable, as far as I can tell--if not as profitable as management would often like. Perhaps that's not true of The Atlantic: I don't know the details of their balance sheet, although this 2010 NYT article says they made "a tidy profit of $1.8 million this year" and this 2012 article credits them with three years of profitability. That's an impressive bankroll for someone who claims they don't have the budget to pay writers for feature work.
That said, let's accept that I am not an industry expert. It's entirely possible that I'm wrong, and these are desparate times for publications. I can't solve this problem for them. But I can choose a place to stand on my end. I don't work for free, unless it's explicitly for myself under terms that I completely control (i.e., this blog and the others that I fail to maintain as diligently), the same way that I don't take gigs from paying musicians just because I like playing in front of an audience.
Coates may defend working for free, because it got him a guest spot at the publication where he now works. But to me, the most important part of the story is that he got that spot on the strength of his blogging, which drew the attention of other writers and editors. You want exposure? There's nothing wrong with making it for yourself. Please start a blog, and hustle for it like crazy. But don't let other people tell you that it's the same as a paycheck--especially when they're not working for "exposure." They're on salary.
Is there a chance that, as with Coates and so many others, that exposure could lead to better gigs? Sure, the same way that a musician might get discovered while playing folk covers at a Potbelly sandwich shop. But it's a lottery, and pointing to successful writers who came up that way ignores the order of magnitude more that wrote for exposure and promptly sank into obscurity. You can't pay your rent with publicity, and you never could. We're professionals, and we should demand to be treated that way.