this space intentionally left blank

April 29, 2016

Filed under: journalism»education

Reporting with Python

This month, I'm teaching a class at the University of Washington on reporting with Python. This seems like an odd match for me, since I hardly ever work with Python, but I wanted to do a class that was more journalism-focused (as opposed to the front-end development that I normally teach) and teaching first-time programmers how to do data analysis in Node just isn't realistic. If you're interested in following along, the repository with the class materials is located here

I'm not the Times' data reporter, so I don't get to do this kind of analysis often, but I always really enjoy it when I do. The danger when planning a class on a fun topic is that it's easy to over-stuff the curriculum in my eagerness to cover the techniques that I think are particularly interesting. To fight that impulse, I typically make a list of material I want to cover, then cut it in half, then think about cutting it in half again. As a result, there's a lot of stuff that didn't make it in — SQL and web scraping primarily among them.

What's left, however, is a pretty solid base for reporters who are interested in starting to use code to generate and explore stories. Last week, we cleaned and searched 1,000 text files for a string, and this week we'll look at doing analysis on CSV files. In the final session, I'm planning on taking a deep dive into regular expressions: so much of reporting is based around interrogating text files, and the nice thing about an education in regex is that it will travel into almost any programming language (as well as being useful for many command line tools like grep or sed).

If I can get anything across in this class, I'm hoping to leave students with an understanding of just how big digital scale can be, and how important it is to have tools for handling it. I was talking one night with one of the Girl Develop It organizers, who works for a local analytics company. Whereas millions of rows of data is a pretty big deal for me, for her it's a couple of hours on a Saturday — she's working at a whole other order of magnitude. I wouldn't even know where to start.

Right now, most record requests and data dumps operate more at my scale. A list of all animal imports/exports in the US for the last ten years is about 7 million records, for example. That's approachable with Python, although you'd be better off learning some SQL for the heavy lifting, but it's past the point where Excel is useful, and it certainly couldn't be explored by hand. If you can't code, or you don't have access to someone who does, you can't write that story.

At some point, the leaks and government records that reporters pore over may grow to a larger kind of scale (leaks, certainly; government data has will be aggregated as long as there are privacy concerns). When that happens, reporters will have to develop the kinds of skills that I don't have. We already see hints of this in the tremendous tooling and coordination required for investigating the Panama papers. But in the meantime, I think it's tremendously important that students learn how to automate data at a basic level, and I'm really excited that this class will introduce them to it.

March 3, 2016

Filed under: journalism»industry

Spotlit

Judging by my peers, it's possible that I'm the only journalist in America who didn't absolutely love Spotlight. I thought it was a serviceable movie, but when it comes to this year's Best Picture award I still harbor a fantasy that there's an Oscar waiting in Valhalla, shiny and chrome, for Fury Road (or for Creed, if push came to shove).

But I'm not upset to see Spotlight win, either. The movie may have been underwhelming for me, but its subject deserves all the attention it gets (whether or not, as former NYT designer Khoi Vinh wonders, the Globe fully capitalizes on it). My only real concern is that soon it'll be mostly valuable as a historical document, with the kind of deep reporting that it portrays either dying or dead.

To recap: Spotlight centers on the Boston Globe's investigation into the Catholic Church's pedophilia scandals in the 1990s — and specifically, into how the church covered up for abusive priests by moving them around or assigning them to useless "rehabilitation" sessions. The paper not only proved the fact that the church was aware of the problem, but also demonstrated that it was far more common than anyone suspected. It's one of the most important, influential works of journalism in modern memory, done by a local newsroom.

It's also a story of successful data journalism, which I feel is often rare: while my industry niche likes to talk itself up, our track record is shorter than many of us like to admit. The data in question isn't complex — the team used spreadsheets and data entry, not scripting languages or visualizations — but it represents long hours of carefully entering, cleaning, and checking data to discover priests that were shuffled out of public view after reports of abuse. Matt Carroll, the team's "data geek," writes about that experience here, including notes on what he'd do differently now.

So it's very cool to see the film getting acclaim. At the same time, it's a love letter to an increasingly small part of the news industry. Investigative teams are rare these days, and many local papers don't have them anymore. We're lucky that we still have them at the Seattle Times — it's one of the things I really like about working there.

Why do investigative teams vanish? They're expensive, for one thing: a team may spend months, or even a year working on a story. They may need legal help to pursue evidence, or legal protection once a story is published. And investigative stories are not huge traffic winners, certainly not proportional to the effort they take. They're one of the things newsrooms do on principle, and when budget gets tight, those principles often start to look more negotiable than they used to.

In this void, there are still a few national publishers pursuing investigations, both among the startups (Buzzfeed, which partnered on our mobile home stories) and the non-profits (Pro Publica and the Marshall Project). I'm a big fan of the work they're doing. Still, they're spread thin trying to cover the whole country, or a particular topic, leaving a lot of shadows at the local level that could use a little sun.

It's nice to imagine that the success of Spotlight the movie will lead to a resurgence in funding for Spotlight the investigative department, and others like them. I suspect that's wishful thinking, though. In the end, that Oscar isn't going to pay for more reporters or editors. If even Hollywood glamor can't get reporters and editors funded, can anything?

February 5, 2016

Filed under: journalism»articles

Catch-up: 2015

The last thing I'd written about here was the paper's investigation into police shootings, so let's take this chance to wander through the rest of 2015.

In October, after a Seattle dentist shot Cecil the lion and made himself temporarily infamous, one of our reporters put in a records request for all historical animal imports into the USA. The resulting story involved querying through seven and a half million rows of data to find out what we import, and how Paul Allen's Initiative 1401 (which banned the resale of several species of animal trophies) would affect these imports (answer: hardly at all). We also got to do some fun visualizations for it.

In November, my teammate Audrey worked with the Seattle Sketcher to create a voiced history of Ravensdale, a boomtown destroyed after a mining accident. In general, audio slideshows aren't hugely successful online, but I think this one was a really pleasant experience, and analytics indicate that a lot of people listened to it.

Every year, during the Seahawks season, the paper does a series of "paper hawks" — foldable paper dolls for players on the team. The last one is blank, so people can put in their own faces. To make things interesting, I put together a paper hawk web app that could use a camera to take a picture of the reader, and do all the customization in the page (including changing skin tones and hair color), then print it out. This was interesting project in part because the API I used (getUserMedia) is restricted to HTTPS only in Chrome. To make it work, we moved all of our projects to secure domains, which was a great test case for encrypting additional content at the paper.

For MLK Day, my team revived the Seattle Times' tribute to the great man, which was originally published twenty years ago (and had been last updated in 2011). The new version is responsive and easier to update, so that each year we can add more information to it. It's fitting, of course, that the paper has a page just for Dr. King, since they were a major part of the campaign to rename King County in his honor back in 1995. It's pretty cool to keep that tradition going.

Finally, just this week, we published a Pacific NW Magazine story on modern dating, with an interactive "mini-documentary" that I built with our video team. Based on your answers, it generates a custom playlist from the interviews that we recorded. We were inspired by this great piece done by the Washington Post on "the N word." I really enjoyed putting the interactions and animation together, but honestly, most of the credit goes to our video team, and my work was just the window dressing.

These are just the major interactives, of course. All told, we built 84 projects of all sizes last year, not including various small pages built by the producers using our app template. That's a pretty good rate of production for a two-developer team. Here's to a busy 2016!

January 21, 2016

Filed under: journalism»professional

Unconferencing

How do we level up data journalists? In a few months, we'll have a new digital/data intern at the Times, and so I've been asking myself this question quite a bit, especially in light of our team's efforts to recruit diverse candidates. There are a lot of students and young journalists out there with a little bit of training, but no idea where to go from there: how do we get them across the gap to where they're capable of working on a newsroom development team? There's a catch-22 at work here: it's especially tough for aspiring news devs to get a job without experience, but they can't get experience without the job.

One strategy I've often heard is that young people should attend industry conferences as a way to learn from experienced journalists and build connections. Myself, I'm skeptical of this. Conferences have never really been a part of my professional life. We didn't go to them at CQ, and I never got a chance to go to GDC when I worked in the game industry. After I was hired at the paper, I got to go to SND2015 and Write the Docs, and this year I'm heading to NICAR, SRCCON, and (possibly) CascadiaJS. It's possible I really hate myself.

Visiting conferences is rewarding, but it's also exhausting, expensive, and a huge time-sink. And while host organizations often work to mitigate that through scholarships and grants to disadvantaged communities, it's still a big ask for neophytes. Even if I weren't skeptical of the benefits conferences actually bring, I think it's hard to argue that we don't need better, more accessible solutions.

The way I see it, there are three things that you get out of a conference as a young person:

  1. Mentorship
  2. Training
  3. Exposure to developing industry trends

Of the three, the first is the hardest to duplicate, and yet it's the most crucial. Networks are powerful in this industry, and you can practically watch them develop before your eyes if you look closely: young people who catch a break early with the right people, and find themselves quickly elevated with opportunities to work on well-known teams, fill industry panels, and write insipid Nieman Lab think-pieces on the future of news. Then we all end up competing over hiring those same six people, which I don't really think is healthy.

Ironically, this is something I want to discuss with other newsrooms at the conferences this year, before I retreat into my Seattle cave for the rest of my natural life. But I'm also starting a personal initiative to make myself available for "remote mentorship," and asking other people to do so. If you're in news and would like to join, feel free to add yourself to the sheet, and I'll share it with students or other people who get in touch!

October 14, 2015

Filed under: journalism»industry

AMPed up

This morning, you can read my opinions (plus three other newsroom developers) on AMP, Google's proposed ultra-fast publishing format. I'm the most optimistic of the the four, even though I wouldn't say that I'm enthusiastic. I think it's an interesting format, and possibly a kick in the pants for the business side of the industry.

In the last question of the interview, I talk a little bit about how I don't think site performance is a topic of actual discussion for product managers at news organizations, and as a result speed is still not a priority for them. What I didn't get in, but wish I had, is that I'm not sure they're wrong about that. Certainly, performance is important and third party code has run rampant on mobile pages. But is that really what's killing us?

I think it's worth remembering that this whole conversation started, in part, because Facebook decided that they want to be a publisher. Of course, nobody with a firm grasp on reality would think that handing full control of all their content over to Facebook is a good idea, so Zuckerberg's posse needed to create an incentive. Instant Articles ensued: in a burst of publicity, Facebook announced that the web was "slow" (with a lot of highly suspect numbers quantifying that slowness) and proposing their publication system as a way to speed it up.

Since in general we like nothing more than talking about how awful our industry is, journalists leapt to join in: why yes, now that you mention it, look how slow our sites are! Clearly, that's the problem (and not, say, the fact that Facebook holds our referral traffic hostage). It's the same reaction the industry has every time Apple releases a new device — cue exhaustive (and exhausting) ruminations on how to create compelling smartwatch content. Yuck.

This is not the first time that Facebook has created panic around the open web in order to make its social racket seem more appealing. In 2011, Anil Dash wrote his infamous post Facebook is gaslighting the web, documenting their practice of putting scary warnings on outgoing links while privileging their (short-lived) "seamless sharing" program. I think we should be careful about accepting their premises, even when they seem to jibe with the larger conversations around web technology.

Which brings us back to the question: should we care that news sites are slow?

My thought is that from a technical side, we should obviously care. Everyone on the web cares about speed. It has a proven effect on things like purchases and on-site time. It's an important metric, and one we should absolutely take seriously. But from a product standpoint, is it the most important thing? No. It's a Product X, and Product X will not save journalism (that post is from 2010, and sure enough, I think I've linked to it once a year since). It's easier to pitch a silver bullet than to admit the harder truth: that the key to our success is putting out journalism that is good enough that people will pay for it, one way or another.

It's possible, unfortunately, that there is no general-audience journalism good enough to make people pay for it anymore. And in that case, we are all doomed, with the possible exception of the NYT and whatever hipster media startups can get Comcast to cough up $200 million in funding. So it goes. But if we're going to be doomed, I'd rather be honest about why that is. It's not because we're slow. It's not because the ads are horrible. It's because our readers didn't think what we put out was important enough to pay for. That's enough of a tragedy on it's own.

October 1, 2015

Filed under: journalism»investigation

Shielded by the law

This weekend, The Seattle Times released our investigation into Washington's "evil intent" law, which makes it almost impossible to prosecute police officers for the use of deadly force: Shielded by law. This was a great project to work on, and definitely an issue I'm proud we could bring to a wider audience. The source code for it is available here.

This project is on a page outside of our CMS entirely to support the outstanding trailer video that our photo department put together, which plays at the end of the animation if your device supports inline video (sorry, iOS). We didn't want to jump directly into the video, because A) it has sound, and B) many readers might find its visuals disturbing. Using our common dot motif to pull people in first, and then give them a choice of watching the video, seemed like a nice strategy. The animation is all done in the DOM and is mostly done just by adding classes on a timer — the only real time that JS touches individual elements is to randomize the fade-in for the dots. Loading the page without JavaScript shows the final image of the animation, via a handy no-js class that the scripts remove before starting playback.

One of our interesting experiments in this story was the use of embedded quiz questions, asking people to test their preconceived notions of police shootings. Originally we intended to scatter these throughout the story to grab readers' attention, but a section on the numerical results of the investigation ended up spoiling the answers. Instead, we moved them to a solid block before that section, and it's been well-received. The interactive graphics were actually also a relatively last-minute addition: originally, we were just going to re-run the print graphics, but exposing all the data in a responsive way was just too useful to pass up.

Probably the most technically advanced part of the page is the audio transcript from the 1985 state senate hearing on the law. As the audio plays, the transcript auto-advances and highlights the current line. It also displays a photo of the speaker from the hearing, to help readers get an idea of the players involved. Clicking on the transcript scrubs the audio to the correct spot. We don't do a lot of audio work here, unfortunately, but I think having an interface that's friendly to readers and listeners alike is a really nice touch, and something I do want to take advantage of on future projects. We built it to generate the data from standard subtitle files, so it should be easy to revisit.

Lastly, one of the most important parts of the story is the least flashy: the table in the "by the numbers" section for deadly force rates by race/ethnicity. We had worked for a while with this information presented the same as the other trivia questions, via clickable dots, but found that the part we really wanted to stress (the relative rates of death proportional to the general population) didn't stand out as much as we wanted. We brainstormed through a few different alternate visualizations, including stacked bars and nested pie charts, but in the end it was just clearest to build a table.

Like Rodney Dangerfield, they may get no respect, but a well-designed table can often be the simplest, easiest way to get a point across. The question then is, what's a well-designed table? Personally, I think there's a whole post in that question — how you order the columns, effective sorting/filtering, and how to add extra features (embedded sparklines, detail expansion, and tree views) that add information without confusing readers. One day, maybe I'll write it. But in the meantime, if you're working on a similar project and can't quite figure out how to present your information, there's no shame in using a table if it serves the story.

September 15, 2015

Filed under: journalism»industry

Value Ad

Welcome to the block party:

The math is even starker for smaller publications and individual bloggers, who rely more heavily on display advertising—and who have already been battered by shifts in the advertising market; some longtime professional bloggers, like Heather Armstrong, have given up writing their blogs full-time. The Awl's publisher Michael Macher told me that "the percentage of the network’s revenue that is blockable by adblocking technology hovers around seventy-five to eighty-five percent." Currently, readers use an ad blocker on around twenty-five percent of all pageviews. Nicole Cliffe, one of the founders of The Toast, said that "adblocker is brutal for us. And people always break out the 'Subscribers model! I donate twenty bucks a year!' thing but it doesn’t add up."

I'm finding myself thinking about adblocking a lot this week, and about publishing platforms. I spend a lot of time thinking about this in general, because I enjoy working for a Seattle newspaper and I would like it to still be here (in one form or another) fifteen years from now (at least), something which was never guaranteed but looks noticeably more tenuous these days. And the upcoming launch of easy, widespread mobile ad-block software is a big part of that.

Bad apples

You can't say that the ad industry has not done anything to deserve this, because of course they have. Online advertising has always been the place where incompetent programming and delusional management meet in a nexus of terrible. You're not a bad person if you work in ads, but you work for a bad business and in all seriousness I will help you go work somewhere else if you get in touch with me. Contact info is on the right.

The problems that advertising causes for web pages are well documented. Ads slow pages down. They're heavy and disruptive. They cause security risks and drive-by hacks. There is a strong argument that a lot of the (admittedly welcome) improvements in web programming technique comes from having to work around these issues: lazy-loading content, async scripts, module systems that can't be stomped by leaky ad code globals.

As a side note, in these discussions, one of the big elephants in the room is that Google (and Facebook, and Apple, and Twitter) are all ad companies. Which is true, but it's true in the way that we might say that insects are a good source of protein — you're still not going to sell me a grasshopper sandwich. Lumping Google in with the average fly-by-night agency may be technically correct, but anyone who has interacted with regular ad code will tell you that the two are miles apart. If Google were actually the people writing the ads you see on an average media site, we probably wouldn't be having this discussion.

Well, we might. Apple might still have decided to stick their thumb in Google's eye out of pure spite, because they're a nasty little gang of capitalists, and that's kind of what they do. But it doesn't matter, because the really smart people at Google aren't writing actual ads. They write very elegant, high-performing auction software that distributes other people's horrible, horrible code, thus undermining quite a bit of their moral high ground. It's a little hard to get mad at readers who want to run content blockers or Greasemonkey scripts or whatever. Of course you want to block these ads! Who wouldn't?

Disruption and its discontents

We have a bad habit in the news industry, which is that we have no faith in our ability to run a business, even though we speculate on it endlessly. Allison Hantschel has been writing posts like this for literally a decade now as a result. One word for the embrace of clear management-led self-sabotage is "trusting." Another word is "suckers."

Newsrooms are very good at grilling other organizations about their plans, and very bad at interrogating our own, in part because we're supposed to have a "wall" between the business and editorial sides of the enterprise. These days that wall is often porous, but the tradition is still there. So when the business half of a paper tells editors and reporters that running obnoxious ads are necessary, we don't often push back, even though we don't want to run them any more than readers want to see them.

This is an explanation, not an excuse. That said, it is inescapably true that the business models we chose, as an industry, are not proving to be as solid as they once were — and it is worth remembering that journalism really was (and in many cases, still is) wildly profitable. Craigslist killed off the classifieds, and content blockers will probably suck all the profit out of the banner-ad revenue stream. Ironically, the one strategy that's still surprisingly sound is printing the previous day's events into a complicated stack of folded paper and selling it for a buck or two. It's not a growth industry, but it seems to be relatively disruption-proof so far. Nobody seems very clear on how to take that model online, though, except by digitizing old people a la Kurzweil and counting on them to pay for content (probably a long shot).

The thing about Silicon Valley's lust for disruption is that, absent of any principles other than a libertarian belief in market power, it tends to just recentralize or recreate the pre-disruption problems. So instead of having a corrupt taxi bureacracy, now we have a corrupt Uber oligarchy, where half the cars you see in the app are fake and they're probably selling your ride history to data merchants in Russia for pennies on the dollar. You don't have to like the taxi system to think that this is kind of a bum deal. Similarly, you don't have to be a fan of advertising, or of advertising-supported journalism, to think that the inevitable outcomes of blocking display will range from bad to worse.

Personally, I think it's healthy to feel wildly uneasy with this entire dynamic, in which tech companies decide to target one bad actor and inflict collatoral damage on an entire industry with a nonchalant wave of their hand. I think it's normal to believe that publishers are getting what they deserve for decades of bad management, and still feel like wiping them out is overkill. It's reasonable to think we should have control over the experience as users, while also arguing that media companies need to pay the bills somehow. But then, I'm not exactly disinterested, myself.

Brought to you by everybody, and nobody

I have a post that's been incubating for about two months now, about riot grrrl and open source. I started thinking about it when I watched The Punk Singer, a shockingly-good documentary about Bikini Kill frontwoman Kathleen Hanna. And the story of the whole movement that she founded (along with a number of other influential women) is fascinating, because it's based on an entire ethic of self-publication and self-determination. They didn't like the commercial media that they had, so they made new media of their own and taught people how to do the same. To me, that's how open source should feel: undermining centralized power and giving the means of production back to the people.

But there's another way of looking at that, which is to say that riot grrrl zines never changed much of anything and the old open web got lost in the shuffle. We can romanticize both of them as much as we want, but at the end of the day they weren't capable of surviving against moneyed interests, and no amount of self-mythologizing is going to change that. That doesn't mean we should give up, but we need to be realistic about the gap between "should be" and "is," because we're in the middle of it now: readers should pay for journalism; they actively don't want to do so.

Our grim meathook media future

Here's one difficult truth: if you are a reporter, editor, or other news human in the year of our lord 2015, your fate is almost certainly on the web. The New York Times and the hot youth flavor of the day (Vox, Vice, Buzzfeed) may get invited into Instant Articles or Apple News, but everyone else is on their own. App-only publications have been tried, and failed, even with the force of Rupert Murdoch behind them. That leaves the web as the place where a diverse, free press can exist, especially once those print revenues finally dry up.

Here's another: the web is always going to grapple with hostile ads, because it's a platform built on remixing and embedding third-party content. The same things that let advertisers abuse your mobile connection also allow us to host comments via Disqus, or embed media from Twitter or Youtube, or create neat interactive features. Open platforms are messier, which is part of why they grow so effectively, and also why they have a hard time competing with closed, curated platforms. Nobody's going to make it easy for us.

Between those two difficult truths is a spectrum of uncomfortable options, ranging from paywalls to subscriptions to (most likely) bankruptcy. As Casey Johnston says in the Awl piece that opens this post, the likely outcome is the rapid eradication of many sites that currently scrape by on Doubleclick revenue. The small and the quirky are going to take the hit here, even if they're not so small: The Dissolve was shuttered earlier this year, despite a pretty impressive stable of contributors and support, and they won't be the last.

In the very long term, we all die alone. I hesitate to make any other predictions. But I suspect that the eventual fallout of these changes is the hollowing-out of the American media: two big national papers at the top; a horde of niche publications clinging, white-knuckled, to subsistence at the bottom; and not very much in the middle except the non-profits who have opted out of the entire rat race. That this arrangement parallels our national economic inequality is probably not a coincidence, but we're long past the point where anyone wants to hear a systemic critique. Will your favorite publication survive? It's time to spin the wheel and find out.

July 22, 2015

Filed under: journalism»industry

Covering letters

It's a low bar to clear, but I think I can honestly say that journalism has a better diversity record compared to tech. If there were a newsroom the size of Facebook, chances are high it would have hired more than 7 black people last year. But that doesn't mean we can't do better. And if we're going to talk about hiring in journalism, we need to talk about interns.

NPR's visuals team has decided to try making internships more diverse, by being transparent about their requirements. Basically, they want to be clear about the expectations around cover letters and interviewing, so that people from non-privileged backgrounds know to prepare for them. I know and like several members of the team there, so I'm going to give them the benefit of a doubt when they say that there's more to come, but as a diversity program this seems a bit thin.

Firstly, a post on a little-trafficked blog is not exactly a high-visibility broadcast (said post isn't linked from any of the open internship positions as far as I could tell). It's easy for people to miss. More importantly, if the team is finding that cover letters and interviews are excluding good candidates, maybe the point should be to change the way that those are evaluated (or drop them entirely). Perhaps cover letters are not a great criteria for picking interns, or the way you're looking at them is biased in some way.

My own thoughts on this are complicated, not least because I see the playing field being artificially manipulated from all sides. I'm always amazed when I teach workshops at UW and hear that students may be on their fourth or fifth internship. They're behaving rationally — a lot of journalism careers are founded on student internships — but it's still bizarre to think that the path to a newsroom job might require literally years of unpaid or low-paying labor. If nothing else, there are a lot of people for whom that's just not an option.

Perhaps this is why, as CJR noted in a just-published report, minority journalists aren't finding jobs at rates proportional to graduation. In fact, minorities who graduate with a degree in journalism were 17% less likely to find a print journalism job compared to their white counterparts, compared to only 2% difference in advertising. As Alex Williams states:

Overall, only 49 percent of minority graduates that specialized in print or broadcasting found a full-time job, compared to 66 percent of white graduates. These staggering job placement figures help explain the low number of minority journalists. The number of minorities graduating from journalism programs and applying for jobs doesn’t seem to be the problem after all. The problem is that these candidates are not being hired.

I think the lessons from this are two-fold. First, I think we should be better about spreading internships out to a wider range of students. That's partly about selecting more diverse candidates, but it's also about turning down interns many-times-over in favor of candidates who need more of a boost. Internships are about experience, but they're also a way of pre-selecting who we want in the newsrooms of the future by burnishing their resumes. It's great to see NPR taking some responsibility, small or not, for their role in the pipeline. Hopefully other organizations will follow suit.

Additionally, maybe we should be less interested in internships as hiring criteria in the first place. Although my corner of the field is a little atypical, many of the best digital journalists I know didn't enter the field through a traditional career path (myself included). If our goal is to diversify our newsrooms, being accepting of a variety of different backgrounds and experiences is part of how we get there. So a candidate didn't have an internship. So what? Can they write? Can they edit? Can they code?

I often worry about over-stressing credentials in journalism. Sure, it helps separate the wheat and the chaff, but it also brushes over the fact that what we do just isn't that hard. We go places, talk to people, and then write it down and give to other people to read. You don't need a degree from that (as Michael Lewis aptly chronicled more than 20 years ago), and you shouldn't necessarily need an internship. As a community, we mourned the passing of David Carr, but we haven't learned the lessons he taught to writers like Ta-Nehisi Coates, about hiring "knuckleheads" and molding them into the industry we want to be. And until we do, we will still struggle to find newsrooms that reflect modern American diversity.

June 8, 2015

Filed under: journalism»professional

Paper Anniversary

It's ironic, I guess, that I was so busy at the Seattle Times a couple of weeks ago I forgot to write about my one-year anniversary here. Anyway: it's been quite a year! I've done real estate visualizations, provided an overview of Oso Valley development, and covered the Washington state elections. I did much of the development on our major investigative pieces, Loaded with Lead and Sell Block (not to mention graphs and narrative interactives for the Warren Buffett mobile home investigation). I made a Seahawks fan map so good that the team outright stole it for themselves. For the local architecture buff, I worked on a building quiz, and for the beer fans I helped build the landing page for our Brew with Us project. Want to know where the May Day protests went? I built a map for that. And this is just the big stuff.

In addition to the externally-facing development, I've been working on building tools that are used by the rest of the newsroom. I think our news app scaffolding is as good as anyone's in this business. We're leading the industry in custom element development, with responsive frames, Leaflet maps, and more. The watermarking tool I made on my second day is still in use, and will probably outlive me entirely at this point.

I have always had a low threshold for boredom, a character flaw that's led to overpacking for every trip I've ever taken and a general inability to read literary fiction. I love working in a newsroom for many reasons, but one of the greatest has always been that I am rarely bored here, and it never lasts more than two weeks when it does happen. I cannot recommend this job highly enough for technical people who want to have an impact, or journalists who want to break out of a single beat. Working at the Seattle Times has been the most fun I've had a job in a long time. I can't wait to see what the next year brings.

"As I look back over a misspent life, I find myself more and more convinced that I had more fun doing news reporting than in any other enterprise. It is really the life of kings."

--H.L. Mencken

May 20, 2015

Filed under: journalism»industry

Instant Noodles

Like all Facebook's attempts to absorb the news industry, there's a probable timeline their new Instant Articles will follow, and it basically looks like this:

  • 2015: Facebook introduces Instant Articles, in which a few media partners push their content directly into Facebook's servers, and (in the iPhone app only) it gets rendered without leaving the application. "Content," in this case, even includes the publisher's own ad and tracking systems.
  • 2016: The program expands to other publishers, albeit possibly with a few more "refinements" (read: restrictions) on what those publishers are allowed to do. It becomes fashionable in the newsroom to harass me about it.
  • Late 2016: Once Instant Articles gets some traction, Facebook finds a way to sabotage or undercut it. Either they'll introduce more restrictions on allowable features, or they'll lower the frequency at which the posts appear and charge newsrooms to "promote" them (or both — why take half measures?).
  • 2017: Noting that the magical promised ad dollars have not materialized (or are eaten up by tithes back to The Algorithm), media organizations start quietly reducing their Instant Article publishing rate. Jeff Jarvis writes a sad editorial about it.
  • 2018: Claiming it was an "educational" experiment, Facebook shuts down the program. Rumors begin circulating about its VR news platform, in which the New York Times will publish for Oculus Rift.

Instant Articles is not the first time Facebook has tried to take over the web, and it won't be the last. They're very bad at it, probably because they're the original kings of empty promises: working with Facebook is a constant stream of exasperation, until either you realize that they're incapable of maintaining a stable API/business relationship, or you slit your wrists. They've done it to game developers (goodbye, Farmville), to other newsrooms (remember Washington Post Social Reader?), and to anyone else who's tried to build on the various Facebook "platforms."

Lots of people have written very smart reactions to the Instant Articles announcement — I'm partial to Josh Marshall's behind-the-scenes take, John Herrman's spiral of bemused horror, and Zeynep Tufekci's reminder that Facebook cannot be trusted to engage honestly with its role as gatekeeper.

It's probably more fun to engage with the self-proclaimed "controversial" opinions, like this profoundly dumb thought-leadering from MG Siegler:

With Instant Articles, Facebook has not only done a 180 from what Mark Zuckerberg has called the company's biggest mistake, they've now done another lap just to prove a point.

They did a 180, and then took a lap, so... they ran the race backwards, which is a good thing? Somewhere, Tom Friedman feels a twinge of jealousy.

Not only is the web not fast enough for apps, it's not fast enough for text either. And you know what, they're right.

"They're right" that an app loading pre-cached text can be faster than a web browser downloading that same text from the network, yes. Apparently our plan now is just to restrict your reading material to what Facebook can download ahead of time. I hope you like Upworthy lists.

Though, in a way, Facebook itself really is just a web browser. It's just a different, newfangled one for a new era. A mobile era.

A different, newfangled web browser that only goes to Facebook, apparently. Who would want to read anything else? In the future, all websites are Facebook. (Ironically, according to the Instant Articles FAQ, they're fed from HTML anyway, so they're not even really that "new." But it's probably too much to expect Siegler to do research.)

Siegler's not the only person I've seen celebrating Facebook's move as an end to the open web (by which we mean HTML/CSS/JavaScript), although he's certainly one of the most gleeful (he also thinks Facebook should shut down its website entirely, in case you were wondering the general quality of his business advice). Of course, you'll notice that these hot takes are not themselves published to Facebook, or to a native app somewhere. If that were the case, no-one would have heard of them. They get posted to the web, where they can get linked and shared across social media, and read regardless of platform or hardware.

Even without without bringing in ideology, the "native apps instead of the web" idea faces a tremendous number of problems once you think about it for more than thirty seconds. How do new publications like The Toast or FiveThirtyEight get traction when you have to manually download them from an app store to read them? If they get popular through the web first, why bother transitioning to native? Nobody makes "reader" apps for desktops and laptops, so what happens to them? Does anyone really want to write long-form on Facebook, a service that only recently added an "edit post" button? Who cares: punditry is hard, let's go shopping!

It's easy to pick on shallow people who think Instant Articles represent a grand utopian state, but I'd also like to celebrate people who are actually building in the opposite direction. This weekend, I went to a Knight-Mozilla code convening in Portland, which included a ticket to the Write the Docs convention. I'm not a documentation writer, really, so most of the conference went right past me. But the keynote on the second day was by Ward Cunningham, inventor of the wiki, and it was a fascinating look at what it would really look like to reinvent the web.

For the past few years, Cunningham's been working on "federated" wikis, which store content on multiple servers instead of using a single database. If you link to another person's wiki page and you want to change the content, you fork it a la GitHub, and edit the new local copy (which remembers its origin) right there in your browser. You can also drag-and-drop content into a new page, if you want to merge text from multiple sources. It's pretty neat. The talk isn't online, but he did another presentation at New Relic that covers similar material.

Parts of Cunningham's pitch can sound kind of crankish, although I'm sure I would have said the same thing for the original wiki. But other parts are really interesting, such as the idea of creating a forkable attribution trail for data and reporting. Federated wikis are another attempt to decentralize and diversify the Internet, instead of walling it up behind a corporation's control. And a lot of it is inspired by the main insight that wikis had in the first place: on a wiki, you create a page by first creating a hyperlink to it, then following that link.

As a result, even though users don't directly type HTML into the window, this form of authorship is profoundly of the web, and it's the kind of thing that's never going to exist in a native application somewhere. The fact that Cunningham can experiment with adding new markup features in JavaScript — and even turn a browser into a new kind of hypertext reader, with a different interface paradigm — is what the web platform does best. Like water, it can flow, or it can crash.

And that's why it's ultimately ridiculous to act like some pre-cached news articles are the herald of a new media age. What the web gives us — a freedom for anyone to publish to everyone, a wildly cross-platform programming environment, a rich multimedia container where your plain-text article can live right next to my complex news app — is not going to be superceded by a bunch of native apps, and certainly not by Facebook. Instant Articles won't even be the future of news. Future of the web? Give me a break.

Past - Present - Future