Mile Zero http://www.milezero.org/index.php this space intentionally left blank en Bloxsom on PHP Emu Nation http://www.milezero.org/index.php/gaming/perspective/emu_nation.html It's hard to hear news of Nintendo creating a tiny, $60 NES package and not think of Frank Cifaldi's provocative <a href="http://www.gdcvault.com/play/1023470/-It-s-Just-Emulation">GDC talk on emulation</a>. Cifaldi, who works on game remastering and preservation (most recently on a <i>Mega Man</i> collection), covers a wide span of really interesting industry backstory, but his presentation is mostly infamous for the following quote: <blockquote> <p> The virtual console is nothing but emulations of Nintendo games. And in fact, if you were to download Super Mario Brothers on the Wii Virtual Console... <p> <i>[shows a screenshot of two identical hex filedumps]</i> <p> So on the left there is a ROM that I downloaded from a ROM site of Super Mario Brothers. It's the same file that's been there since... it's got a timestamp on it of 1996. On the right is Nintendo's Virtual Console version of Super Mario Brothers. I want you to pay particular attention to the hex values that I've highlighted here. <p> <i>[the highlighted sections are identical]</i> <p> That is what's called an iNES header. An iNES header is a header format developed by amateur software emulators in the 90's. What's that doing in a Nintendo product? I would posit that <b>Nintendo downloaded Super Mario Brothers from the internet and sold it back to you.</b> </blockquote> <p> As Cifaldi notes, while the industry has had a strong official anti-emulation stance for years, they've also turned emulation into a regular revenue stream for Nintendo in particular. In fact, Nintendo has used scaremongering about emulation to monopolize the market for any games that were published on its old consoles. In this case, the miniature NES coming to market in November is almost certainly running an emulator inside its little plastic casing. It's not so much that they're opposed to emulation, so much as they're opposed to emulation that they can't milk for cash. <p> To fully understand how demented this has become, consider the case of <i>Yoshi's Island</i>, which is one of the greatest platformers of the 16-bit era. I am terrible at platformers but I love this game so much that I've bought it at least three times: once in the Gameboy Advance port, once on the Virtual Console, and once as an actual SNES cartridge back when Belle and I lived in Arlington. Nintendo made money at least on two of those copies, at least. But now that we've sold our Wii, if I want to play <i>Yoshi's Island</i> again, even though I have owned three legitimate copies of the game I would still have to give Nintendo more money. Or I could grab a ROM and an emulator, which seems infinitely more likely. <p> By contrast, I recently bought a copy of <i>Doom</i>, because I'd never played through the second two episodes. It ran me about $5 on Steam, and consists of the original WAD files, the game executable, and a preconfigured version of DOSBox that hosts it. I immediately went and installed <a href="http://chocolate-doom.org">Chocolate Doom</a> to run the game fullscreen with better sound support. If I want to play <i>Doom</i> on my phone, or on my Chromebook, or whatever, I won't have to buy it again. I'll just copy the WAD. And since I got it from Steam, I'll basically have a copy on any future computers, too. <p> (Episode 1 is definitely the best of the three, incidentally.) <p> Emulation is also at the core of the Internet Archive's groundbreaking work to preserve digital history. They've preserved thousands of games and pieces of software via browser ports of MAME, MESS, and DOSBox. That means I can load up a copy of <a href="https://archive.org/details/msdos_broderbund_print_shop">Broderbund Print Shop</a> and relive summer at my grandmother's house, if I want. But I can also pull up the <a href="https://archive.org/details/canoncat">Canon Cat</a>, a legendary and extremely rare experiment from one of the original Macintosh UI designers, and see what a radically different kind of computing might look like. There's literally no other way I would ever get to experience that, other than emulating it. <p> The funny thing about demonizing emulation is that we're increasingly entering an era of digital entertainment that may be unpreservable with or without it. Modern games are updated over the network, plugged into remote servers, and (on mobile and new consoles) distributed through secured, mostly-inaccessible package managers on operating systems with no tradition of backward compatibility. It may be impossible, 20 years from now, to play a contemporary iOS or Android game, similar to the way that Blizzard themselves <a href="http://www.gamasutra.com/view/news/274750/To_run_WoW_legacy_servers_Blizzard_must_reverseengineer_its_own_game.php">can't recreate a decade-old version of <i>World of Warcraft</i></a>. <p> By locking software up the way that Nintendo (and other game/device companies) have done, as a single-platform binary and not as a reusable data file, we're effectively removing them from history. Maybe in a lot of cases, that's fine &mdash; in his presentation, Cifaldi refers offhand to working on a mobile <i>Sharknado</i> tie-in that's no longer available, which is not exactly a loss for the ages. But at least some of it has to be worth preserving, in the same way even bad films can have lessons for directors and historians. The Canon Cat was not a great computer, but I can still learn from it. <p> I'm all for keeping Nintendo profitable. I like the idea that they're producing their own multi-cart NES reproduction, instead of leaving it to third-party pirates, if only because I expect their version will be slicker and better-engineered for the long haul. But the time has come to stop letting them simultaneously re-sell the same ROM to us in different formats, while insisting that emulation is solely the concern of pirates and thieves. Thu, 14 Jul 2016 20:04:29 -0700 http://www.milezero.org/index.php/gaming/perspective/emu_nation.html/gaming/perspective Under our skin http://www.milezero.org/index.php/culture/america/race_and_class/under_our_skin.html This week, we've launched a major project at the Times on the words people use when talking about race in America. <a href="http://projects.seattletimes.com/2016/under-our-skin/">Under our skin</a> was spearheaded by a small group of journalists after the paper came under fire for some bungled coverage. I think they did a great job &mdash; the subjects are well-chosen, the editing is top-notch, and we're trying to supplement it with guest essays and carefully-curated comments (as opposed to our usual all-or-nothing approach to moderation). I mostly watched from the sidelines on this one, as our resident expert on forcing Brightcove video to behave in a somewhat-acceptable manner, and it was really fascinating watching it take shape. Mon, 20 Jun 2016 13:56:25 -0700 http://www.milezero.org/index.php/culture/america/race_and_class/under_our_skin.html/culture/america/race_and_class Speaking schedule, 2016 http://www.milezero.org/index.php/random/personal/speaking_schedule_2016.html After NICAR, I wasn't really sure I ever wanted to go to any conferences ever again &mdash; the travel, the hassle, the expense... who needs it? But I am also apparently unable to moderate my extracurricular activities in any way, even after leaving a part-time teaching gig, so: I'm happy to announce that I'll be speaking at a couple of professional conferences this summer, albeit about very different topics. <p> First up, I'll be facilitating a session at SRCCON in Portland about <a href="http://srccon.org/sessions/#proposal-312019">designing humane news sites</a>. This is something I've been thinking about for a while now, mostly with regards to bots and "conversational UI" fads, but also as the debate around ads has gotten louder, and the ads themselves have gotten worse (<a href="https://rewire.news/article/2016/05/25/anti-choice-groups-deploy-smartphone-surveillance-target-abortion-minded-women-clinic-visits/">see also</a>). I'm hoping to talk about the ways that we can build both individual interactives and content management systems so that we can minimize the amount of accidental harm that we do to our readers, and retain their trust. <p> My second talk will be at <a href="http://2016.cascadiafest.org/speakers/">CascadiaFest</a> in beautiful Semiahmoo, WA. I'll be speaking on how we've been using custom elements in production at the Times, and encouraging people to build their own. The speaker list at Cascadia is completely bonkers: I'll be sharing a stage with people who I've been following for years, including Rebecca Murphey, Nolan Lawson, and Marcy Sutton. It's a real honor to be included, and I've been nervously rewriting my slides ever since I got in. <p> Of course, by the end of the summer, I may never want to speak publicly again &mdash; I may burn my laptop in a viking funeral and move to Montana, where I can join our <a href="http://www.seattletimes.com/seattle-news/editor-kathy-best-leaving-the-seattle-times/">departing editor</a> in some kind of backwoods hermit colony. But for right now, it feels a lot like the best parts of teaching (getting to show people cool stuff and inspire them to build more) without the worst parts (grading, the school administration). Thu, 26 May 2016 10:30:38 -0700 http://www.milezero.org/index.php/random/personal/speaking_schedule_2016.html/random/personal Behind the Times http://www.milezero.org/index.php/tech/web/behind_the_times.html The paper recently launched a new native app. I can't say I'm thrilled about that, but nobody made me CEO. Still, the technical approach it takes is "interesting:" its backing API converts articles into a linear stream of blocks, each of which is then hand-rendered in the app. That's the plan, at least: at this time, it doesn't support non-text inline content at all. As a result, a lot of our more creative digital content doesn't appear in the app, or is distorted when it does appear. <p> The justification given for this decision was speed, with the implicit statement being that a webview would be inherently too slow to use. But is that true? I can't resist a challenge, and it seemed like a great opportunity to test out some new web features I haven't used much, so I decided to try building a client. You can find the code <a href="https://github.com/thomaswilburn/seatimes-chrome">here</a>. It's currently structured as a Chrome app, but that's just to get around the CORS limit since our API doesn't have the Access-Control-Allow-Origin headers added. <p> The app uses a technique that's been popularized by Nolan Lawson's <a href="http://www.pocketjavascript.com/blog/2015/11/23/introducing-pokedex-org">Pokedex.org</a>, in which almost all of the time-consuming code runs in a Web Worker, and the main thread just handles capturing UI events and re-rendering. I started out with the worker process handling <a href="https://github.com/thomaswilburn/seatimes-chrome/blob/master/src/js/worker/seatimes.js">network and caching in IndexedDB</a> (the poor man's Service Worker), and then expanded it to do <a href="https://github.com/thomaswilburn/seatimes-chrome/blob/master/src/js/worker/sanitize.js">HTML sanitization as well</a>. There's probably other stuff I could move in, but honestly I think it's at a good balance now. <p> By putting all this stuff into a second script that runs independently, it frees up the browser to maintain a smooth frame rate in animations and UI response. It's not just the fact that I'm doing work elsewhere, but also that there's hardly any garbage collection on the main thread, which means no halting while the JavaScript VM cleans up. I thought building an app this way would be difficult, but it turns out to be mostly similar to writing any page that uses a lot of AJAX &mdash; <a href="https://github.com/thomaswilburn/seatimes-chrome/blob/master/src/js/worker/routes.js">structure the worker as a "server"</a> and the patterns are pretty much the same. <p> The other new technology that I learned for this project is <a href="http://mithril.js.org">Mithril</a>, a virtual DOM framework that my old coworkers at ArenaNet rave about. I'm not using much of its MVC architecture, but its view rendering code is great at gradually updating the page as the worker sends back new data: I can generate the initial article list using just the titles that come from one network endpoint, and then <a href="https://github.com/thomaswilburn/seatimes-chrome/blob/master/src/js/ui/sectionView.js#L63">add the thumbnails that I get from a second, lower-priority request</a>. Readers get a faster feed of stories, and I don't have to manually synchronize the DOM with the new data. <p> The metrics from this version of the app are (unsurprisingly) pretty good! The biggest slowdown is the network, which would also be a problem in native code: loading the article list for a section requires one request to get the article IDs, and then one request for each article in that section (up to 21 in total). That takes a while &mdash; about a second, on average. On the other hand, it means we have every article cached by the time that the user can choose something to read, which cuts the time for requesting and loading an individual article hovers around 150ms on my Chromebook. <p> That's not to say that there aren't problems, although I think they're manageable. For one thing, the worker and app bundles are way too big right now (700KB and 200KB, respectively), in part because they're pulling in a bunch of big NPM modules to do their processing. These should be lazy-loaded for speed as much as possible: we don't need HTML parsing right away, for example, which would cut a good 500KB off of the worker's initial size. Every kilobyte of script is roughly 1ms of load time on a mobile device, so spreading that out will drastically speed up the app's startup time. <p> As an interesting side note, we could cut almost all that weight entirely if the <var>document.implementation</var> object was available in Web Workers. Weir, for example, does all its parsing and sanitization <a href="https://github.com/thomaswilburn/Weir/blob/master/public/js/Service.Sanitize.js#L30">in an inert document</a>. Unfortunately, the DOM isn't thread-safe, so nothing related to <var>document</var> is available outside the main process, and I suspect a serious sanitization pass would blow past our frame budget anyway. Oh well: <var>htmlparser2</var> and friends it is. <p> Ironically, the other big issue is mostly a result of packaging this up as a Chrome app. While that lets me talk to the CMS without having CORS support, it also comes with a fearsome content security policy. The app shell can't directly load images or fonts from the network, so we have to load article thumbnails through JavaScript manually instead. Within Chrome's <var>&lt;webview&gt;</var> tag, we have the opposite problem: the webview can't load anything from the app, and it has a weird protocol location when loaded from a data URL, so all relative links have to be rewritten. It's not insurmountable, but you have to be pretty comfortable with the way browsers work to figure it out, and the debugging can get a little hairy. <p> So there you have it: a web app that performs like native, but includes support for features like DocumentCloud embeds or interactive HTML graphs. At the very least, I think you could use this to advocate for a hybrid native/web client on your news site. But there's a strong argument to be made that this could be your <i>only</i> app: add a Service Worker and (in Chrome and Firefox) it could load instantly and work offline after the first visit. It would even get a home screen icon and push notification support. I think the possibilities for <a href="https://addyosmani.com/blog/getting-started-with-progressive-web-apps/">progressive web apps</a> in the news industry are really exciting, and building this client makes me think it's doable without a huge amount of extra work. Tue, 10 May 2016 15:24:10 -0700 http://www.milezero.org/index.php/tech/web/behind_the_times.html/tech/web Reporting with Python http://www.milezero.org/index.php/journalism/education/reporting_with_python.html This month, I'm teaching a class at the University of Washington on reporting with Python. This seems like an odd match for me, since I hardly ever work with Python, but I wanted to do a class that was more journalism-focused (as opposed to the front-end development that I normally teach) and teaching first-time programmers how to do data analysis in Node just isn't realistic. If you're interested in following along, the repository with the class materials is located <a href="https://github.com/thomaswilburn/reporting-with-python">here</a> <p> I'm not the Times' data reporter, so I don't get to do this kind of analysis often, but I always really enjoy it when I do. The danger when planning a class on a fun topic is that it's easy to over-stuff the curriculum in my eagerness to cover the techniques that I think are particularly interesting. To fight that impulse, I typically make a list of material I want to cover, then cut it in half, then think about cutting it in half again. As a result, there's a lot of stuff that didn't make it in &mdash; SQL and web scraping primarily among them. <p> What's left, however, is a pretty solid base for reporters who are interested in starting to use code to generate and explore stories. Last week, we cleaned and searched 1,000 text files for a string, and this week we'll look at doing analysis on CSV files. In the final session, I'm planning on taking a deep dive into regular expressions: so much of reporting is based around interrogating text files, and the nice thing about an education in regex is that it will travel into almost any programming language (as well as being useful for many command line tools like grep or sed). <p> If I can get anything across in this class, I'm hoping to leave students with an understanding of just how big digital scale can be, and how important it is to have tools for handling it. I was talking one night with one of the Girl Develop It organizers, who works for a local analytics company. Whereas millions of rows of data is a pretty big deal for me, for her it's a couple of hours on a Saturday &mdash; she's working at a whole other order of magnitude. I wouldn't even know where to start. <p> Right now, most record requests and data dumps operate more at my scale. A list of <a href="http://www.seattletimes.com/seattle-news/environment/thousands-of-exotic-animals-are-shipped-through-seattle-each-year/">all animal imports/exports in the US for the last ten years</a> is about 7 million records, for example. That's approachable with Python, although you'd be better off learning some SQL for the heavy lifting, but it's past the point where Excel is useful, and it certainly couldn't be explored by hand. If you can't code, or you don't have access to someone who does, you can't write that story. <p> At some point, the leaks and government records that reporters pore over may grow to a larger kind of scale (leaks, certainly; government data has will be aggregated as long as there are privacy concerns). When that happens, reporters will have to develop the kinds of skills that I don't have. We already see hints of this in the tremendous tooling and coordination required for investigating <a href="https://source.opennews.org/en-US/articles/people-and-tech-behind-panama-papers/">the Panama papers</a>. But in the meantime, I think it's tremendously important that students learn how to automate data at a basic level, and I'm really excited that this class will introduce them to it. Fri, 29 Apr 2016 10:04:06 -0700 http://www.milezero.org/index.php/journalism/education/reporting_with_python.html/journalism/education Calculated Amalgamation http://www.milezero.org/index.php/tech/coding/calculated_amalgamation.html In a fit of nostalgia, I've been trying to get my hands on a TI-82 calculator for a few weeks now. TI BASIC was probably the first programming language in which I actually wrote significant amounts of code: although a few years later I'd start working in C for PalmOS and Windows CE, I have a lot of memories of trying to squeeze programs for speed and size during slow class periods. While I keep checking Goodwill for spares, there are plenty of TI calculator emulation apps, so I grabbed one and loaded up a TI-82 ROM to see what I've retained. <p> Actually, TI BASIC is <i>really</i> weird. Things I had forgotten: <ul> <li> You can type in all-caps text if you want, but most of the time you don't, because all of the programming keywords (<var>If</var>, <var>Else</var>, <var>While</var>, etc.) are actually single "character" glyphs that you insert from a menu. <li> In fact, pretty much the only code that's typed manually are variable names, of which you get 26 (one for each letter). There are also six arrays (max length 99), five two-dimensional matrices (limited by memory), and a handful of state variables you can abuse if you really need more. Everything is global. <li> Variables aren't stored using <var>=</var>, which is reserved for testing, but with a left-to-right arrow operator: <var>value &rarr; dest</var> I imagine this clears up a lot of ambiguity in the parser. <li> Of course, if you're processing data linearly, you can do a lot without explicit variables, because the result of any statement gets stored in <var>Ans</var>. So you can chain a lot of operations together as long as you just keep operating on the output of the previous line. <li> There's no debugger, but you can hit the On key to break at any time, and either quit or jump to the current line. <li> You can call other programs and they do return after calling, but there are no function definitions or return values other than <var>Ans</var> (remember, everything is global). There is GOTO, but it apparently causes memory leaks when used (thanks, Dijkstra!). </ul> <p> I'd romanticized it over time &mdash; the self-contained hardware, the variable-juggling, the 1-bit graphics on a 96x64 screen. Even today, I'm kind of bizarrely fascinated by this environment, which feels like the world's most cumbersome register VM. But loading up the emulator, it's obvious why I never actually finished any of my projects: TI BASIC is legitimately a terrible way to work. <p> In retrospect, it's obviously a scripting language for a plotting library, and not the game development environment I wanted it to be when I was trying to build Wolf3D clones. You're supposed to write simple macros in TI BASIC, not full-sized applications. But as a bored kid, it was a great playground, and the limitations of the platform (including its molasses-slow interpreter) made simple problems into brainteasers (it's almost literally the challenge behind <i>TIS-100</i>). <p> These days, the kids have it way better than I did. A micro:bit is cheaper and syncs with a phone or computer. A Raspberry Pi is a real computer of its own, as is the average smartphone. And a laptop or Chromebook with a browser is miles more productive than a TI-82 could ever be. On the other hand, they probably can't sneak any of those into their trig classes and get away with it. And maybe that's for the best &mdash; look how I turned out! Thu, 14 Apr 2016 22:48:16 -0700 http://www.milezero.org/index.php/tech/coding/calculated_amalgamation.html/tech/coding ES6 in anger http://www.milezero.org/index.php/tech/web/es6_in_anger.html One of the (many) advantages of running Seattle Times interactives on an entirely different tech stack from the rest of the paper is that we can use new web features as quickly as we can train ourselves on them. And because each news app ships with an isolated set of dependencies, it's easy to experiment. We've been using a lot of new ES6 features as standard for more than a year now, and I think it's a good chance to talk about how to use them effectively. <p> <h4>The Good</h4> <p> Surprisingly (to me at least), the single most useful ES6 feature has been arrow functions. The key to using them well is to restrict them only to one-liners, which you'd think would limit their usefulness. Instead, it frees you up to write much more readable JavaScript, especially in array processing. As soon as it breaks to a second line (or seems like it might do so in the future), I switch to writing regular function statements. <pre><code> //It's easy to filter and map: var result = list.filter(d => d.id).map(d => d.value); //Better querySelectorAll with the spread operator: var $ = s => [...document.querySelectorAll(s)]; //Fast event logging: map.on("click", e => console.log(e.latlng); //Better styling with template strings: var translate = (x, y) => `translate(${x}px, ${y}px);`; </code></pre> <p> Template strings are the second biggest win, especially as above, where they're combined with arrow functions to create text snippets. Having a multiline string in JS is very useful, and being able to insert arbitrary values makes building dynamic popups or CSS styles enormously simpler. I love writing template strings for quick chunks of templating, or embedding readable SQL in my Node apps. <p> Despite the name, template strings aren't real templates: they can't handle loops, they don't really do interpolation, and the interface for using "tagged" strings is cumbersome. If you're writing very long template strings (say, more than five lines), it's probably a sign that you need to switch to something like Handlebars or EJS. I have yet to see a "templating language" built on tagged strings that didn't seem like a wildly frustrating experience, and despite the industry's shift toward embedded DSLs like React's JSX, there is a benefit to keeping different types of code in different files (if only for widespread syntax highlighting). <p> The last feature I've really embraced is destructuring and object literals. They're mostly valuable for cleanup, since all they do is cut down on repetition. But they're pleasant to use, especially when parsing text and interacting with CommonJS modules. <pre><code> //Splitting dates is much nicer now: var [year, month, day] = dateString.split(/\/|-/); //Or getting substrings out of a regex match: var re = /(\w{3})mlb_(\w{3})mlb_(\d+)/; var [match, away, home, index] = gameString.match(re); //Exporting from a module can be simpler: var x = "a"; var y = "b"; module.exports = { x, y }; //And imports are cleaner: var { x } = require("module"); </code></pre> <h4>The bad</h4> <p> I've tried to like ES6 classes and modules, and it's possible that one day they're going to be really great, but right now they're not terribly friendly. Classes are just syntactic sugar around ES5 prototypes &mdash; although they look like Java-esque <var>class</var> statements, they're still going to act in surprising ways for developers who are used to traditional inheritance. And for JavaScript programmers who understand how the language actually works, class definitions boast a weird, comma-less syntax that's <i>sort of</i> like the new object literal syntax, but far enough off that it keeps tripping me up. <p> The turning point for the new <var>class</var> keyword will be when the related, un-polyfillable features make their way into browsers &mdash; I'm thinking mainly of the new Symbols that serve as feature flags and the ability to extend Array and other built-ins. Until that time, I don't really see the appeal, but on the other hand I've developed a general aversion to traditional object-oriented programming, so I'm probably not the best person to ask. <p> Modules also have some nice features from a technical standpoint, but there's just no reason to use them over CommonJS right now, especially since we're already compiling our applications during the build process (and you have to do that, because browser support is basically nil). The parts that are really interesting to me about the module system &mdash; namely, the configurable loader system &mdash; aren't even fully specified yet. <h4>New discoveries</h4> <p> Most of what we use on the Times' interactive team is restricted to portions of ES6 that can be transpiled by Babel, so there are a lot of features (proxies, for example) that I don't have any experience using. In a Node environment, however, I've had a chance to use some of those features on the server. When I was writing our <a href="https://github.com/seattletimes/mlb-scraper/">MLB scraper</a>, I took the opportunity to try out generators for the first time. <p> Generators are borrowed liberally from Python, and they're basically constructors for custom iterable sequences. You can use them to make normal objects respond to language-level iteration (i.e., <var>for ... of</var> and the spread operator), but you can also define sequences that don't correspond to anything in particular. In my case, I created a generator for the calendar months that the scraper loads from the API, which (when hooked up to the command line flags) lets users restart an MLB download from a later time period: <pre><code> //feed this a starting year and month var monthGen = function*(year, month) { while (year < 2016) { yield { year, month }; month++; if (month > 12) { month = 1; year++; } } }; //generate a sequence from 2008 to 2016 var months = [...monthGen(2008, 1)]; </code></pre> <p> That's a really nice code pattern for creating arbitrary lists, and it opens up a lot of doors for JavaScript developers. I've been reading and writing a bit more Python lately, and it's been amazing to see how much a simple pattern like this, applied language-wide, can really contribute to its ergonomics. Instead of the Stream object that's common in Node, Python often uses generators and iteration for common tasks, like reading a file line-by-line or processing a data pipeline. As a result, I suspect most new Python programmers need to survey a lot less intellectual surface area to get up and running, even while the guts underneath are probably less elegant for advanced users. <p> It surprised me that I was so impressed with generators, since I haven't particularly liked Python very much in the past. But in reading the <a href="http://chimera.labs.oreilly.com/books/1230000000393/index.html">Cookbook</a> to prep for a UW class in Python, I've realized that the two languages are actually closer together than I'd thought, and getting closer. Python's <var>class</var> implementation is actually prototypical behind the scenes, and its use of duck typing for built-in language features (such as the <a href="https://docs.python.org/2/reference/datamodel.html#context-managers"><var>with</var> statement</a>) bears a strong resemblance to the work being done on JavaScript Promises (a.k.a. "then-ables") and iterator protocols. <p> It's easy to be resistant to change, and especially when it's at the level of a language (computer or otherwise). I've been critical of a lot of the decisions made in ES6 in the past, but these are positive additions on the whole. It's also exciting, as someone who has been working in JavaScript at a deep level, to find that it has new tricks, and to stretch my brain a little integrating them into my vocabulary. It's good for all of us to be newcomers every so often, so that we don't get too full of ourselves. Tue, 22 Mar 2016 18:43:02 -0700 http://www.milezero.org/index.php/tech/web/es6_in_anger.html/tech/web Seventy-two http://www.milezero.org/index.php/politics/national/executive/seventy_two.html Since it is election season, when I ran out of library books last week I decided to re-read Hunter S. Thompson's <i>Fear and Loathing: On the Campaign Trail '72</i>, as I do every four years or so. Surprisingly, I don't appear to have written about it here, even though it's one of my favorite books, and the reason I got into journalism in the first place. <p> <i>On the Campaign Trail</i> is always a relevant text, but it feels particularly so apt this year. In the middle of the Trump presidential run, the book's passage on the original populist rabble-rouser, George Wallace, could have been written yesterday if you just swap some names &mdash; not to mention the whirlwind chaos of the primaries and a convention battle. On the other hand, with writing this good, there's really no wrong time to bring it up. <p> Even before his death in 2005, when most people thought about Thompson, what usually came to mind was wild indulgence: drugs, guns, and "bat country." Ironically, <i>On the Campaign Trail</i> makes the strong case that his best writing was powerfully controlled and focused, not loose and hedonistic: the first two-thirds of the book (or even the first third) contain his finest work. After the Democratic national convention, and the resulting breakdowns in Thompson's health, the analysis remains sharp but the writing never reaches those heights again. <p> That's not to say that the book is without moments of depravity &mdash; his account of accidentally unleashing a drunken yahoo on the Muskie whistlestop tour is still a classic, not to mention the extended threat to chop the big toes off the McGovern political director &mdash; but it's never random or undirected. For Thompson, wild fabrication is the only way to bring readers into the surreal world of a political race. His genius is that it actually works. <p> Despite all that, if <i>On the Campaign Trail</i> has a legacy, it's not the craziness, the drugs, or even the politics. The core of the book is two warring impulses that drive Thompson at every turn: sympathy for the voters who pull the levers of democracy, and simultaneously a deep distrust of the kind of people that they reliably elect. The union of the two is the fuel behind his best writing. Or, as he puts it: <blockquote> <p> The highways are full of good mottos. But T.S. Eliot put them all in a sack when he coughed up that line about... what was it? Have these Dangerous Drugs fucked my memory? Maybe so. But I think it went something like this: <p> "Between the Idea and the Reality... Falls the Shadow." <p> The Shadow? I could almost <i>smell</i> the bastard behind me when I made the last turn into Manchester. It was late Tuesday night, and tomorrow's schedule was calm. All the candidates had zipped off to Florida &mdash; except for Sam Yorty, and I didn't feel ready for that. <p> The next day, around noon, I drove down to Boston. The only hitchhiker I saw was an eighteen-year-old kid with long black hair who was going to Reading &mdash; or "Redding," as he said it &mdsash; but when I asked him who he planned to vote for in the election he looked at me like I'd said something crazy. <p> "What election?" he asked. <p> "Never mind," I said. "I was only kidding." </blockquote> Wed, 16 Mar 2016 22:19:43 -0700 http://www.milezero.org/index.php/politics/national/executive/seventy_two.html/politics/national/executive Spotlit http://www.milezero.org/index.php/journalism/industry/spotlit.html Judging by my peers, it's possible that I'm the only journalist in America who didn't absolutely love <i>Spotlight</i>. I thought it was a serviceable movie, but when it comes to this year's Best Picture award I still harbor a fantasy that there's an Oscar waiting in Valhalla, shiny and chrome, for <i>Fury Road</i> (or for <i>Creed</i>, if push came to shove). <p> But I'm not upset to see <i>Spotlight</i> win, either. The movie may have been underwhelming for me, but its subject deserves all the attention it gets (whether or not, as former NYT designer Khoi Vinh wonders, <a href="http://www.subtraction.com/2016/03/01/is-the-boston-globe-doing-enough-with-spotlights-oscar-win/">the Globe fully capitalizes on it</a>). My only real concern is that soon it'll be mostly valuable as a historical document, with the kind of deep reporting that it portrays either dying or dead. <p> To recap: <i>Spotlight</i> centers on the Boston Globe's investigation into the Catholic Church's pedophilia scandals in the 1990s &mdash; and specifically, into how the church covered up for abusive priests by moving them around or assigning them to useless "rehabilitation" sessions. The paper not only proved the fact that the church was aware of the problem, but also demonstrated that it was far more common than anyone suspected. It's one of the most important, influential works of journalism in modern memory, done by a local newsroom. <p> It's also a story of successful data journalism, which I feel is often rare: while my industry niche likes to talk itself up, our track record is shorter than many of us like to admit. The data in question isn't complex &mdash; the team used spreadsheets and data entry, not scripting languages or visualizations &mdash; but it represents long hours of carefully entering, cleaning, and checking data to discover priests that were shuffled out of public view after reports of abuse. Matt Carroll, the team's "data geek," writes about that experience <a href="https://medium.com/3-to-read/spotlight-the-movie-a-personal-view-f0fa39900afc">here</a>, including notes on what he'd do differently now. <p> So it's very cool to see the film getting acclaim. At the same time, it's a love letter to an increasingly small part of the news industry. Investigative teams are rare these days, and many local papers don't have them anymore. We're lucky that we still have them at the Seattle Times &mdash; it's one of the things I really like about working there. <p> Why do investigative teams vanish? They're expensive, for one thing: a team may spend months, or even a year working on a story. They may need legal help to pursue evidence, or legal protection once a story is published. And investigative stories are not huge traffic winners, certainly not proportional to the effort they take. They're one of the things newsrooms do on principle, and when budget gets tight, those principles often start to look more negotiable than they used to. <p> In this void, there are still a few national publishers pursuing investigations, both among the startups (Buzzfeed, which partnered on our mobile home stories) and the non-profits (Pro Publica and the Marshall Project). I'm a big fan of the work they're doing. Still, they're spread thin trying to cover the whole country, or a particular topic, leaving a lot of shadows at the local level that could use a little sun. <p> It's nice to imagine that the success of <i>Spotlight</i> the movie will lead to a resurgence in funding for Spotlight the investigative department, and others like them. I suspect that's wishful thinking, though. In the end, that Oscar isn't going to pay for more reporters or editors. If even Hollywood glamor can't get reporters and editors funded, can anything? Thu, 03 Mar 2016 21:04:25 -0800 http://www.milezero.org/index.php/journalism/industry/spotlit.html/journalism/industry Excelsior? http://www.milezero.org/index.php/culture/pop/comics/excelsior.html Marvel Comics has a digital subscription service called "Marvel Unlimited" that's basically Netflix for their comics: access to most of their archives online for ten bucks a month or so. I decided to give it a shot after Ta-Nehisi Coates kept singing its praises. I buy a few trades a year, but don't always keep them on my shelf, and I figured this was a good chance to go trolling through a few classics that aren't collected in print anymore. <p> Is it worth it? Well, usually. It turns out that Marvel's back catalog is hardly immune to Sturgeon's Law: most of it is crap. It doesn't help that it's almost all superhero-flavored, which is fine in small doses but starts to feel a little ridiculous when you're exposed to literally thousands of titles and they've all got capes: really, this is all you have? Sure, it's Marvel and that's what they do, but knowing that there's a <a href="https://en.wikipedia.org/wiki/Fun_Home">broad</a> <a href="https://en.wikipedia.org/wiki/Lumberjanes">range</a> of <a href="https://en.wikipedia.org/wiki/Seconds_%28comics%29">other</a> <a href="https://en.wikipedia.org/wiki/Saga_%28comic_book%29">stories</a> being told in this medium makes their genre limitations feel all the more jarring. <p> Marvel's other bad habit, which only seems to have gotten worse as far as I can tell, is the "special events" that make it impossible to just read through a single storyline. For example, trying to read through the new X-Men titles is an exercise in frustration, since they keep being interrupted or pre-empted by crossover stories from other books. As a way to sell comics to a hardcore faithful, it probably works pretty well. But as a relative newcomer, it's disorienting and irritating, as though a medical drama came crashing into your favorite sitcoms at random intervals. <p> As a result, it's not surprising that my favorite series to read so far have been either standalone humor titles or oddball takes on the genre. Dan Slott's 2004 run on <i>She-Hulk</i> (often referred to as "Single Green Female") is more legal workplace drama than anything else, and while it sometimes got too clever with the meta-humor, it delivers a nice, funny, self-contained story arc. Ditto for <i>The Superior Foes of Spider-Man</i>, which ran for 17 issues and follows a set of petty, incompetent super-thieves who get in way over their heads. <i>X-Men: Legacy</i> is another short storyline focusing on Charles Xavier's son, David, who has some legitimate disagreements with his "peaceful" father's violent vigilante organization. With its frequent trips into psychic psychedelia, it makes a great case for the infinite effects budget that comics so rarely exploit. <p> On the other side of the coin, I went trolling through Walt Simonson's tenure on <i>Thor</i>, which ran back in the 1980's and often gets mentioned as a stellar example of classic comics writing. It's pretty good! But it's also a decidedly-weird artifact: while there's overlap with the rest of the Marvel universe from time to time, most of the story is a kind of bonkers faux-Norse legend, with characters taking oaths of honor, pursuing doomed love, and striking off on various quests. The most impressive thing, from a modern perspective, is how many storylines it manages to juggle per issue. There's A, B, C, and sometimes even a D plot, all playing out in 30 page chunks. <p> But by far my favorite discovery has been the original reason I signed up: Priest's late-90's <i>Black Panther</i>, which is a really fascinating, thought-provoking bit of work. While parts of the art and dialog have not aged gracefully, a lot of it continues to feel very current, both in terms of topic matter and storytelling. <p> As early as possible, and throughout the rest of the book, Priest emphasizes that T'Challa (the titular Panther) is not just a vigilante out to fight crime, like other superheroes. He's the king of a country &mdash; a legitimate state power with an entirely different set of priorities and concerns. To drive that point home, Priest frames the narrative as a series of progress reports from the US liason to T'Challa, Everett Ross, a move that turns out to be an elegant narrative hat trick: <ul> <li> Being a white State Department functionary, Ross can explain the political element of the books and serve as an audience surrogate for the largely-white readership. <li> He's useful as comic relief, which is good, since the story arcs themselves revolve around political coups and international sovereignty, and can get a little byzantine. <li> He's <b>a terrible narrator</b>, which starts a running gag where each issue starts disastrously <i>in media res</i> and then unshuffles itself as Ross is forced to double back and explain the situation. </ul> <p> It's a comic book, so of course there are goofy action scenes, and much like the current crop of comic-inspired movies, these rarely rise above "vaguely interesting." But when I think back to the most memorable pages, it's mostly quieter or more subversive scenes. Most of the real plot happens in dialog: negotiations between the Panther and other governments, discussions of succession and history, sarcastic asides that mock the standard superhero schtick. Along the way, Priest is happy to extend a scene for either pathos or awkward humor, to undercut his own pretension, or let characters react to <i>The Black Panther</i>'s quietly revolutionary core &mdash; an African nation that's portrayed as a technological superpower of its own. As Coates says, when talking about his own plans to write for the character: <blockquote> <p> It's obviously not the case, but T'Challa &mdash; the Black Panther and mythical ruler of Wakanda &mdash; has always struck as the product of the black nationalist dream, a walking revocation of white supremacist myth. T'Challa isn't just a superhero in the physical sense, he is one of the smartest people in the world, ruling the most advanced civilization on the planet. Wakanda's status as ever-independent seems to eerily parallel Ethiopia's history as well as its place in the broader black imagination. Maybe it's only me, but I can't read Jason Aaron's superb "<a href="http://comicvine.gamespot.com/black-panther-39-see-wakanda-and-die-part-1/4000-134635/">See Wakanda And Die</a>" and not think of <a href="https://en.wikipedia.org/wiki/Battle_of_Adwa">Adowa</a>. <p> Comic book creators, like all story-tellers, get great mileage out of myth and history. But given the society we live in, some people's myths are privileged over others. Some of that is changing, no doubt. In the more recent incarnations of T'Challa you can see Christopher Priest invoking the language of the Hausa or Reginald Hudlin employing the legacy of colonialism. These were shrewd artistic decisions, rooted in the fact that anyone writing Black Panther enjoys an immediate, if paradoxical, advantage: the black diaspora is terra incognita for much of the world. What does the broader world really know of Adowa? Of Nanny and Cudjoe? Of the Maji-Maji rebellion? Of Legba and Oshun? Of Shine? Of High John The Conqueror? T'Challa's writers have always enjoyed access to a rich and under-utilized pool of allusion and invocation. </blockquote> <p> It's a proudly Afrocentric (and Afrofuturist) book, way ahead of its time, and put out by a major comics publisher. I imagine there are a lot of people for whom these throwaway, cheaply-printed comics were profound experiences when they were young. It's hard to imagine how much of that material can translate through to the eventual movie version, even when directed by a thoughtful and talented filmmaker like Ryan Coogler. But kids who go looking for the originals after they see it in theaters are in for a real surprise. Wed, 17 Feb 2016 21:23:53 -0800 http://www.milezero.org/index.php/culture/pop/comics/excelsior.html/culture/pop/comics