This summer I've started abusing the e-book lending program from the Seattle Public Library. The process itself is kind of stupid (why do I need to be on a waiting list for a digital file? Why can't I download it over the cell network?) but it's cheap and the selection's not bad. Unfortunately, the library doesn't keep a separate list of what you've borrowed, and I went and cleaned up my Amazon list, so I'm having to write part of this from memory. Truly, these are awful times.
For a long time, I didn't particularly have any feelings one way or the other about Connie Willis. She wrote that one depressing book about the black plague, is about all I could tell you. Then I read To Say Nothing of the Dog, which is probably the funniest time-travel book I've found (granted, not a terribly hilarious genre), and figured I'd give her another chance (the library has a ton of Willis available). Passage was probably the first title I picked up. A book about the science (and pseudo-science) of near-death experiences, it has all of the hallmarks of her work: sympathetic characters trapped in quagmire of bureaucracy, chaos, and cheerful incompetence--in this case, a hospital filled with quirky patients and ever-shifting construction. I don't think I reacted quite as strongly as Jo Walton to the twist ending, but I sympathize. It's a funny book, but (as is also often the case with Willis) maddeningly-paced. I would probably recommend it anyway.
Bellwether is also oddly-paced, but probably funnier and without the existential angst. It's about a scientist working for the HyTech corporation, trying to find out what causes fads, while fending off waves of trendy management policies and a disastrously bad office assistant. In a last-ditch effort to keep funding, she pairs up with a chaos theory mathematician using sheep for experimental subjects, even though neither of them knows the first thing about sheep. Willis has a gift for running jokes in dialog that she uses here, as every character has their own competing obsessions running roughshod over everyone else's. It's not an unpredictable book, which is funny for a story about chaos theory, but the enjoyment is in the journey.
Baratunde Thurston's How to be Black is ostensibly a satirical how-to guide, but really it's a memoir. In between chapters like "How to Be The (Next) Black President" and "How to Speak for All Black People", Thurston (a former web editor for The Onion) writes about what it was like to grow up in DC as a black kid in a militant vegetarian, pro-black household, attend private school with the politically-connected, and finally head off to Harvard. Honestly, I could have done with more of these stories, which are funny and glib in a self-deprecating way, more than a lot of the guidebook chapters, which start to feel like filler. At 272 pages, the book was a perfect library read: short enough to get through without endangering late fees on anything else, funny enough I didn't mind the length, cheap enough I didn't feel cheated that it was 272 pages with filler.
In addition to the library, I've also spent the summer reading through books that were downloaded to my Kindle literally years ago, but that I'd never gotten around to reading--mostly from book giveaways, before the publishers decided the real path to e-book success was to charge way too much for them. This means a number of terrible mystery novels and some decidedly mediocre fantasy. But one book stands out for being more bizarre than anything else in my backlog.
Flash, by L.E. Modesitt, is the simple story of a man named Jonat who consults on product placement in a not-quite-dystopian future, where ad jingles often include subliminal harmonics to create brand identification. Except he's also an ex-Special Ops soldier with a bunch of cybernetic enhancements that somehow the government just forgot to turn off. Hired to do some political consulting that goes vaguely wrong, Jonat finds himself on the wrong end of an enormous corporate conspiracy. This is the point where most protagonists would find some way to expose the malfeasance and cleverly put their enemies into a position of harmlessness. Jonat, on the other hand, embarks on a bizarre rampage of assassination and murder when confronted. Despite all evidence, the book seems convinced that Jonat is a fine, upstanding person--after a couple of bombings, shootings, and fatal traffic accidents, there's a moderately happy ending, in which he starts dating the emancipated clone body of a police AI.
To call it strange is perhaps not even the right word. It's as though Heinlein decided to start writing knockoffs of The Bourne Identity. I almost think you should read it.
My third quarter of teaching Intro to Programming and Intro to JavaScript at SCCC ends today. Over the last eight months, I've learned a bit about what gives new programmers trouble, and how to teach around those problems. I definitely wouldn't suggest JavaScript as a first language unless the students already know HTML and CSS very well--otherwise, they're learning three languages at once, and that's more than a little overwhelming.
Outside of class I've also started to daydream about ways to teach the basics of programming, not just for web development, but in general. I think a simpler (but still dynamic) language is probably best for beginners--even though I think its whitespace model is insane, Python (or its cheery little cousin Ruby) would probably be a good option. It's got a good library, not a lot of punctuation or syntax, and it would get people to indent their code (which, for some reason that I can't understand, is like pulling teeth no matter how many times you show people that it makes the code more readable). More importantly, it's straightforward imperative code--none of the crazy functional malarkey that emerges, marmot-like, whenever JavaScript starts doing any kind of user interaction or timing. And people actually use Python, so it's not like you're teaching them Lisp or something.
But let's pretend we weren't bound by the idea of "teach skills that are directly marketable"--i.e., let's pretend I'm not working for a community college (note: there's nothing wrong with teaching marketable skills, it's just a thought exercise). What's a good way to introduce people to the basic problems of programming, like looping and conditionals and syntax?
What about assembly?
Let's go ahead and get some reasons out of the way as to why you shouldn't teach programming with assembly. First, it offers no abstractions or standard libraries, so students won't be learning any immediately-useful skills for outside the classroom. It's architecture-specific, meaning they probably can't even take it from one computer to another. And of course, assembly is not friendly. It doesn't have nice, easy-to-type keywords like "print" and "echo" and "document.getElementById" (okay, so that's not all bad).
What you gain, given the right choice of architecture, is simplicity. Assembly does not give you syntax for loops, or for functions. It doesn't give you structures. You get some memory, a few registers to serve as variables, and some very basic control structures. It's like BASIC, but without all the user-friendliness. Students who have trouble keeping track of what their loops are doing, or how functions work, might respond better to the simple Turing tape-like flow of assembly.
But note that huge caveat: given the right choice of architecture. Ideally, you want a very short set of instructions, so students don't have to learn very much. That probably rules out x86. 6502 has a reasonably small set of opcodes, but they're organized in that hilarious table depending on what goes where and what addressing scheme you're in, which is kind of crazy. 68000 looks like a bigger version of 6502 to me. Chip-8 might work, but I don't really like the sprite system. I'd rather just have text output. What we need is an artificial VM designed for teaching--something that's minimal but fun to work with.
That's why I'm really interested in 0x10c, the space exploration game that's in development by Notch, the creator of Minecraft. The game will include an emulated CPU to run ship functions and other software, and it's programmed in a very simple, friendly form of assembly. There are a ton of people already writing tools for it, including virtual screen access--and the game's not even out yet. It's going to be a really great tool for teaching some people how computers work at a low level (in a simplified model, of course).
I'm not the first person to wonder if assembler could make a decent teaching tool. And I'm not convinced it actually would--it's certainly not going to do anything but confuse and frustrate my web development students. But I do think programming instruction requires a language that, against some conventional wisdom, isn't too high level. JavaScript's functions are great, but they're too abstract to translate well for most students--this and the DOM form a serious conceptual barrier to entry. On the other hand, students need to feel effective to keep their attention, meaning that it shouldn't take 50 lines of opcodes to print "hello, world." That flaw may make assembly itself ineffective, but thinking about what it does teach effectively may give us a better frame of reference for teaching with other toolkits.
The Thinkpad Hardware Maintenance Manual is kind of amazing. It is specifically laid out to walk someone through the process of disassembling a laptop, all the way down to the pads on the motherboard, and including (so helpful!) the types of screws used at each step of the way. You can fix pretty much anything on a Thinkpad with the HMM and the right spare parts. I know this now because I have been through the entire thing forward and backward.
About two months ago, I fried my motherboard (literally) by replacing the cooling fan with one meant for a laptop with integrated graphics, instead of the Nvidia chip that was actually installed. At 105°C, it wasn't long before the chip melted itself down into a puddle of silicon and coltan. I was able to eke out a few more days of use by turning off anything that could possibly require acceleration--Flash, videos of any kind, Aero Glass--but soon enough it just gave up on me completely.
What's frustrating about this is that you can't simply replace the shiny patch of metal that used to be a GPU on a laptop. Even though Lenovo refers to the Nvidia option as "discrete" graphics, it's actually soldered to the motherboard (perhaps they mean it won't gossip about you, which is probably true, because in my experience Nvidia's mobile chips don't live long enough to learn anything embarrassing anyway). So your only repair option is to replace the entire system board: about $600 with parts and labor, assuming you take it to some place where they know what they're doing, which is not a guarantee. Not that I'm apparently any better, but at least I only destroy my own property.
Stil, given the comprehensive hardware maintenance manual, a used motherboard from eBay, and a spare pill case to hold the screws, fixing the laptop turned out to be no big deal--I even converted it to the Intel graphics chip, which is less likely to overheat since it's got all the horsepower of a Nintendo Super FX chip. The experience has left me impressed with Lenovo's engineering efforts. The construction is solid without "cheating"--it's honestly pretty easy to open it up, and there's no glue or clips that only the manufacturer can remove/replace. As an inveterate tinkerer, I believe everything should be built this way.
But in that, at least, Lenovo and I am in the minority. Increasingly (led by Apple), laptops and other computing devices are sealed boxes with few or no user-serviceable parts. Want to upgrade your RAM or hard drive? Too bad, it's soldered in place. Break the screen? Better send it to the factory (or not, I guess). If a video card burns out on its own (as the Nvidia 140M chips did, repeatedly, even before I started tinkering with the cooling system) or another hardware problem occurs, they don't want you to fix it. They want you to buy another, most likely throwing the old one away. Even the Thinkpad, when it's outpaced by more demanding software, can only be upgraded so far. That means it also has a limited lifespan as a donation to a school or needful friend.
So in addition to fixing my old laptop, I'm moving away from a dependence on tightly-coupled, all-in-one computing, and back to modular desktop units. I bought a Maingear F131 PC (the most basic model), which I can upgrade or repair piece by piece (I already started by adding a wireless PCI card--Maingear doesn't sell those directly). It's better for the environment, better for my wallet, and it represents a vote against wasteful manufacturing. My goal is to make it the last whole computer (but not the last hardware) I'll buy for a long, long time.
Should you do the same? I think so. Manufacturers are moving to the integrated device model because it's cheaper for them and consumers don't seem to care. Change the latter, and we can change the direction of the industry. And I think you should care: in addition to the problem of e-waste, user-replacable components help to support an entire cottage industry of small repair shops, consultants, and boutique builders--the mom-and-pop businesses of the tech world. Moving to one-piece technology will kill those, just as it killed television, vaccuum, and radio repair shops.
You can argue, I'm sure, that many people do not want to learn how to install or repair their own hardware. That's true, and they shouldn't have to. But I also deeply believe that user-servicable and ease-of-use are not mutually-exclusive qualities, nor do they require users to actually learn everything there is to know about a computer. We don't expect everyone to know how to rebuild a car engine, but I think most people would also agree that a thriving industry for aftermarket repair parts and service is a good thing--who hasn't at least had a new radio installed? Who wants to be forced into a dealership every time a filter needs to be changed?
It is tempting to see the trend toward disposable devices as one with the trend toward walled gardens in software--a way to convert users from one-time payers to a constant revenue stream. I don't actually think most companies set out to be that sinister. But it's obvious that they also won't do the right thing unless we hold their feet to the fire. The time has come to put our money where our mouths are, and insist that the environment is not a suitable sacrifice for a few extra millimeters shaved off a product spec sheet.
In a move guaranteed to bring every troll of a certain age crashing into the comments section, an NPR All Songs Considered intern wrote this month about listening to Public Enemy's It Takes A Nation of Millions to Hold Us Back for the first time, comparing it unfavorably to (of all people) Drake. I can sympathize, because I too am still new to a lot of classic hip-hop, and I too do not always think it lives up to its reputation. On the other hand, even I don't go around kicking the whole Internet in the shins these days.
Surprisingly, in the kind of serendipity that sometimes rescues online slapfights like this, ?uestlove from the Roots dropped into the comments alongside the vitriol and lent some balanced advice on putting those recordings into context. He also wrote, in a follow-up on Twitter:
Man, That NPR/PE piece isn't bothering me as much as the position of both sides: youngins (I don't listen to music older than me) oldies (I cry for this generation) so you got one side that is dismissive to learning, we got another side dismissive on how to teach. Which leads to that "hip hop on trial" clip in which I spoke about the absence of sampling in hip hop is killing interest in music in general. At its worst sampling is a gateway drug to music you forgot about (listen to "talking all that jazz" by stet).
As a rock musician, I didn't get sampling for a long time, because I didn't really understand the relationship between listeners and producers that samples create. It didn't become clear to me until I started hanging out with the dancers in Urban Artistry, many of whom are also DJs or ferociously dedicated music fans, and realized that their knowledge of music was incredibly deep in part because they were listening to sampled music. What seemed like a lazy way to construct songs disguises an incredibly active listening experience.
That's why I love ?uestlove's commentary, because now that I listen to a lot more hip-hop I catch myself doing exactly what he describes: listening with one ear tuned to the present, and one to the past. Looking up the origins for a beat is a great way to discover classic tunes that I missed, or that I was too young to hear when they were first released. Recognizing a sample sometimes reveals a sly in-joke for a song, or a link somewhere else by virtue of shared DNA. It's not that other genres of music don't do the same thing--jazz musicians do this with riffs, and when I started learning bass, there was a whole canon I was expected to learn, from Pastorious to Prestia--but I guess I like the irony of it: here's the drummer for the greatest hip-hop "live band" justifiably lamenting the lack of sampling because it removes context and discoverability from the music.
A common lament among historians like Jeff Chang or Joseph Schloss is that hip-hop culture is distinctly apocryphal. It's an oral tradition: even in the dance community, moves like the CC or the Skeeter Rabbit are named after their creators as a way of maintaining continuity. Far from disrespecting the original artists, hip-hop music uses samples to put them in a privileged position. Knowing where the sample originates--the song, the record, the artist--marks a fan as someone who's doing their homework, in the tradition of DJs "digging in the crates" for new records to play. The future challenge for both historians and participants in hip-hop is walk a fine line: preserving the culture without disrupting either its innovative spirit or its built-in mechanisms of respect.
When I started thinking about blogging again, after an unintentional break, I realized that I'd been doing this, almost continuously, for more than seven years now. That's a long time. Although it was tempting to let it lie fallow, I figured it would be a shame after such a long run--and besides, I do like writing here, especially now that most of the readers (such as they were) are gone.
When I turned Mile Zero into a blog, way back in the day, one of the main things that I wrote about was gaming--specifically, gaming culture. That wasn't all I wrote about, but it was something I was interested in, and there was a whole community of great gaming blogs I could join. Gaming culture had plenty to write about, because it was (and is) a problematic place dominated by emotional children and shameless hacks pretending to be journalists. If I took on those issues, even in a tiny way, I hoped it could help--and it was a good distraction from an office job I wasn't thrilled about and a freelance career that probably wasn't headed anywhere either.
A few years later I got a job at CQ as a "Multimedia Web Producer." Nobody at CQ knew what that was supposed to mean, so gradually I turned myself into the newsroom's go-to person for interactive journalism. I loved my job, and the time and energy I put into it (not to mention the strict editorial policy of non-partisanship) meant I cut back on blogging. I also threw myself into dancing, which I think took me by surprise as much as anyone else, particularly once I joined Urban Artistry. And I went on a bit of an information diet, angry with the low quality/high volume approach of most gaming and tech sites. When I got a chance to write here, usually once a week, the spread of subjects had become more random than ever.
So here we are, seven years (and almost two months dark) later. Sure, this was never really a gaming blog. But I did write about gaming, particularly the sexism, racism, and classism I saw there, and I hoped it could get better. Has it?
Well, kind of better. I mean, it's still awful, isn't it? Sometimes it just seems like the exploitation gets more subtle over time. Tomb Raider pops back up, for example, but now Lara Croft's proportions are less exaggerated--and she's being threatened with sexual assault so players can feel protective toward her. One step forward, two steps off a cliff marked "Seriously, guys, what on earth were you thinking?"
At the other end of the malevolence spectrum, I just finished Driver: San Francisco. Loved it: it's funny, well-balanced, filled with homage to classic car movies and TV (including constant callbacks to its obvious inspiration, Life on Mars). But even though it's a game where the main character is never playable outside a car, even though it's set in a world where the solution to every crime involves vehicular damage, even though the physical make-up of the hero is literally of absolutely no consequence whatsoever... you're still playing as John "Incredibly Generic White Dude With An Incredibly Generic White Dude's Name" Tanner. You could not possibly challenge fewer conventions than Driver:SF, which these days is not so much actively frustrating as it is wearying.
That said, I think there's hope. When I look at something like Anita Sarkeesian's Tropes Vs. Women project on Kickstarter, which went from zero to troll-ridden to ridiculously over-funded in a matter of hours, it kind of blows me away. Seven years ago, would Sarkeesian's videos have gotten that much support? Would it have gotten sympathetic attention from the corporate blogs? Would it have been picked up across a wide range of non-gaming media? I feel like no, it wouldn't have. And while tools like Kickstarter have made it a lot easier for small projects like this to get the funding they need, I suspect that changes in the culture have also made a big difference.
More importantly, it's not just one culture anymore, if it ever was. Communities don't just grow by getting bigger, they also grow by having new circles intersect at their Venn diagram. You see this everywhere: look at the way that music fans start out as a small, particular group, and then as the artist gets bigger, different people begin to attach--sometimes for very different reasons, which may eventually drive the original fans away. The reasons why I love the Black Keys (their early, filthy-sounding recordings from Akron) are not the reasons that new fans probably love them, but we all end up at the same concerts together.
When I was studying intercultural communication in college, the term for these meshed sub-populations was "co-culture." I didn't care for the term then, but now it seems appropriate. Gaming is bigger than it was seven years ago, and it's no longer accurate--or seen as desirable--to say that the "real" gamers are the angry 14-year-olds with a chip on their shoulder about girls and minorities. This space can (and does) support more than that: from Troy Goodfellow's series on science and national characters in gaming, to The Border House providing a critical examination of character and plot, to rhetorically-stunning games like Auntie Pixelante's dys4ia. These are not all the same voices I was reading and responding to seven years ago, but they are stronger and louder and more influential. That's fantastic.
I'll probably never refocus here to the degree I did when I was writing a post or more a day, because being a single-issue blogger (or a single-issue anything) has never been interesting to me. But I'm thrilled other people are doing good work with it. As a gamer, the same way that other people might be movie buffs or music snobs, I want to see it grow and change so that I'll be exposed to new and interesting perspectives. I don't want to see it stagnate. While progress is slow, I think it's being made. Let's hope in another seven years, I can look back and say the same.
Did you know? If you buy a fan for a laptop, you should make sure to get the right one for the machine, so your video card doesn't start shutting down at 105° C (hopefully before causing permanent--and expensive--damage).
Back in a bit.
I've recently been recommending Zed Shaw's Learn X the Hard Way books for learning a variety of computer languages. I find these books, and the educational theory behind them, kind of fascinating. Shaw himself encourages other people to fork his project for new languages, and provides some advice on its structure:
The way to think of the book's structure is the first half gets them strong, the second half gets them skills. In the first half they're just doing push-ups and sit-ups and getting used to your language's basic syntax and symbols. In the second half they use this strength and grounding in the basics to start learning more advanced techniques and concepts, then apply them to real problems.This is not a way that I particularly like to think about learning--my least favorite part of high school was doing drills in class. But in retrospect, what I hated was drilling things I already understood. I despised sentence diagramming because I already understood grammar--I didn't need to draw arrows above the subject-verb-object relationship. When it comes to new problems, I actually spend a lot of time on simple, repetitive practice, what Shaw calls "getting strong." The same seems to be true of most of my students, and that realization has dramatically changed the way I teach this quarter.
When I started learning bass, someone recommended a really good book on bass technique--not a book of songs, or a book on music theory, but just step-by-step foundation on how to hold the instrument and pull the strings without causing long-term physical harm. I spent hours just running through the most basic exercises: plucking strings with two fingers, then playing across strings, then muting unplayed strings. It was tedious, but whenever I play a looping arrangement (where unmuted strings would create "drone notes") and sing simultaneously, I'm glad I spent the time.
Likewise, I was reminded of this the other night at dance practice while talking to one of the other poppers about traveling. When I first started, I remember asking the teachers in class how they were able to combine movement across the floor with their isolations and waves--when I tried, it was too hard to keep both motions in my head, and one of them would collapse into awkward spasming. I wanted a "secret"--some kind of special technique that would let me skip the hard work. But that shortcut didn't exist: gradually I learned to travel while dancing only by working hard on each component individually and repetitively.
I'm proud this quarter that the examples, in-class exercies, and homework I've assigned for Intro to JavaScript have all been "real" work--my students have learned how to make basic slideshows, to filter and display data from CQ and the World Bank, and even how to imitate Paint.exe using canvas. I hope that they'll be able to leave the class and talk about what they've done in interviews, or use them as jumping-off places for other projects. But more importantly, I'm spending each class with students typing along with me, giving them feedback and drilling the basic skills of reading and writing code. They're learning the hard way, and so far, it seems to be working.
The problem with writing a book about trains is that it hands your critics a healthy arsenal of cheap metaphors to use in reviews (see also: Atlas Shrugged). Do we say that Railsea goes off the tracks a bit? That it doesn't really make it into station? Or indeed, that it never really gets up a good head of steam? Screw the puns. Let's just say it's not really up to par. This isn't to say that Railsea is bad, but it has a lot to live up to. Mieville has already written a better book about trains (Iron Council), a superior story about oceanfaring (The Scar), and a much more inventive YA novel (Un Lun Dun). Where does that leave Railsea? It's readable, even captivating at times, but ultimately a bit of a trifle.
Other readers have called this "Moby Dick with moles," but that's not quite true. Set on a planet where hunters, pirates, and scavengers roam an "ocean" of train tracks while avoiding dangerously-outsized ferrets, earwigs, and burrowing owls, Mieville does invoke Melville: train captains in this society each grow obsessed with a particular animal, including one who hunts a great white mole named Mocker-Jack. But these are just spice, thrown in as mood-setters. The vast majority of the book is actually about a moletrain doctor's assistant named Sham, who finds a memory card that leads to the end of the titular railsea, and kicks off a chase for the rumored riches located there.
Railsea is filled with clever authorial touches, like the use of the ampersand instead of "and" (there is a in-text reason) or an extended meditation on the ways that stories are themselves on rails, particularly in science fiction. Always respectful of genre, Mieville throws in passing references to Aubrey and Maturin, Robinson Crusoe, and Roadside Picnic (watch for the mention of a "Strugatski triskele"). These touches add interest to what is otherwise a pretty limp narrative: Sham spends most of his trip passively wandering up to more interesting stories, until the inevitable character growth moment. This is a book that's better as a critic than as a reader, but even there, it's not subtle: the layered, rich symbolism of Weavers and golems is missing, although I'll admit to enjoying the authorial asides that draw attention to the text's own lumpy pace.
Where Railsea redeems itself is in Mieville's writing, which is still (love it or hate it) an incredibly distinctive prose style, and its straight-faced embrace of the ridiculous. He gives only the slightest indication that his setting--with its savage naked mole rats, rail captains with mandatory artificial limbs, and carriages pulled by rhinocerii--is completely preposterous. Mieville has always written worlds that piled unlikelihood on improbability atop impossibility, but here he occassionally winks to us, such as this section on the theology of trees and railway ties:
Of all the philosophers' answers, three stand out as least unlikely.
— Wood & wood are, in fact, appearances notwithstanding, different things.
— Trees are creations of a devil that delights in confusing us.
— Trees are the ghosts of ties, their gnarled & twisted & dreamlike echoes born when parts of the railsea are damaged & destroyed. Transubstantiated matter.
All other suggestions are deeply eccentric. One of these three is most likely true. Which you believe is up to you.
All gripes about the book aside, I find that completely charming. This mischievious voice makes Railsea the kind of book that's almost begging to be read aloud. And if, in the end, the twists in this tall tale are a bit straighter than you might expect, I suspect it's still worth the price of the ticket.
The sound design, as usual for Nintendo, is instantly recognizable. It makes this kind of phased, dopplered hissing sound, a parody of "something going very fast." You can hear it coming up from behind a few seconds before it hits, or when it passes someone else in splitscreen mode. The latter is the really frustrating scenario: you know you're going to be knocked out of the race--the only question is, when?
The blue shell is the reason I can't play Mario Kart anymore. Belle and I started playing on the Wii again a couple of weeks ago, and for the most part I enjoy it. The boosts are toned down so that snaking can't be abused like the DS version, the tracks are decent with few outright stinkers, and I like the addition of motorcycles (even if they're unplayable with the Classic Controller). In multiplayer, I could care less: if I get knocked out and lose to Belle, it's all in good fun. But then I tried unlocking new characters in the grand prix mode, and the blue shell completely ruins that.
The thing about Mario Kart is that it's balanced via progressive taxation. Everybody in the race gets items, but the better you're doing (right at that moment) the worse those items generally are. If you're in the lead, you only get items that let you maintain that lead (but not increase it), like banana peels or fake item boxes. If you're in the back of the pack, you get items that let you jump up in line, like the star or bullet. And the game heavily incentivizes using those items quickly instead of hoarding them--a number of the other power-ups will cause you to lose anything you're holding when they hit you. It's actually an extremely clever set of interlocking mechanics, all designed to keep races unpredictable.
But the blue shell breaks that pattern. It doesn't give you a boost (even implicitly, by punishing everyone else, the way that the lightning does). In fact, if you're in the back of the pack, it probably doesn't help you at all--the second and third place racers are just going to shuffle up in position. Using a blue shell in Mario Kart has one goal, and one goal only: to ruin the third lap for the best racer on the track. It's subsidized griefing.
Worst. Power-up. Ever.