this space intentionally left blank

February 11, 2009

Filed under: culture»internet

Singularity U

Don't look now, but higher education just got higher:

Singularity University derives its name from Dr. Ray Kurzweil's book "The Singularity is Near." The term Singularity refers to a theoretical future point of unprecedented advancement caused by the accelerating development of various technologies including biotechnology, nanotechnology, artificial intelligence, robotics and genetics.

Singularity University makes no predictions about the exact effects of these technologies on humanity; rather, our mission is to facilitate the understanding, prediction and development of these technologies and how they can best be harnessed to address humanity's grand challenges. The University has uniquely chosen to focus on such a group of exponentially growing technologies, in an interdisciplinary course structure. These accelerating technologies, both alone and when applied together, have the potential to address many of the world's grand challenges.

The other diploma mills are kicking themselves for not thinking of this sooner. Being able to charge $25K to rehash Moravec and talk about how robots will eradicate world hunger? Sign me up!

In all seriousness, though, the real disappointment is that there's an actual niche to be filled, and Singularity University misses it by a mile. After all, we live in a time when technology has had incredible consequences for the way we live, and the future we create together: climate change, I suspect, is going to radically alter the tone of innovation going forward (if it hasn't already--see the recent emphasis on green datacenters and the carbon cost of Google searches). But even in just this one area, SU can't even devote an entire course to it: it gets a minor part of the "Energy and Ecological Systems" section, about equal to the amount devoted to space travel and (tellingly) venture capitalism.

Indeed, the entire curriculum is comical. A path for futurism, in a university named for the event after which technological change becomes impossible to predict? And more importantly, an interdisciplinary program that breaks its studies down into technological disciplines like "Nanotechnology" and "AI & Robotics?" That's a total conceptual failure. Worldchanging's Jamais Cascio gets it right in the comments for his reaction post when he writes:

I proposed the social-centered structure for a few reasons, but they all come down to moving away from the unidirectional technological change -> social change model that seems so commonplace in emerging tech/Singularity discussions.

Implicit in a structure that focuses on particular technology categories is a "here's why this tech is nifty/dangerous" approach. By focusing instead on areas of impact, I'm pushing a "here's an important issue, let's see how the different techs get woven through" model. Both may talk about technologies and society, but the tech-centered structure looks for cause and effect, while the social-centered structure looks for interactions.

Singularity University is distinctly oriented toward a method of thinking where technology leads to social change--unsurprising, since that's much of the appeal of singulatarianism itself. But technology isn't created or used in a vaccuum. Look at development, for example: fifty years of the IBRD trying to build open markets via top-down structural adjustment loans, completely blindsided by microfinance and the ability to transfer cellular minutes. Terrorists in Mumbai are using Blackberries and information technology to coordinate their attacks. Not to mention the rise of the netroots as a political organization that has not only shaped the electoral process, but altered the policies (open government, net neutrality, creative commons) that it demands.

These innovations are not stories of emerging technology with direct, predictable outcomes. They're all rooted deeply in the social and cultural context in which they evolved, and they trade ideas across non-contiguous domains--who would have thought that Daily Kos would borrow community management methods from Slashdot, for example? And yet Singularity University seems to have put together its mission without considering these kinds of Black Swans: invent X technology, they seem to be saying, and Y or Z social impact will follow (or can be guided by visionaries) in a linear fashion. It's a predictive viewpoint straight out of Heinlein-era science fiction, and frankly it's irresponsible. Even assuming that the institution really does foster "the development of exponentially advancing technologies" (if such a thing is at all desirable), it's an act of phenomenal hubris to think that those same leaders will be the ones to "apply, focus and guide these tools" (quotes directly from the SU mission statement).

We could spend all day picking out the inconsistencies and missteps in the SU plan, like the fact that their "international" university has a faculty that's so very white and American. But the wider point remains: at a time when the cost of intellectual overconfidence has been driven home economically and ecologically, Singularity University wants to charge CEOs and government leaders $25,000 to tell them that they're in control of the future. For an academic insitution, that's a pretty big lesson they seem to have missed.

Future - Present - Past