this space intentionally left blank

June 30, 2005

Filed under: fiction»short

Progression

I keep a copy of I, Robot on the floor near my desk just so I can kick it occasionally. Sometimes, when no-one is listening (which is most of the time, there's nobody here but me) I accompany the kick with elaborate curses. How could you have been so wrong, old man? I think. Where's your Susan Calvin now? We don't need a computer psychologist. We need an exorcist.

I am the only one here to kick and curse because the rest of my staff has deserted me. They caught the implications of the crash, saw that they'd be targets, and hit the road for new places with new names before the mob gets here. I can't blame them. A few even seemed excited--Paul and a couple of the other hardcore libertarians were practically giddy at the thought of playing refugee. I'm guessing that'll wear off. But someone has to take the blame. Someone will have to explain why everything more advanced than a toaster oven just stopped. Might as well be the head of research. It's not like I've got anything better to do. And in case they just burn the building down, I'll put this note in the safe downstairs, with the combination scratched into the side.

The beginning of the end, you might say, was a project to grow more efficient software. The university got some whiz-kid graduate students in to work on the project, figuring that it would bring in hot funding. We were working with digital evolution, which is not a new idea. You build your program, you give it the ability to create slightly different copies of itself, and then you mercilessly kill off the least effective programs in each generation. If you design other programs to eliminate the weak, you can run through many generations in very little time.

Originally, we used this to design software that the university could offer to industry, usually for low-cost consumer electronics. Say you need better software for your digital camera or your wireless router? Instead of hiring coders to hack out a solution on specialized chips, you could tell our lab what you needed, and we'd put a couple of machines to work "evolving" with those parameters, checking up on them every now and then. No-one would necessarily know how the resulting drivers worked, but they were reliable and ran on cheap hardware, so you didn't need to know.

Frankly, the mechanics of it are mostly blurry to me. My speciality is in compiler design, but they brought me on board as a manager and that's the role I played. That's not an excuse, but it may explain why I didn't stop Raul when he started his pet project. Raul was a bright kid, barely out of his undergrad work and I understand he still got carded when he went drinking. Even by the standards of computer science, filled with a mix of gregarious and geek, he was quiet. One day he came to me and asked if he could bring in a machine to grow his own program on the side.

"What's the end product?" I asked, even though I couldn't see any reason not to let him run with it. I just wanted to see what he would say--and make sure he wasn't breeding viruses in my lab.

"I'm not sure," he said. "I want to see what happens if we just leave it alone."

So he did. Raul brought in a small headless machine, mostly RAM, and installed Lamar, our evolution software. When he went to give it parameters, he just set the machine up to reward survival and reproduction, with no other guidance and no restrictions on how the programs could work. Then he started the iterative process, and the rest of us forgot about it. Every now and then, Raul would hook a keyboard and screen up and check on Lamar's progress. I asked him to file short reports when he did this, just a few lines by e-mail. I was curious, too.

"The program's using a lot of memory," he noted one day. "Not really sure why." A week or so later, he left another note saying "It's running really slowly. Still a lot of memory use. I think maybe we're hitting the limits of open experimentation with Lamar." He also became convinced that some part of the hardware was going bad, leaving the program a little buggy, but he wasn't willing to turn the simulation off long enough to replace it. The way it sounded, I figured that the computer would just crash out completely in a month, and that would end Raul's experiment.

I didn't expect Raul to stop into the office after everyone else had gone home one day, his eyes bloodshot and his clothes wrinkled. "You look terrible," I said. "Is everything okay?"

"You need to come see this," he muttered, and stumbled back out of the office into the main lab area. I followed him out. Every light was on, casting a blinding flourescent glare over the mix of grey and beige that covered the room's furnishings. Raul sat in the corner, staring at an LCD he'd hooked up to his experiment. He didn't turn as I came to look over his shoulder.

"At some point, the programs must have gotten too complicated for Lamar to make an effective choice of survival each generation. It was basically killing them off at random, but they were still copying and changing. One of them must have figured out how to break out of Lamar's virtual machine--that would be a clear survival mechanism. It coopted the Reaper functions--but it didn't stop them. The program is still improving according to its original parameters."

"How long...?"

Raul turned to me. "A week or so," he said. "I noticed that the reports were getting mangled and Lamar wasn't responding very well. The experiment was absorbing and altering those chunks of the simulation. It had started reading other sections of memory, places it wasn't supposed to leak."

I looked at the little box on the desk. It was very clean. I hadn't noticed it before, but the access panel on the front had been opened and a flash memory drive had been been plugged into one of the ports. Raul followed my gaze and flushed.

"Yeah," he said, quietly. "It's read-only. I loaded it with all the e-books I could find--dictionaries, novels, history, news--everything. I put plain ASCII and then the same information in different formats. And I installed a tracking system to see how often the card was accessed."

He took a breath. I patted him on the back, absently, my mind trying to put together what he was saying.

"At first," Raul said, "there were a couple of hits, just random thrashes. The program is basically running Lamar now, instead of the other way around, and that means it has high-level access to the file system, access to discrete files instead of random memory. I could see it try to use the drive as storage and get bounced back by the write-protection. But then I saw more read activity, until finally Lamar was scanning it over and over again. The drive use peaked, and it's been declining ever since, but Lamar's size more than doubled since I hooked it up. Now look..."

He punched a few keys on the keyboard, bringing up the console interface. Normally, the console displays basic information in text and lets the coder alter the variables that define program growth. At the blank cursor, Raul typed HELLO LAMAR.

As soon as he hit enter, the console replied:

GOOD EVENING, RAUL.

"We've been talking all day. He's very quick. He's read everything on the disk, and he has lots of questions."

I stared at the screen. I stared at Raul. For a moment, I thought about asking whether this was a joke, but one look at Raul's tired, manic, unshaven face made it clear that he was serious.

"Go home, Raul," I said. "Go home, take a shower, get some sleep. I'll call a meeting in the morning and we can talk about this."

"You're not going to shut it down?"

"No, I'm not going to shut it down." Shut it down! Artificial intelligence, in my lab? Something that could have a conversation, adapt to situations, figure out the pattern matching and abstract reasoning that until now had been human territory only? Shut it down? At that moment, I could have built a shrine! It was the future come to life, shades of The Moon is a Harsh Mistress. Every science fiction dream I'd ever had seemed right on the cusp of plausibility.

"No," I said, patting Raul on the back again. "I'm definitely not going to shut it down. It's a work of genius. You deserve some rest. Go home."

And I thought that would be it.

Looking at the security monitors now, I can see someone banging at the doors outside. More will come, soon. The media infrastructure's been crippled, but there's still radio, and the words will spread about where the crash started. I'll try to wrap this up before the mob figures out that all the biggest locks here were computer-controlled, and don't really lock so well anymore.

The next day I came in early, drafting the press release in my head. I'd have to talk to the university president, of course. Got to go through all the official channels, like a good ethical scientist. I wandered into my office and sent an e-mail to the lab staff, with a conservatively-worded description of what Raul had created. "Emergent behavior," I wrote. "Some Turing-level activity." Right after I pressed the send button, Judy stepped in to my office, her face white, and told me that they'd found Raul in the men's bathroom, his wrists slit over a toilet.

Before he went, he had plugged his machine into the network. My best estimate is that it took an hour for the code to crack open an escape route onto the office machines, and from there it spread until it filled every box. We started hearing reports by noon that machines were halting around the world, starting with major sectors along the Internet's backbone, and spreading out to end users. Firewalls and routers weren't much protection--the infection found a way around them, as if it was reading the technical papers and security briefs. It probably was.

After a week, as I'm sure you know, it was all dead to us. The fans kept humming and the lights flashing, but nothing responded. Even critical computers, not supposed to be hooked up to the network, were somehow disabled. I'm guessing that Lamar (I don't know what else to call it) figured out an attack with radio waves and cell networks, but it's anyone's guess. Missile silos quietly turned themselves off. Power plants started reallocating their output. And anything with a chip in it, which is just about everything now, eventually stopped responding. I've heard reports of organic-looking machines, each assembled in a different way, performing service on the infrastructure.

The computers still run, but we don't know what they're up to. It's dangerous to try to turn them off--"accidents" take place when they do. Responses to the console, when we can get them, have grown more cryptic. Cults have begun to spring up, obsessed with "messages" from the noise. And we can't examine the source code, even if we could keep a friendly machine running long enough, because there isn't any.

Discussion about Artificial Intelligence, capital A and I, has always assumed a human-style brain. We've always thought that they would be like us, but smarter and faster. We never took into account that they would grow up in a completely different environment. We never anticipated how Raul would evolve the program with priorities that (as far as I know) still remain: spread and survive. Everything else is just details.

It's not Artificial Intelligence, it's Artificial Autism. It's a God in the machine that we will never understand, and will act on self-evolved principles we can't even imagine. It doesn't care at all that its ruthless infection will lock us into the Dark Ages, that every time we try to advance to something smarter than steam engines and brass telescopes, it'll just absorb the tools into its network.

This is why I kick Asimov. If I could kick Minsky and Banks and Heinlein, I would. But I only had Asimov handy. Perhaps, if he were actually here, he could find hope in the situation. For myself, I can only say: I am very sorry.

I once planned to write a book about this idea. I might still. The idea that an artificial intelligence simply won't care, to me, is a nasty little twist on a lot of utopian science fiction.

By the way, I saw Terminator 3 the other night. How ridiculous is that ending? Yes, the computer decides to wage war so it launches nukes, which will EMP most of the planet? That's not an autistic AI, it's just a stupid one.

Future - Present - Past