I didn't recognize Dumont's house. I never delivered there as a driver, but it had a card key lock, so it was part of the system. My card key wouldn't work, because I wasn't dispatched there by that system, like what happened when I was locked out of Acid's place for a second on New Year's, so I rang his doorbell.
His computer might have scheduled him to be asleep that late, like Lanning said they do to him, but I only waited for about ten seconds before I knocked, loud, with my foot.
Once I had the name "Dumont," finding him in the surveillance log took a simple search. Figuring out his address from that was easy, but finding time to go see him was the hard part. Then there was nothing between me and answers but a really fucking strong door.
I kept "knocking," but all I did was leave a bunch of foot-shaped dents.
Then I heard a voice from the inside. "Alright, I'm coming." There was a beep, and the latch turned by a motor. The door opened a crack, held by one of those security chain things.
I smiled. "Hi. Here to check the levels."
That was my clever idea to bluff my way in, but I must have looked like a burglar, kicking his door in the middle of the night with my brown skin and leather jacket. He opened the door anyway. Dumont was an old man, fifty maybe, in slippers with a bathrobe over his pajamas. His beard was gray, and his back was bowed. "What levels?"
I put my foot in the corner of the door frame in case he tried to close it on me. "The levels of bullshit I'm willing to put up with if you don't tell me what I want to know."
He smiled and nodded. "Come in, come in." He shuffled back down his hallway, leaving the door wide open. "This one a long time have I watched."
I followed. "You know who I am?"
"Of course. Your data has been crucial to designing the system. All that really erratic stuff is tough to predict. You generated some great test data."
I didn't want to be known for that. I comforted myself that at least the system didn't know where I was right then. My phone was at home. "Surprise you I'm here?"
"There was a probability of 98.7% you would come once Harry told you about me." Somehow he knew about that. He stopped when we got to a big room with blinding fluorescents and tables full of the pieces of dozens of computers. It looked like a technology grow room. The old man turned back to smile at me. "There's also an 87.2% predicted probability that you would be drunk when you showed up. Is that accurate?"
I shook my head. "Your data is old. Your computer doesn't know me that well. I'm just a little buzzed." I need to make my own decisions, or nothing is worth it. "What's Sarah gonna do with all that information anyway?"
"Sarah? The log file?" He laughed. "That's a privacy issue for our employees. We'll dump stuff like that before we move on."
"Move on to what?" Until that moment, I never even thought the system was being designed for something else.
"You name it." He grinned. "How about traffic lights that observe the world around them, linked together to calculate optimal traffic patters in real time while monitoring the city for unsafe activity?"
I tried to picture that. "So that thing where you run a red light and they mail you a ticket?"
"And so much more. Better monitoring means better enforcement with fewer cops. Wouldn't you like that?"
I shook my head. "Fewer cops isn't the point if nobody can get away with anything. People should live by their own rules."
"That doesn't seem right." He turned to look at one of the dismantled computer monitors and rubbed his beard. "But that's just one application. Programs that interpret their surroundings could drive trains more safely than humans, then cars if we get it right."
"So machines that do what you paid me for." I clenched my fist. "Then what am I supposed to do?"
"No, you're missing the point." He waved his hands in big gestures. "Within ten years, all transportation will be out of the hands of human error, and we can create the infrastructure for reliable civilian aerial transportation. Flying cars!" He held up a finger in excitement. "Doesn't that excite you?" The same finger moved to his face and tugged at a corner of his beard.
I had to shrug. "Kinda."
"Programs handling every part of human life we choose not to do, and we don't have to lift a finger." He poked his finger at my chest. "What about that?"
I shook my head. "What happens when the machines decide they don't need us anymore, try to kill everyone?"
He chuckled. "You're thinking of science fiction. That's movies."
"You're talking about flying cars! Roger gave away that the whole fucking project is named after Terminator. Look who started this thing. Sarah and Hans are the only things keeping the Dude in check. If you delete them, you think the Dude is gonna know the difference between reality and science fiction?"
He got confused. "But he's not in charge. He's a meat popsicle."
"Yeah, he said that -- twice, but who else is there? You?"
He chuckled. "No, the system is too complex for me to take any real control at this point. After it got too big to fit in this room, it almost became homeless, so we distributed it. That was my idea. I always felt more comfortable working from home than in a real office. We set the system to take care of us while we finish roughing out the edges. It keeps track and reminds us when it's time to eat, like a robot butler."
"Don't start that again." I picked up a computer part from one of the tables. "What if I disagree with your system?"
"Huh..." He looked at the working computer again. "I had almost this same conversation with your friend Harry. If you want to read the transcript--"
"Talk now!" I couldn't take it anymore. "FUCK, I have to keep telling you people that mouths are for more than eating. Can't you even consider the consequences of your actions? I mean, why put the programs in charge of you?"
He stood still for a second before talking again. He went slow, thinking each word before he said it. "Nietzsche said it's not the intensity but the duration of great sentiments that makes great men."
"Is that from a movie?"
"That's from Nietzsche." He started to talk faster again. "Humans aren't designed for multitasking. By giving ourselves each a single continuous function, we're more efficient, but life kept getting in the way."
"Yeah. Life, the interesting stuff." I put down the small computer part and picked up something bigger, more like a metal brick.
Dumont kept going. "There were two basic problems to solve, leaving our computers for food, and leaving to sleep. The first one was simple enough. Back in the boom times of the late 90's, we had Kosmo.com. They would bring you anything, but they grew too fast to survive. We just hired a few new people and aliased an identity to automate the deliveries. We call him Hans."
"We've met."
"The second problem, sleep, posed more trouble. Sleep functions like a garbage collector that helps our brains take what they cached since we last slept and integrate into long-term memory. The system calculates the length of our optimal REM cycles, and we found through iteration that the fewer things we have to think about, the less sleep we need. Simpler memory cache, less time lost, more efficiency." He smiled, so proud of himself.
"You can't treat these people like pieces of software."
"We can't?" His smile stayed. "Bugs keep popping up, like masturbation, but we refine the system."
"You blocked porn?"
"No, just limited how long we have to use it. By forcing ourselves back to work too soon, we redirect the excess energy back to the project and again, increase productivity."
"That's cruel."
"That's tantra. But even that didn't work for everyone."
"You mean Acid Burn? Because she's a chick?"
He hesitated. "Well, some people just wouldn't stop. No, Acid Burn posed a different problem, social contact. We couldn't give her contact with the outside world. The distraction would be too great, which is why we allowed her to create another channel of communication."
"The secret network?" My blood started pumping. "You knew about that?"
He nodded. "The system needs to encounter more and varied situations if it's going to fulfill its purpose, and it needed practice to infiltrate other networks."
I stared at him. If it can infiltrate things, that means the system can spread. That makes it so much worse. "So you let them think they were out but kept watching everything they did?" I realized I still had the heavy piece of metal in my hand. "All that is information that's still helping build the system?" I moved towards him.
"And it's working." He gave his biggest smile yet. "We expect SkyNet to become fully self aware by this August 29th."
For legal reasons, I can't remember what I did next.