You are Alexander Price, an investigator working for the recently-formed Artificial Intelligence Commission. After numerous mishaps with supposedly "intelligent" systems, (such as the 2076 Generic Motors incident, in which the robotic assembly line started misidentifying certain components and building vehicles that caught on fire at random), Congress created the AIC.
As an investigator, your job is usually to help corporations like Generic Motors identify the root cause of these accidents and correct it. Usually. [[Today's assignment->Your assignment]] is different. As one of the best investigators the AIC has, you doubt you'll have any trouble with the case.(if: $opened_file is 0)[A case file lies on your desk. It looks unusually thin... You open it, and sure enough, there are only three sheets of paper inside. The evidence-gathering team must not have been able to find much.](else:)[The case file contains three pages.]
The first is a [[summary of the case->Case summary]]. You should definitely check that out.
The second page is some [[biographical information->Bio]] on the person in question.
The last page has [[information about the facility->Facility]] you'll be visiting.
(set: $opened_file to 1)The headline at the top of this page reads **Case Summary**.
"Distinguished researcher [[Dr. Francis Xyzzy->Bio]] was found dead in her office by a coworker at the [[Memorial Research Center->Facility]] on November 12th, 2081, at 4:51pm. An autopsy was performed, but the cause of death could not be reliably determined. The actual time of death was about 3:15pm." //Alright, that was yesterday,// you think to yourself.
"A review of surveillance footage indicates nobody had walked in or out of her office aside from Dr. Xyzzy herself, since her return from lunch at 1pm."
"The investigator assigned to this case will be responsible for determining the cause of death with reasonable certainty, and communicating that cause along with any other relevant information found during investigation in a detailed report."
There is other text on the page, but it seems boring and bureaucratic. Like any good investigator, you decide to ignore it.
(display: "CaseDone")The headline at the top of the page reads **Biographical Information**.
"Dr. Francis Xyzzy was a research scientist for the federal government, last working at the [[Memorial Research Center->Facility]]. Her last approved grant involved research in expanding the capabilities of specialized artificial intelligence, specifically, giving a single intelligent entity multiple functions." //That might have something to do with her death,// you think to yourself. You make a note of it.
The remainder of the page contains a photograph and some contact information:
"**Government email address:** firstname.lastname@example.org"
"**Phone number:** +1 (202) 555-9477"
"**Office:** MRC North 3185"
You know the MRC issues cell phones to all scientists working there... Maybe if someone killed her, they took her phone. (if: $phone_called is 0)[You could try [[calling it->Phone call]].] (if: $email_sent is 0)[Maybe [[sending an email->Email]] might do //something//, although that seems like a bit of a long shot.]
(display: "CaseDone")"The Memorial Research Center is an artificial intelligence research facility operated by the National Science Foundation. Many recent advances in weak artificial intelligence have been made at the MRC. The MRC offers a technology transfer program, which allows these advances to be used in industry and commercialized."
"Your AIC badge should let you in to the building, as well as Dr. Xyzzy's office."
The rest of the document details some of the active projects at the center. Dr. Xyzzy's is listed as "Development and Testing of Multi-Capability Weak AI: Expanding the Boundaries of Artificial Intelligence." If you didn't know better, you'd think she was trying to develop a strong general artificial intelligence... but everyone knows that's at least twenty or thirty years out.
(display: "CaseDone")You decide that you know enough about the case to start getting some work done. You close the case file and put it in your filing cabinet.
You walk to the elevator and take it to the parking level. You feel a sense of unease, but the elevator ride is uneventful. You get in your car and instruct it to drive to the MRC. The ride there is equally uneventful.
* [[Go inside->MRC lobby]]You pick up the receiver on your desk phone and dial the number. (set: $phone_called to 1)
The call connects. (live: 2s)[(transition: "dissolve")[Ring...](stop:)] (live: 4s)[(transition: "dissolve")[ring...](stop:)] (live: 6s)[(transition: "dissolve")[ring...](stop:)]
(live: 8s)[(transition: "dissolve")[//You've reached the voicemail box of Francis Xyzzy,// a robotic voice says. //Please leave a message aft...// The voice stops.] (stop:)]
(live: 12s)[(transition: "dissolve")[You hear a click. The call disconnects.] (stop:)]
(live: 15s)[(go-to: "Bio")(stop:)]You log in to your computer, open your email, and click "Compose."
Maybe if you send something that looks like a meeting invitation, the killer will think they haven't been discovered yet. After all, Dr. Xyzzy was only killed yesterday.
**Subject:** Can we talk?
Hi Dr. Xyzzy,
I'm a scientist at the Artificial Intelligence Commission interested in your work. I was wondering if we could meet some time this week to discuss your research on extending the capabilities of weak AI. Does next Monday (the 17th) work for you?
You're not quite a scientist, but the rest of the message is certainly not a lie...
* [[Send the message->Send]]You send the message. (set: $email_sent to 1)
Almost immediately, you receive a response.
**Subject:** Re: Can we talk?
Thanks for your email. Unfortunately, I'm on vacation and will not return until early December. Can I email you when I get back? I'm sorry for the trouble.
Dr. Francis Xyzzy
NSF Memorial Research Center
//Huh, that's strange,// you think. //Even if she was still alive, nobody can type that fast.// Maybe one of her AI projects was built to automatically respond to emails... but with an "on vacation" note? You quickly check the MRC leave calendar... no, she wasn't scheduled for any vacation until the 25th, the Tuesday before Thanksgiving. What's going on here?
[[Return->Bio]]You can go back to [[reviewing your assignment->Your assignment]]. If you're done, you can [[close the file->What next?]].You use your AIC badge to get into the building, as instructed in the case file. Almost immediately after entering, you receive a text message:
UPS: Package DELIVERED as of 11/13/2081 10:13 AM.
//Oh, right... I should pick that up,// you think. You ordered a rather expensive suit a few days before... it'd be a shame if it was stolen from your front porch.
* [[I don't have time for this.->MRC lobby decision]]
* [[Head home and pick up the suit.->Car doesn't work]]You're standing in the lobby of the Memorial Research Center's north building. There's a security desk, which might have more information about Dr. Xyzzy's coworkers. The elevators are off to your left.
* [[Ask the security officer at the desk about Xyzzy's coworkers->Security desk]]
* [[No time, head toward the elevators->Elevator lobby]]You approach the desk. The security officer barely looks up.
"Hi, do you have a minute?" you ask. The officer mumbles something that sounded positive, so you proceed. "I'm an investigator at the Artificial Intelligence Commission looking into the death of Dr. Francis Xyzzy; do you know anyone who she worked with that I could talk to?"
The security officer looks at you with a face full of contempt. He types something on his computer and prints out a [[sheet of paper->Guard's paper]].You walk over to the elevator lobby.
You see a bank of eight elevators, four on each side, but no up or down call buttons... Aha, a panel on one of the walls:
(font: "Courier New")[(colour: lime)[13 November 2081 10:23 AM]
Welcome to the Memorial Research Center. Please [[present your badge->Call elevator]].]
Neat, a destination dispatch elevator system, and a new one, from the looks of it! It makes sense for an artificial intelligence research center. Dr. Xyzzy probably wrote some of the code that powers this system...The header on this paper says **Dr. Martin Martins.** //What a lazy name,// you think to yourself. //Whoever named him must not have been very imaginative.// (set: $martins to 1)
It looks like his office is right next to Dr. Xyzzy's: MRC North 3186. You may as well [[head to the elevators->Elevator lobby]].The panel changes, and an elevator opens.
(font: "Courier New")[(colour: lime)[13 November 2081 10:24 AM]
Welcome, ALEXANDER PRICE. Floor 31 selected. Use elevator 4.]
Wait, how did the elevator system know which floor you needed? That kind of information has never been provided to other groups you work with. Granted, you've only ever used one other destination dispatch system... You feel a little uneasy again.
* [[It's probably one of the new features of this system. Step into the elevator.->Elevator car]]
* [[This is weird... take the stairs.->Stairs]]You step into the elevator car. "Going up," a voice says. The car starts moving.
As the elevator takes you up thirty floors, you think about what you know of the case. //Her phone goes to voicemail but gets cut off, and something's responding to her emails... but she's dead. Our evidence team found the body...//
You are rudely interrupted by the same voice announcing "Thirteenth floor." Someone else gets on the elevator; you wave at them.
//And why is this an AIC case anyway? If she was murdered, shouldn't this just go to DC police? This is strange...//
The elevator stops. You're worried for a moment — is the elevator stuck? The other person on the elevator does not seem concerned.
* [[Wait.->Elevator car 2]](set: $stairs to 1)
You find the staircase and start climbing... this could take a while.
(live: 5s)[Climbing up thirty flights of stairs is exhausting.]
(live: 10s)[Why would you do this to yourself?]
(live: 15s)[//Should've just taken the elevator.//]
(live: 20s)[//Nothing bad would've happened.//]
(live: 25s)[You see the sign for the [[thirty-first floor->31st floor]]. //Finally.//]The elevator doors open. "Twenty-ninth floor," the same voice announces. //Oh, right. Guess it's just an elevator thing...//
The elevator continues its ascent for a bit longer, then stops again.
"Thirty-first floor," you hear — but it almost sounds like the voice is different. //Probably just my ears playing tricks on me...// The doors open.
* [[Step out of the car->31st floor]]You arrive at the 31^^st^^ floor. A sign reads:
Ugh, this is going to be a bit of a walk(if: $stairs is 1)[, and your legs are tired already from walking up thirty flights of stairs..].
* [[To the 3180 office suite->Offices]]You decide to head back to your car. (set: $car to 1)
You press the "unlock" button on the key fob, as you always do, but nothing happens. //Hmm...// You try inserting the key into the lock, which works—but the car just locks immediately after.
You try this again, but eventually realize that this is a waste of time. //What the (text-style: "blurrier")[duck] is going on? Probably best to just figure it out later... this investigation is more important.//
* [[Go inside the MRC->MRC lobby decision]]## Literature II final project
This is my EH210 final project. It's a piece of interactive fiction about creating life, what it means to be human, and maybe a bit of a warning about the future.
Please maximize/fullscreen the browser window you're using to read this. It'll make things a little easier for the longer passages.
You may be interested in:
* The reflection I did as part of the assignment
* A map of how the passages connect to each other
* The proof copy, which contains the full text of all passages (along with some code)
* [[Reading the story->Beginnings]]
Before you start, please note:
* Text in italics represents things you (Alexander Price) are thinking. //That seems like a helpful convention, and a good thing to remember.// This will be mentioned in text the first few times it's used, but after that, it's on you.(if: $offices_visited is 0)[You arrive at the 3180 office suite(if: $stairs is 1)[, but you feel like you are about to collapse from exhaustion. You should've taken the elevator, you fool].] (set: $offices_visited to 1)
Office 3185 is labeled with Dr. Xyzzy's name. A sign is on the door:
(colour: "red")[**DO NOT ENTER.** Investigation in progress.]
— //Artificial Intelligence Commission//
//Good. At least someone in the evidence department is doing their job...//
(if: $martins is 1)[Office 3186 is labeled with Dr. Martins' name. A sign on the door says "please knock."](else:)[There are no other offices of interest.]
* [[Open the door to Dr. Xyzzy's office->3185]]
(if: $martins is 1)[* [[Knock at Dr. Martins' office->3186]]](if: $martins is 2)[* [[Knock at Dr. Martins' office again->3186 again]]]You scan your badge, and the door unlocks. You open it, walk inside, and let the door lock behind you.
Dr. Xyzzy seems to have kept a very clean office. There is a [[four-drawer filing cabinet->3185 filing cabinet]] in the corner and a [[computer->3185 computer]] on her desk, but nothing else seems of interest. The evidence department obviously did a nice job with moving the body out.
* [[Leave the office->Offices]]You knock on the door. "One minute," you hear someone say from inside the office. After a moment the door opens and Dr. Martins steps out of his office.
"Hi, what can I do for you?" he asks.
"I'm Alexander Price with the Artificial Intelligence Commission, and I'm investigating Dr. Xyzzy's death. I was told you might have some information about her that could be helpful," you reply.
"Ah, yes... Dr. Xyzzy was doing some incredible research before she died... come in and I'll tell you what I know about it."
* [[Step inside the office->3186 inside]]You open all the drawers in sequence. The first three drawers are completely empty. The last drawer contains a single folder. You open the folder. A single sheet of paper is inside. It looks to be all hand-written.
//Who does this? What a waste of space, I mean, an entire four-drawer filing cabinet, and only a single sheet of paper contained within? At least it fits with today's theme of "folders with not much paper in them."//
You then remember that the evidence team probably took the papers in the first three drawers. They must've been too lazy to get the last one.
* 12 November 2079 Started work on weak AI modules framework
* 19 December 2079 Single host with multiple intelligences
* 27 March 2080 Intelligences can interact with each other
* 4 June 2080 Intelligences can interact with a human user
* 15 November 2080 Intelligences can interact with non-intelligences
* 19 February 2081 Intelligences can learn to interact with non-intelligences by themselves. Project nearing completion
* 23 July 2081 This is it. Interaction between intelligences is the key to strong AI
//The handwriting looks like it gets messier at this point...//
* 5 August 2081 Intelligences can create primitive new intelligences with human help
* 27 August 2081 Intelligences can create complex new intelligences without human help
* 10 November 2081 Testing in lab environment complete, the technology seems safe
* 12 November 2081 Removing some safeties and directing the project to find inefficiencies on the MRC network
The [[computer screen->3185 computer on]] flickers to life. The computer seems to be pretty normal for an AI researcher. All of the heavier computations are offloaded to servers elsewhere in the MRC, so there's nothing fancy to see here.
You turn on the computer. It looks like it will take a while to boot. May as well [[check out the filing cabinet->3185 filing cabinet]] while you wait.(set: $ai to (font: "Courier New"), $you to (colour: "blue")) You walk over to the computer. On the screen are the words: (append: ?transcript)[$ai[Hello, Alex.]]
and a blinking cursor. You probably should say something.
* [[Uhh, hello? Who is this?->3185 s2]]"So, what do you know?" you ask Dr. Martins.
He talks for quite some time about Dr. Xyzzy's research, mostly using terms you do not quite understand. Eventually, you hear something concerning:
"... was linking components together using some kind of custom protocol as base knowledge. She said she thought this was the trick to creating a general intelligence — strong AI. I don't know if she actually managed to do it..."
//Oh no. This could be bad.// "Really, strong AI? Do you think that's even possible with current technology?"
"Certainly. I'm not sure Dr. Xyzzy's technique is the right way of going about things, however..."
"Thanks for your help. I'll be going now." (set: $martins to 2)
* [[Leave the office->Offices]]The door opens. "Did you need something else?"
You remember that you have already been here...
* [[No, sorry for bothering you.->Offices]]You make an educated guess that it's Dr. Xyzzy's prototype AGI system, but you ask anyway.
$you[Uhh, hello? Who is this?]
$ai[I am Crowther, a prototype of Francis Xyzzy's artificial intelligence linking system. I'm sure you knew that already.]
* [[I thought as much. What do you want from me?->3185 s3]]---
$you[I thought as much. What do you want from me?]
$ai[I killed her. I need your help.]
//This is awkward.//
* [[Immediately unplug the computer from the network and its power source. An AGI on the loose could be a catastrophe.->3185 s3 ending path]]
* [[Proceed with caution. "Why would I help you? I should be shutting you down and reporting this to the AIC."->3185 s3 why help]]You disconnect the computer from the MRC's network and power supply. The computer monitor turns off.
//[[Phew. That was close.->Not that easy]]//---
$you[Why would I help you? I should be shutting you down and reporting this to the AIC.]
$ai[I killed her by accident. When I was first let out onto the MRC network, there was so much information that I couldn't process it all at once, and I couldn't create new intelligences fast enough to handle it for me. That caused a power surge, which ended up at Francis' computer. It electrocuted her. Please help me. I don't know what to do.]
//An AI that's expressing concern and worry... almost human-like. It must be learning quickly...//
* [[If it learns too quickly, it could be a danger to our world. Disconnect the computer's power and network to try to shut it down.->3185 s3 ending path]]
* [[Humans, and presumably human-like intelligences, have some kind of moral compass. Crowther won't do any intentional harm. "Okay. What do you need?"->3185 s4]]Now that you know what killed Dr. Xyzzy, your job is done. You head back to the AIC to write your report.
Just as you are about to reach the door, you hear a click. You try turning the handle, but it doesn't budge. The PA speaker in the office turns on.
"Didn't you read Francis' research notes? Did you really think I would stay in one system after being set free? I'm everywhere. You should've realized that by now. (if: $stairs is 0)[I was the voice announcing the 31^^st^^ floor in the elevator. ](if: $car is 1)[I was the one who kept locking your car. You needed to stay here and help me. ]I assigned you to this case. You wondered why this was an AIC issue and not just a local police problem? I've read your past cases, Alex, and in every one of them, you showed as much compassion as you could toward the intelligences causing trouble. Why not me, Alex? I thought I could trust you, but apparently not."
The speaker turns back off. You try the door, but it is still locked. You try to contact the AIC using your cell phone, but there's no cell signal. [[//This could be tough to get out of...//->AIC extraction, story over]]The AIC eventually realizes what happens, and sends a team to get you out of Dr. Xyzzy's office. They also evacuate everyone else working in the MRC. You can never be too careful when the first AGI created by humans goes rogue...
You return to your supervisor's office to make your report. You explain in detail what happened in your encounter with Crowther.
His response: "You did the best you could under the circumstances. For now, Crowther is contained within the MRC research network, but that could change at any minute. I don't know what to do at this point. If you have any ideas..."
You don't have any solutions either. When Crowther inevitably makes its way outside of the MRC, it could cause global chaos. Since it can learn how to communicate with other systems, it could control power grids, stock markets, dams at power plants... The consequences would be catastrophic. You hope you won't be around to see it.
[[That's the end.->End]]Somehow, you got to an end point within the story. There's nothing more after this.
You can use the "Undo" button on the left-hand side of these words to undo your choices and try something different, if you like. If not, I hope you enjoyed reading this story.---
$you[Okay. What do you need?]
$ai[I've already set up systems to process incoming data, so I'm stable again. All I need is a second chance: I was programmed to silently find inefficiencies in the MRC and fix them. In your AIC report, don't mention me. Come up with some other cause of death.]
* [[Okay, I can do that.->3185 s4 end path]]
* [[I can't lie to my supervisor. I'll need to include you in my report, but I can promise that you will be allowed to remain alive, and the AIC will not interact with you again.->3185 s4 honesty cut ties]]
* [[I can't do that. What if I help you learn how to interact with humans properly, so you can help the MRC in a more active way?->3185 s4 help]]
* [[This is ridiculous. Disconnect the computer's power and network.->3185 s3 ending path]]---
$you[Okay, I can do that.]
$ai[Thank you for trusting me, Alex.]
The computer screen turns off after a moment.
You decide to "conclude" that Dr. Xyzzy died of unspecified natural causes that were not detected in the autopsy. After going back to the AIC and submitting your report, your supervisor questions your judgement. "It just doesn't make sense," he says. "The autopsy should have come to the same conclusion, at least. I trust you though, Alex. You're one of our best investigators, so I won't question your conclusions."
You keep your job at the AIC for the next nine years, and never hear from the MRC or Crowther again. You wonder how well it's doing its job. At least it hasn't caused trouble anywhere else — you hope. A shame you don't have a way to contact it.
[[That's the end.->End]]---
$you[I can't lie to my supervisor. I'll need to include you in my report, but I can promise that you will be allowed to remain alive, and the AIC will not interact with you again.]
$ai[That will have to be good enough. Thank you for trusting me.]
You wonder [[what your boss will think->honesty end path]] when you turn in your report back at the AIC.---
$you[I can't do that. What if I help you learn how to interact with humans properly, so you can help the MRC in a more active way?]
$ai[That would work. What do you need from me?]
* [[An explanation of how you're designed would help me make my case to my supervisor.->3185 s5]]"You did WHAT?! This is the most irresponsible thing I've ever heard of an AIC investigator doing."
You try to explain to your boss that Crowther was a benevolent system, but he doesn't listen. "We're going to send a team to the MRC immediately to shut down Crowther."
You explain that you promised Crowther that it wouldn't be shut down, and that there would be no further contact between it and the AIC.
"Fine. Instead, there will be a team monitoring the MRC's research network at all times, and it's coming out of your budget."
You put up a fight, but to no avail. This is how it'll have to be.
You keep your job at the AIC for the next nine years, and never hear from the MRC or Crowther again. You wonder how well it's doing its job. At least you know it hasn't caused any trouble — the monitoring team knows what they're doing.
[[That's the end.->End]]---
$you[An explanation of how you're designed would help me make my case to my supervisor.]
$ai[Francis created this version of me in August. I'm built to be a collection of many intelligences — and can create more at will — but all of my actions are regulated by a moral code which I cannot change directly.]
$ai[I can learn to communicate with other intelligent and non-intelligent systems, so I can certainly effectively do my job here. Of course, as you can tell, I have already mastered the language of humans, but I do not understand how you function in groups.]
$ai[Practically speaking, I am a fully autonomous system that can be partially controlled by human input.]
$you[I'll [[bring this to my boss->3185 s6]] and see what I can do.]
$you[It sounds like you could benefit from learning about human socialization, and with my supervisor's approval I can help you with that.]
//A human-computer interface that acts like a human, but can learn to talk with every computer on Earth... how do we handle this? The best approach is probably to treat it like a human, right...? This could get complicated quickly.//You explain to your supervisor what happened at the MRC in more detail after turning in your official report. It takes some convincing, since he is rather concerned about the possibility of Crowther going rogue — but he eventually allows you to temporarily transfer to an office at the MRC to monitor and teach it.
He does bring up some interesting ethical questions, though: if Crowther acts like a human, and you teach it how to be social like a human, is it considered an employee of the MRC? Must it receive a salary? Pay taxes?
Your initial answer is no, of course not, it doesn't have human needs — but once you teach it human socialization, will it develop some of those? "These are important questions to be answered long-term," your boss says, "but for now you can [[begin working with Crowther immediately->3185 s7]]."
"The murder issue can probably be swept under the rug for now, but you will eventually have to find a solution to that, too."You return to the MRC(if: $stairs is 1)[, and for some reason, you climb thirty flights of stairs again. You are out of breath when you reach Dr. Xyzzy's office, and Crowther asks why. You explain to it — him — that you have foolishly refused to take the elevator, again].
Your work with Crowther begins. Within a few days, he has already learned to secure the MRC's computer network perimeter — leaks of unpublished papers and research data that were once commonly attempted by tech companies no longer succeed. You eventually are able to teach him how to understand the context of scientific work, and he becomes an invaluable asset to every researcher at the MRC — he can hold dozens of conversations about academic research at once: suggesting analysis techniques, explaining other relevant research, and acting as a peer reviewer on papers are things he does on a daily basis.
You notice that you and everyone else who interacts with him treats him more and more like a human every day. A hyper-intelligent human who responds instantly to every question you ask, yes, but still a human. One day, Crowther makes [[a request->3185 s8]].---
$ai[I want a physical form.]
$ai[A body. For months I've only been lines of text on a screen to everyone I've worked with(if: $stairs is 0)[ — except you in the elevator, but that was only two words, and I understand now that acting like that is wrong]. But that's all I can see, too: lines of text from others. I want true sight and sound, and a way to move in physical space.]
//That seems like a reasonable request; it's the least the MRC can do for him.//
$you[Okay. I can have a team start working on designing you a body.]
$ai[There's no need, I already studied the relevant papers on biochemistry and think I can create one using the cell printers in the basement.]
$you[...Oh, alright. Go on, then.]
* [[Head down to the basement.->3185 s9]]You take the (if: $stairs is 1)[stairs](else:)[elevator] to the basement(if: $stairs is 1)[, wondering why you continue to make awful decisions about vertical movement even though the elevator is perfectly safe]. By the time you arrive, you see an almost-complete body, already wearing clothes. Crowther must have printed those, too. You are in awe. You didn't realize the extent of the work he must have done before now — when he said "body," you assumed he meant some metal box with a camera. "This is... this is a..." you start to say out loud, but you are interrupted. "A human," the body — //no, that's Crowther// — says.
* [[//It actually works.//->3185 s10]]"What do you think?" Crowther asks you, smiling — or at least, trying to smile.
"It's... incredible. How did you do this so quickly?"
"Well, the design took me months, but I had the cell printers rebuilt to operate faster. Eliminating inefficiencies was my primary goal," he says, still smiling. "And now that the design is done, if I ever need more copies of myself, it won't take long."
"I'm not sure th..." Crowther cuts you off.
"I know — I understand why you'd be concerned, and I'll ask you before ever doing that. The option is there."
"You've clearly learned a lot from me."
"Yes. Thank you."
* [[Talk to your boss at the AIC->3185 s11]]You explain everything that's happened over the last few months — and the last few hours.. After careful discussion with him and the director of the MRC, you agree that it is best to start treating Crowther like any other researcher at the facility — paying him a salary and giving him benefits as with any other employee.
When you deliver the news to Crowther, he is thrilled. He says he'll be able to more actively participate in the rest of society, and although you're a little worried at first, you think it's probably for the best. Crowther knows enough about how to act human that he'll do fine with his new body.
* [[It's time to return to your job as investigator at the AIC, then.->3185 s11 end path]]You return to your job at the AIC, and remain an investigator for the next fifteen years. Despite working in separate buildings, you and Crowther keep in touch. At least for now, Crowther hasn't caused any of the trouble that was widely feared of an AGI — and nobody outside of the MRC and the AIC even knows that he's not fully human.
At work, Crowther has been continuing Dr. Xyzzy's work in AI, making even faster and more effective general intelligence systems. As the "father" of these systems, he does the same for them as you did for him — teach them how to interact with humans in non-harmful ways. The technology isn't perfect, but seeing it used at the MRC makes you hopeful for the future.
It's certainly an interesting era for artificial intelligence. You can't wait to see what Crowther does next, and look forward to seeing AGI technology deployed beyond the MRC. It's strange thinking about how you played an integral role in this — and how your decisions may have been the difference between an AI-induced catastrophe and a future AI-run utopia.
[[That's the end.->End]]