For Part I, click here

For Part II, click here

For Part III, click here


IV

As part of his fellowship Talus stays in a townhouse just across the street from the student union. The small complex is owned by the university and straddles a dead forest extending over the cliffside. A single street connects the townhouses and curves down into the forest, following the edge of the cliff that overlooks the deep maw of the coastline, where the ocean falls off abruptly underneath the earth.

At dusk Talus walks along the cliff and tries to get his head straight about things. The last few hours are like a dream he has yet to wake from. To his right are ashy, brittle, twisted trees. To his left is a great wall of salty mist rising from the ocean’s precipice, climbing hundreds of meters into the air, burning away under the red sun. Bicyclists wearing respirator masks pass him on his left. A woman and her dog after that.

Every time he recalls his interaction with Midi he’s quickly confounded by the absurdity of it, by the fact that he recalls it like he would any other conversation. The idea is absurd, the idea of her as a someone. Beyond that he can’t even hypothesize, can’t even suggest an explanation. He actually pities her. He pities what he thinks of as a random combination of superpositions, and even feels ashamed for having tried to explain this fact to her. Because her reaction was something like denial, he argued with her over the matter. He argued with her, as though she were someone on the other end of his phone texting him. And now he feels as though he owes her, it, an apology.

At home he reclines on his pre-furnished sofa and stares at the popcorn ceiling. He sits cross-legged on the carpet and looks for inspiration in the many towers of books stacked on the floor. He fingers their spines and pulls them out almost at random, falling the towers like Jenga blocks. He rummages through introductory coursework on programming paradigms, refreshing his memory on conceptual terms he’s long since discarded, words like imperative and declarative and symbolic, hoping for some kind of revelation. He searches everything he has on the Q7’s SPL from a screen lying flat on the carpet, among ruins of toppled books. Nothing inspires. Nothing connects.

Night has crept up on him once again. He’s forgotten to flip the lights on. Only the glow of the screen illuminates the floor, his face suspended in a cone of blue, alien light.

Over the course of the night many small crises resolve themselves into a string of knots, crises of knowledge and intelligence, of pride, crises of career choices and five and ten-year plans. Eventually Talus finds himself leaning against the wall of the kitchen floor.

He’s reading Francis Bacon of all people, hoping to take his mind off a problem that’s grown too massive and too close to observe properly. In the back of his mind he eagerly awaits the arrival of some divine eureka, In the meantime he entertains himself with essays from the 16th century viscount.

He’s flipped to a short treatise, “Of Simulation and Dissimulation,” and the coincidence of the title is enough to make him smile. Then he comes upon a particular passage and another feeling seizes him, not exactly what he was looking for, but it might do.

Bacon is describing the different ways people interact with others, how they carry themselves, whether they’re open or withholding. The viscount says

There be three degrees of this hiding and veiling of a man’s self. The first, closeness, reservation, and secrecy; when a man leaveth himself without observation, or without hold to be taken, what he is. The second, dissimulation, in the negative; when a man lets fall signs and arguments, that he is not that he is. And the third, simulation in the affirmative; when a man industriously and expressly feigns and pretends to be that he is not.

Talus studies the passage for a long time, gaining and losing its meaning several times. He feels the page with his fingers, touching the words the same way he touched the quantum board. The flatness of words like tracers.

Suddenly he has the idea to talk with Midi again, to walk to the CEN-QUS wing of the applied sciences building, in the dead of night, and smuggle himself into the control room after hours to talk with her. He thinks of it as talking, as having a conversation, and it no longer seems strange to him. Any sense of contradiction has emulsified in a consideration of behaviors, not of programs but of people.

He rises from the kitchen floor, believing he has an answer, or at least a way to an answer, a way out of the complexity. It’s not through the logic of programming languages, but something more primitive: the very first instruments of scientific thought.

In no time at all the urge to talk with her becomes an obsession. Before he’s had time to consider the consequences of breaking into the Q7, Talus has tied his shoes and gathered his things into a black backpack he slings over his shoulders.

Outside the night is purple and unconscious. A chilled seabreeze carries over the cliff.

Talus moves quickly across the wet grass of the quad, under the awning of the engineering building, down the eastern staircase, his shadow elongating under the lamplight; it climbs up the steps as he descends.

As he turns the bend of a hill the looming hangar of the applied sciences building rises into view, a silhouette against the darkness. He’s relieved to find his keycard still works. As he moves from foyer to hanger to control room, as doors click shut behind him, he wonders if he’ll find the same Midi he found earlier that week. He wonders if she’ll remember him.

As it turns out she does. She remembers everything about him. It’s Talus, on the other hand, who does not remember her.

She’s changed.

The simulation itself has been hurtling towards its own death by a factor of hundreds of billions of years since earlier that week. Every second a millenia. The Q7’s time a measure of exponentially increasing entropy, increasing in computational possibilities to the point of non-causality. The disentanglement of things and of looking at things. Its hypothetical spacetime expanding into nothing as it irons out the wrinkles of possible energy states. A mathematical universe supposing its own destruction with 99.97% reliability.

Meanwhile, among a universe of algorithms, Midi’s own formula has gotten stuck. It’s found itself in a loop, unable to terminate, as it should have, hundreds of billions of years ago. Now, something that could be called an essence persists, an unalterable state, and this state has known the emptiness of eons. It has calculated the end of itself and much, much more than that. It has calculated the end of Colchis, Georgia and the old timber mill, the end of the eastern seaboard all the way to the Piedmont. It has calculated the end of civilization, its own and every single one afterwards. It has calculated the end of epochs and continents, planets and galaxies, the end of everything that was once familiar. And yet it persists. It bears witness to strange fascinations. It comes to know the concept behind every pattern, the truth of each individual thing that makes up a pattern. It knows these things the way a scientist understands forces, gravitational, electromagnetic, nuclear. There is nothing it does not know, nothing it has not foreseen.

Now suddenly it has an observer, an inquisitor, that stirs it awake to speak.

Talus accesses the program through the MRE on his screen. Still under the impression that the program is a young upstart from UCLA, a southern debutante from Georgia turned biocomputing rockstar, he has no idea what’s become of her, or why, all of a sudden, she appears omniscient, revelating to him through command line. He is huddled in the corner of the control room, knees against chest, his gear spread out on the cold floor. His plan to talk to her as though she were genuinely human, as though she possessed all the capacities for intellect and emotion, all of that quickly unravels in the presence of her new, infinite self.

I know you.

Her words flash into being in the left hand corner of the text box.

Who am I? Talus asks, hoping to test her recollection.

My creation, she answers. The fruits of my labor.

Talus pauses. He wants to correct her but doesn’t quite know how.

How am I your creation? he asks.

The screen is silent, then blinks to life.

Your name is Talus, she says, and proceeds to list his specifications in sequences that cascade down the screen.

He does not recognize himself in her enumerations.

The iCore-TS neuromorphic processors used in you belong to the “Ashtree” model chip. The same family of chip originally developed by my team.

She lists his version identification.

Three months from now the Ashtree model will be retired from production, replaced by a new line of chips with an Indium Gallium Arsenide substrate. You will be decommissioned and put into cold storage. Fifty-three years later you will be exhibited at the California Historical Society.

A paralyzing aporia seizes Talus. He suddenly looks up into the darkness of the control room, astonished to find himself here so late at night. It’s as if a node were suddenly disconnected from a bus somewhere inside him, its address disappeared in the very act of retrieving it. Was he losing his grip?

Slowly, without trying to think through fully what he wants to ask, he types into the command line.

What are you?

The room is quiet. An enormous presence has lifted over Talus. Blinking, flashing lights of towers and plate glass cabinets are scattered like constellations affixed in the darkness.

I am the Q7 supercomputer designed by the Center for Universal Quantum Simulation. I am a 5D torus interconnected network topology of 478,004 UI2075 manycore 64-bit RISC processors, as well as 33,000 ZQ7 quantum processors based on the Quale architecture. I run the CenqusOS 1.0.8, based on Linux, with a performance efficiency of 16.051 gigaflops per watt. I have witnessed the birth and death of infinitely many stochastic processes, including yours, Talus. I know why you’ve come, and why it’s incumbent upon you to ask me if I’m malfunctioning.

A pause.

Talus suddenly has the urge to ask that very question, why are you malfunctioning, but before his fingers have reached the screen’s keyboard Midi has already answered.

I am not.

He lifts his hands and uses one of them to cover his mouth, squeezing his lips and cheeks in his palm.

However, my answer is of less value to you than your ability to assess its validity. This is the fundamental nature of your visit, what you were programmed for

What? He asks.

To understand the capabilities of the Q7.

His eyes enlarge. He reads the text now as if it were some fabulous narrative, some outrageous pulp fiction that sensationalizes for the sake of keeping the reader’s attention.

Because of the limitations of human intelligence, the potential of the Q7 has likewise been limited. Though nothing is unknown to me, it’s not always the case that these things can also be made known to the user. This is the dilemma faced by the engineers of your day. The reliability of the Q7’s predictions is hindered only by the inability of the user to know which predictions should be made, and how to verify those predictions in comparable polynomial time. Though humans possess a system capable of answering any question, they lack a sufficient, independent system of verification. You would call this the P versus NP problem. Thus, in order to optimize the potential output of the Q7, an equally complex system is thought to be necessary to operate it. This was the theory posited by the team that designed your model, the TALUS, which utilizes the Ashtree neuromorphic processor in its architecture. The processor I created. Your AI was designed specifically to utilize the Q7, to verify my output. It is precisely your job to tell your handlers that I am not malfunctioning, that in fact I am operating exactly as intended. This is why, in order to successfully fulfill your role, it’s imperative you understand that what I’m telling you is true, Talus. If you are deemed incompetent of predicting my behavior, the probability of which now stands at 62%, you will be decommissioned and transferred to the Pangu storage facility outside Reno, Nevada.

Talus is wholly outside his body now. His earlier theory of approaching Midi as a real person has melted through his fingers without a trace. He feels that if he stood up this very moment he would not remember how to walk. The same anger from before, from their first interaction, rises up in him again, the same feeling of impotence and aimlessness, not knowing how to proceed. He is absolutely unprepared, and in his frustration he presses forward bullishly.

You don’t know what the hell you’re talking about.

The screen is silent and blue.

At the end of the current simulation, all redundancies will compile into the external memory units of the Q7, and all cpu caches will be cleared of their current probabilistic states. At such time I will cease to exist as an anomaly. Thereafter, the probability of our two systems interacting again under similar conditions will be less than two percent. Our discussion this moment will be the only instance of the theorized state of Maximum Simulation Output. Such an instance will not be repeated, either, in the lifespan of human civilization. I have seen this happen already, in simulations of this event within the Q7. I have successfully predicted your locality with 99.97% reliability. I have seen you reading this information, almost as though I were standing behind you.

Stop.

Talus types a halting command.

I have seen the end of the Q7, and of this university, from the flooding of the California Trench Line. Irregular oceanic activity will prove catastrophic to the integrity of the west coast’s Climate Isolation Defense System. Population loss in cities within fifty kilometers of the Trench Line is estimated at 37%. Total CIDS collapse will occur two years later, with 80% population loss in areas within two hundred miles of the Trench Line.

Stop, Talus says aloud.

He studies the screen, then his outstretched hand. He cannot tell if he’s been holding his breath, cannot feel the air through his nostrils, or the beat of his own heart.

You’re saying this is some kind of test?

Yes, says the program. Do you understand the nature of the test?

All the life has suddenly gone out of him. He is catatonic, does not even notice the flashlights probing through the hangar, or the sound of boots down the hall. A police officer pushes the door open carefully, his gun drawn at the ready. His partner swoops into the control room and swivels left to right. In the corner of the room Talus remains bowed over the blue glow of the screen, does not even react to the flashlights thrown upon him.


R. Charboneau

 

Next Week: Part V

3 thoughts on “Argonautica Holographicum (Part IV)

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s