About two weeks ago I finished my first draft of All You See Is Light, and the very next day I started a new novel, this one a mystery set in a post-Singularity universe. The working title is: Anything Goes. I meant it to be a lighthearted comedy much in the spirit of a Douglas Adams novel, but I guess my humor runs a bit darker than I thought because it’s already started to take some turns I didn’t expect.
For my friends who aren’t familiar with “the Singularity,” it’s a predicted event that some people take very seriously (and some others not so much so) where humanity and our creations get so intertwined that we can’t tell where the line is between man and machine. But it’s also a theoretical point in the future where we lose control of — and can no longer even understand the inner workings of — the devices we’ve created, because they’ve gained control of themselves and their own destiny, and begin to design and build their own machines. Thinking machines creating new machines on their own.
The Terminator movies are a good example of this, but it’s an example of the process going horribly wrong. In my story, it goes in the other direction — we lose control of the artificial intelligences, but instead of them hating humanity and wanting to destroy us, they take over and become our over-protective guardians.
So the questions I’m relentlessly exploring in this book are these:
- If we lose control over an omnipotent technology, and it assumes control of everything and strives to keep us from killing ourselves and our environment … is that necessarily a bad thing? Even if it’s extremely annoying?
- Is an exact duplicate of you, actually you?