Friday, November 20, 2009

Happiness and Artificial Intelligence: Sci-Fi Part 2


In my first post on happiness and artificial intelligence, I tried to be fact-based and talked about artificial intelligence programing. In this post, I want to about artificial intelligence in "literature," (which sounds so much more substantive than in "Science Fiction".)

I became interested in this topic because I am reading the 1992 book  Mostly Harmless, by Douglas Adams, where hero Ford Prefect hotwires a security robot so that it is always happy. The happy robot is so content that it stops chasing him and cheerfully cooperates.  This got me thinking that even if people could build a self-aware robot, why would it want to do anything?

The epitome of bored science fiction AI is Marvin the Robot from Hitchhickers Guide to the Galaxy. He is so smart that everything that happens bores him, and he has lived so long that he has seen it all before. [At left is Marvin from the 2005 movie.]

 Ray Kerzwell considers that societies advanced enough to venture to earth might not be motivated to do so.

"this thinking might explain why we haven't found extraterrestrial life yet: intelligences on the cusp of achieving interstellar travel might be prone to thinking that with the galaxies boiling away in just 10^19 years, it might be better just to stay home and watch TV".

He introduces a future dystopia called  "A "societal fixed point" [that] might be defined as a state that self-reinforces, remaining in the status quo--which could in principle be peaceful and self-sustaining, but could also be extremely boring--say, involving lots of people plugged into the Internet watching videos forever."


Kerzwell's society has reached a high state of material prosperity, citizens don't really need to do anything productive.

You are probably are familiar with "Problem of Evil." Why does a good God allow suffering in the world? One of the answers is that pain serves a purpose, for example painful swelling helps a wound cure or a painful burn helps children to avoid touching the stove again.  Similarly sadness, anxiety, paranoia, need and desire, like hunger and lust get people up off the couch and doing things.


This shows up in The Matrix where Agent Smith says:

Agent Smith: Did you know that the first Matrix was designed to be a perfect human world? Where none suffered, where everyone would be happy. It was a disaster. No one would accept the program. Entire crops were lost. Some believed we lacked the programming language to describe your perfect world. But I believe that, as a species, human beings define their reality through suffering and misery. Which is why the Matrix was redesigned to this: the peak of your civilization.

Many people have observed that happiness is temporary, with stories like  "I thought the new red sports car/new boat/new house/new wife would make me happy but after a few weeks/months, I got bored and wanted something more."