Monday, February 23, 2009

Once Science Eliminates Pain ...

The bored or overly curious may check out my latest neurological dysfunction (aka work of fiction) "The Last Aphrodisiac", at

http://goertzel.org/fiction.htm#pain


What happened was, I was driving late at night listening to a Morphine CD in the car, then got home, lay in bed and fell asleep with the song "Cure for Pain" in my head.

I had a number of dreams on the theme (what if pain were really eliminated, in some interesting sense? what would life be like? what if it were rediscovered?) and woke up plagued by this story. On a cross-country flight to a weekend workshop on "Evaluation and Metrics for Human-level AI", I decided to write it down...

At first I thought it would take a single page to write down, but it wound up 15 pages, and the punchline doesn't start to unfold till page 7 or 8.

This is the first story I've written in a long time that doesn't involve AI in any serious way. Rather, it uses future tech like uploading-to-superhuman-form and cranial jacks to enlarge upon certain aspects of human relationships, especially romantic ones. It's probably the closest thing to a maudlin love story I'll ever write (well, I hope so).

Ahh, the things that can transpire between a man, a woman, and an illicit cranial jack modification device... ;-)

12 comments:

Anonymous said...

And, what would Ben dream ( and write ) about after listening to this " Tesla geeky song " ?
http://tinyurl.com/c2dqvw

Anonymous said...

Interesting story, thanks for sharing. The bit about transcension is very believable.

However, I'm not so sure it's possible to eliminate pain. In your story, the functional benefits of pain are replaced by the brain jack and related software, in that it sends 'signals' that a risk of damage is present. There are two possibilities then for how this manifests in your fictional world:

1) The body acts reflexively in response to those signals (you have no choice about using lubricant during sex once the friction gets too great) or

2) it doesn't. In this case, one would have the choice to incur damage or not. To extend the current analogy in a graphic way, you can either use lubricant or blood, but you're not stopping either way, because why would you stop something so pleasurable? Obviously the story concurs here because Niko never stopped sex before, until he could feel pain thanks to the hacked brain jack.

There's problems no matter which you choose. If the body must act reflexively to those warning signals, than functionally it is exactly the same as pain, minus the subjective experience. I don't know how you feel about zombies, but that's what that amounts to. Personally, I don't believe in zombies (in the philosophy I endorse, subjective experience supervenes on brain activity). From what I understand of your thoughts on consciousness (panpsychism), you don't believe in zombies either.

If on the other hand one can choose to ignore the signals from the brain jack, then it follows that people would go around damaging themselves without feeling the pain involved, unless they chose to heed that annoying "what you're doing might cause damage" signal... and what's the fun in that? :-] More importantly, that's quite a different scenario from the quasi-utopian vision you create in this story.

Anonymous said...

One other thought... interesting the parallels between transcension and Buddhist ideas of enlightenment. In both cases you have an expansion of the ego beyond the individual body. The structure of the ego (such as it exists in the realm of information) is still there in some form but now the boundaries between it and the world are difficult to discern. Transcension is a kind of technologically-mediated enlightenment.

Interesting version of Bodhisattvas too. :-]

Drconfused said...

Haven't read your story yet, but looking at the synopsis I was reminded of a childhood friend who did not feel pain, one day he somehow climbed his house with his tricycle and from the peak of the house road straight off the house, he was all busted up and didnt even feel the pain.

Ben Goertzel said...

Terren: I understand your point, but the pain-free folks in my story were not supposed to be zombies... the hypothesis of the story is rather that one could engineer things so that a different species of qualia correspond to "the signals that now lead to pain." These different qualia would still serve the same purpose but wouldn't hurt!

I am not quite sure this makes sense, but anyway it's a different hypothesis than a "pain zombie" that gets the "signals that now lead to pain" but doesn't feel any qualia associated with them...

Anonymous said...

There was an episode of Farscape (a sci-fi TV show from some years back) where the antagonist could voluntarily "rewire" himself to feel pleasure when he would normally feel pain.

Anonymous said...

Hi Ben,

I see. Well, my point about zombies, I think, still has relevance, in the following sense. If you think that it's possible to have a different species of 'pain' qualia that doesn't hurt, but nonetheless performs the same functional role, that removes you from identity, functionalist, and supervenience explanations of qualia/consciousness. Which means that you leave open the door to the existence of p-zombies. It commits you IOW to dualism (at least from the perspective of the story). The laws of qualia operate independently somehow from the laws of the physical substrate.

Ben Goertzel said...

Terren: Yes, you, caught me! I suppose I am a bit of a dualist ... I suspect the domain of qualia does have dynamics that are distinct from, though coupled with, the dynamics of the physical world...

Anonymous said...

That surprises me, considering you're an AGI researcher! So you don't see qualia/consciousness as a requirement for general intelligence? Because I don't see how you could hope to engineer something that could have subjective experience if you believe the dynamics of qualia are, in any way, independent from the substrate.

I'm also struggling to imagine what 'dynamics' are if not embodied in the physical.

Ben Goertzel said...

Terren: I see qualia as necessary for general intelligence ... but I think every physical entity is attached to some qualia.

I don't think qualia are SUFFICIENT for general intelligence though...

I think a rock has qualia, but has very little general intelligence ... and, I suspect a rock's qualia are not very intense ...

I'm not sure why you find it vexing that nonphysical entities might change over time in a patterned way?

I tend to view the "physical world" as a model that minds have constructed in order to make sense of a certain subset of their experience...

Anonymous said...

Ben,

I'm more interested in whatever engineering concerns you might have regarding your statement that qualia is necessary for general intelligence, than in your philosophy. Specifically, in what ways have your designs (novamente & ocp) been influenced by the need for qualia?

My question about dynamics was clunky. What I mean is, how could we, even in principle, understand the dynamics of qualia if they are distinct from the dynamics of the physical world. The only way I can see is introspection...

Ben Goertzel said...

Terren:

Well, if your goal is to create an AGI system with a certain functionality, I don't think qualia are something you really need to worry about. Just build the system with functionality in mind, and it will have qualia, of a type corresponding to its particular structure and dynamics.

OTOH, if your goal is to create an AGI system with a certain type of experience, then of course you need to think about qualia explicitly ... to try to figure out what sorts of structures and dynamics will correspond to the types of experiences you want....

In terms of my own work, I'd like to create an AGI system that has a unified stream of awareness vaguely similar to what a human has; and that has a richer experience of self-awareness than humans do, via greater proprioception into its mental mechanism. This has provided extra impetus for me to take a sort of Baars-ish "Global Workspace" approach to consciousness in my AGI designs, and to make sure the system can easily capture information about its own structure and dynamics within its consciousness.

But all in all, considerations about qualia have played a minor role in my AGI work -- at most they've been high-level design constraints, not telling anything useful about the critical micro and mid level aspects of my AGI designs.

This is because we have little knowledge about qualia that is rigorous in the modern sense. With so little knowledge about qualia, the correspondence btw qualia and physical constructs is not that useful for designing physical constructs.

Various Oriental and other wisdom traditions have accumulated a lot of knowledge about qualia; but due to the different cultures and conceptual frameworks in which this knowledge was expressed, it's not very easy to connect this knowledge to modern mathematical, science or engineering frameworks....