News:

Licenced Jenkem provider since 2007

Main Menu

FFFFUUUUUUUUUU

Started by Jasper, January 14, 2010, 06:47:56 AM

Previous topic - Next topic

Mesozoic Mister Nigel

If we manage to make robots that have consciousness and reasoning, we damn well better give them emotions, too, so that they'll make stupid decisions based on feelings. Otherwise they'll look at this shit, realize that humans are retarded, and promptly take over the world.
"I'm guessing it was January 2007, a meeting in Bethesda, we got a bag of bees and just started smashing them on the desk," Charles Wick said. "It was very complicated."


Jasper

What would be bad is to have a robot that simply has access to the information " my oil is low" and acts on it. What I want is for a robot that "feels" it's low oil the same way we feel hunger.  The difference doesn't make a great difference for utility, but it is what I want to create. Things like space exploration are reasons to make smart robots, but my real motive is to understand and recreate consciousness. The utilities don't matter to ME, but they matter for things like funding.

Nigel, you are correct. Just s we FEEL hungry, or sad, or ambivalent, or other sensations ( emotional or bodily), these sensations provide us with an extremely nuanced , information-rich reality in the context of our own experiences. That's important for conscious experience: sensations, voluntary and involuntary behaviors, inner dialogue, the ability to imagine sensations of other people, and the ability to emulate minute behavioral information by observing others like us.

Elder Iptuous

is consciousness as you define it requisite for belief?

Jasper

The point of all this is, once I have the means to recreate a brainlike synthetic organ, how will I know if it truly FEELS hungry, or if it is just responding to mechanical hunger protocols? Is that what we do? Neuroscience has identified the exact chemical that tells our brain we are hungry (they want to make a weight loss drug). If all we're doing is responding to a neurochemical information system, then why bother with a unified psychosensory experience?  Are our brains hallucinating a reality abstracted from sensory data? I would find that neato, but I have doubts.  Could it be that our whole sense of awareness evolve out of evolutionary pressure? Possibly. Without having  conscious experience, we would habe trouble existing in social situations. No real means to think of "me" as it refers to "the others" without "me" as a highly developed perception.

Jasper

Iptuous, what beliefs are you talking about? The sentence needs more careful wording fore to give a good answer.

The Fundamentalist

I'm not sure that I understand this problem.  After all, isn't it just as impossible to verify that other people have subjective experiences?

As for the importance of communication/linguistics, I do think that it's important.  It seems like a reasonable idea to me that consciousness evolved from lying.  That is, as soon as we needed to make models of what other people were thinking, we had to think ourselves.

(I am also very interested in AI.)

NotPublished

hahaha hallucinating robots

"Why was I built to have pain"

And don't forget Robot Religion & Philosophy.
In Soviet Russia, sins died for Jesus.

Jasper

#37
Lying is definitely a skill that developed from evolutionary social pressures. Without the ability to model the thinking of others, how do you lie well?  For that matter, how do we empathize? Autistic people may be merely lacking in these evolutionary traits, I have heard.  That's how recently our social skills evolved. We still have people who don't have this trait as a dominant gene.

Mesozoic Mister Nigel

It seems like once we had conscious robots, managing the structure of each type of input until it was confusing, conflicting, and sometimes overwhelming should give us a good approximation of "feelings".

If we can get them going "Oh my god I feel kinda crappy; is my oil low or am I just still upset over losing my job or do I need to empty my condensation chamber?" and then make the sensations become more unmanageably overwhelming and difficult to distinguish, as well as decreasing their ability to reason and their fine motor skills as demands on them increase, I think we would just about have it.
"I'm guessing it was January 2007, a meeting in Bethesda, we got a bag of bees and just started smashing them on the desk," Charles Wick said. "It was very complicated."


NotPublished

I guess one cool thing of being a concious machine is the internet

Some models have access, some don't.
In Soviet Russia, sins died for Jesus.

Mesozoic Mister Nigel

I mean, they'd be depressed all the time and we'd need to invent the robot equivalent of tranquilizers, sleep aids, and antidepressants, but can you imagine functional robots in a dysfunctional human world? They'd have us all neutered or spayed and keep a few of us around as pets.

Which actually might not be so bad.

We should give them an appreciation for art and literature but without any ability to create their own, so they'd have an incentive to keep us and treat us reasonably well.
"I'm guessing it was January 2007, a meeting in Bethesda, we got a bag of bees and just started smashing them on the desk," Charles Wick said. "It was very complicated."


The Fundamentalist

Personally, I think that the bicameral model is a pretty interesting one for the origin of consciousness, although I haven't read that book yet and I've heard that it probably took place long before the author claimed if it did.  On the other hand, I might just like Snow Crash too much.

Quote from: The Right Reverend NigelIf we can get them going "Oh my god I feel kinda crappy; is my oil low or am I just still upset over losing my job or do I need to empty my condensation chamber?" and then make the sensations become more unmanageably overwhelming and difficult to distinguish, as well as decreasing their ability to reason and their fine motor skills as demands on them increase, I think we would just about have it.

Is it wrong that I laughed?

Quote from: FelixLying is definitely a skill that developed from evolutionary social pressures. Without the ability to model the thinking of others, how do you lie well?  For that matter, how do we empathize? Autistic people may be merely lacking in these evolutionary traits, I have heard.  That's how recently our social skills evolved. We still have people who don't have this trait as a dominant gene.

So we have to make robots that can lie.

Yes, I think that AI research is the place to be for mad science.

NotPublished

Hmm creating a race of beings that would kill us in the end. I like that. Maybe we did that to our own ancestors, they built us so we ate them.

Quote from: The Fundamentalist on January 15, 2010, 02:08:19 AM
Quote from: The Right Reverend NigelIf we can get them going "Oh my god I feel kinda crappy; is my oil low or am I just still upset over losing my job or do I need to empty my condensation chamber?" and then make the sensations become more unmanageably overwhelming and difficult to distinguish, as well as decreasing their ability to reason and their fine motor skills as demands on them increase, I think we would just about have it.

Is it wrong that I laughed?

I think it'd be cute
In Soviet Russia, sins died for Jesus.

Jasper

:lol:

I'm all for confused robots, but maybe they should only experience as much emotional turmoil as your average person. The point of emotion is to provide instinctual insurance against bloodymindedness (read: HAL), so the crippling levels of stress and despair perhaps are unneeded.

BabylonHoruv

Quote from: Felix on January 15, 2010, 01:54:24 AM
The point of all this is, once I have the means to recreate a brainlike synthetic organ, how will I know if it truly FEELS hungry, or if it is just responding to mechanical hunger protocols? Is that what we do? Neuroscience has identified the exact chemical that tells our brain we are hungry (they want to make a weight loss drug). If all we're doing is responding to a neurochemical information system, then why bother with a unified psychosensory experience?  Are our brains hallucinating a reality abstracted from sensory data? I would find that neato, but I have doubts.  Could it be that our whole sense of awareness evolve out of evolutionary pressure? Possibly. Without having  conscious experience, we would habe trouble existing in social situations. No real means to think of "me" as it refers to "the others" without "me" as a highly developed perception.

I really doubt that is possible.

Not that it is not possible to make a machine that feels hungry, but it is impossible to know it feels hungry and doesn't just inform you that it feels hungry because that is how it was programmed.
You're a special case, Babylon.  You are offensive even when you don't post.

Merely by being alive, you make everyone just a little more miserable

-Dok Howl