News:

PD may suddenly accelerate to dangerous speeds.  If PD splits open, do not look directly at resulting goo.  PD is still legal in 14 states.

Main Menu

FFFFUUUUUUUUUU

Started by Jasper, January 14, 2010, 06:47:56 AM

Previous topic - Next topic

NotPublished

I think I'm emotionally numbed to  :lulz:
In Soviet Russia, sins died for Jesus.

Jasper

Quote from: NotPublished on January 15, 2010, 04:09:08 AM
I think I'm emotionally numbed to  :lulz:

Then find us a new cynically fabulous laugh emote.  We'll probably love it to death too.

Mesozoic Mister Nigel

Quote from: Felix on January 15, 2010, 04:06:26 AM
I don't, except in unusually stressful situations.  Normal situations generally evoke moderate emotional sensations.

That's the idea. Ramping up the complexity and intensity of the stress results in more confusion and less functionality.
"I'm guessing it was January 2007, a meeting in Bethesda, we got a bag of bees and just started smashing them on the desk," Charles Wick said. "It was very complicated."


Jasper

The last thing I want is robots that are more sensitive to criticism than I am.  Fuck that.

NotPublished

Hahahaha

It could be like that gay robot from Star Wars .. that gold one, I'm sure you know who I'm talking about :)
In Soviet Russia, sins died for Jesus.

Jasper

Quote from: NotPublished on January 15, 2010, 04:28:43 AM
Hahahaha

It could be like that gay robot from Star Wars .. that gold one, I'm sure you know who I'm talking about :)

That guy was fucking useless.  I require robots that don't have to shuffle like a kneeless mutant.

Mesozoic Mister Nigel

I require giant robots that run like horses.
"I'm guessing it was January 2007, a meeting in Bethesda, we got a bag of bees and just started smashing them on the desk," Charles Wick said. "It was very complicated."


BabylonHoruv

Quote from: Felix on January 15, 2010, 03:06:32 AM
Then it's all about instinct, isn't it?  Our instincts tell us what hurts, what is dangerous, when to eat, sleep, crap, have sex, raise kids, protect other people, and other kinds of things.  But these particular instincts do not have much relevance to a non-organism that was made for a purpose.

Part of the trick in working out machine consciousness is to work out machine instincts, it seems.

Instincts are just ROM  (metaphorically speaking of course, but I am sure you get my point)
You're a special case, Babylon.  You are offensive even when you don't post.

Merely by being alive, you make everyone just a little more miserable

-Dok Howl

Jasper

Quote from: BabylonHoruv on January 15, 2010, 04:39:00 AM
Quote from: Felix on January 15, 2010, 03:06:32 AM
Then it's all about instinct, isn't it?  Our instincts tell us what hurts, what is dangerous, when to eat, sleep, crap, have sex, raise kids, protect other people, and other kinds of things.  But these particular instincts do not have much relevance to a non-organism that was made for a purpose.

Part of the trick in working out machine consciousness is to work out machine instincts, it seems.

Instincts are just ROM  (metaphorically speaking of course, but I am sure you get my point)

Yeah, I get that, but how do you create a data driven system that goes "Oh, this doesn't feel right.  I think this is a bad idea."?

The Fundamentalist

Quote from: NotPublished on January 15, 2010, 02:34:34 AM
Quote from: The Fundamentalist on January 15, 2010, 02:29:53 AM
I'm not sure I understand what you mean.

We already have learning machines.  I've even programmed one.  Artificial neural networks aren't too hard
Sorry I wasn't very clear,

If a machine has the potential to learn things to alter its own behavioural pattern - I am not comfortable with that idea. Its like opening up MS Word and it tells you to Go Fuck Yourself because it doesn't want to be a slave to you anymore (That would mean that it also feels emotion).

But if a machine can learn patterns to assist with its main intent, then of course that will be beneficial. Its like having predicive text on when SMSing. I can't use the shit but my sister does and she writes too fast.

There wouldn't be very much point in making MS Word or something similarly simple conscious.  It would be a reversal of all of the mechanization we've been doing throughout history.

BabylonHoruv

Quote from: Felix on January 15, 2010, 04:41:59 AM
Quote from: BabylonHoruv on January 15, 2010, 04:39:00 AM
Quote from: Felix on January 15, 2010, 03:06:32 AM
Then it's all about instinct, isn't it?  Our instincts tell us what hurts, what is dangerous, when to eat, sleep, crap, have sex, raise kids, protect other people, and other kinds of things.  But these particular instincts do not have much relevance to a non-organism that was made for a purpose.

Part of the trick in working out machine consciousness is to work out machine instincts, it seems.

Instincts are just ROM  (metaphorically speaking of course, but I am sure you get my point)

Yeah, I get that, but how do you create a data driven system that goes "Oh, this doesn't feel right.  I think this is a bad idea."?

you don't.  Our instincts don't do that.  They drive us to do things and we put feelings and good or bad ideas on top of it with our rational and emotional mind.
You're a special case, Babylon.  You are offensive even when you don't post.

Merely by being alive, you make everyone just a little more miserable

-Dok Howl

Jasper

Quote from: BabylonHoruv on January 15, 2010, 05:01:59 AM
Quote from: Felix on January 15, 2010, 04:41:59 AM
Quote from: BabylonHoruv on January 15, 2010, 04:39:00 AM
Quote from: Felix on January 15, 2010, 03:06:32 AM
Then it's all about instinct, isn't it?  Our instincts tell us what hurts, what is dangerous, when to eat, sleep, crap, have sex, raise kids, protect other people, and other kinds of things.  But these particular instincts do not have much relevance to a non-organism that was made for a purpose.

Part of the trick in working out machine consciousness is to work out machine instincts, it seems.

Instincts are just ROM  (metaphorically speaking of course, but I am sure you get my point)

Yeah, I get that, but how do you create a data driven system that goes "Oh, this doesn't feel right.  I think this is a bad idea."?

you don't.  Our instincts don't do that.  They drive us to do things and we put feelings and good or bad ideas on top of it with our rational and emotional mind.

I think we have different instincts.  My instinctual sense is a constant yet subtle backdrop to my conscious and unconscious behavior.  What are yours?

BabylonHoruv

Quote from: Felix on January 15, 2010, 05:10:15 AM
Quote from: BabylonHoruv on January 15, 2010, 05:01:59 AM
Quote from: Felix on January 15, 2010, 04:41:59 AM
Quote from: BabylonHoruv on January 15, 2010, 04:39:00 AM
Quote from: Felix on January 15, 2010, 03:06:32 AM
Then it's all about instinct, isn't it?  Our instincts tell us what hurts, what is dangerous, when to eat, sleep, crap, have sex, raise kids, protect other people, and other kinds of things.  But these particular instincts do not have much relevance to a non-organism that was made for a purpose.

Part of the trick in working out machine consciousness is to work out machine instincts, it seems.

Instincts are just ROM  (metaphorically speaking of course, but I am sure you get my point)

Yeah, I get that, but how do you create a data driven system that goes "Oh, this doesn't feel right.  I think this is a bad idea."?

you don't.  Our instincts don't do that.  They drive us to do things and we put feelings and good or bad ideas on top of it with our rational and emotional mind.

I think we have different instincts.  My instinctual sense is a constant yet subtle backdrop to my conscious and unconscious behavior.  What are yours?

Things which drive me.  I make up reasons for them while doing them.  The reasons always seem perfectly rational.
You're a special case, Babylon.  You are offensive even when you don't post.

Merely by being alive, you make everyone just a little more miserable

-Dok Howl

Triple Zero

Quote from: Felix on January 15, 2010, 01:43:42 AM
What would be bad is to have a robot that simply has access to the information " my oil is low" and acts on it. What I want is for a robot that "feels" it's low oil the same way we feel hunger.  The difference doesn't make a great difference for utility, but it is what I want to create. Things like space exploration are reasons to make smart robots, but my real motive is to understand and recreate consciousness. The utilities don't matter to ME, but they matter for things like funding.

Nigel, you are correct. Just s we FEEL hungry, or sad, or ambivalent, or other sensations ( emotional or bodily), these sensations provide us with an extremely nuanced , information-rich reality in the context of our own experiences. That's important for conscious experience: sensations, voluntary and involuntary behaviors, inner dialogue, the ability to imagine sensations of other people, and the ability to emulate minute behavioral information by observing others like us.

you remember that thing about that person implanting a rare earth magnet in their fingertip?

because of all the nerve endings there and the slight movement and vibrations, that person could feel magnetic fields and electromagnetic radiation, to some extents.

as far as I understood, it did become part of their "psychosensory experience" as you call it.

even though all they did was just add an extra "input" which could interface with the nervous system in some way.

so that would make me conclude that the sense, or the mechanics of the sense do not matter for a consciousness whether it "experiences" a sensation or just acts upon an if/then rule.

so therefore, a robot equipped with an internal "my oil is low" sensor, would, with a "sufficiently conscious" pattern recognition/neural net/positronic brain interpret this sensor's input as a "hunger for oil sensation" instead of a mere mechanical "*BLEEP* LOW OIL ERROR: *BLOOP* ENGAGE REFILL PROTOCOL 7".

perhaps this is because a "sufficiently conscious artificial brain" would make associations with all the other sensory inputs and knowledge and its own internal abstract concepts and symbols that usually occur when the "oil is low" sensor goes into a certain state.

for example, this "hunger chemical" you speak about, yes we sense it (i suppose), but that's not the point, we do not act mechanically "IF hunger chemical THEN engage hunger protocol", no instead our brain associates the presence of this hunger chemical with our previous experiences, it does this via a very simple procedure, basically neurons and groups of neurons that "light up" (get activated) together get their connections strengthened, even if there was none before. so after a few times, the "detect hunger chemical" group of neurons also causes the "i feel a bit light headed" group of neurons to activate partially, even when you don't feel light headed yet, but this way it creates and paints the entire picture of the "hunger sensation", not just the chemical, nor just the actions that it requires (eating), but all the things that you generally associate with the presence of that chemical, or things that just happen to happen at the same time [think Pavlov experiment].

there are some kind of hybrid AI systems that combine elements of relational databases and neural nets that can do these kinds of associative things, I heard about it in this presentation: http://blip.tv/file/1947373 skip to 10min30 in, that's where he starts talking about it.
Ex-Soviet Bloc Sexual Attack Swede of Tomorrow™
e-prime disclaimer: let it seem fairly unclear I understand the apparent subjectivity of the above statements. maybe.

INFORMATION SO POWERFUL, YOU ACTUALLY NEED LESS.

LMNO

Quote from: The Right Reverend Nigel on January 15, 2010, 04:38:39 AM
I require giant robots that run like horses.


For some reason, I found that hauntingly poetic and awesome.  And now I want one.