News:

MysticWicks endorsement: "At least Satanists HAVE a worldview. After reading this thread, I'm convinced that discordians not only don't, but will actively mock anyone who does."

Main Menu

I'm making a religion based on Emergence.

Started by Kai, July 04, 2009, 04:57:41 PM

Previous topic - Next topic

Bebek Sincap Ratatosk

Quote from: James Semaj on October 26, 2010, 05:23:53 AM
QuoteI know a guy who knows a guy.

You'll be fine.

Oh that sounds promising. What about the Rat Thing?

Leave me out of your perversions.

Rat
- I don't see race. I just see cars going around in a circle.

"Back in my day, crazy meant something. Now everyone is crazy" - Charlie Manson

Kai

Quote from: LuciferX on October 26, 2010, 05:31:57 AM
:fap:
I think we should take "making" out of the title...

I think we should take "LuciferX" out of PD.com.

And by "take" I mean "knee in the balls".
If there is magic on this planet, it is contained in water. --Loren Eisley, The Immense Journey

Her Royal Majesty's Chief of Insect Genitalia Dissection
Grand Visser of the Six Legged Class
Chanticleer of the Holometabola Clade Church, Diptera Parish

minuspace

Quote from: Kai on October 28, 2010, 11:45:23 PM
Quote from: LuciferX on October 26, 2010, 05:31:57 AM
:fap:
I think we should take "making" out of the title...

I think we should take "LuciferX" out of PD.com.

And by "take" I mean "knee in the balls".

based on how you kneed balls?

Requia ☣

Quote from: Liam on October 29, 2010, 09:39:08 PM
QuoteTranshumanists tend to be too optimistic about the whole thing.

I'm all up for it, but the sheer amount of redundancy needed to be built in to the whole system, if its to stand any chance against entropy is mindfuckingly amazing and highly improbable at this juncture in time, and without some amazing strides in science.

I wish life was like an Alaister Reynolds novel.

I don't mean so much the technology holding up, or worries about AIs killing us all for the hell of it.  I'm more worried about humans being human, and dooming us all.  We'll refuse to adapt to a post scarcity economy and unemployment will hit 90%, or we'll decide weaponized grey goo is a good idea, or some other piece of insanity.
Inflatable dolls are not recognized flotation devices.

Jasper

I'm mostly concerned about our ability to create intelligent machines that are too obedient. 

All the most terrible things ever are accomplished with armies of obedient agents. 

Cain

I'm also worried about too human-like AI, Sig.

Reginald Ret

Quote from: Sigmatic on October 31, 2010, 08:42:50 AM
I'm mostly concerned about our ability to create intelligent machines that are too obedient. 

All the most terrible things ever are accomplished with armies of obedient agents. 
This.
Motherfucking this.
Welcome brother! to the brotherhood of Andoulism.
I used to call myself anarchist, back when i thought leaders were to blame for having followers.
but now i know better, it's not the leaders that are bad, its the followers.
instead of without leaders === an-archos i have decided that utopia can only be obtained when we are
without servants/slaves ===  an-doulos.
All hail Andoulitry!
Lord Byron: "Those who will not reason, are bigots, those who cannot, are fools, and those who dare not, are slaves."

Nigel saying the wisest words ever uttered: "It's just a suffix."

"The worst forum ever" "The most mediocre forum on the internet" "The dumbest forum on the internet" "The most retarded forum on the internet" "The lamest forum on the internet" "The coolest forum on the internet"

minuspace

Quote from: Regret on November 04, 2010, 05:23:59 PM
Quote from: Sigmatic on October 31, 2010, 08:42:50 AM
I'm mostly concerned about our ability to create intelligent machines that are too obedient. 

All the most terrible things ever are accomplished with armies of obedient agents. 
This.
Motherfucking this.
Welcome brother! to the brotherhood of Andoulism.
I used to call myself anarchist, back when i thought leaders were to blame for having followers.
but now i know better, it's not the leaders that are bad, its the followers.
instead of without leaders === an-archos i have decided that utopia can only be obtained when we are
without servants/slaves ===  an-doulos.
All hail Andoulitry!

It likes the idea, then dismisses acceptance thereof because logical inconsistency follows.
If the idea serves reason or my taste, it cannot be andoulos.
Conversely, if it is not andoulos, I then can follow.
Andoulos is /contra/ my diction. :oops:

Jasper

Quote from: Cain on November 04, 2010, 07:19:46 AM
I'm also worried about too human-like AI, Sig.

The evil you know, et cetera.  I strongly anticipate that some kind of machine intelligence will be created eventually, the best thing I can think of is to create an anthropic one that is squeamish about murder, a bit lazy, and not in total control of it's own mind.  Like a human.  At least that's the kind of problem you can sort of model and predict.

Quote from: Regret on November 04, 2010, 05:23:59 PM
Quote from: Sigmatic on October 31, 2010, 08:42:50 AM
I'm mostly concerned about our ability to create intelligent machines that are too obedient. 

All the most terrible things ever are accomplished with armies of obedient agents. 
This.
Motherfucking this.
Welcome brother! to the brotherhood of Andoulism.
I used to call myself anarchist, back when i thought leaders were to blame for having followers.
but now i know better, it's not the leaders that are bad, its the followers.
instead of without leaders === an-archos i have decided that utopia can only be obtained when we are
without servants/slaves ===  an-doulos.
All hail Andoulitry!

All that said, the reason I'm actually interested in all this machine consciousness crap is that deep down I really just want a robot butler who knows a little kung fu.

That would pretty much complete me, spiritually and emotionally.

minuspace

Quote from: Sigmatic on November 06, 2010, 04:06:14 AM
Quote from: Cain on November 04, 2010, 07:19:46 AM
I'm also worried about too human-like AI, Sig.

The evil you know, et cetera.  I strongly anticipate that some kind of machine intelligence will be created eventually, the best thing I can think of is to create an anthropic one that is squeamish about murder, a bit lazy, and not in total control of it's own mind.  Like a human.  At least that's the kind of problem you can sort of model and predict.

Quote from: Regret on November 04, 2010, 05:23:59 PM
Quote from: Sigmatic on October 31, 2010, 08:42:50 AM
I'm mostly concerned about our ability to create intelligent machines that are too obedient. 

All the most terrible things ever are accomplished with armies of obedient agents. 
This.
Motherfucking this.
Welcome brother! to the brotherhood of Andoulism.
I used to call myself anarchist, back when i thought leaders were to blame for having followers.
but now i know better, it's not the leaders that are bad, its the followers.
instead of without leaders === an-archos i have decided that utopia can only be obtained when we are
without servants/slaves ===  an-doulos.
All hail Andoulitry!

All that said, the reason I'm actually interested in all this machine consciousness crap is that deep down I really just want a robot butler who knows a little kung fu.

That would pretty much complete me, spiritually and emotionally.
I like how it likes what is prudent.

minuspace

Quote from: Liam on November 04, 2010, 08:59:44 AM
QuoteI'm also worried about too human-like AI, Sig.


transitional post. this translates into;



this right! :lulz: