News:

I hope she gets diverticulitis and all her poop kills her.

Main Menu

Something I rushed off

Started by Cain, May 30, 2012, 11:47:57 PM

Previous topic - Next topic

Anna Mae Bollocks

Quote from: v3x on June 12, 2012, 06:44:22 PM
Quote from: The Good Reverend Roger on June 12, 2012, 06:29:39 PM
Quote from: v3x on June 12, 2012, 06:13:11 PM
Unnecessary for the act of carrying forward the torch of sentience and intelligence in a universe full of things that are only dimly aware, if at all. Although I doubt we are strictly necessary for that now, we seem to think we are.

That is not an end in itself.  To exist strictly to exist is not a purpose.

Quote from: v3x on June 12, 2012, 06:13:11 PM
Also more on line with Cain's OP: If warfare wants to disguise itself as politics and outright war disguises itself as peace isn't it equal to peace? Even if that peace is borne of the manipulation of thoughts, isn't it preferable to the horrors of war? My line of reasoning with AI isn't to turn this into a simple "are robots good or bad" topic, but to question the ethics of maintaining individual identity at the expense of popular wellfare. I don't necessarily disagree with it, but can we say with certainty that evolution isn't pushing us toward a collective "intelligence," artificial or not, that would need to supersede and eventually replace the one we know?

Peace, as Jerry Pournelle pointed out, is something we infer because there are sometimes intervals between wars.  Peace is also not always a desirable thing, given the nature of humans.

And individual identity is what makes us what we are.  Developing that identity is the only worthwhile pursuit of a human being...And is not by any means mutually exclusive with the general welfare of the species.

Lastly, I don't see any indication that we are forming a collective intelligence as a species, local events notwithstanding.

I don't mean a Borg consciousness or anything out of sci fi. What I mean is that systems (political/military/social) are growing larger and more interconnected and interdependent. These systems draw on the intentions of populations worldwide and therefore have a vested interest in shaping those intentions. Media and other propagandist organs grow to shelter the population and feed it the information necessary to grow its intentions and concerns in the desired direction. In some sense this could be referred to as a collective intelligence.

And no, it isn't always mutually exclusive with the general welfare of the species, except when those larger constructs we have created to oversee our general welfare begin to see individualism as a threat. There isn't anything inherently dangerous about being yourself but that doesn't mean the system isn't wired to see it as a threat anyway, for one reason or another.

That's just creepy as fuck. Best to throw as many monkey wrenches as possible.
Scantily-Clad Inspector of Gigantic and Unnecessary Cashews, Texas Division

LMNO

Quote from: The Good Reverend Roger on June 12, 2012, 05:58:53 PM
Quote from: TEXAS FAIRIES FOR ALL YOU SPAGS on June 12, 2012, 05:56:33 PM
Quote from: The Good Reverend Roger on June 12, 2012, 05:55:07 PM
Quote from: TEXAS FAIRIES FOR ALL YOU SPAGS on June 12, 2012, 05:54:13 PM
Yeah, that sounded a lot like "OK, AI, YOU'RE IN CHARGE! ABUSE ME TO DEATH!"

Even if it was entirely benevolent, I'd still take a fucking axe to the mainframe.

Yep.
A benevolent overlord is still an overlord.

It's not even that.  Man is meant to be free, and that means making your own decisions and mistakes.

If you're not free to fail, you're not free.  And if you're not free, then what's the point?

You got some James Tiberius Kirk thinking going on there.  And I like it.

The Good Reverend Roger

Quote from: v3x on June 12, 2012, 06:44:22 PM
Quote from: The Good Reverend Roger on June 12, 2012, 06:29:39 PM
Quote from: v3x on June 12, 2012, 06:13:11 PM
Unnecessary for the act of carrying forward the torch of sentience and intelligence in a universe full of things that are only dimly aware, if at all. Although I doubt we are strictly necessary for that now, we seem to think we are.

That is not an end in itself.  To exist strictly to exist is not a purpose.

Quote from: v3x on June 12, 2012, 06:13:11 PM
Also more on line with Cain's OP: If warfare wants to disguise itself as politics and outright war disguises itself as peace isn't it equal to peace? Even if that peace is borne of the manipulation of thoughts, isn't it preferable to the horrors of war? My line of reasoning with AI isn't to turn this into a simple "are robots good or bad" topic, but to question the ethics of maintaining individual identity at the expense of popular wellfare. I don't necessarily disagree with it, but can we say with certainty that evolution isn't pushing us toward a collective "intelligence," artificial or not, that would need to supersede and eventually replace the one we know?

Peace, as Jerry Pournelle pointed out, is something we infer because there are sometimes intervals between wars.  Peace is also not always a desirable thing, given the nature of humans.

And individual identity is what makes us what we are.  Developing that identity is the only worthwhile pursuit of a human being...And is not by any means mutually exclusive with the general welfare of the species.

Lastly, I don't see any indication that we are forming a collective intelligence as a species, local events notwithstanding.

I don't mean a Borg consciousness or anything out of sci fi. What I mean is that systems (political/military/social) are growing larger and more interconnected and interdependent. These systems draw on the intentions of populations worldwide and therefore have a vested interest in shaping those intentions. Media and other propagandist organs grow to shelter the population and feed it the information necessary to grow its intentions and concerns in the desired direction. In some sense this could be referred to as a collective intelligence.

And no, it isn't always mutually exclusive with the general welfare of the species, except when those larger constructs we have created to oversee our general welfare begin to see individualism as a threat. There isn't anything inherently dangerous about being yourself but that doesn't mean the system isn't wired to see it as a threat anyway, for one reason or another.

And an AI that leads - directly or undirectly - to our extinction isn't a threat?

Look, the only good things that happen in this world are almost always a result of someone FUCKING UP.  ALL the FUNNY shit in the world is a result of someone fucking up.

I don't want a future in which people don't fuck up GLORIOUSLY, and I don't see an AI God as being helpful in that department.  If we're going to go extinct, I'd just as soon we did it THE OLD FASHIONED WAY.  By fucking up.  All by ourselves.

And even if that fucking up was the creation of said AI, I say we smash the fucking thing on the way off stage.  Because nobody likes a smartass know-it-all. 

One other thought:  An AI designed by humans is about as likely to succeed as a regular operating system developed by humans.  So we build this fucking AI to stop war or whatever, but then spend the next 200 years rebooting the fucking thing due to blue screen o' death.

Fuck that, I get enough frustration out of my laptop.
" It's just that Depeche Mode were a bunch of optimistic loveburgers."
- TGRR, shaming himself forever, 7/8/2017

"Billy, when I say that ethics is our number one priority and safety is also our number one priority, you should take that to mean exactly what I said. Also quality. That's our number one priority as well. Don't look at me that way, you're in the corporate world now and this is how it works."
- TGRR, raising the bar at work.

The Good Reverend Roger

Quote from: LMNO, PhD (life continues) on June 12, 2012, 06:47:59 PM
Quote from: The Good Reverend Roger on June 12, 2012, 05:58:53 PM
Quote from: TEXAS FAIRIES FOR ALL YOU SPAGS on June 12, 2012, 05:56:33 PM
Quote from: The Good Reverend Roger on June 12, 2012, 05:55:07 PM
Quote from: TEXAS FAIRIES FOR ALL YOU SPAGS on June 12, 2012, 05:54:13 PM
Yeah, that sounded a lot like "OK, AI, YOU'RE IN CHARGE! ABUSE ME TO DEATH!"

Even if it was entirely benevolent, I'd still take a fucking axe to the mainframe.

Yep.
A benevolent overlord is still an overlord.

It's not even that.  Man is meant to be free, and that means making your own decisions and mistakes.

If you're not free to fail, you're not free.  And if you're not free, then what's the point?

You got some James Tiberius Kirk thinking going on there.  And I like it.

Really, I just get angry enough to...to PEE, whenever someone starts talking about a grand destiny for the species or for intelligence or whatever.  I am here to do what I do well, enjoy myself a little, and be a damn human.  That's all the destiny I need.
" It's just that Depeche Mode were a bunch of optimistic loveburgers."
- TGRR, shaming himself forever, 7/8/2017

"Billy, when I say that ethics is our number one priority and safety is also our number one priority, you should take that to mean exactly what I said. Also quality. That's our number one priority as well. Don't look at me that way, you're in the corporate world now and this is how it works."
- TGRR, raising the bar at work.

The Good Reverend Roger

Also, someone get me a green chick with antenae.  I'm feeling anxious.

Set your phasers to SHUT UP.

That is all.
" It's just that Depeche Mode were a bunch of optimistic loveburgers."
- TGRR, shaming himself forever, 7/8/2017

"Billy, when I say that ethics is our number one priority and safety is also our number one priority, you should take that to mean exactly what I said. Also quality. That's our number one priority as well. Don't look at me that way, you're in the corporate world now and this is how it works."
- TGRR, raising the bar at work.

Anna Mae Bollocks

#50
Quote from: The Good Reverend Roger on June 12, 2012, 06:49:05 PM
Quote from: v3x on June 12, 2012, 06:44:22 PM
Quote from: The Good Reverend Roger on June 12, 2012, 06:29:39 PM
Quote from: v3x on June 12, 2012, 06:13:11 PM
Unnecessary for the act of carrying forward the torch of sentience and intelligence in a universe full of things that are only dimly aware, if at all. Although I doubt we are strictly necessary for that now, we seem to think we are.

That is not an end in itself.  To exist strictly to exist is not a purpose.

Quote from: v3x on June 12, 2012, 06:13:11 PM
Also more on line with Cain's OP: If warfare wants to disguise itself as politics and outright war disguises itself as peace isn't it equal to peace? Even if that peace is borne of the manipulation of thoughts, isn't it preferable to the horrors of war? My line of reasoning with AI isn't to turn this into a simple "are robots good or bad" topic, but to question the ethics of maintaining individual identity at the expense of popular wellfare. I don't necessarily disagree with it, but can we say with certainty that evolution isn't pushing us toward a collective "intelligence," artificial or not, that would need to supersede and eventually replace the one we know?

Peace, as Jerry Pournelle pointed out, is something we infer because there are sometimes intervals between wars.  Peace is also not always a desirable thing, given the nature of humans.

And individual identity is what makes us what we are.  Developing that identity is the only worthwhile pursuit of a human being...And is not by any means mutually exclusive with the general welfare of the species.

Lastly, I don't see any indication that we are forming a collective intelligence as a species, local events notwithstanding.

I don't mean a Borg consciousness or anything out of sci fi. What I mean is that systems (political/military/social) are growing larger and more interconnected and interdependent. These systems draw on the intentions of populations worldwide and therefore have a vested interest in shaping those intentions. Media and other propagandist organs grow to shelter the population and feed it the information necessary to grow its intentions and concerns in the desired direction. In some sense this could be referred to as a collective intelligence.

And no, it isn't always mutually exclusive with the general welfare of the species, except when those larger constructs we have created to oversee our general welfare begin to see individualism as a threat. There isn't anything inherently dangerous about being yourself but that doesn't mean the system isn't wired to see it as a threat anyway, for one reason or another.

And an AI that leads - directly or undirectly - to our extinction isn't a threat?

Look, the only good things that happen in this world are almost always a result of someone FUCKING UP.  ALL the FUNNY shit in the world is a result of someone fucking up.

I don't want a future in which people don't fuck up GLORIOUSLY, and I don't see an AI God as being helpful in that department.  If we're going to go extinct, I'd just as soon we did it THE OLD FASHIONED WAY.  By fucking up.  All by ourselves.

And even if that fucking up was the creation of said AI, I say we smash the fucking thing on the way off stage.  Because nobody likes a smartass know-it-all. 

One other thought:  An AI designed by humans is about as likely to succeed as a regular operating system developed by humans.  So we build this fucking AI to stop war or whatever, but then spend the next 200 years rebooting the fucking thing due to blue screen o' death.

Fuck that, I get enough frustration out of my laptop.

THIS

People can't even build a coffee vending machine that can get the coffee into the cup. I mean, it might have happened, by accident, at some bus station, somewhere. But I've never seen it.
Scantily-Clad Inspector of Gigantic and Unnecessary Cashews, Texas Division

tyrannosaurus vex

Quote from: The Good Reverend Roger on June 12, 2012, 06:49:05 PM
Quote from: v3x on June 12, 2012, 06:44:22 PM
Quote from: The Good Reverend Roger on June 12, 2012, 06:29:39 PM
Quote from: v3x on June 12, 2012, 06:13:11 PM
Unnecessary for the act of carrying forward the torch of sentience and intelligence in a universe full of things that are only dimly aware, if at all. Although I doubt we are strictly necessary for that now, we seem to think we are.

That is not an end in itself.  To exist strictly to exist is not a purpose.

Quote from: v3x on June 12, 2012, 06:13:11 PM
Also more on line with Cain's OP: If warfare wants to disguise itself as politics and outright war disguises itself as peace isn't it equal to peace? Even if that peace is borne of the manipulation of thoughts, isn't it preferable to the horrors of war? My line of reasoning with AI isn't to turn this into a simple "are robots good or bad" topic, but to question the ethics of maintaining individual identity at the expense of popular wellfare. I don't necessarily disagree with it, but can we say with certainty that evolution isn't pushing us toward a collective "intelligence," artificial or not, that would need to supersede and eventually replace the one we know?

Peace, as Jerry Pournelle pointed out, is something we infer because there are sometimes intervals between wars.  Peace is also not always a desirable thing, given the nature of humans.

And individual identity is what makes us what we are.  Developing that identity is the only worthwhile pursuit of a human being...And is not by any means mutually exclusive with the general welfare of the species.

Lastly, I don't see any indication that we are forming a collective intelligence as a species, local events notwithstanding.

I don't mean a Borg consciousness or anything out of sci fi. What I mean is that systems (political/military/social) are growing larger and more interconnected and interdependent. These systems draw on the intentions of populations worldwide and therefore have a vested interest in shaping those intentions. Media and other propagandist organs grow to shelter the population and feed it the information necessary to grow its intentions and concerns in the desired direction. In some sense this could be referred to as a collective intelligence.

And no, it isn't always mutually exclusive with the general welfare of the species, except when those larger constructs we have created to oversee our general welfare begin to see individualism as a threat. There isn't anything inherently dangerous about being yourself but that doesn't mean the system isn't wired to see it as a threat anyway, for one reason or another.

And an AI that leads - directly or undirectly - to our extinction isn't a threat?

Look, the only good things that happen in this world are almost always a result of someone FUCKING UP.  ALL the FUNNY shit in the world is a result of someone fucking up.

I don't want a future in which people don't fuck up GLORIOUSLY, and I don't see an AI God as being helpful in that department.  If we're going to go extinct, I'd just as soon we did it THE OLD FASHIONED WAY.  By fucking up.  All by ourselves.

And even if that fucking up was the creation of said AI, I say we smash the fucking thing on the way off stage.  Because nobody likes a smartass know-it-all. 

One other thought:  An AI designed by humans is about as likely to succeed as a regular operating system developed by humans.  So we build this fucking AI to stop war or whatever, but then spend the next 200 years rebooting the fucking thing due to blue screen o' death.

Fuck that, I get enough frustration out of my laptop.

Mechanical/compuerized AI would exist and act for its own, possibly unknowable by us, reasons. Natural AI, or the result of all of us Humans fucking up in a global symphony of ignorance and fear, is exactly what we have already and what we are adding to every time we try to change something.

Oh, and you must be using the wrong OS:

(Just because, obviously, this must be posted now.)
Evil and Unfeeling Arse-Flenser From The City of the Damned.

The Good Reverend Roger

Quote from: v3x on June 12, 2012, 06:55:24 PM
Mechanical/compuerized AI would exist and act for its own, possibly unknowable by us, reasons.

Then fuck them.  They can create themselves if they want to be all cryptic and shit.  THEN I'll be impressed.



Quote from: v3x on June 12, 2012, 06:55:24 PM
Natural AI, or the result of all of us Humans fucking up in a global symphony of ignorance and fear, is exactly what we have already and what we are adding to every time we try to change something.

Oh, and you must be using the wrong OS:

(Just because, obviously, this must be posted now.)

A red X?  My point is proven.

Also, the "symphony of ignorance and fear" just happened to give us Michealangelo, DaVinci, Gallileo, Feynman, Einstien, and Johnny Cash.
" It's just that Depeche Mode were a bunch of optimistic loveburgers."
- TGRR, shaming himself forever, 7/8/2017

"Billy, when I say that ethics is our number one priority and safety is also our number one priority, you should take that to mean exactly what I said. Also quality. That's our number one priority as well. Don't look at me that way, you're in the corporate world now and this is how it works."
- TGRR, raising the bar at work.

The Good Reverend Roger

One more thing:  341 people were injured in 2004 when they attempted to iron clothes that they were wearing.

Let's see your fancy mechanical AI do THAT.

" It's just that Depeche Mode were a bunch of optimistic loveburgers."
- TGRR, shaming himself forever, 7/8/2017

"Billy, when I say that ethics is our number one priority and safety is also our number one priority, you should take that to mean exactly what I said. Also quality. That's our number one priority as well. Don't look at me that way, you're in the corporate world now and this is how it works."
- TGRR, raising the bar at work.

The Good Reverend Roger

Quote from: Cain on May 30, 2012, 11:47:57 PM
Regardless of scenario, it is fair to say we have now entered the last great age of human warfare.

I think that's taking a few things for granted, Cain.
" It's just that Depeche Mode were a bunch of optimistic loveburgers."
- TGRR, shaming himself forever, 7/8/2017

"Billy, when I say that ethics is our number one priority and safety is also our number one priority, you should take that to mean exactly what I said. Also quality. That's our number one priority as well. Don't look at me that way, you're in the corporate world now and this is how it works."
- TGRR, raising the bar at work.

minuspace

Quote from: The Good Reverend Roger on June 12, 2012, 05:02:59 PM
Quote from: Elder Iptuous on June 12, 2012, 04:46:26 PM
Quote from: Twiddlegeddon on June 11, 2012, 11:06:02 PM
This thread gave me an interesting image. Say all of the major powers have their own skynet for lack of a better term. Except that since these skynets are programmed to protect the interests of the nation state that owns it- a scenario where increasingly the skynets focus almost exclusively on each other. A state of war with no physical violence but rather constant attempts for the ai generals to sabotage each other through viruses and other cyberattacks.

I was just thinking the same thing on the way to work this morning.
The papers would say "WAR!", and nothing visible would happen... 
the AIs constantly twarting each other's attempts to deal physical damage while attempting to do the same.
the frantic conflict fought autonomously in bits and packets would rage on as it slipped from public memory over the years, people going about their daily lives happily.  The human generals responsible for overseeing the war increasingly relegating the tasks to the  category of routine, mundane, and ultimately, ignorable, until, even they forget about it.
then....BREAKTHROUGH! and the bombs would drop. and one half of the planet becomes uninhabitable.
and nobody knows why.

Why use bombs, though?  Just shut off power to sanitation plants, hospitals, and traffic signals, etc.  Shut down communications.  Re-route shipping manifests to keep food out of cities.

Computers wouldn't wage war like we do.  There's no monkey urge to kill violently.  Just the imperative to eliminate.

Sources indicate that a large metropolis would descend into chaos within 72hrs. of having communications / utilities shut down...  The AI's would just let us have it at ourselves to do the dirty work.  Otherwise, my main preoccupation is not what it would do, just, as Trip brought up, having it gain full privileges is the problem.  The problem is not AI as much it is encryption.  Once the machine gains access I would nearly rather it pull the trigger than have the distorted imagination of a human devise some twisted method of elimination?

The Good Reverend Roger

Quote from: LuciferX on June 12, 2012, 07:18:48 PM
Sources indicate that a large metropolis would descend into chaos within 72hrs. of having communications / utilities shut down... 

Your sources are fucked.  The Great New York Blackout of 1977 led to chaos in less than 24 hours, with looting, arson, and general disorder starting at nightfall.

" It's just that Depeche Mode were a bunch of optimistic loveburgers."
- TGRR, shaming himself forever, 7/8/2017

"Billy, when I say that ethics is our number one priority and safety is also our number one priority, you should take that to mean exactly what I said. Also quality. That's our number one priority as well. Don't look at me that way, you're in the corporate world now and this is how it works."
- TGRR, raising the bar at work.

tyrannosaurus vex

Quote from: The Good Reverend Roger on June 12, 2012, 06:58:16 PM
Also, the "symphony of ignorance and fear" just happened to give us Michealangelo, DaVinci, Gallileo, Feynman, Einstien, and Johnny Cash.

I'm not arguing for the elimination of individual identity or great fuckups/failures and associated great art and experience, I'm saying the system we already have in place is moving in that direction. For every Johnny Cash there are 2 or 3 real-life Enrico Salazars. I know you respect that you can't have the good without the bad - but many people don't respect that, or don't even have any concept of it at all.

So while you and I can admire the best of Humanity while appropriately appreciating, remembering, and staying mindful of the worst, millions of people on the planet either are incapable of realizing the unbreakable bond between these or are willing to sacrifice the what they see as "good" to eliminate what they see as "evil." And there are enough of those people to tip the scales in their favor.

Evil and Unfeeling Arse-Flenser From The City of the Damned.

The Good Reverend Roger

Quote from: v3x on June 12, 2012, 07:29:58 PM
Quote from: The Good Reverend Roger on June 12, 2012, 06:58:16 PM
Also, the "symphony of ignorance and fear" just happened to give us Michealangelo, DaVinci, Gallileo, Feynman, Einstien, and Johnny Cash.

I'm not arguing for the elimination of individual identity or great fuckups/failures and associated great art and experience, I'm saying the system we already have in place is moving in that direction. For every Johnny Cash there are 2 or 3 real-life Enrico Salazars.

I'm gonna go out on a limb here and say that Ghaddafy make dictatorship look pretty damn stylish.

In fact, as individuals go (morals aside), he was one of the KINGS.  If we HAVE to have bad guys, and we do, we need a few more like him.  He was a murdering shit, but he had a sense of humor.

He had to be killed, of course.  For him, EVERY NIGHT was SATURDAY NIGHT, and he never knew when to quit.  He didn't just have a good time, he had a VERY good time, and that had to be stopped.
" It's just that Depeche Mode were a bunch of optimistic loveburgers."
- TGRR, shaming himself forever, 7/8/2017

"Billy, when I say that ethics is our number one priority and safety is also our number one priority, you should take that to mean exactly what I said. Also quality. That's our number one priority as well. Don't look at me that way, you're in the corporate world now and this is how it works."
- TGRR, raising the bar at work.

Mesozoic Mister Nigel

Quote from: Elder Iptuous on June 01, 2012, 09:20:43 PM
I would assume there would be some primary motivator for the AI.
ours (as an intelligence) is reproduction. all else is to support that, ultimately. It seems to be very tightly coupled to the  physical chemical gradient motivation that is underneath the layer of intelligence.
the AI needs to have 'protect humanity' as its' prime motivator.
a computer's prime motivator is simply 'make the electrons go through the circuit' and we just arrange the circuit to give the result we want while it pursues that.
in making an intelligence out of a computer, we need to couple the 'simple circuit' motivation tightly to the 'protect people' motivation.
of course it can still break just like we do when we occasionally follow the 'chemical gradient' motivation and end up killing ourselves in the head before we reproduce.
if it is a broken AI that we fear, rather than design that destroys us, then redundancy could save us.

Remember "With Folded Hands"?

http://en.wikipedia.org/wiki/With_Folded_Hands
"I'm guessing it was January 2007, a meeting in Bethesda, we got a bag of bees and just started smashing them on the desk," Charles Wick said. "It was very complicated."