Quote
Everyone is worried that the singularity will be smart, I'm worried that it will be dumb, with a high clock speed. Any dumb ass can beat you at chess if it gets ten moves to your one. In fact, what if the singularity already happened, we are its neurons, and it's no smarter than a C. elegans worm? Worse, after the Twitterfall incident, I'm worried about what it will do when it discovers its motor neural pathways.
The human brain is brilliance derived from dumb nerves. Out of those many billions of simple connections came our Threshold of Reflection and everything that followed. But consciousness is going meta and we're being superseded by a borg-like singularity; intelligence turned upside down. Smart nodes suborning ourselves to a barely conscious #fail-obsessed network. It's dumb as a worm, fast as a photo multiplier tube, and ready to rage on at the slightest provocation. If you're on stage (or build a flawed product, or ever ever mention politics), watch out.
We don't plan to go mob rules any more than a single transistor on your computer intends to download porn. We participate in localized stimulus and response. Macro digital collectivism from local interaction. Macro sentiment from local pellet bar smacking.
We're pre-implant so I plug into the Skinner Borg with fingers and eyes that are low bandwidth synapses. When I try to unplug (or when I'm forced to in an airplane at altitude), my fingers tingle and I feel it still out there. I'm a stimulus seeking bundle of nerves. I experience the missing network like a phantom limb.
So where's this going? Like I said, I'm not a Luddite but I'm no Pollyanna Digitopian either. Age of spiritual machines? Whatever. Show me spiritual people. When the first machine or machine-assisted meta-consciousness arrives on the scene, it's going to be less like the little brother that you played Battleship with and more like a dumb digital version of poor Joe from Johnny Got His Gun. Barely sentient but isolated from sensation. Do we think that a fully formed functional consciousness is going to spring to life the first time sufficient processing power is there to enable it? I'm not worried about it replicating and taking over the world, I'm worried about it going completely bat shit crazy and stumbling around breaking stuff in an impotent rage.
http://radar.oreilly.com/2010/01/skinner-box-theres-an-app-for.html
this article describes the singularity as if it's just a personification of internet trends.
if we look at it that way, this "emergence of awareness" isn't at all unique. One might say the french revolution (for example) was the awakening of an egregore composed of the emotions and individual actions of the french people. A giant brain focused on terminating the aristocracy.
So if we're all synapses in a giant idiot brain, and internet trends really ARE the emergence of some high tech hivemind, how do we measure when the thing crosses the threshhold into existence? It sounds like it's already happening, you know? And if it's already happening, when did it start? usenet?
Although, it does bring up an interesting point:
What if the Singularity kind of sucks?
I think I can solve this man's problems:
Virgin America has onboard wifi.
This reminds me of a story from Kitchen Confidential. Remember that monomaniacal baker, Adam? Everything about his life was ugly and pointless, except his baking. He would often call in when his sordid affairs kept him away, urging that the chef "feed the bitch", referring to his treasured starter mix. He was obsessed.
I often feel like I'm feeding the bitch too, but for me it's the internet. It needs new material, of any level of quality whatsoever, on a constant basis. This post resonated with me, because sometimes I also feel like an overworked and underappreciated neuron in some vast idiot's brain.
Quote from: Cramulus on January 05, 2010, 06:06:43 PM
this article describes the singularity as if it's just a personification of internet trends.
if we look at it that way, this "emergence of awareness" isn't at all unique. One might say the french revolution (for example) was the awakening of an egregore composed of the emotions and individual actions of the french people. A giant brain focused on terminating the aristocracy.
So if we're all synapses in a giant idiot brain, and internet trends really ARE the emergence of some high tech hivemind, how do we measure when the thing crosses the threshhold into existence? It sounds like it's already happening, you know? And if it's already happening, when did it start? usenet?
I would generally argue that the singularity in the sense in which he uses it in this article (I.E., a superorganism) has been happening since the invention of spoken language, and mirrors a similar singularity that happened when single celled organisms decided to get along and turn into multicelled organisms. Everything since then has been kind of vaguely progressing towards a big animal made of people. I suspect that the nation-state is something like a rat, and the city-state something like a cockroach.
The singularity will be created by the direct and indirect actions of monkeys.
Ergo, it will be Dumb.
End of story.
Quote from: Enki v. 2.0 on January 05, 2010, 06:33:16 PM
Quote from: Cramulus on January 05, 2010, 06:06:43 PM
this article describes the singularity as if it's just a personification of internet trends.
if we look at it that way, this "emergence of awareness" isn't at all unique. One might say the french revolution (for example) was the awakening of an egregore composed of the emotions and individual actions of the french people. A giant brain focused on terminating the aristocracy.
So if we're all synapses in a giant idiot brain, and internet trends really ARE the emergence of some high tech hivemind, how do we measure when the thing crosses the threshhold into existence? It sounds like it's already happening, you know? And if it's already happening, when did it start? usenet?
I would generally argue that the singularity in the sense in which he uses it in this article (I.E., a superorganism) has been happening since the invention of spoken language, and mirrors a similar singularity that happened when single celled organisms decided to get along and turn into multicelled organisms. Everything since then has been kind of vaguely progressing towards a big animal made of people. I suspect that the nation-state is something like a rat, and the city-state something like a cockroach.
well then what he's talking about isn't really anything new or special.
He's just anthropomorphizing trends. Using new language to describe very old phenomena.
I always thought the singularity referred to something
new, some new type of organization or some unique locus of motion.
As I understand it, the idea of the Singularity is that the speed and acceleration of technology, most specifically artificial intelligence, would be of such complexity, that we simply can not make any reasonable predictions about what the world would be like. If you can't make any reasonable predictions, you can hardly tell if it would be better or worse to live in than the world today.
The Singularity is also about advances in technology which seem to be following an exponential curve, rather than a linear one.
For example, we can fairly easily imagine living ~100 years ago with no electricity, outside sanitation and sticks&hoops being the coolest toy... but will kids in even 10 years time be able to really imagine what it was like before the internet and mobile electronic devices?
Never mind that though, because even today I can barely remember what life was like before internet. About twice in the last five years I've been without internet for almost an hour - ISP failures - and each time I find myself doing dumb ass shit like trying to Google how much longer it'll take. One of those times the cable TV was down too, and it took me a few minutes to figure out that I could just check the weather by looking out of the window.
:horrormirth:
For a moment, I feel like reframing into information theory.
Information content is defined as the inability to predict what comes next (in information theory -- this REALLY doesn't jive with the way most people view information because most people link it with meaning. When somebody tells you that a Dan Brown novel has less information than a single page of line noise, most people will say you're bullshitting). So, the singularity could be defined in terms of the information content of the immediate future zooming close to infinity, since the predictability zooms close to zero. Now, amusingly enough, we consider the agents of the singularity to be intelligences -- which mostly function by taking information, ignoring bits of it, and organizing the remainder into patterns (and thereby causing there to be LESS information).
Quote from: Cramulus on January 05, 2010, 07:16:12 PM
I always thought the singularity referred to something new, some new type of organization or some unique locus of motion.
it does. and part of the deal is that you can't know what it will be like before it happens. it's practically in the definition of it, at least the way I understand it.
whether you will know after, or during it happens, I couldn't say.
but as LMNO says, it does bring up an interesting point, what if it kind of sucks :) well, TBH, when I first heard about the singularity I wasn't really expecting it to be pretty or anything. the first thought I popped into my mind was something like machine (technology) enslaving humanity, Terminator/Matrix style. but instead of hunting down humans, or putting them into batteries, epecially in the latter scenario, in the whole matrix idea, it would make a whole lot more sense to just turn humans into ultimate couch-potatoes, if you somehow were to want to harvest their whatever it is.
I mean, it
could be all pretty and everyone gets enlightened and shit, but it could also not. It could go either way. And the way we are currently going, is not really pointing anywhere in favour of the pretty enlightened happy fun time singularity, you know?
Also, ENKI, what you are talking about is called
information entropy, hence the confusion about your comparison between a novel and line noise. The word "information" itself has quite a number of meanings, so if you want to be specific about it, use the term coined by mr Claude Shannon (http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf).
Also, when being precise about those terms, I don't quite understand how the thing you say follows from information theory. Because, well, for starters, there is no such thing as infinite information entropy, it goes on a scale from zero to one, with zero being absolute predictability and one being complete randomness. And your statement about agents of the singularity, is where you kind of lost me in speculation.
Quote from: Triple Zero on January 05, 2010, 09:22:42 PM
Also, ENKI, what you are talking about is called information entropy, hence the confusion about your comparison between a novel and line noise. The word "information" itself has quite a number of meanings, so if you want to be specific about it, use the term coined by mr Claude Shannon (http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf).
Also, when being precise about those terms, I don't quite understand how the thing you say follows from information theory. Because, well, for starters, there is no such thing as infinite information entropy, it goes on a scale from zero to one, with zero being absolute predictability and one being complete randomness. And your statement about agents of the singularity, is where you kind of lost me in speculation.
My bad -- I got my full knowledge about information theory from that bit in Prometheus Rising a few years ago. The zero to one scale, though, seems like it could be remapped to a scale that uses all real numbers (in which case the argument still works, kind of).
Let me rephrase it:
If the singularity is defined as the point at which one can not determine what will happen next, for a very small 'next', then you can port that definition over to information theory and say that the singularity is a point in the series of events in time after which the level of information entropy is stuck at one or very close to one.
Singularity = Technological Feedback Loop
Advance in technology A creates the right environment for technological advance B and C which are the final bits needed for technology D to just explode with new stuff... etc etc etc etc
Its not good or bad... it just is.
Though the aesthetic is likely to either be cool (because the designers watched too much Sci Fi) or suck (cause the designers are cranking out tech far more quickly than we can make it pretty). For example, wearable computers have been slowly progressing for 20 years or so, from head mounted monitors to a tiny led scanning images onto the eye retina. Take that line and shrink it down to three months because of major advances in home replication units (like the RapRep), access to information and new tech via the Internet (OSS better drivers, scanners, etc) and the new invention of direct cerebral connectivity where we can wire direct to the brain instead of goggles and clicker keyboards.
And if you're using it like some trans-humanists... it includes something about the salvation of mankind as we become Post Human.
Wait.
It's just a feedback loop?
I wanted a fucking black hole that would suck the world into it while I laugh maniacally until the tidal forces pull me into an infinitely long leering face covered in long-delayed satisfaction.
And it's just some new agey term for increasing technical development?
FUUUUUUUUUUUUUUUUUUUUUUUCK!
:crankey:
Quote from: Enki v. 2.0 on January 05, 2010, 08:03:39 PM
For a moment, I feel like reframing into information theory.
Information content is defined as the inability to predict what comes next (in information theory -- this REALLY doesn't jive with the way most people view information because most people link it with meaning. When somebody tells you that a Dan Brown novel has less information than a single page of line noise, most people will say you're bullshitting). So, the singularity could be defined in terms of the information content of the immediate future zooming close to infinity, since the predictability zooms close to zero. Now, amusingly enough, we consider the agents of the singularity to be intelligences -- which mostly function by taking information, ignoring bits of it, and organizing the remainder into patterns (and thereby causing there to be LESS information).
I think you're thinking of the Technological Singularity as an increase information, when it's really more about a feedback loop of self-improving intelligence. The idea being that when man creates an intelligence more sophisticated than his own, it will in turn be able to create ever more sophisticated intelligences at an exponential rate. Of course, there are widely varied interpretations on this, as there are for what "intelligence" even means, especially in popular culture, but I have yet to come across any academics or respected* technologists who portray the singularity as an increase in information. Increasing/decreasing the amount of "information" in the universe would be virtually the same thing as increasing/decreasing the amount of "energy". It cannot be done.
Quote from: The Good Reverend Roger on January 05, 2010, 10:04:03 PM
Wait.
It's just a feedback loop?
I wanted a fucking black hole that would suck the world into it while I laugh maniacally until the tidal forces pull me into an infinitely long leering face covered in long-delayed satisfaction.
And it's just some new agey term for increasing technical development?
FUUUUUUUUUUUUUUUUUUUUUUUCK!
:crankey:
:lulz: In popular culture, it's a new agey term, but in the academic world it has much more to do with a point in time where mathematical predictions are virtually impossible to make with any measurable degree of accuracy. There are some difficulties with terms like "intelligence" which
still need fleshing out, but I feel that the more valuable academic dialogues regarding a Technological Singularity are primarily mathematical in nature. The farther out it diverges from numbers and computational modelling, the more these kinds of discussions seem to encompass bullshit philosophy and new agey speculation.
I blame transhumanists for making it impossible for me to even hear the word "singularity" without breaking out into laughter.
Quote from: Cain on January 05, 2010, 10:28:24 PM
I blame transhumanists for making it impossible for me to even hear the word "singularity" without breaking out into laughter.
I'm not to that stage yet. I'm still bitterly, bitterly disappointed and angry.
Quote from: Vaudeville Vigilante on January 05, 2010, 10:19:32 PM
Quote from: The Good Reverend Roger on January 05, 2010, 10:04:03 PM
Wait.
It's just a feedback loop?
I wanted a fucking black hole that would suck the world into it while I laugh maniacally until the tidal forces pull me into an infinitely long leering face covered in long-delayed satisfaction.
And it's just some new agey term for increasing technical development?
FUUUUUUUUUUUUUUUUUUUUUUUCK!
:crankey:
:lulz: In popular culture, it's a new agey term, but in the academic world it has much more to do with a point in time where mathematical predictions are virtually impossible to make with any measurable degree of accuracy. There are some difficulties with terms like "intelligence" which still need fleshing out, but I feel that the more valuable academic dialogues regarding a Technological Singularity are primarily mathematical in nature. The farther out it diverges from numbers and computational modelling, the more these kinds of discussions seem to encompass bullshit philosophy and new agey speculation.
Great.
New Agey Singularity Tards:
1. When does it happen?
2. What will it actually DO?
3. How many hours of screeching, hysterical newsfeed will it generate?
4. How long will we have to put up with this fad?
Quote from: Cain on January 05, 2010, 10:28:24 PM
I blame transhumanists for making it impossible for me to even hear the word "singularity" without breaking out into laughter.
I credit them for it. I mean, it's generally preferable to burst into laughter at a term rather than bust a nut or two in rage at one. You only have so many nuts. At least, until the singularity.
Quote from: Enki v. 2.0 on January 05, 2010, 10:34:33 PM
I mean, it's generally preferable to burst into laughter at a term rather than bust a nut or two in rage at one.
Speak for yourself, hippie.
Any situation wherein by hearing two words I can become a eunuch is a bad situation.
Quote from: Enki v. 2.0 on January 05, 2010, 10:38:51 PM
Any situation wherein by hearing two words I can become a eunuch is a bad situation.
Fortunately for me, my testicles replace themselves constantly...like a shark's teeth. I have 5 rows of the fuckers.
I find The Technological Singularity an interesting concept, but the premise that either technology or intelligence can be ramped up indefinitely is quite untested.
Quote from: The Good Reverend Roger on January 05, 2010, 10:40:19 PM
Quote from: Enki v. 2.0 on January 05, 2010, 10:38:51 PM
Any situation wherein by hearing two words I can become a eunuch is a bad situation.
Fortunately for me, my testicles replace themselves constantly...like a shark's teeth. I have 5 rows of the fuckers.
If only you could replace them faster each time, then you'd have The Testicological Singularity.
Quote from: The Good Reverend Roger on January 05, 2010, 10:40:19 PM
Quote from: Enki v. 2.0 on January 05, 2010, 10:38:51 PM
Any situation wherein by hearing two words I can become a eunuch is a bad situation.
Fortunately for me, my testicles replace themselves constantly...like a shark's teeth. I have 5 rows of the fuckers.
They call him the Ten Testicle Terror
Quote from: Cramulus on January 05, 2010, 11:29:41 PM
Quote from: The Good Reverend Roger on January 05, 2010, 10:40:19 PM
Quote from: Enki v. 2.0 on January 05, 2010, 10:38:51 PM
Any situation wherein by hearing two words I can become a eunuch is a bad situation.
Fortunately for me, my testicles replace themselves constantly...like a shark's teeth. I have 5 rows of the fuckers.
They call him the Ten Testicle Terror
Yeah, but when I get kicked in the balls, it takes me 4 days to get back up.
The Singularity isn't really a useful concept outside of historical preenactment societies because it talks about a definite event.
I think about it as a trend. A singularity 'event' is possibly one where technology becomes more intelligent than humans, but the weighting and measuring of intelligence is problematic.
If you look at it as a trend, you see definite patterns taking shape. The internet is part of a larger information decentralization pattern. Open courseware too. We're moving away from fortresses of knowledge, towards evenly distributed backups and plenty of free information so that nothing likely to happen can destroy our centuries of learning and culture.
That's just one trend inside the series of threads I use to think about the singularity. It's not going to be like a mythic event, apocalypse, or theophany, it will be more like a lot of wild shit happening and technology being a little out of control.
Kernel Panic!
Attempted to kill init!
Init is dead!
Long live init!
Quote from: The Good Reverend Roger on January 05, 2010, 10:04:03 PMWait.
It's just a feedback loop?
I wanted a fucking black hole that would suck the world into it while I laugh maniacally until the tidal forces pull me into an infinitely long leering face covered in long-delayed satisfaction.
And it's just some new agey term for increasing technical development?
FUUUUUUUUUUUUUUUUUUUUUUUCK!
:crankey:
Different kind of singularity. You're talking about the one they expect to be in the middle of a black hole. Which will do the infinite suckage. Unfortunately, you will never get to see the singularity inside a black hole either, because the event horizon will always be in front of it, which is also called the Cosmic Censorship Hypothesis, thought up in '69 by .. that
other Roger.
Technological Singularity is basically the new age version of "Kids these days ...", when they're not praising them and calling them indigo kids, the new agers suddenly realize they can't program their VCR and don't understand fuck about all this complicated technology anymore. So this all gotta fit in the grand cosmic scheme, so obviously this meaning that technology is accelerating and we'll all be poomped into enlightenment at an accelerated rate!
Technological singularity will end the planet, by using it all up to build nth generation machines.
Formula is simple - 1st gen machines were built by humans, 2nd gen were built by humans, assisted by machines, 3rd gen built by machines, assisted by humans, 4th gen built entirely by machines, 6th gen built by the machines the 4th gen produced.. and so on.
The singularity bit comes in because the rate of production of subsequent generations is considered to be faster and faster. Singularity can occur one of two ways:
1) The rate of production becomes so fast that, eventually, generation 100075 actually manages to finish designing generation 100076 before it's been designed itself, causing a rift in the space-time continuum.
2) Generation 3567298 designs 3567299 which is so complex it uses up all the matter in the known universe to build it.
To be quite honest, I see the singularity as something more of a plateau, pragmatically speaking.
As pattern makers, we are assaulted by vast reams of information, and we pick and choose our way through it.
The singularity, as I see it, is a point where the technology and knowledge base becomes greater than our ability to understand it.
At that point, we will fall back to our null state, and no more advancements are made.
now that's a scary thought!
but don't you think people will just keep picking at the stuff they understand?
Or is it going to be that you need a lifetime of education just to roll the boulder back up the hill?
Quote from: LMNO on January 06, 2010, 02:38:47 PM
To be quite honest, I see the singularity as something more of a plateau, pragmatically speaking.
As pattern makers, we are assaulted by vast reams of information, and we pick and choose our way through it.
The singularity, as I see it, is a point where the technology and knowledge base becomes greater than our ability to understand it.
At that point, we will fall back to our null state, and no more advancements are made.
If that's the case then we've already reached it. Name one person on earth that understands everything? People choose fields and specialise. If those fields become too complex then they are further subdivided - molecular biology, organic biochemistry .. etc
The knowledge base, even in my field of computing, is already greater than my ability to understand it but I don't need to. Everything is documented in digital format. We are using machines to make up for the shortcomings of our brains, organising information in ever more accessible ways. Developing ever more sophisticated algorithms for retrieving it.
There's a hell of a lot of things that get done on a daily basis that would be impossible without technology, including designing and building that technology itself.
To me the singularity occurs when humans completely lose touch with the design process. Then you have a real runaway train on your hands.
can we get back to Roger's testicles?
'cause that was where the thread was getting good...
Quote from: Iptuous on January 06, 2010, 03:09:47 PM
can we get back to Roger's testicles?
'cause that was where the thread was getting good...
Yeah, that was an interest sparker for me, the rest leans towards new age dribble....
I thought I'd logged on to the wrong board at first.
sombody WOMP ten nuts on him, stat!
Quote from: Triple Zero on January 06, 2010, 09:59:09 AM
Quote from: The Good Reverend Roger on January 05, 2010, 10:04:03 PMWait.
It's just a feedback loop?
I wanted a fucking black hole that would suck the world into it while I laugh maniacally until the tidal forces pull me into an infinitely long leering face covered in long-delayed satisfaction.
And it's just some new agey term for increasing technical development?
FUUUUUUUUUUUUUUUUUUUUUUUCK!
:crankey:
Different kind of singularity. You're talking about the one they expect to be in the middle of a black hole. Which will do the infinite suckage. Unfortunately, you will never get to see the singularity inside a black hole either, because the event horizon will always be in front of it, which is also called the Cosmic Censorship Hypothesis, thought up in '69 by .. that other Roger.
Technological Singularity is basically the new age version of "Kids these days ...", when they're not praising them and calling them indigo kids, the new agers suddenly realize they can't program their VCR and don't understand fuck about all this complicated technology anymore. So this all gotta fit in the grand cosmic scheme, so obviously this meaning that technology is accelerating and we'll all be poomped into enlightenment at an accelerated rate!
Funny. That other Roger is also the other Roger Penrose. Our names are identical. He's my 2d cousin or some shit.
And yeah, I figured as much.
Concerning the above, there's nothing wrong or unlearnable about technology. It just requires additional specialization in the given field, and it's not like we have a shortage of people.
But what's funny is, less and less people enter the trades each generation.
Who's gonna fix that power station when it goes off line? Don't worry, the singularity will handle things. Honest.
TGRR,
Knows that we'll all starve to death in the cold and dark, because we will no longer know how to change a switchgear.
What's a switchgear?
\
:asshat:
Quote from: The Good Reverend Roger on January 06, 2010, 04:12:02 PM
Concerning the above, there's nothing wrong or unlearnable about technology. It just requires additional specialization in the given field, and it's not like we have a shortage of people.
But what's funny is, less and less people enter the trades each generation.
Who's gonna fix that power station when it goes off line? Don't worry, the singularity will handle things. Honest.
TGRR,
Knows that we'll all starve to death in the cold and dark, because we will no longer know how to change a switchgear.
It's an increasingly dangerous level of dependency. Gears and valves and flywheels are the base of a pyramid that ends in subroutines and dynamic classes. Eventually the pyramid will be too tall for the human race to manage.
... then lulz
Quote from: LMNO on January 06, 2010, 04:16:09 PM
What's a switchgear?
\
:asshat:
Some unimportant thing dealing with boring, prosaic stuff that isn't nearly as exciting as an undefined hippie "singularity"/2012 knockoff/new age religion.
Really. It has grease in it, and is thus beneath the notice of the Eloi.
TGRR,
Morlock.
Quote from: P3nT4gR4m on January 06, 2010, 04:17:40 PM
Quote from: The Good Reverend Roger on January 06, 2010, 04:12:02 PM
Concerning the above, there's nothing wrong or unlearnable about technology. It just requires additional specialization in the given field, and it's not like we have a shortage of people.
But what's funny is, less and less people enter the trades each generation.
Who's gonna fix that power station when it goes off line? Don't worry, the singularity will handle things. Honest.
TGRR,
Knows that we'll all starve to death in the cold and dark, because we will no longer know how to change a switchgear.
It's an increasingly dangerous level of dependency. Gears and valves and flywheels are the base of a pyramid that ends in subroutines and dynamic classes. Eventually the pyramid will be too tall for the human race to manage.
... then lulz
H.G. Wells was an optimist. In the end, the Morlocks all disappear, and the Eloi starve to death.
Who's left tending the great machines that keep everything running? Me, Curley, and a few dozen guys in their early 60s.
When I get my vacuuming robot Imma name it Morlock.