News:

Please take a stand against our terrible values

Main Menu

Something I rushed off

Started by Cain, May 30, 2012, 11:47:57 PM

Previous topic - Next topic

Cain

Busy with other stuff.  This isn't a fully realised and carefully thought out essay, its something I knocked off in the last half hour, in hope of stimulating discussion more than anything else.



"He was intellect... He was war!  That is what they are!  Do you not see?  With every heartbeat they war against circumstance, with every breath they conquer!  They walk among us as we walk among dogs, and we yowl when they throw out scraps, we whine and whimper when they raise their hands..."
- Cnaiur urs Skiotha, The Thousandfold Thought, R. Scott Bakker

The difference between war and civil peace is only a relative lack of open violence in the latter, with the contention between forces possibly boiling over into new war or revolution, which may in turn result in a new settlement. Indeed, from the inversion of Clausewitz, Foucault concludes that "the final decision can only come from war, or in other words a trial by strength in which weapons are the final judges".
- Mark G.E. Kelly, The Political Philosophy of Michel Foucault

The whole secret lies in confusing the enemy, so that he cannot fathom our real intent.
- Sun Tzu, The Art of War

In the trailer to Tinker Tailor Soldier Spy, the viewer is asked "how do you find an enemy who is hidden right before your eyes?"  Not a bad question, for those who work involves questions of trust and espionage.

A more relevant question, however, would have been "how do you fight an enemy you don't even know you have?"

As the 21st century was born, a deadly mix of global connectivity, reliance on highly complex systems and the emergence of non-traditional modes of warfare has made it that the old, traditional methods of fighting have been rendered obsolete in many circumstances, while the opening the door for a new and entirely more insidious method of conflict.

The nature of the conflict is such that the first shots in this new kind of war may have been fired, and yet we will remain unaware of them.  This is the coming age of 5GW, or perhaps it could also be called "Invisible Warfare".  It will also be the last great transition in how humanity fights its wars.

Conflict can be seen, broadly speaking, as falling somewhere on a line, with total warfare (genocide) at the one end, and pure politics (non-violent, vocal conflict) at the other.  While it is hard to generalize about such a thing, as a rule the kinds of war we have experienced throughout history keep moving towards the political end of the conflict spectrum.  As such, warfare has become more discriminate and more undefined in terms of combatants and theatres of conflict. 

A full realization of this would be a war that hews as close to the political end of the spectrum as possible, manipulating political, religious and social contexts and themes to bring about an outcome preferable to the strategist in question.  Violence may be needed for elements of that, but ideally it should be kept to the minimum necessary to order the context on your terms.

The problem is, of course, if such manipulation were blatant, many would find it offensive and feel prepared to work against it.  Furthermore, it allows your enemy to formulate counter-strategies to defeat your attempts, or just target and destroy you outright.  Therefore, we come to the most important qualification of this new kind of warfare – that it must practice deception at the Grand Strategic level.  Your enemy can never know that they are your enemy in the first place.  Thus, invisible warfare. 

The importance of perception not only exists in how the 5GW must be planned and executed, but it is also the focus on how to defeat the enemy.  If you can convince an enemy they are not your enemy, and to act in a way that does not harm (or in fact benefits) you, then they are not an enemy, are they?

It also makes this new kind of war intensely intellectual.  Not only will the strategist(s) in charge of such a campaign be required to have a near encyclopaedic knowledge of the society they wish to subvert, they will have to take an approach that is somewhere between intelligent guerrilla warfare and intellectual Aikido, using the cognitive processes of their enemy against their own designs, attacking from unknown mental territory, using new cognitive scripts to obtain the element of surprise which is so vital to success.  Those who think fastest, and have the most accurate mental maps and are the most creative...victory will likely be theirs.

And this is where the problem comes in.

Every mode of conflict contains in and of itself the contradiction which allows warfare to mature to a newer gradient – much like the process of Hegelian synthesis, contradictions arise in how that mode of warfare is handled, which are then resolved via a newer model of conflict.  For example, nuclear weapons made large-scale conventional warfare obsolete.  Therefore, proxy wars waged by guerrilla forces, special forces and paramilitary arms of intelligence agencies, came to replace large-scale warfare.  One can see this in Vietnam, or Afghanistan, or the current war on terror.

If 5GW is reliant on intelligence and secrecy as its key attribute, then there is at least one obvious problem here.  Or two, depending on how you want to look at it.  That is, what happens when you add self-modifying intelligences which can optimize themselves, into such a mix?

I am, of course, talking about artificial intelligence.

At once, you can see the temptation, I am sure.  If a nation builds up a 5GW infrastructure, eventually it will want an AI or something that is functionally similar, to help with computations, planning the strategy of the campaign, analysis of what targets need to be eliminated, which ones can be spared and, in short, how best to achieve the goals of the strategists.

However, an AI which is more intelligent than a human is fundamentally untrustworthy.  Its goals cannot be really known, or understood.  Given the speed at which it could work, and the modifications it could make to itself, it may be impossible to figure out where it is misleading you. 

The obvious solution is to not use an AI at all.  But that will then put nations who refuse to use them at a comparative disadvantage to those who do.  Once Pandora's box is opened, it cannot be closed again.  Once machines take over the vital war-making functions from human strategists, we are entering what is entirely unknowable territory.  Will AIs compete, or cooperate?  How will they view humans?  What would their goals actually be?

This Sixth Generation of Warfare will mark the end of the human era of conflict.  To be sure, if humans are still around, they will fight.  But will they do it because they have chosen to, or because they have been carefully manipulated into believing it is in their best interest by an intelligence far beyond them?

Of course, this may never come to happen.  AI technology may not even be theoretically possible, in the sense that most of us understand it.  But there is another path this can take, with similar outcomes.  And that is transhumanism.  I can understand the desire to transcend our biology, to optimize our physical selves through a variety of methods, to be the best possible sentient being that we can.

But the same problems apply.  If transhumans are created which are more intelligent than the average human, that self-secrete "smart drugs", that have far denser and larger brains than the average person, then they can also manipulate human strategists to their own ends.  And as with AIs, the temptation to create such transhumans for short term comparative advantage will overcome all objections to such a plan, regardless of how many bioethics treaties are signed.

Or, transhumanism is also, fundamentally unfeasible.  Either on a theoretical level, or because our planet is becoming more and more resource-strapped, and the vast amount of effort this would require may not be worth the payoff, given the choices available at the time.

Regardless of scenario, it is fair to say we have now entered the last great age of human warfare.  It will be messy, it will be confusing, and you will not know it when you see it.

Salty

#1
What terrifies me with this kind of warfare is the view from my window will look the same not matter what. It seems to me that no matter who wins the bulk of people lose because they just roll along with the program because it's the right thing to do. The meaty, plastic cogs that keep this kind of war machine alive sitting there slack jawed, staring at romantic comedies while others get chewed up....it'll all just get dialed up.
The world is a car and you're the crash test dummy.

AnarChloe

Quote from: Alty on May 31, 2012, 12:41:59 AM
What terrifies me with this kind of warfare is the view from my window will look the same not matter what. It seems to me that no matter who wins the bulk of people lose because they just roll along with the program because it's the right thing to do. The meaty, plastic cogs that keep this kind of war machine alive sitting there slack jawed, staring at romantic comedies while others get chewed up....it's all just get dialed up.

^ This ^

With this kind of warfare, huge and major changes to policy could be made and no one would notice.

And that is terrifying.
Smooth Groove Panty Insert Design Specialist™

[redacted]

Hopefully an AI would be free of our instinct to be top dog / pack leader.

If we're their programmers then this would be some achievement though.

Mesozoic Mister Nigel

I really wish my brain was in adequate working order for me to formulate a quality reply right now.

But it's not, so I'm just posting to say, interesting topic, and I hope to have more to say about it when I'm not so exhausted.
"I'm guessing it was January 2007, a meeting in Bethesda, we got a bag of bees and just started smashing them on the desk," Charles Wick said. "It was very complicated."


NewSpag

I disagree.  I would postulate that we are well beyond 5GW but the entity that won the war now controls the flow of information in your world (quite possibly an AI, if that means a non-human intelligence), leading you to suggest that we are entering into said war.  In this situation by believing you are going to war you have already lost the war.  Your "enemy" has you so throughly confused that you cannot see the "secret", that there is no enemy at all.
QuoteOne day I realized life was pointless.  I've been celebrating ever since.
Quote
There's beauty in everything so lets destroy it all together.
Sometimes Always is Never.  For everything else there's Mastercard.

P3nT4gR4m

With regards transhumanism, I'm of the opinion that this has already happened but we don't really class it as that yet. Imagine a soldier with a gun, walking about, looking for something to shoot.  Now imagine his enemy - another soldier with a gun and a gps smartphone, hooked up to a video camera on a mini drone - which one is your money riding on?

Transhuman is generally accepted as humans with bits altered or bits added - smartphone doesn't quite count cos it's not growing out his spleen kinda thing but the tactical advantage is already in play, we're just waiting on the spleen-meld.

I see my ability to harness technology and use it to my advantage as something that makes me a vastly more powerful free agent in this emerging invisible theatre. The thing about 5GW is that the traditional players aint the only ones fucking up shit on the battlefield anymore. The deeper down the rabbit hole we go the more opportunity arises for the one man nation state to carve out a nice little slice of pie for himself.

I'm up to my arse in Brexit Numpties, but I want more.  Target-rich environments are the new sexy.
Not actually a meat product.
Ass-Kicking & Foot-Stomping Ancient Master of SHIT FUCK FUCK FUCK
Awful and Bent Behemothic Results of Last Night's Painful Squat.
High Altitude Haggis-Filled Sex Bucket From Beyond Time and Space.
Internet Monkey Person of Filthy and Immoral Pygmy-Porn Wart Contagion
Octomom Auxillary Heat Exchanger Repairman
walking the fine line line between genius and batshit fucking crazy

"computation is a pattern in the spacetime arrangement of particles, and it's not the particles but the pattern that really matters! Matter doesn't matter." -- Max Tegmark

Cain

Quote from: ExitApparatus on May 31, 2012, 02:02:44 AM
Hopefully an AI would be free of our instinct to be top dog / pack leader.

If we're their programmers then this would be some achievement though.

Hopefully, but not necessarily.  An AI would have to be able to rewrite themselves, in order to keep flexible and update their models of reality.  A constrained AI would almost certainly lose in a war between it and an unconstrained AI, so the question of being able to program it to act in that way is ultimately a moot point.

Because we're dealing with something of a magnitude of intelligence higher than humans, trying to figure out what it may be doing and its goals would be like a dog trying to make sense of human actions.  Hence the very first quote.  Even with significant genetic modifications, a human or transhuman simply wouldn't be able to keep up with an AI, given Moore's Law (even taking into account the curent slowing down of that trend).

Quote from: NewSpag on May 31, 2012, 07:48:10 AM
I disagree.  I would postulate that we are well beyond 5GW but the entity that won the war now controls the flow of information in your world (quite possibly an AI, if that means a non-human intelligence), leading you to suggest that we are entering into said war.  In this situation by believing you are going to war you have already lost the war.  Your "enemy" has you so throughly confused that you cannot see the "secret", that there is no enemy at all.

Uh, yeah, or maybe we're really brains in a vat, and everything we sense is an illusion.  Do you have an actual contribution, rooted in facts, to contribute to this debate, or do you want to just play clever word games so you can pride yourself on how "smart" you are?

Cain

Quote from: P3nT4gR4m on May 31, 2012, 10:48:17 AM
With regards transhumanism, I'm of the opinion that this has already happened but we don't really class it as that yet. Imagine a soldier with a gun, walking about, looking for something to shoot.  Now imagine his enemy - another soldier with a gun and a gps smartphone, hooked up to a video camera on a mini drone - which one is your money riding on?

Transhuman is generally accepted as humans with bits altered or bits added - smartphone doesn't quite count cos it's not growing out his spleen kinda thing but the tactical advantage is already in play, we're just waiting on the spleen-meld.

I see my ability to harness technology and use it to my advantage as something that makes me a vastly more powerful free agent in this emerging invisible theatre. The thing about 5GW is that the traditional players aint the only ones fucking up shit on the battlefield anymore. The deeper down the rabbit hole we go the more opportunity arises for the one man nation state to carve out a nice little slice of pie for himself.

Well, I consider transhumanism to specifically be the controlled genetic alteration of human DNA for optimized performance.

Everything else is just technology.  The same comparisons could be made between a 14th century archer and a 19th soldier armed with a rife and relying on trains and semaphore, but I wouldn't exactly consider the latter to be transhuman.

The rest I agree with, however, though it will be significantly easier for factions within a government to practice deception on the grand strategic level, because they have access to superior resources (everything imposes a cost, in terms of time, efficiency etc.  Even tactical deception requires reliable intel, being able to move the troops into a position without them being spotted, having a position on the route where an ambush is feasible, coordinating the attack so all the individual elements are working together etc).  An actual government, most likely, would not be able to do that, but the permament elements within the bureaucracy, military and intelligence services possibly could.

P3nT4gR4m

Quote from: Cain on May 31, 2012, 10:53:49 AM
Everything else is just technology.  The same comparisons could be made between a 14th century archer and a 19th soldier armed with a rife and relying on trains and semaphore, but I wouldn't exactly consider the latter to be transhuman.

Totally, no argument there. Personally I include cybernetics under the transhuman umbrella. That's why I'm looking at current microprocessor tech and thinking that, even before I can get it as an implant, I'm getting most of that advantage already.

Communications is going to be one of the core weapons in the new warfare. The ability to eavesdrop on theirs whilst hiding or obfuscating your own is going to be more crucial to the outcome than it was back in the enigma days, given the speed that things can shift and react now. Reading between the disinformation lines will be a big part of the key to success.

One thing that interests me is the emerging potential for a loose network of mercenaries to set up shop. Something along the lines of Anon or the chinese hackers but with a much less idealistic motivation. An organisation like that could quickly become a pretty formidable player in the new game.


I'm up to my arse in Brexit Numpties, but I want more.  Target-rich environments are the new sexy.
Not actually a meat product.
Ass-Kicking & Foot-Stomping Ancient Master of SHIT FUCK FUCK FUCK
Awful and Bent Behemothic Results of Last Night's Painful Squat.
High Altitude Haggis-Filled Sex Bucket From Beyond Time and Space.
Internet Monkey Person of Filthy and Immoral Pygmy-Porn Wart Contagion
Octomom Auxillary Heat Exchanger Repairman
walking the fine line line between genius and batshit fucking crazy

"computation is a pattern in the spacetime arrangement of particles, and it's not the particles but the pattern that really matters! Matter doesn't matter." -- Max Tegmark

LMNO

I don't have much to say about AI or TransHumanism that hasn't either been said before or is retarded.  But before the essay went there, I was struck by this:

Quote from: CainThe importance of perception not only exists in how the 5GW must be planned and executed, but it is also the focus on how to defeat the enemy.  If you can convince an enemy they are not your enemy, and to act in a way that does not harm (or in fact benefits) you, then they are not an enemy, are they?

The idea of silent, total manuplation of a percieved threat or enemy who has no idea a manipulation is taking place sounds both like a PKD novel and what has been slowly happening to this country for a long time now; only the enemy is us.  Say what you want about "memetics", but the marketing concept that ideas can be introduced that slip through mental filters has been becoming more refinded with every passing year, and every new psychological/sociological experiment that gets published. 

I know that Cain was most likely talking about the manipulation of a state or government by another, but an even less obvious ploy is to unobtrusively act against the citizenry, especially in a democracy/republic, so that the people would be the movers, electing the kind of government that the enemy country wants to see. 

Another question that quote brings to mind is, if two states are acting like friends, and behaving like friends, isn't that functionally equivalent to "peace"?

NewSpag

#11
Quote from: Cain on May 31, 2012, 10:49:13 AM
Uh, yeah, or maybe we're really brains in a vat, and everything we sense is an illusion.  Do you have an actual contribution, rooted in facts, to contribute to this debate, or do you want to just play clever word games so you can pride yourself on how "smart" you are?
Yeesh, I'm still pretty new here so I don't know what constitutes a "fact" in this debate.  Hell, considering how many layers of abstraction I have to go through to even access this forum I'm surprised that I actually am able to convince myself that you are actually a human being.  Point being until you tell me where "facts" come from the brains in a vat/illusion theory is pretty much my default fallback.

Try #2:
Quote from: LMNO, PhD (life continues) on May 31, 2012, 02:13:38 PM

The idea of silent, total manuplation of a percieved threat or enemy who has no idea a manipulation is taking place sounds both like a PKD novel and what has been slowly happening to this country for a long time now; only the enemy is us.  Say what you want about "memetics", but the marketing concept that ideas can be introduced that slip through mental filters has been becoming more redefined with every passing year, and every new psychological/sociological experiment that gets published. 


While surfing my way through the interweb the other day I stumbled across this site claiming to be a research collective on InfraSonic warfare. According to the wikipedia Infrasonic waves are outside of humans' normal hearing range and that it can cause people to "feel vaugely that supernatural events are taking place".  Sounds almost crazy enough to be something that Cain would accept as a fact a project that started here. 
QuoteOne day I realized life was pointless.  I've been celebrating ever since.
Quote
There's beauty in everything so lets destroy it all together.
Sometimes Always is Never.  For everything else there's Mastercard.

The Johnny

Quote from: Cain on May 30, 2012, 11:47:57 PM

Conflict can be seen, broadly speaking, as falling somewhere on a line, with total warfare (genocide) at the one end, and pure politics (non-violent, vocal conflict) at the other.  While it is hard to generalize about such a thing, as a rule the kinds of war we have experienced throughout history keep moving towards the political end of the conflict spectrum.  As such, warfare has become more discriminate and more undefined in terms of combatants and theatres of conflict. 

A full realization of this would be a war that hews as close to the political end of the spectrum as possible, manipulating political, religious and social contexts and themes to bring about an outcome preferable to the strategist in question.  Violence may be needed for elements of that, but ideally it should be kept to the minimum necessary to order the context on your terms.

I dont want to seem as a snob, this is merely a detail that could broaden the discussion (and its the only real thing that i can add to it):

Theres symbolical violence, or non-physical violence that resides within the political interactions between people or groups.

A certain society might be structured in a way in which the distribution of wealth is very much unequal, that is in a sense violent in an indirect manner. Discrimination is a kind of symbolical violence too, or sexism, etc.

Violence can be either direct physical harm but it can also be an injustice.

My point is, that violence isnt reduced from one trend of managing conflict to another, its merely transformed or expressed in a different manner... why kill someone when you can make them do what you want? The bottom line is the annulation of the other person's projects/agenda/desires and promoting one's own
<<My image in some places, is of a monster of some kind who wants to pull a string and manipulate people. Nothing could be further from the truth. People are manipulated; I just want them to be manipulated more effectively.>>

-B.F. Skinner

Triple Zero

(written yesterday to follow LMNO's reply, but then the forum started crashing--I'm blaming Wintermute or Helios)

I was sort of wondering a similar thing as LMNO's last thought, it may be a bit blasphemous in a "freedom or kill me" sense, but as these generations of warfare advance, they also seem to get less violent and with less overall horrors (like Goya's horrors of war, we still got plenty of new and improved horrors) taken on the whole? Or do those horrors just move elsewhere, as they are currently occurring in Africa, and parts of the Middle East.

I'm not making a very clear point I suppose but what I was thinking about, isn't this progress, in some sense?

BTW Cain I really enjoyed your essay, and I hadn't at all expected the sudden turn to AI and transhumanism :)

I have read several articles on the musings about hyperintelligent AIs on LessWrong.com, which I assume is where you also read about those theories. I'm really not sure what to make of those. But I guess that's exactly what the Technological Singularity has always meant, by definition: It's impossible to reason about what comes after, until you actually get there, and by then it's too late to do anything about it.

But I wouldn't even want to guess when we get something like that.

The closest thing, that I've heard about and is used for actual strategic decisions, is the gargantuan data-centre that is (being?) built in Utah. Based on the Total Information Awareness program, they're receiving more data than any human could conceivably process. Something on the order of the amount of data that is sent through the Internet daily (or four times that? it was in the article I linked in Privacy Thread).

Now I have some background in Machine Learning, and recent developments seem to point out that, if you got insane amounts of data, you merely need the processing power to deal with all of that, and ML algorithms get really really good. There was an article, or maybe a TED talk called "the unreasonable effectiveness of big data"--or something like that, it's been a while since I came across it.

So, basically you just need to have really fast and powerful computers, and then you need to mass-produce an insane fuckton of them and bury them underground in Utah.

Now Machine Learning isn't the same thing as AI. I'm writing it capitalized because I'm referring to the specific definition of the term as used in Computational Science: a modelling, decision, classification or prediction algorithm that gets better at doing its task on unseen data, the more seen (training) data you provide it with. It's basically extra fancy types of statistics and regression formulas with a bunch of Bayes thrown in. But they work very well.

So can we build really smart computers to analyze really complex things? Yeah. But one thing I'm not at all sure we're close to cracking it or even knowing how to approach starting building it, is the "self-modifying/self-improving" part of the deal. Which is a rather key element in Yudkowsky's reasonings on these matters. But he just takes it for granted. He justifies this decision in one of his essays btw and that's okay, I really have to commend him for trying to take on that Singularity problem, in a serious manner, and actually doing a pretty damn good job at trying to predict what comes after an event that changes everything.

See it's one thing for a ML algorithm to optimize and modify its own parameters to optimally fit, model and predict incoming data, according to the algorithm. But it's a whole other ballgame for a computer to actually modify the algorithm to do something different, free-form. Or to improve the algorithm into something that works and optimizes significantly better than before. In fact that last one would probably be enough to set off the chain reaction into AI so intelligent it's impossible to know what its goals are or what it will do, aka Singularity.

But it might be quite a while before we can build such a thing, IF it's even possible. Example, right now we got pretty "smart" cars, monitoring tire pressure, detecting obstacles, capable of self-diagnosing problems, etc. They can internally dial all sorts of knobs to make the car engine run as smoothly as possible, and that's amazing. But they can't increase the size of their gas tank, modify the physical engine, or grow more wheels. They can only tweak parameters, but not change the actual system. And that's where current day AI and ML is at. Sure, they got a LOT of parameters, making the system tweakable to great extent, but only within those parameters. They can't think outside that box. Even though, if you design the system very cleverly, that box might be a lot bigger than a human would assume at first glance of the parameters given, and the machine will find it, it's still a box.

I was going to comment on the other, transhuman, part of your essay, but this post is getting long enough as it is.
Ex-Soviet Bloc Sexual Attack Swede of Tomorrow™
e-prime disclaimer: let it seem fairly unclear I understand the apparent subjectivity of the above statements. maybe.

INFORMATION SO POWERFUL, YOU ACTUALLY NEED LESS.

Prince Glittersnatch III

Quote from: Cain on May 30, 2012, 11:47:57 PM
The obvious solution is to not use an AI at all.  But that will then put nations who refuse to use them at a comparative disadvantage to those who do.  Once Pandora's box is opened, it cannot be closed again.  Once machines take over the vital war-making functions from human strategists, we are entering what is entirely unknowable territory.  Will AIs compete, or cooperate?  How will they view humans?  What would their goals actually be?

At their core all AIs will probably have some sort of command telling them to obey orders, or to act in their owners best interest. I cant really see any advantage to allowing them to modify that. It would probably be pretty vague as well, because giving such an intelligence specific orders would be pointless micro-management. What would be interesting is if they were allowed to lie to their owners, or if their obligation to act in their owners best interest overrode their programing against lying. People are stupid, that should be double obvious to a higher intelligence. What if the AIs lie to their owners for their own good?


http://www.facebook.com/profile.php?=743264506 <---worst human being to ever live.

http://www.jesus-is-savior.com/False%20Religions/Other%20Pagan%20Mumbo-Jumbo/discordianism.htm <----Learn the truth behind Discordianism

Quote from: Aleister Growly on September 04, 2010, 04:08:37 AM
Glittersnatch would be a rather unfortunate condition, if a halfway decent troll name.

Quote from: GIGGLES on June 16, 2011, 10:24:05 PM
AORTAL SEX MADES MY DICK HARD AS FUCK!