feel like he'd read like BIP and take away all the wrong messages . if he makes a single tweet about this all of the cryptocurrency/roko's basilisk guys would overwhelm us in seconds
Just so you know elon bought the principiadiscordia.com last year for 10 dollars.
To date we are his most successful investment as he has only lost 30bn since acquiring us for the 10 dollars.
Quote from: Faust on January 07, 2024, 03:20:34 PM
Just so you know elon bought the principiadiscordia.com last year for 10 dollars.
It was for sale?!!!?!?? I would have paid twice as much.
Quote from: mx krabs the bepronouned on January 07, 2024, 02:00:04 PM
feel like he'd read like BIP and take away all the wrong messages . if he makes a single tweet about this all of the cryptocurrency/roko's basilisk guys would overwhelm us in seconds
He won't. He's too pig-brained to actually recognize any value in Discordian messaging. The one to worry about is Yudkowsky.
Quote from: altered on January 07, 2024, 11:08:58 PM
Quote from: mx krabs the bepronouned on January 07, 2024, 02:00:04 PM
feel like he'd read like BIP and take away all the wrong messages . if he makes a single tweet about this all of the cryptocurrency/roko's basilisk guys would overwhelm us in seconds
He won't. He's too pig-brained to actually recognize any value in Discordian messaging. The one to worry about is Yudkowsky.
I vaguely recall that Yudkowsky mentioned Discordianism once? So it may already be too late. I was reading through some of the writings on lesswrong a while back. Google isn't helping me find the specific reference, though.
I find a lot of Yudkowsky's rationalist ideas to be interesting/useful, but his views on AGI are bonkers. His reaction to the Roko's Basilisk nonsense, at least, indicates to me that even if he can discuss the mental tools of rationality, he isn't very good at applying them consistently, and is highly susceptible to fear-motivated rationalization.
His rationalism stuff is polluted by believing that ANY humans are EVER capable of being rational actors. If anyone was, we would simply all live in their example, and oops, you made a cult, better hope your rational actor doesn't have underhanded motives that lead them to rationally form a cult to pursue those underhanded motives.
Addiction to rationality is caused by irrational motivations. if we were truly rational we'd give up on rationality. As Roger was fond of saying, it's no way to run a human being.
What's useful about his rationalism stuff is specifically this: having a toolkit for when you want something else to blame if things go wrong, and the person in charge of yelling at you won't accept "God did it".
Quote from: altered on January 10, 2024, 11:51:36 PM
Addiction to rationality is caused by irrational motivations. if we were truly rational we'd give up on rationality. As Roger was fond of saying, it's no way to run a human being.
Indeed. Logic is often a useful tool for achieving a specific goal, but it's entirely useless at figuring out what root goal to select. Axioms cannot be arrived at rationally.
It seems obvious now, but I was well into my twenties before I figured that out.
Yudkowsky still hasn't learned. Also, he's hilarious to watch when viewed through the light of a "rational human being". He's a kook, and when you take him at his word, he even looks like one.
The real question is it would be more or less cringe than what Rev. Uncle BadTouch already does?