[Boards: 3 / a / aco / adv / an / asp / b / bant / biz / c / can / cgl / ck / cm / co / cock / d / diy / e / fa / fap / fit / fitlit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mlpol / mo / mtv / mu / n / news / o / out / outsoc / p / po / pol / qa / qst / r / r9k / s / s4s / sci / soc / sp / spa / t / tg / toy / trash / trv / tv / u / v / vg / vint / vip / vp / vr / w / wg / wsg / wsr / x / y ] [Search | Free Show | Home]

LessWrong, MIRI cult

This is a blue board which means that it's for everybody (Safe For Work content only). If you see any adult content, please report it.

Thread replies: 9
Thread images: 1

File: Eliezer Yudkowsky.jpg (18KB, 280x280px) Image search: [Google]
Eliezer Yudkowsky.jpg
18KB, 280x280px
So are they cults or what?

>Give us millions of dollars or AI will kill everyone.

Seems like a death cult.
>>
>>8581521
>Give us millions of dollars or malaria will kill many hundreds of thousands of Africans
Does that sound like a cult to you? How is this different?
>>
>>8581521
>cult
Shallow pattern recognition

Shit tier thinking.

Read his books and his papers. He has thought and worked hard on important issues.
>>
>>8581521
>>8581553

See http://yudkowsky.net/obsolete/plan.html#vision_memes where he lays out his vision of the Singularity as essentially a global coup, bypassing politicians and reporters by recruiting a crop of super-rich "believer" CEOs

>We need to worry about the reactions of CEOs, Greenpeace, politicians, TV reporters, teens, journalists, televangelists, honest religious fundamentalists, the middle class, truck drivers who've lost their jobs, "disadvantaged youth" and the "urban poor".

>We need ... CEOs who don't object to using AI and are even attracted by the sparkle, supercomputing vendors who either believe or turn a blind eye when the time comes to run the Last Program, no interference from politicians, no fad television programs about the Singularity, and citadels of technophobia worrying about something else.

>We should emotionally accept the possibility of government interference, and be prepared to move against attempts to regulate the development of AI, or evade those regulations if they are successful.

>With O($100M) to O($1G), or more ... We could engage in large-scale evangelism. We could "meddle" in things like independent patent agencies, government hearings on biotechnology, and so on.

Literally has been planning a planet-wide coup for decades. Not a cult how?
>>
>>8581579
Some juicy machiavellianism a little further in

>The primary ethic of writing first-step documents, IMO, is what I call "Invariance under the whole story." Sometimes, due to constraints of space, or the desire to avoid frightening off the reader, one must leave out some parts of the story. The cognitive structures that remain - the logic and emotion - must remain invariant under the whole story. The content, the matters of scale, the concrete visualization can change, but not the structure. When saving the world, the difference between a group of benevolent but mortal-scale transhumans working for the common good, and AI-born Powers rewriting the Solar System on the molecular level, is simply a matter of how much future shock the reader is exposed to.

>Ideally, one should remain vague about what form transhuman aid would take, at least until the second-step document. (Unless, of course, you yourself believe in a concrete scenario weak enough not to shock your readers.) Thus, each first-step reader will visualize whichever outcome they can imagine, at their own level of future shock. If the author's second-step visualization is more powerful, the reader will count the first-step mental image as a bad visualization, rather than as a deception - so long as the basic ideas remain invariant under the whole story.

as he talks about the need for scientology-style multi-layered recruiting material

>If someone emotionally attached to the concept of Apotheosis becomes more capable of emotionally accepting the necessity of the risks involved, then this can be viewed either as leading someone down the garden path, or as the correct functioning of the built-in cost-benefit analysis intuitions, depending on what you think is the actual correct answer.

All for the alleged greater good, of course.
>>
>>8581600
And of course the cult needs independent colony "survival stations" for the event of nanotech war, so that humans can survive, but only because without them the glorious vision can't come true

>Undoubtedly the anti-disaster groups, including ourselves, will do everything possible to preserve the six billion people presently living on this planet. But our first priority must be to preserve the existence of the human species. The survival of individuals, including ourselves (275), must be secondary. (Not that the goals are likely to conflict directly; I'm talking about the allocation of project resources.) If intelligence survives in the Solar System, there will be a Singularity, sooner or later. Given enough time, someone will code an AI. We just have to ensure that survival stations, capable of (A) sustaining life indefinitely and (B) reproducing into an acceptably-sized culture, (C) come into existence before military nanotechnology (279) and (D) are out of the line of fire (280).
>>
>>8581610
From his Facebook, recently.
>Mostly, our response [to President Trump] should be to have advance plans to leave the United States to carry on with our other work, if that becomes necessary; and to give up faith in governance and large groups of people being sensible, if that was a faith you still had.

"Just be cool and prepare to flee society and the government to carry out the plan". But no, not a cult, seriously just a legit charity without a creepy totalizing ideology.
>>
>>8581579
What part of "obsolete" do you not understand? This is dated 1999, and disclaimed by the author more than fifteen years ago.

>Not a cult how?
How many cults do you know that leave their old misguided and confessed-as-stupid ideas around for you to criticize? That sounds like the opposite of cult-like behavior to me. That sounds like someone going out of his way to allow you to judge him honestly, even against his best interests.
>>
>>8581617

He started the organization he describes in this plan in 2000. He's been building it up ever since, it's called MIRI now. The only change in his style of thought is more emphasis on the risks of AI and less on nanorobots. See the Facebook quote, he's still an egomaniac with delusions of being a world savior.
Thread posts: 9
Thread images: 1


[Boards: 3 / a / aco / adv / an / asp / b / bant / biz / c / can / cgl / ck / cm / co / cock / d / diy / e / fa / fap / fit / fitlit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mlpol / mo / mtv / mu / n / news / o / out / outsoc / p / po / pol / qa / qst / r / r9k / s / s4s / sci / soc / sp / spa / t / tg / toy / trash / trv / tv / u / v / vg / vint / vip / vp / vr / w / wg / wsg / wsr / x / y] [Search | Top | Home]

I'm aware that Imgur.com will stop allowing adult images since 15th of May. I'm taking actions to backup as much data as possible.
Read more on this topic here - https://archived.moe/talk/thread/1694/


If you need a post removed click on it's [Report] button and follow the instruction.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com.
If you like this website please support us by donating with Bitcoins at 16mKtbZiwW52BLkibtCr8jUg2KVUMTxVQ5
All trademarks and copyrights on this page are owned by their respective parties.
Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
This is a 4chan archive - all of the content originated from that site.
This means that RandomArchive shows their content, archived.
If you need information for a Poster - contact them.