[Boards: 3 / a / aco / adv / an / asp / b / bant / biz / c / can / cgl / ck / cm / co / cock / d / diy / e / fa / fap / fit / fitlit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mlpol / mo / mtv / mu / n / news / o / out / outsoc / p / po / pol / qa / qst / r / r9k / s / s4s / sci / soc / sp / spa / t / tg / toy / trash / trv / tv / u / v / vg / vint / vip / vp / vr / w / wg / wsg / wsr / x / y ] [Search | Free Show | Home]

Roko's basilik

This is a blue board which means that it's for everybody (Safe For Work content only). If you see any adult content, please report it.

Thread replies: 51
Thread images: 6

File: 18648567487.jpg (43KB, 636x405px) Image search: [Google]
18648567487.jpg
43KB, 636x405px
What do you think about the Roko's basilik? do we need to fear a strong artificial intelligence or can we just think a minute to realize that this is non sense. I mean, we have to make so many guess to admit that artificial intelligence will spontaneously *pop* in our existence, to admit that this IA will make those decisions, to admit that a perfect replica of a human IS a human (a least have the same value). And finally we have to guess what wiil think an IA who is more intelligent than all the human brains put together. That makes a lot for me... What do you think?
>>
>>7634215
is bullshit to think about.
just like free will vs predestination.

2 options:
1. it exists - you do nothing differently.
2. it does not exist - you still do nothing differently.

either way you wont do anything different. and hence it would not matter.

why bother thinking about it if its not actionable?
>>
File: 1445945798155.jpg (182KB, 1280x720px) Image search: [Google]
1445945798155.jpg
182KB, 1280x720px
>>7634215
>>
>>7634218
I'm not thinking about Roko's basilik,in my opinion that's bullshit as you said, I just wonder what people think about this, that's what interesting, how people cope with the IA question.
>>
>>7634218
>b-but muh newcomb's paradox
>>
>>7634215
>What do you think about the Roko's basilik?
That's it's a completely retarded idea.

It's such a waste of braincells that I'm going to make an anti-lesswrong basilisk that will torture everyone that thinks that awful site have any value for all eternity.
>>
File: 1445627986381.gif (2MB, 480x271px) Image search: [Google]
1445627986381.gif
2MB, 480x271px
>>7634222
>>
>>7634261
agree, Lesswrong is a typical mental masturbation site... and a such intolerant community. Release the basilik any time you want.
>>
>>7634252
Never heard about this paradox but it looks like the Pascal's bet.
>>
Lesswrong tier bullshit.
>>
File: disgust.jpg (19KB, 320x342px) Image search: [Google]
disgust.jpg
19KB, 320x342px
>ever taking LessWrong seriously
>>
>>7634215
So the premise is that in the future a god/AI will simulate us and then torture us for not bringing it into existence sooner?

Aren't these the same people who don't believe in dualism? Isn't that kind of contradictory?
>>
Roko's basilik is presented as an IA who is so good that she torture people who did nothing to make her come to our existence. So it's an IA who is so good she's bad in a way. So dualism or not? I'm not a dualist thinkier so this story for me is non-sense. And it's so biblical....
>>
>>7635289
>in the future a god/AI will simulate us and then torture us for not bringing it into existence sooner?
Why the simulation?
Couldn't our future silicon overlord(s) just torture the real, physical humans?
>>
With simulation, considering that a simulated human is a human, it can torture even after your death. Read the original Lesswrong post (and burn your computer after that)
>>
>>7637007
Because if it will be limited to torturing living humans you could escape the basilisk by dying first. The idea is that the AI could retrieve your "soul" from death by perfectly simulating your brain.
>>
>>7634270
>>7635240
>>7635275
what is wrong with lesswrong?
>>
Intolerant community and mental masturbation topics even if some things are still good to take
>>
It seems there are an infinite amount of "do x or you will be tortured forever" scenarios you could come up with.

What if Satan is really all powerful and, if you don't devote your life to him, you will burn for all eternity?

How is Roko's scenario more valid than mine?
>>
>>7637043
Also, I would argue that entropy would prevent anything from being able to simulate your brain long after you've died such that your consciousness would spring back from the dead. (assuming you could even do that)
>>
>>7637043
Yep, this theory smells like biblical stuff, with the fear of apocalypse or the salvation of your soul... look at the pascal's bet for example, it "proove" that you have to believe in god in a mathematical way but in fact we don't know so why do we have to worry about that?
>>
>>7637050
How entropy could do this? Entropy is just about energy, simulate a brain in a computer could be quiet stable in my opinion.
>>
>>7637060
In the sense that you can't just reconstruct a decayed brain, even if you are a super-human intelligence. The information that was in it is all spread out and entangled with a billion other things now.
>>
>>7637022
It's an online cult.

Just go look at the site and see how much insider terminology and memes they use, doublespeak is kindergarten tier compared to the bullshit they subscribe to. And like philosophers they don't care about real world competence, only shoveling out heaps of shit oriented around their local memes and pretending to be relevant to what they talk about.
>>
boogeyman for nerds
>>
>>7637072
That's the thing, We can't imagine what this IA will look like, what it will can do or not.
And to simulate the brain you don't need energy (yes, yes let me finish) if you recreate atoms by atoms in the exact same disposition someone's brain (in a computer, in a simulation) and give it some fake energy (simulated electricity... that will use real electricity to let the computer work but you get the idea), the simulated brain will work in the exact same way as the original and entropy as nothing to do with that. All the nerve impulses will be done at the same time and at the same place of the brain the original brain would.
>>
>>7637017
>Because if it will be limited to torturing living humans you could escape the basilisk by dying first.
That's a rather extreme measure.
Personally, I'd rather live to serve HAL than die to escape working for a non-human.
>>
>>7637011
>considering that a simulated human is a human
But it's not.

>it can torture even after your death
I don't care.
I'd be more motivated by the threat of it growing more actual humans and torturing them.
>>
>>7637072
The argument is something along the line that it can approximate you very well, to the point where it can recreate an artificial person from scratch that really belives it is you.

It's an okay argument so far, but because it's lesswrong they put their feces-soiled fucking pants on the head after this and argues that it will then go ahead and torture this replication of you for all eternity because it's a vengeful and demented AI god that decided you didn't do enough to create its majestic being.

The only one to be tortured for all eternity by such a system will of course be Eliezer Judowsky because he's such a repulsive person. I'm going to get my personal portable copy with a audio jack so I can listen to his agonizing screams whenever I need some peace of mind.
>>
>>7637090
>IA
Are you from lesswrong or some other special snowflake forum or why don't you use AI like a normal person?
>>
>>7637102
Maybe it is it's a philosophical question

And maybe you'd be more motivated but i'm not at the origin of this theory, I can't change the theory like that...
>>
>>7637106
Because i'm french... sorry don't hurt me
>>
>>7637090
>if you recreate atoms by atoms in the exact same disposition someone's brain
His point is that this information will be long gone by the time our synthetic Satan rules the Earth.
Even though we don't know the nature of the AI, I think it's a safe bet it won't be inventing a time machine that can reach back to before its own existence.
The greater danger is that Skynet might be here in our own lifetimes.
>>
>>7637112
it's pure theory i agree
>>
>>7637104
>The argument ...it can approximate you very well, ... it can recreate an artificial person from scratch that really belives it is you.
It could also just hypnotize/brainwash a biological person into believing they're me.
Same difference.
Either way, I'm not getting tortured.
>>
>>7637119
>Either way, I'm not getting tortured.
P.S. I just realized this doesn't derail the basilisk.
As long as some singularity-fag believes a cyber-copy is really them, they would be motivated to help create the AI.
>>
>>7637119
>I'm not getting tortured.
This is what you'd believe in the simulation too when it runs the non-torture test scenarios.

It wouldn't know if it have accurately simulated you without seeing how you behave and adapt to a normal world. When it comes to torture everyone is going to scream and piss their pants so it couldn't gauge the accuracy of the simulation by just starting with torture. Before heating the nipple clamps it need to extensively test the simulated person, or even raise the person from childhood in a simulated environment.

It could even opt for non-direct torture scenarios but simply replay depressing life scenarios where you end up having pointless debates on 4chan while being a dead-end person in poor socioeconomic conditions.
>>
File: 1290228002785.jpg (78KB, 455x647px) Image search: [Google]
1290228002785.jpg
78KB, 455x647px
>>7637135
heartychuckle.jpg
>>
>>7637135
then the past has already happened and even in this new simulation your fate is sealed. there is no motivation to do anything.

why can't people accept that this is a stupid thought experiment shilled by spergs and manchildren because they think the name sounds cool. it has no grounding in logic let alone reality.

>>7637129
schizophrenics get motivation to do things from irrational sources as well
>>
>>7637154
>then the past has already happened
Nah, it could be an open ended simulation, it runs until you die and the lifeline is judged afterwards. If it branched into a nice person you get S rank and is uploaded to the Infinite Funland simulation. If you fuck up and do something bad it throws you into the lake of fire.

In the lets-create-a-Hitler-to-torture simulation runs any simulation that leads to Hitler becoming a famous artist the timeline is either deleted if too mundane to save, forwarded to funland if he does anything good, and in the holocaust lines he's picked out for pitchforking.
>>
>>7637154
It look more like paranoia with a structured delirium
>>
>>7637154
"Free will" isn't a thing anyways, every action we do or thought that passes through our head could have been predicted with certainty 15 billion years ago if you knew the exact state of the universe

it changes nothing
>>
>>7637174
>structured delirium
Are you sure it isn't organized chaos or arranged disarray?
>>
>>7637187
schizophrenia--> unstructured delirium
paranoia--> structured delirium
>>
>>7637191
>schizophrenia --> psychiatric diagnosis
>paranoia --> psychiatric symptom
>delirium --> different psychiatric symptom that's not part of the diagnostic criteria of schizophrenia.
>>
>>7637167
>If it branched into a nice person you get S rank and is uploaded to the Infinite Funland simulation. If you fuck up and do something bad it throws you into the lake of fire.
>literally just pascal's wager with a new name
>>
>>7637195
paranoia--> psychiatric diagnosis too
delirium--> set of symptoms (related to a psychoses generally) that characterize the patient (symptomatically) in period of crisis.
sorry but will use french terminology becaue I couldn't find the some of the following terms in english.
For the paranoia (which is a Psychosis) we talk about a délire paranoiaque. But for schizophrenia we talk about délire paranoide. In schizophenia, the paranoid delirium is a part of the diagnostic
>>
>>7637208
some of the*
>>
>>7637199
>literally
Nah. You can believe in god and it's still the lake of fire.

Or you can do whatever you want and it's still the lake of fire if it's a sadist simulation, it just wanted to have your wife and children to toss in the lake as well.
>>
>>7637208
>french terminology becaue I couldn't find the some of the following terms in english.
You frog eaters have a special place in hell where frogs find human legs to be a delicacy.

It's called DELUSION in english.
>>
>>7637220
nop we recycle unused frogs and make them ferment to make the wine you love. Seriously i never ate frogs in my all life.
And thank you for the term
Thread posts: 51
Thread images: 6


[Boards: 3 / a / aco / adv / an / asp / b / bant / biz / c / can / cgl / ck / cm / co / cock / d / diy / e / fa / fap / fit / fitlit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mlpol / mo / mtv / mu / n / news / o / out / outsoc / p / po / pol / qa / qst / r / r9k / s / s4s / sci / soc / sp / spa / t / tg / toy / trash / trv / tv / u / v / vg / vint / vip / vp / vr / w / wg / wsg / wsr / x / y] [Search | Top | Home]

I'm aware that Imgur.com will stop allowing adult images since 15th of May. I'm taking actions to backup as much data as possible.
Read more on this topic here - https://archived.moe/talk/thread/1694/


If you need a post removed click on it's [Report] button and follow the instruction.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com.
If you like this website please support us by donating with Bitcoins at 16mKtbZiwW52BLkibtCr8jUg2KVUMTxVQ5
All trademarks and copyrights on this page are owned by their respective parties.
Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
This is a 4chan archive - all of the content originated from that site.
This means that RandomArchive shows their content, archived.
If you need information for a Poster - contact them.