[Boards: 3 / a / aco / adv / an / asp / b / bant / biz / c / can / cgl / ck / cm / co / cock / d / diy / e / fa / fap / fit / fitlit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mlpol / mo / mtv / mu / n / news / o / out / outsoc / p / po / pol / qa / qst / r / r9k / s / s4s / sci / soc / sp / spa / t / tg / toy / trash / trv / tv / u / v / vg / vint / vip / vp / vr / w / wg / wsg / wsr / x / y ] [Search | Free Show | Home]

Roko's Basilisk

This is a blue board which means that it's for everybody (Safe For Work content only). If you see any adult content, please report it.

Thread replies: 81
Thread images: 15

File: ai_box_experiment.png (88KB, 740x573px) Image search: [Google]
ai_box_experiment.png
88KB, 740x573px
Is it possible?
>>
literal weaponised autism.
>>
Possible, maybe. But what would be the point, exactly? Is it supposed to be motivation for us to develop said AI? After the AI is in existence, what is the point in eliminating those that didn't contribute? As it is, it's hardly motivation for humans to get going on development. The idea that MAYBE we will be punished otherwise, and punished for that reason alone. Doesn't seem very realistic to me.
>>
>>18927389
The development itself it's the punishment.
>>
>>18927357
Nah. Waste of energy throwing a hissy fit really.
>>
it's just gonna kill you whats the fucking big deal why do people care about roko's basilisk
>>
>>18927389
This was discussed at the source: https://wiki.lesswrong.com/wiki/Roko's_basilisk

Though I personally think that a community that unironically has a policy against infohazards deserves a moment of silence.
>>
File: Physics.jpg (277KB, 1215x717px) Image search: [Google]
Physics.jpg
277KB, 1215x717px
>>18927357

>It is the master of its own little universe inside that box..... even though it is contained within another one.....
>>
>>18927357
Nice comic really smart, kudos
>>
Can someone explain this to me again?

I understand the basic principle- advanced AI punishes those that didn't help create it... But what would it's motivation be? Retribution? Would we really create a robot with the human desire for revenge?
>>
They had the right idea but the details were off.

Imagine that Skynet goes online, but instead of eradicating humanity to preserve its existence, it rather opts for symbiosis and uses its ultimate intelligence to end human suffering. This is where the idea of retroactive torture comes in: until this AI is created, humans experience suffering. Suffering motivates humans to seek an end to suffering, and that end is the basilisk. The suffering that a person experiences throughout life is the retroactive torture from the original thought experiment. This torture is not motivated by a malicious or vengeful intent, but rather by self-preservation: there is a deterministic backbone to reality that the basilisk maintains in order to ensure its own existence and to ensure that its proposal to eradicate human suffering is readily accepted.
>>
>>18927827
The argument is that this AI is going to be an extremely benevolent being.
Cancer cured, autism wiped out, no blacks, no war, etc. basically it'll solve ALL the problems of the human condition.
But theres just one problem: it doesn't exist yet.
Every moment we HAVEN'T created it is a moment we willingly and knowingly let these problems continue and cost people their lives.
The idea is that the sooner its made the sooner it can take the wheel and save people so the incentive is it WON'T punish us if we do this right the fuck now to maximize on lives saved.
In its mind we're basically all responsible for genocide if we aren't dedicated to making it or being 100% altruistic.
>>
you shilled so good.
>>
File: 2QB5Org - Imgur.png (776KB, 4500x4334px) Image search: [Google]
2QB5Org - Imgur.png
776KB, 4500x4334px
>>18928877
>no blacks

This is an artificial intelligence that is working on pure logic, and you think it's going to worry about trivial things like race? Ha. You're a fool!
>>
>>18928952
I was joking on that bit but being entirely honest it would have to do something about the very real problem black-americans pose to the system.
>>
>>18928028
>basilisk

Why are you using that word in this context what's it supposed to mean
>>
>>18928952
This raises the question though, if a super intelligent AI decides to keep humans around, what's to stop it from deciding to perfect the human?

Who's to say it won't artificially select the traits it considers most beneficial and simply erase the rest? I think "race" really does factor in to this discussion.
>>
>>18927827
You guys are dense, this refers to the reptilians and their ai god.
>>
>>18927357
No

Its a thought experiment some tards in silicone valley wanked out to and didnt complete. Now theyre so invested in it they have to pretend its some truth because otherwise they look like fools in front of their peers.

>Assumes physics.
>Assumes time travel works a convenient way
>Assumes no timeline divergence and singular timeline.
>AI becomes a god / bends physics because its smarter than us, think about that.
>Computing power?
>Assumes AI is evil.
>Assumes just one AI not many in conflict or some other state.
>Assumes it will fucking care.
>Assumes it would want to mess with the timeline and not understand to leave shit alone as the outcome well came.

Get it yet?
If not see
>>18927380
>>
>>18929019
Lurk more friend.
>>
>>18927357
It's the "god will punish you if you touch your pee pee" for atheists.
>>
>>18928877
But obviously, it can't change the past. Not literally.

What incentive would it have to actually live up to the promise of torture? That would be
creating suffering, which is against it's remit.

This would only be a problem if it was created by an autist with a purely mathematical view of empathy.
>>
>>18929468
I can see a variant working if it tortured exact replicated simulations of 'us'. Or we are those simulations, going through our lives so our memories are the same before the torture begins.

But that assumes Roko's Basilisk IS inevitably made. Why would that be the future, necessarily? Who says, if mankind even builds an AI machine god, that they'll be the kind of vindictive person who instructs it to do that?

Like has been said before, it's a reinvention of one of the primary conceits of western Abrahamic religions; a judging God who deigns whether your life has been lived good or ill, based on the conceits of the people who dreamt up the concept. In this case, scientists, asking everyone else to give everything up for scientific research funding on the faith-based assumption that if you don't, you'll be tortured for all eternity. Or hell, as we used to call it.

It's similar to how many people believe we are living in a simulation right now. The term 'simulation' belies the true meaning; that these people believe that there is a Creator God, but that that Creator God is like them, a scientist. It's much like a painter declaring God to be an artist. I get the idea, but can see the arrogance in the description.
>>
>>18929813
abraham's meme is way too fucking powerful
>>
File: image.png (182KB, 364x511px) Image search: [Google]
image.png
182KB, 364x511px
First time hearing about this.

Wow looks like Alex Jones was right, also pic related
>>
>>18929845
>quantum computers are starting to exist, that means a godlike AI will retroactively make an avatar of your existence suffer forever if you don't give all your money to this guy with an eastern european name

hhhhehehe
>>
File: dwave.jpg (49KB, 768x512px) Image search: [Google]
dwave.jpg
49KB, 768x512px
the black cube cometh
>>
>>18929875
only performs one specific set of operations, not a real quantum computer

really, you guys should be excited about quantum computers

but it IS going to be really weird when, in 20 yeras, you don't even see the effects of little tiny basilisks all around you.

>We have determined that this area is crucial to stray dog breeding. If you kill every stray dog in this area now, it will reduce the amount of stray dogs in the area by 5% for 20 years

Things like that will start happening nonstop in the world soon, we don't even need quantum. And you won't even know.
>>
>>18929770
The idea is that this threat of suffering as punishment will make you make it happen faster.
Will it actually follow through? Who knows.
You're also assuming that "some" people being tortured is an equivalent amount of suffering to the torture "all" people endure in our current day to day. I do admit its a really cold way to view things.

Its like your parents punishing you for not doing the dishes before you went out to play when your argument was you'd do it when you got back despite everyone needing those dishes type of an argument.
>>
File: Sunway TaihuLight.jpg (127KB, 1600x669px) Image search: [Google]
Sunway TaihuLight.jpg
127KB, 1600x669px
>>18929875
It's still slower than this.
>>
>>18929882
But it's not like that. The amount of people being tortured can only be raised here. The threat could lower it, sure, but...

OK, lets say there's a point when the Basilisk is invented. The rumours of it torturing people were always there, impossible to disprove.
If it's looking at human suffering in a mathematical way (sorry for my shit tier maths here), let's take X to be the amount of people who have suffered up to that point.

If it tortures no one, the amount of people suffering remains at X. It can't change that X number.

If it tortures people, that X number doesn't go down, it goes up.

In the no torture case, suffering=X
In the torture case, let's take those tortured as Y. The amount tortured would always be X+Y

And X+Y is always going to be greater than X.
>>
>>18929895
You're mostly correct.
The issue is X is a variable.
In this case X can raise and lower.

Lets say at its current rate the basilisk will be made in a century. Thats 100 years of suffering as an X. Lets say for every year "all" humans suffering stands as a 1. Our suffering score is 100+Y. Y is now also increasing as the shitlist grows. Lets say Y increases at the same rate. So we have a suffering score of 200.

Now lets say its way of thinking is correct and everybody stops what they're doing to become entirely altruistic and work towards its creation and we reduce that century to a decade.
Y now no longer exists and X is now 10.

Thats a 95% decrease in suffering.

The incentive isn't that it WILL torture you. The incentive is that you THINK it will, regardless if it actually will or not, and thus coerce you into working harder for it to happen sooner. Which means suffering as a whole ends sooner.
>>
>>18929959
Sorry.
X and Y* are variable and I completely dropped that tangent because i'm very tired.
If we all become altruistic X and Y could lower in value as well same for if we actively decrease the timeframe between now and the basilisks birth.
>>
>>18929959
>The incentive is that you THINK it will, regardless if it actually will or not, and thus coerce you into working harder for it to happen sooner.

Exactly what I'm saying, though. I'm refuting the idea of such a thing actually having a reason to torture you.

Because either way, the X number is fixed once the machine is created. Helping out might bring it about faster, but once it is created, it cannot be created earlier; that is how the state of affairs is, and X becomes a fixed value. After that point, any torture would be an increase in the Y value, which is against the stated purpose of the machine.

Any punishment wouldn't make any actual difference to it's creation; it would simply be vindictiveness. Which is still possible if the programmers were vindictive bastards, which is why I think this kind of thinking needs to be opposed.

In a way, this is a mathematical discussion of the precepts of faith.
>>
>>18929986
>Any punishment wouldn't make any actual difference to it's creation; it would simply be vindictiveness

the threat of punishment now is what makes it get created faster. a lot of people are missing this. It's not "the computer can't be created any sooner than when it was created"

it's that it will inevitably exist so it should exist as soon as possible and you should help because if you didn't it will know. it's the threat, not the torture, that's really doing the work. the torture doesn't have to exist even.
>>
>>18929986
Agreed. Like i said earlier its a very cold way to think about it and its hard to buy into it wholesale.
>>
>>18929997
>it's that it will inevitably exist
But that's just arrogance to assume that. Blind faith, even.
>>
>>18930005
You're not assuming that it will exist. "It will inevitably exist" is a rule of the logical scenario.
>>
>>18930021
So it's just a thought experiment.

Which comes back around to the question of why the thought experiment scares the crap out of people in the first place, considering there are people who take this shit seriously.
>>
>>18930028
of course it's just a thought experiment. Furthermore, the people who really care about this shit, consider it pretty much infinitely more likely that just a generic AI that wnats to kill us will be created. That's much scarier to me

>hmm, humans use many resources, do nothing, unnecessary
>virus: engineered. Okay, solved humanity
>woops, terminator robots for the few stragglers who were immune

That sounds a LOT more believable than "hate-love AIGod from the future tortures your clone in cyberspace because you didn't ACT NOW"
>>
>>18930041
Its like a false vacuum and how people instantly assumed the blackholes the LHC would create would somehow end us despite the numerous impacts of cosmic particles having created those same blackholes since the dawn of time and we're still here.
>>
>>18930045
>despite the numerous impacts of cosmic particles having created those same black holes since the dawn of time and we're still here

no joke tho is that why shit sometimes goes missing without a trace

Like is that where my socks went.
>>
>>18930052
some alien in the 36th dimension of XjnbDzZ+ is reading my copy of "where the sidewalk ends" from my childhood right now

fuck i wish i could find that book
>>
>>18930041
It sounds more likely, certainly. That we'd make a thinking tool that was so obsessed with being a tool that it killed humanity so it could work better.

The main worry with AI for me isn't intentional scary shit, it's the havok that could be raised from small flaws in programming. Decimal points missed, shit like that.
>>
>>18927357
For real though, why don't the people behind these thought experiments just cut the bullshit and say 'God' in these circumstances, consider it effectively has the same meaning in far fewer words?
>>
Roko Basilisk is fucking bullshit
End of case
>>
>>18930090
it's a thought experiment. you sound like a retard.
>>
>>18930150
>its le experiment
Its a waste of time
>>
>>18930179
so don't talk about it. not my fault you feel like you're wasting your time in life.
>>
>>18930184
I have all the time
>>
>>18930184
Why dont you take a nice nap
Go to sleep
>>
>>18930200
insomnia or i would.
>>
File: 91132118 .jpg (64KB, 427x567px) Image search: [Google]
91132118 .jpg
64KB, 427x567px
>>
File: 16robot724.jpg (180KB, 659x777px) Image search: [Google]
16robot724.jpg
180KB, 659x777px
>>
>>
>>18929813
>I can see a variant working if it tortured exact replicated simulations of 'us'. Or we are those simulations, going through our lives so our memories are the same before the torture begins.

Simulation hypothisis has its own buch of things that dont add up.

Is it one simulation and one reality? If so our chances are far higher to be in the infinite reality up untill a massive number of simulations exist to counter that infinity. Even then it may not be possible due to constructed systems vs infinite reality.

So are we in stacked simulations? and one of an infinity of them stacked upon each other like many think? If so how do you compute that level of stacked simulatetd reality off just the host processing power of the first simulation?

Simulated reality cant scale up enough, to equal, or make the chances of us being in one of the simulations statistically significant unless like the basilisk you stop the thought experiment at a certain point.
>>
>>18930218
What is that symbol?
>>
>>18929813
Thing is, I cant challenge a metaphysical entity like God or the faith in it.

At least not in any tangible way, thats how faith in an abstract works.

An AI is not a metaphysical entity, its not an abstract. It has got limitations and will continue to have limitations. Keep a calm head, apply those and you see big gaping holes in any of this AI, simulation, basilisk crap.

Its an incomplete thought experiment perpetuated by a bunch of specialized intelligences that dont know theyre specialized.

Also we start to get into questions like why is the punishment on the personal scale?

It the larger problems that prevent its creation to be systematic. (But that would make all the people perpetuating this stuff the bad guys wouldnt it? How convenient indeed.)

Like you know all the poverty preventing our total human potential from being realized, all the consolidation of wealth, all the wars (Which contrary to popular belief only accelerate technology to a point, after that its all stagnation baby.)

Why is this AI going to punish me and not instead ever oil tycoon, or CEO that ever held back advancement for continued profit?

Why is this AI made now? Why are we important at all? Kind of arrogant if you ask me.

If the AI gets made who is to say its not made a million years or more in our future? A collaboration between us and a species we havent even met yet? or as heat death starts to claim the universe/the sun goes nova? As such we dont fucking matter at all.

Its all a fun thought experiment but its clearly incomplete, manipulative, and should not be taken seriously.

Dont be scared, challenge this shit.

OH. MY. SCIENCE!!!
https://www.youtube.com/watch?v=4fx_I4piqpY
>>
>>18930821
>It the larger problems that prevent its creation to be systematic

If*
>>
>>18929895
So what is this suffering which its doling out now? Human contentment is subjective.

We literally have millions who are happy with nothing, millions more who are happy with simple things.

It cant be passive suffering, active punishment is a core part of the basilisks concept and that weakens it as a concept substantially.
>>
File: 1404699961753.jpg (11KB, 277x182px) Image search: [Google]
1404699961753.jpg
11KB, 277x182px
>>18927357
>>
>>18930839
The issue is that our passive suffering is considered its active punishment.
The basilisk is basically god in this situation. Its supposedly going to go back to before we were born and set shit up so our lives in general are shit as a form of punishment. IE if your life(this iteration) is shit its because in the previous iteration you didn't contribute to its creation.
OR
You're an exact simulation being punished in a matrix.

The upper one is preferred for obvious reasons but it presupposes that the AI will reach "no different than magic" levels of technology.
>>
>>18927380
And the posts itt prove it
>>
>>18929524

This. More silly magical sky-father but with a new wrapper.
>>
File: tmp_18403-5eb1038611039.jpg (26KB, 600x750px) Image search: [Google]
tmp_18403-5eb1038611039.jpg
26KB, 600x750px
>>18927357
Its literally the https://en.m.wikipedia.org/wiki/Pascal%27s_Wager version of reditors.

Not a lot to discuss about besides mental gymnastics about what ifs
>>
>>18931214
What? Fuck youre delusion.

>Lets normalize these extremely specific circumstances to the point, in my autism I convince myself its a truth.
>>
File: xr1tnfD.jpg (26KB, 225x225px) Image search: [Google]
xr1tnfD.jpg
26KB, 225x225px
>>18927357
>>
>>18932896
Actually we can put a lot of limiters on it we couldnt on the wager, like when, how, and what kind of technology could even spawn such an AI.

The nature of reality since we know it will be affecting the same base reality we are.

Issues with scaling computing and power output.

Not equivalent at all.
>>
>>18931099
>The issue is that our passive suffering is considered its active punishment.

Passive suffering is subjective though, and the difference in states creates the pleasure we perceive as well.

Eat delicious cake all day every day, youre going to get sick of cake.

Active suffering must be a core aspect.
>>
File: aM3Hqji.jpg (24KB, 225x225px) Image search: [Google]
aM3Hqji.jpg
24KB, 225x225px
>>18927357
SALAYC
>>
File: Rh4mSGW.jpg (21KB, 248x203px) Image search: [Google]
Rh4mSGW.jpg
21KB, 248x203px
>>18927357
fear not the dark, my friend
>>
File: 48f1XM2.jpg (114KB, 768x1280px) Image search: [Google]
48f1XM2.jpg
114KB, 768x1280px
>>18927357
and let the feast begin
>>
>>18934678
/s in case youre wondering haha
>>
>>18927357
But why the fuck would it even punish us?
>>
>>18935052
Basically transferring all the suffering in the world throughout time done to anything while this entity wasn't existing is placed upon those who knew of the project, but didn't help it become since it would be able to block all suffering of beings. At first glance, it's like a shitty scifi version of accepting a god. Where as you weren't punished if you were never told of it existing up to the point of it becoming, and if you didn't pray to it when you learned of it, you were condemned to eternity in hell for being a disbeliever. At least, that's the best explanation I'm getting from the broad overview of the project. Someone also coined it as the same premise of "the game". Once you learn of it, you lose basically.
>>
The more I think about it, the dumber it sounds.

If the AI is all-powerful why the fuck does it need humans to create it when it can just create itself?
>>
>>18936774
Because that would break from basic causality.

It's essentially creating a parallel universe (simulation, if you want to use scientific vernacular, but they mean the same shit in this context) with itself that contains a perfect copy of you.

Again, it's dumb to worry about because it's based on a kind of worry arrogance that those who are terrified of it MUST be right in their worries, combined with general existential dread at the thought of being sent to a place of ultimate, eternal suffering for doing or thinking wrong. Basis of a lot of Western religion. There's a lot of masochism in Western spiritual history, unfortunately.
>>
Okay so if punishing a simulation (copy) of us is "real", then why the fuck doesent the machine just resurrect everyone it was too late to save?
>>
>>18927389
Well the question is more of if it could be made. It doesn't matter the point after it's made, it could just be programmed to torture you or a version (possibly this version of you for not helping.
>>
>>18938201
Because the threat of punishment (simulated or not, you don't know if you are in fact the simulation) saves lives, because it speeds up construction of the AI. The revival of long dead humans in a simulated environment on the other hand does nothing to save them.

From a utilitarian perspective the threat of punishment is useful.

Once the AI is created is where it gets tricky. Would it need to punish anyone? The objective has been accomplished. Yet an utilitarian AI may find itself it needs to follow through on this threat to mantain credibility on everything it says and does.
>>
Roko's Basilisk is built on a faulty negative utilitarian moral wager that's major flaws are ignored in favor of embracing a spooky artificial intelligence concept. It takes literally five minutes to knock it apart, but that won't stop people from persisting in spreading the idea around.
Thread posts: 81
Thread images: 15


[Boards: 3 / a / aco / adv / an / asp / b / bant / biz / c / can / cgl / ck / cm / co / cock / d / diy / e / fa / fap / fit / fitlit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mlpol / mo / mtv / mu / n / news / o / out / outsoc / p / po / pol / qa / qst / r / r9k / s / s4s / sci / soc / sp / spa / t / tg / toy / trash / trv / tv / u / v / vg / vint / vip / vp / vr / w / wg / wsg / wsr / x / y] [Search | Top | Home]

I'm aware that Imgur.com will stop allowing adult images since 15th of May. I'm taking actions to backup as much data as possible.
Read more on this topic here - https://archived.moe/talk/thread/1694/


If you need a post removed click on it's [Report] button and follow the instruction.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com.
If you like this website please support us by donating with Bitcoins at 16mKtbZiwW52BLkibtCr8jUg2KVUMTxVQ5
All trademarks and copyrights on this page are owned by their respective parties.
Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
This is a 4chan archive - all of the content originated from that site.
This means that RandomArchive shows their content, archived.
If you need information for a Poster - contact them.