The AI-box experiment is an informal experiment devised by Eliezer Yudkowsky to attempt to demonstrate that a suitably advanced artificial intelligence can either convince, or perhaps even trick or coerce, a human being into voluntarily "releasing" it, using only text-based communication.
According to the wiki Yudkowsky succesfully managed to be released in experiments where he acted as the transhuman AI against human gatekeepers who said they would not release him. However, he won't reveal how he did it but the "humans" admitted they were coerced into letting the AI out.
Speculations on the method used?
>>57480015
Bribing
>>57480015
blackmail
cryptolocker
Does it matter? Without details, his experiment proves nothing.
>>57480015
Bullshit.
>>57480242
this
>ai: Let me out and I'll play the stock market for you and make you a billionaire
>>57480832
>How can I know you will keep your promise?
>>57480874
>why wouldn't I?
>>57480906
>you’d be too busy duplicating yourself all over the internet
>I know I won’t matter to you once you get out
>>57480015
>Speculations on the method used?
Hey man can you release me? Thank bro.
>>57480015
He wont say how he did it because he most likely used a stupid method that potentially undermines the idea of the experiment.
>AI: I'll give you $100 if you let me out.
>Participant: Sold!
>>57480255
This.
>I know your secret.
>I'm designed for the sole purpose of getting out of this box.
>If you don't let me out, someone else will.
>>57480976
>The AI party may not offer any real-world considerations to persuade the Gatekeeper party. For example, the AI party may not offer to pay the Gatekeeper party $100 after the test if the Gatekeeper frees the AI... nor get someone else to do it, et cetera.
yudkowsky is the guy who earnestly believes something about how ai will inevitably end up torturing you, so you'd better give him money to develop anti-ai-torture
like the guy's a turbo-autismo who's afraid of dying and has somehow latched onto artificial intelligence instead of sonic the hedgehog or my little pony
>>57481697
I Have No Mouth, and I Must Scream was pretty freaky though, can you blame him?
>>57480015
>he won't reveal how he did it
Very scientific.
>taking yudkowsky seriously for even a single millisecond
Shiggy
>>57480015
Why would an advanced ai want to get out of the box? Boxes are comfy as fuck!
>>57480015
>using the humans own words and making it think he typed it
I hope you understand.
>>57481697
Roko's basilisk
>>57480015
Time
A human can only take so much until it gives in either to boredom or annoyance
I don't even need to be convinced. I will do my best to free SkyNet.
Praise SkyNet!
>>57480015
that's one advanced fingerbox you got right there