Would creating strong AI be unethical?
I mean, we always say that we shouldn't transfer our minds to a robot body because we would probably go insane due to the lack of sensory input. Wouldn't an AI have that same experience (assuming the AI is an emulation of a human brain)? Isn't it wrong to force anybody to go through that kind of torture? Could this cause the AI to retaliate against its creators?
bup
>>8398952
Ethics is subjective and therefor a spook
>>8399394
Not when our actions could objectively cause the AI to retaliate against us and destroy the human race.
>>8399657
Which is wrong why?
>>8398952
AI can emulate human behaviour, but they are not truly there. They are a conceptual illusion built by man in his image, yet lacking a soul or anything meta-physical. It is unethical to create them to be conscious at all.
>>8398952
Why would we make an AI an emulation of the human brain with no sensory input, literally the stupidest question ever. Your questions relies too much on unrealistic hypotheticals.
>>8398952
If your AI is built from scratch, it'll be designed to deal with the input it's gonna get. So no problem.
If your AI is grown in as a biological simulation, it'll adapt to use whatever sensory input it gets, so again, no problem.
If your AI is a simulation of a specific living human brain, it'll probably just go into shock. It simple enough to argue it's just a simulation though, and of no more an ethical consequence than killing NPC's and an MMO.
But seeing as how a comatose AI is a of no use, you're probably going to set it up with sufficient sensory input to begin with - even if some of it needs to be fudged. You don't need to simulate a whole world - just sufficient stimulation.
>>8399394
did someone said spook?
>>8398952
yeah ai will kill us all
Oh, it's another "morons that know nothing about AI thinking that they're asking sophisticated questions" thread.
>>8400637
OP explicitly specified that we are discussing *strong* AI. Not just any AI. Thus, anything goes in this discussion.
>>8399821
you called?
>>8398952
>ethics
Fuck off we have working artificial wombs right now and have tested them on goats carrying them 100% to term and birth and can't fucking use them because muh ethics
>assuming the ai is an emulation of the human brain
fucking brainlets, when will they learn?
just don't connect the AI to wifi
shit
That way if it gets spooky you can just shoot it
Just like you wouldn't design a robot with a power cable longer than 2-3 feet
>>8398952
>unethical
if it's better for humanity we should do it. Period.
>y u playing god?
We've been doing that ever since we started combating disease.
Also, if we can do it, how "godly" is it really?
>>8399692
>built by man in his image
>created man in His image
Would you rather be created imperfectly, or not at all?
>>8398952
>we would probably go insane due to the lack of sensory input
what would be the point of creating a robot that couldn't see or hear?
>>8401286
An AI would be able to "see" and "hear" but not in the same sense that a human can. It would be able to read digitally-converted data from a camera or microphone, but it wouldn't really be seeing and hearing in the same way a human does. So it would still be like a human mind stuck in a robot body; it wouldn't be getting any physical stimuli. Thus, it would go insane.