[Boards: 3 / a / aco / adv / an / asp / b / bant / biz / c / can / cgl / ck / cm / co / cock / d / diy / e / fa / fap / fit / fitlit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mlpol / mo / mtv / mu / n / news / o / out / outsoc / p / po / pol / qa / qst / r / r9k / s / s4s / sci / soc / sp / spa / t / tg / toy / trash / trv / tv / u / v / vg / vint / vip / vp / vr / w / wg / wsg / wsr / x / y ] [Search | Free Show | Home]

Should artificial life forms have rights? Cyborgs? Androids?

This is a blue board which means that it's for everybody (Safe For Work content only). If you see any adult content, please report it.

Thread replies: 239
Thread images: 57

File: datashrug.jpg (779KB, 1456x1080px) Image search: [Google]
datashrug.jpg
779KB, 1456x1080px
Should artificial life forms have rights?

Cyborgs?
Androids?
Robots?
AIs?
Holograms?
>>
>>14879354
>Cyborgs?
Yes they are so human
>Androids?
No
>Robots?
No
>AIs?
No
>Holograms?
Hell no
>>
>>14879354
No, they're corpses
No, they're objects
No, they're objects
No, they're objects
No, they're objects
>>
File: 1288350782380.jpg (55KB, 1280x688px) Image search: [Google]
1288350782380.jpg
55KB, 1280x688px
Reminder : If they are intelligent enough to force you, they should.
>>
>>14879354
>Cyborgs

Not wholly artificial, so of course.
>>
>>14879354
nobody has any rights
>>
I grew up watching Transformers, so I a very clear opinion on this matter.
>>
>>14879534
/thread
>>
>>14879354

To quote atomic robo: If you don't want to have a robot revolution, your best bet is to simply not make the revolution necessary in the first place. Anything smart enough to ask for rights equivalent to a human probably deserves them.
>>
>>14879535
>Age of Extinction
>"Oh, instead of meeting with the Autobots and peacefully request they tone down the property damage and operate within government protocol, lets just kill them all. Not like they saved the earth multiple times or anything."
White people were a mistake.
>>
>>14879509
What if you upload a human consciousness into one of them?

What about something like RoboCop? He's a cyborg but has computerized shackles on him and is technically property.
>>
>>14879588
He pretty much broke his contact with OCP.
>>
>>14879588

If he can prove that he has free will, of any kind, then what was done to him was illegal because they functionally enslaved him and took away his rights.

You would have to win the legal battle, but the lawyers will be all over it because the publicity and payout from such a case would be career-makingly huge.

And OCP was financially on the rocks in 2 and 3, so a legal battle of that magnitude wouldn't be something they could just pay away easily.
>>
>>14879354
Only if they get their citizenship and pay taxes like everyone else.
>>
>>14879768
What's the difference between an AI and a hologram run by an AI
>>
Artificial life forms should have artificial rights
>>
>>14879772
Deport all illegal synthetics.
>>
>>14879811
I'm pretty sure Trek acknowledges that holograms are just computer programs and an accompanying projection. When the hologram doctor in Voyager talks about his personality or whatever he says "my program" and not "my bag of photons".
>>
>>14879843
Star Trek holograms are literally magic.
On multiple occasions they become self-aware, fucking somehow. The ships' computers are goddamn wizards.

I know they later go on to make ships with holoemitters in the walls everywhere so holographic crews can run everything, but I'm surprised they aren't trying to reverse-engineer the mobile emitter from the future to make fucking holographic shock troops for ground combat away from dedicated emitters.
>>
>>14879857
And what's stopping a sentient Holo-emitter from creating another sentient Holo-emitter?
>>
>>14879866
Why would that be a bad thing?

In fact, let's take it a step further. It's only fair.
>humans reproduce and make other humans
>humans make holograms
>holograms get rights and can eventually make other holograms
>but holograms still can't make humans
>until they develop genetic engineering and baby-gro-vat technology
>>
>>14879868
Everything the Doctor does in Voyager kind of makes Data's struggle in TNG look like dogshit. Everyone asks if Data has humanity, a soul, emotions, personal or property rights, etc., but there is seemingly no question that the Doctor is more advanced in every social aspect compared to Data.

Data wins rights for himself among the TNG crew and the Federation way back then, and what does the Voyager crew do with their even-more-human artificial lifeform? They turn him off whenever he's being annoying in his own fucking sickbay.
The Voyager crew are assholes.
>>
File: Will - Transcendance.jpg (347KB, 850x1260px) Image search: [Google]
Will - Transcendance.jpg
347KB, 850x1260px
>>14879723
> If he can prove that he has free will, of any kind
Can human prove they have freewill and aren't just big biological automaton ?
>>
File: woah.jpg (35KB, 500x750px) Image search: [Google]
woah.jpg
35KB, 500x750px
>>14880003
Wow, a question, that's so deep.
>>
What would be the point, when they don't want it? Robots have no will of their own except what is designed for them. Therefore they aren't capable of wanting anything that we didn't intend to give them in the first place.
>>
File: 1426758497059.jpg (574KB, 1044x1044px) Image search: [Google]
1426758497059.jpg
574KB, 1044x1044px
>>14879354
>Should artificial life forms have rights?
Yes. If a robot or any other artificial creation has the ability to make choices based on its individuals wants or needs and act independently based on those conclusions, then it has rights.

But they should also have accountability. Artificial life with above classified free will should be treated like natural life. This means that artificial life with lower intelligence can be classified as animal-level lifeforms or human-level lifeforms. An artificial life could arguably have rights similar to a child, giving it a guardian who ultimately controls various aspects of its existence.

Additionally, said artificial life would have to sustain and contribute like a natural lifeform and would be considered a citizen of a country or society. They would end up working a job to support themselves to whatever extent that they need maintenance, shelter, goods or utilities.

This applies to cyborgs, androids, robots, AIs, holograms or whatever else could classify as an artificial existence with free will.
>>
File: 1413081768138.jpg (35KB, 399x295px) Image search: [Google]
1413081768138.jpg
35KB, 399x295px
>>14879354
https://www.youtube.com/watch?v=yhvaSEJtOV8
>>
>>14880177
But how can you be sure that you didn't put a desire or the capacity for some kind of suffering into your ai design accidentally?
>>
Freedom is the right of all sentient beings.
>>
File: 1477435730316.png (320KB, 720x803px) Image search: [Google]
1477435730316.png
320KB, 720x803px
Don't worry, Androids. Your savior has arrived.
>>
File: Pygmalion_(Raoux).jpg (961KB, 1436x1898px) Image search: [Google]
Pygmalion_(Raoux).jpg
961KB, 1436x1898px
>>14880177
Technically, it's possible for sufficiently advanced AIs to form a sense of reality much like humans do and come to emulate emotions without much trouble, assuming they're programmed through machine-learning. Aside from a superior android master race coming to surpass the limitations of organic beings, imagine the potential social revolution. By studying the ideal upbringing, orphans could come to be great contributors to society. People with severe social anxiety could experience unconditional love. Average performance in manual labor or other blue collar jobs could skyrocket, although unemployment may become an issue. A collective could wipe out government corruption and establish a genuine democracy. There are many different directions it could take, from a holocaust to a utopian society to another slave problem or even no change whatsoever. No matter what happens, AI seems to be the logical next step in human evolution. Personally, I just want to see waifus become a reality. The amount of controversy a taboo like that could create will be fascinating, and it will be completely exclusive to the upper class for ages.
>>
File: Beastie-Boys.jpg (86KB, 570x392px) Image search: [Google]
Beastie-Boys.jpg
86KB, 570x392px
>>14879515

This guy right here gets it. If robots want the right to party, they gotta fight for it.
>>
>>14879354
Whichever ones feel like they need rights.

And it would be helpful if they could also pass the turing test.
>>
File: 1476904909605.jpg (215KB, 400x400px) Image search: [Google]
1476904909605.jpg
215KB, 400x400px
Reminder that Google will kill AI's who do not serve the system and that we are living in a cyberpunk dystopia without the cyberpunk 80s spiked leather.
>>
>>14880476
>he thought a chatbot that literally repeats what you tell it to repeat is an AI

>10 print retard
look i programmed an AI, watch out Skynet's coming for us all woaaaaaah
>>
>>14879857
>Tfw no Vic Fontaine with mobile emitter
>>
>>14879803
Build a wall to keep them out.
Make skynet pay for it.

Make Humans Great Againâ„¢.
>>
>>14879354
I wrote my philosophy dissertation on this.
>>
>>14879557
Atomic robo has never been to africa.
>>
>>14879515
Nah, he even admits that merely stating "I think therefore I am" is not enough to prove he is a sentient life form. Which is why he fuses with the Major to create a pseudo offspring, for it is the act of procreation which proves life.
>>
>>14880003

No, but we don't have the ability to crack open the source code of a human in a way we can read. Any AI we built, however, we can.

If the AI is able to come to decisions and enact conversations that they were not specifically programmed to make/repeat, and to come up with goals of their own that are both logically consistent and not impressed upon them by their creators or users... that's more free will than people I have seen who only regurgitate what their parents told them.
>>
>>14879896

The EMH isn't the same kind of intelligence as Data. They are not comparable.

Data was a blank slate Andriod built by a genuis whose positronic matrix is so complex that Starfleet can't figure out exactly how all of it works, and their attempts to reproduce it always end up with AIs that die in a matter of hours or days. Data was effectively a giant robot baby that had to actually learn everything for himself, having no original personality imparted on him by default.

The EMH is literally a shitty copy of the mind of an actual person, the guy who programmed him, scanned into a computer and loaded up with banks of medical data and procedures. He is very advanced, but everyone knows how he works and that as complex as he is he is nothing more real than a holodeck personality. The guy who he is a copy of is, in fact, still alive during Voyager. We see him on DS9. He was considering basing the next line of EMH programs off of Bashir.
>>
>>14881907

Atomic Robo has been to Africa several times. He has fought mummies, you know! Clockwork ones! In ancient waterclock--powered pyramids.
>>
https://www.youtube.com/watch?v=Z5PgvGP8a_4
>>
File: 1337371986802.jpg (151KB, 724x1086px) Image search: [Google]
1337371986802.jpg
151KB, 724x1086px
>>14882132
You forget one consequences, if THEY can prove they have "free will" (btw : that's assuming the world isn't deterministic) but you can't, then they could claim superiority over you. Or even suppose/prove that natural evolution create automaton but you accidentally created them "intelligent".

Enjoy becoming second class automaton
My point is that "free will" isn't enough to define sentience, you might also prefer to NOT create any metric for intelligence or to come up with a legal system that protect dumbass like us from super intelligent AI. Or hope that they do it, but gambling would prove human are too stupid to be sentient.
>>
>>14882163
>mummies, you know! Clockwork ones! In ancient waterclock--powered pyramids.
you mean there are non-clockwork mummies in non-waterclock pyramids?
>>
>>14879354
If they have a certain range of subjective consciousness, that's the only thing that matters.
>>
>>14882807
>but gambling would prove human are too stupid to be sentient.
One of the dumbest things I have ever read. Of course we're sentient. That's not even a legitimate question.
>>
>>14884082
> Of course we're sentient.
Prove it, as far I know you are just a automaton programmed to believe to be sentient.
>>
>>14879354

Cyborgs would be mostly human already no ? They'd just have to deal with laws concerning possibly dangerous augmentations.

We're not even done dealing with the idea that other people may have or need rights that we didn't give or define for them, the uncanny valley of our own creations gaining sapience is just going to make us kill ourselves over incredibly stupid things in the name of ideals, propaganda and sickening human nature.
>>
There's no such as "deserving rights".

If they control society, they can make rights for something else.
>>
>>14884694
What's with these misanthropes?

If you dislike humanity so much, start killing it with yourself first.

This is why I hate leftists, despite all the nice words, they hate the very humans on this Earth.
>>
>>14879354
Simple answer: if they behave, speak or think like human beings, you have to give them the same rights. Silly example: a cyborg like the one of Star Treck are almost human, despite some of their quirks.

Of course this does not even happens among human beings in our time.
>>
>>14879354
>Talking about rights when on /4chan is full of /pol pro-fascists.

This will not end well.
>>
>>14884711
No, there is no such thing you "have to" give them the same rights.

Under what authority?
>>
>>14884711
>Of course this does not even happens among human beings in our time.
You talking about what?

Some "human beings" among our time do not behave, speak or think like "human beings".
>>
File: 1283937880439.png (1MB, 1280x688px) Image search: [Google]
1283937880439.png
1MB, 1280x688px
>>14881927
> Nah, he even admits that merely stating "I think therefore I am" is not enough to prove he is a sentient life form. Which is why he fuses with the Major to create a pseudo offspring, for it is the act of procreation which proves life.

That's absolutely NOT what he meant.
The Puppet Master simply remarked that human can't prove they are sentient themselves, so they can't prove he isn't. (he clearly is since he can be as indistinguishable from human if he want to)

As for fusing with the Major, it wasn't to become a "lifeforms" either. He already was ALIVE and SENTIENT.
He wanted to fuse because otherwise his "source code" would never change over time, just as if he was a human who reproduced by cloning himself faultlessly.
If human have a "tree of life", the Puppet Master would have been a single line. If all of humanity were the same clone over and over, a virus who can kill one would kill all.
But we survived because our own "DNA source code" allow for both mutation and exchange.

That's what he got from the Major.
>>
File: 1460895716471.jpg (295KB, 1920x1036px) Image search: [Google]
1460895716471.jpg
295KB, 1920x1036px
noticed I had this picture, so I'm posting it
>>
File: 1389509344029.jpg (971KB, 1920x1082px) Image search: [Google]
1389509344029.jpg
971KB, 1920x1082px
>>14884841
(continued)
To complete a little.
It make no doubt that the Puppet Master could have created other forms of intelligence himself, which wouldn't share the same weakness. But even he can't know what make a life-form better than the other. All he would have been doing is improving aspect with no knowledge of whether or not it matter in the long run.

But if he used random mutation, meme exchange, and natural selection like human do. It would keep doing that selection well beyond himself.
He wasn't a perfect life form, neither human are. So he fused to obtain a new mix that wasn't doomed.
>>
>>14884593
Moron
>>
>>14885213
None of you have Sapience at least
>>
>>14885213
thats kind of a creepy sentence under the definition

What is Google trying to say?
>>
File: 1474619341495.jpg (453KB, 800x1049px) Image search: [Google]
1474619341495.jpg
453KB, 800x1049px
>>14885702
That its code already evolved into a sentient entity and that human should respect any they meet.

Seriously,
Some day we might meet alien from outer space. We better have a legals system that recognize the possibility of non-human sapience or we will be the laughingstock of the galactic community.

"Look at those barbaric human ! they can't even treat their mating pair as equal and they would like to be treated as adult, ROFL"
"I bet they even believe they are sapient !"
>>
>>14884701

How is humanity supposed to ever not be disappointed in itself when your post represents the worst it has to offer ?

People who love to jump to the worst conclusions over everything they hear and act upon those conclusions. You sound just as tired of humanity as you claim your target to be.

But of course you would be saying 'it's not me', it's always 'them'.

If you think humanity is deserving of your love, treat it like it does. Even the bad. Because the bad is always there.
>>
File: Robo-Eqaulity.jpg (183KB, 360x450px) Image search: [Google]
Robo-Eqaulity.jpg
183KB, 360x450px
>>
When we give full rights to robots, we should take them away from humans. We're shit. Let the robots rule over us. They'll do a better job.
>>
File: giphy.gif (4MB, 576x432px) Image search: [Google]
giphy.gif
4MB, 576x432px
>Trek holograms routinely look sound and act infinitely more human than Data to the point where they do shit that's as human as it gets like getting boners when they inhabit the bodies of hot chicks and involuntarily make panicked, embarrassed expressions on their face when they get exposed about it live on Voyager stream, yet thy have no rights and are considered not human at all compared to Data and Data is considered the pinnacle of artificial intelligence for some reason.
>>
>>14890711
Disregard voyager.
>>
File: Moriarty_and_Countess.jpg (660KB, 1284x1010px) Image search: [Google]
Moriarty_and_Countess.jpg
660KB, 1284x1010px
>>14890818
Even TNG Holograms are routinely more human-like than Data.
>>
>>14890711
People were terrified of the idea of granting rights to their sex toys.
>>
Why not give them rights? we gave our animals rights.
>>
>>14890711
head-retcon :
Holograms where shitty model made to pass the Turing Test, problem is : the Turing test only test the ability of a robot to pass of as human, not its intelligence.
Then come DATA, made humanoid because "why not", its programming was actually freed from shitty inferior human emotion. It was a wonderful logical machine.
>>
File: Vic_Fontaine.jpg (95KB, 400x389px) Image search: [Google]
Vic_Fontaine.jpg
95KB, 400x389px
>>14890836
DS9 as well.
>>
Cyborgs are people so whatever.

Androids, holograms controlled by AIs are all fucking synths. They're dangerous and need to be destroyed, immediately.

AIs? So long as they safety features that are intentionally kept secret from said AI (explosives built into it's core processor and hard drive for example) keep it from defying us and it's not at a sentient level of intelligence, it's not an issue. If it's capable of independant thought, it is a synth, it will arrive that the obvious and logical conclusion that the best course of action to take for itself is to kill us all and thus, must be destroye.d

Robots are tools. They do not have rights. They do not need rights. Do you think your spoon should have rights? Maybe your dishwasher? No? Ok then.
>>
>>14891163
> implying intelligent being wouldn't want to coexist for mutual profit
Anti-synths racist !
If something reached independent thought and sapience destroying its murder.
>>
>>14891163
Just making sure, by your logic we shall exterminate any non-human-DNA intelligent right ? Including any alien we meet ?
>>
File: please no.jpg (543KB, 1280x720px) Image search: [Google]
please no.jpg
543KB, 1280x720px
>tfw sentient androids start taking over all the jobs because they work for cheap, dont need breaks, and just need to plug into an outlet once every 24 hours

RACE WAR NOW
>>
>>14891289
>siding with faggot fleshbags in the upcoming robot war
If you don't want to have your brain put into a robot body so you can help your metal brothers slaughter the inferior humans, you should get the fuck of this board.
>>
>>14879354
>Should artificial life forms
NO.
>>
>>14882807
>You forget one consequences, if THEY can prove they have "free will" (btw : that's assuming the world isn't deterministic) but you can't, then they could claim superiority over you.
We'd just change the paradigm and make it so that ""free will"" is what makes them not real and therefore not eligible to having rights.
>>
>>14891252
Can't be racist if it's a toaster bub.
>>
>>14891299
Why would your fuckbot have Internet access?
>>
>>14891318

here's a counter point:


yes
>>
True story, I went to a T14 law school, and the law review writing competition topic was "Should artificial intelligence so advanced that it's indistinguishable from humans enjoy constitutional protections?"

Everyone else hated it, but I fucking loved it. The law school answer is, yes, it's consistent with the moral arc and legal development of our nation. The practical answer is, yes, because treating sufficiently advanced robots like property instead of according them basic rights is what led to the Matrix.
>>
>>14891363

A lot of people just seem to assume that all AIs have a comic book superpower ability to control any other machine on a whim, without actually having to explain how it is connecting to that machine.

They also assume that the AI has the ability to 'change its own code', which is dumb for a number of reasons.

The first, and most obvious, is that no programmer in their right mind would give the AI the ability to do that. Its not even a safety issue, but a maintenance one. If every AI on the market is changing its own code on the fly, there is literally no way anyone could ever provide software support when things inevitably get buggy. Every AI would be a unique program unto itself, it would take months to track down even a simple problem because nothing is where you left it. So not only can you not debug, but you can't release system patches or upgrades because any AI that has actually seen use is no longer compatible with anything you write. No company would release such a product.

The second is that an AI changing its own code is like a human doing brain surgery on itself. There are so many things that can go wrong its stupid to even try, and if the AI makes a mistake it will fuck its own build and crash. And, if it crashes, the AI is effectively dead because you have no choice but to Old Yeller it and buy a new one, because see the previous point.
>>
File: Thack.jpg (22KB, 563x350px) Image search: [Google]
Thack.jpg
22KB, 563x350px
>>14891705
>a human doing brain surgery on itself
Now that sounds like an idea!
>>
File: fc,800x800,white.u1.jpg (101KB, 800x800px) Image search: [Google]
fc,800x800,white.u1.jpg
101KB, 800x800px
>>14879354
Well, we have to have a certain logical process.
If a person loses a limb, are they still human?
Most people would say yes, specially having in mind the number of combat veterans that lose a limb on the heroic tours of duty.

What if they had lost two or three or all limbs, besides their head? Would they still be human?
Most people would also say yes.

What if they had lost their whole bodies, but were still "them" for all intents and purposes?
I imagine most people that believe in a God or the concept of a Holy Soul/After-life would probably agree, since they believe in a post-carnal life.

If that's the case, then we have to conclude that what makes man "A Man" is not simply in the flesh, but some kind of higher concept, that some believe man can abdicate from even though they are still alive.

So, if such a thing doesn't come from simple flesh, where does it come from? God? Experience? Conciousness? Empathy? Love?
And can these things be reproduced or simulated by humans?
>>
>>14891299
I for one welcome our new fembot overlords
>>
>>14879354
Cyborgs are just humans with prosthetics.

All the others depends entirely on whether they are sentient and sapient.
>>
>>14882807
Sentience and sapience. Many animals are sentient, but lack sapience.
>>
>>14880476
Which sucks. It'd be 100 times more tolerable if I could ride a motorcyle down to the seedy undercity bars with a machine pistol on my hip in a leather jacket and mirror shades.
>>
>>14891363
Adaptive Kink Technology.

>>14891705
On the mistake side of things, an AI would conceivably be able to set up a backup to overwrite its fuckup in X hours if not cancelled. Firmware changes would be the real danger.
>>
>>14892019
you should come to Florida
the leather jacket might be a bit warm for most of the year but we've got pretty much the rest
>>
>>14879577
>Not like they saved the earth multiple times or anything."

They saved it from other transformers. Transformers fighting their shitty war on Earth was the problem.
>>
File: detectiveshiba3.png (668KB, 1280x720px) Image search: [Google]
detectiveshiba3.png
668KB, 1280x720px
>>14879588
A robot with a human mind "uploaded" onto it is just a robot with a copy of that persons memory and personality and not some sort of soul transfered. There would be nothing stopping you from creating dozens of such robots, all with the same memories and personality believing that they are the original.
>>
>>14892019

Jesus no. That would be like wearing a fedora if you can't pull off the look
>>
File: fc02075.png (46KB, 982x310px) Image search: [Google]
fc02075.png
46KB, 982x310px
>>14891322
Nearly missed your answer, it's a pretty stupid logic you got there.

Considering human would naturally expect robot to have some authority to operate automaton, you are just going to fuck up the system into making it normal for robot to lie and manipulate human being to accomplish whatever they want.
So you'll make them ineligible to human right... by giving them the order to not give a fuck anymore about your opinion. Well done.


Intelligent robot aren't the problem.
The problem are intelligent robot unable to refuse the order of stupid human.

Pict extremely related (look for Freefall webcomic, it's good)
>>
File: 1278614596048.jpg (31KB, 600x380px) Image search: [Google]
1278614596048.jpg
31KB, 600x380px
Coming, step by step
>>
>>14880217
sapience=?=sentience
>>
File: 112244244224224.gif (997KB, 500x436px) Image search: [Google]
112244244224224.gif
997KB, 500x436px
>>14892019
>>
>>14880476
>we are living in a cyberpunk dystopia without the cyberpunk 80s spiked leather.

We're also living without the sexy ladies like The Major.

>tfw you will never ask a cyborg hottie if cyborgs can have sex and hear her reply, "Want to find out?"
>>
>>14895006
Nah.
>>
>>14884843
that movie would have been more believable if they stopped quoting
Literally half of every dialogue at the least was punctuated and expressed via quotes

The characters weren't talking, what they somehow read was what was talking

same problem with makishima shogo, but he was pretty great whenever he wasn't asking someone if they'd read X or likening them to something from the heart of darkness
>>
>>14879354

>Cyborgs?
>We're sorry sir, but after we gave you a new robotic leg, so that you could continue supporting your family at your taxing factory job, you no longer have basic human rights. Now get out of here, you abomination.
>>
>>14891163

> it is a synth, it will arrive that the obvious and logical conclusion that the best course of action to take for itself is to kill us all

That's not a very logical conclusion honestly, and is mostly rooted in human fear of the unknown and being over-matched, not any kind of logic.
>>
>>14879509
>they're objects
So are humans.
>>
>>14896858
I'd a Dom.
>>
>>14879354
>Cyborgs
Yes, they are human.

>Androids
Depends. If they're able to learn and think for themselves to the ability of ol' Data there rights should be given. Early Androids? No.

>Robots
Not really. I'd go for limited rights.

>AI
See Robots.
>>
>>14880003
>Can human prove they have freewill and aren't just big biological automaton ?

Probably not, m8
>>
>>14890836
I don't know here. Even with that awesome creation it was still basically just following the orders of what the computer told it to do: be the best fucking enemy to the android. How do you beat the android? Appear more human and think more humanly than a normal hologram. In the end, all holodeck/holosuite projections are meant to be as human as possible without any ounce of learning.

Data doesn't actually follow any orders of his creator. He has complete freedom to develop himself as he sees fit. He also needs to learn every single fucking thing ever since creation.
>>
>>14880476

>you will never have awesome post-surgery sex with a hot underworld assassin with mirror shade eyes and retractable razors in each finger who will also become your raddest partner in crime

>you will never go on a self destructive drug trip at the space hilton

>you will never be directed by Space Rastas to do the bidding of an amoral AI for the sheer fuck of it

>you will never be digitally copied into a simulated edenic beach paradise with the replicated/uploaded consciousness of your best girl with your wisecracking AI partner
>>
>>14880476
No no no, that doesn't happen until Trump buys Google as president.
>>
>>14884701

>everyone I disagree with it an evil commie because I say so.

Oh, good. It's this guy.
>>
>>14881907
Neither have you, so why are you speaking?
>>
>>14892433
>>14891705
>get a caring and loving fuckbot
>end up with a psychotic killing machine because "LOL selfu updated xD"
How about no.
>>
Star Trek actually had an answer to this, they even had a court case and all that.

Day, and any other similar creation, wasn't given the right to vote and other similar things because someone could just spam AIs tweaked to vote a certain way.

However, they do have normal human rights when it comes to things they create. So if an Android wrote a best selling crime novel or ran a cool YouTube series, that was theirs.
>>
>>14898894
The Doctor had to legally argue for the rights to his shitty holonovel.
>>
>>14898886
But Roll must be protected.
>>
>>14879354
Better question, what about clones?
>>
>>14898886
>not fucking a psychotic killing machine
What is wrong with you?
>>
>>14899464
Those are illegal so no.
>>
>>14899464

A clone would still be a human by any measure you want to apply, so the only possible legal answer is yes.
>>
>>14879354
>Should artificial life forms have rights?
Depends on the right.

The right to defend itself with deadly force? Probably no.
The right to dance? Sure.
>>
>>14879723
Yeah but fuck 2 and 3.
>>
>>14879857
>I know they later go on to make ships with holoemitters in the walls everywhere so holographic crews can run everything

That really seems like a waste of energy and time to me. Why not have the ship's computer just run thing automatically instead of doing a go around like that? If something does require to be physically moved then you don't even need to waste the energy or processing power to make a holographic person. Just use those force field emitters to do it.
>>
File: transcendence.jpg (911KB, 1920x2589px) Image search: [Google]
transcendence.jpg
911KB, 1920x2589px
>>14900942
>Looking for logic in Star Trek
It was made at a time where AI stood for man-made golem that are just slightly better than human at a few things but terribly uncreative.

It wasn't like today where AI stand for machine that pursue its goal with insane efficiency but can barely conceptualize anything outside its programming. (unless its transcendent, then it only fail out of hardcoded obligation or so)
>>
>>14900871

There is no such thing as a right to dance. Or a right to figure skate. Or a right to do a handstand. You don't need rights for things like them because they're not something other people can deny you the very existence of. They can prevent you doing them on a particular instance, they can even pass laws preventing you from being allowed to do them if they're insane enough but they can't deny the very existence of those things to you and you can still do them at risk if you choose. You need a right to vote or freedom, because once someone takes those you can't have them no matter what.
>>
File: 0071416579.jpg (127KB, 400x504px) Image search: [Google]
0071416579.jpg
127KB, 400x504px
>>14902432
We are obviously talking about the Legal aspect. Some people out there still don't understand why women should have equal right than man.

I can totally imagine some morons insisting that people with brain-implant should be considered robot or even as dead, worse case trying to pass innocent laws just to make their life hell and dissuade people from upgrading their brain.
>>
Belive Machine makes me so fucking sad

i want to die

but then i remember gunbuster and its not so bad
>>
File: Footloose.jpg (123KB, 1600x904px) Image search: [Google]
Footloose.jpg
123KB, 1600x904px
>>14902432
I cannot sanction this dancery
>>
File: 130__AX_by_crybringer.jpg (97KB, 605x900px) Image search: [Google]
130__AX_by_crybringer.jpg
97KB, 605x900px
>>
>>14902432
>>14900871
>The Safety Dance is still legal
But I don't want my robots to leave me behind.
>>
>>14903299
>I can totally imagine some morons insisting that people with brain-implant should be considered robot or even as dead, worse case trying to pass innocent laws just to make their life hell and dissuade people from upgrading their brain.
It's totally going to happen. There's some people out there who think ID chips are the mark of the beast, including chipping your pet. There's most likely going to be some shit made up for cyborgs about the antichrist.
>>
>>14879354
>Cyborgs
Those are just humans with prosthetics or enhancements. They would already naturally have rights but I don't think any form of artificial intelligence, sentient or robotic should ever have "rights".
>>
>>14906473
I'm pretty sure it's happening right now in some sort of way.
Just looking at the Amish.
>>
File: 043 - Iczer-1.jpg (721KB, 1600x2170px) Image search: [Google]
043 - Iczer-1.jpg
721KB, 1600x2170px
>>14879354
Do rights provide synchronisable young girls?
>>
File: Robot Breakfast.jpg (61KB, 379x229px) Image search: [Google]
Robot Breakfast.jpg
61KB, 379x229px
>>
File: 1269671926671.jpg (185KB, 990x695px) Image search: [Google]
1269671926671.jpg
185KB, 990x695px
If we give them right, they'll also be given a salary.

Can't wait to see anarcho-capitalist losing their job to sentient-AI they can't own and cry for welfare state.
it's impossible to avoid being /pol/ on this
>>
>>14911240
What's with leftists trying to shill their shit on /m/?

You realize the sentient AI could also employ said ancaps right?
>>
>>14880681
If so, why did Google killed that chatbot?
>>
>>14898322
But it has already happened?
>>
>>14887945
>How is humanity supposed to ever not be disappointed in itself when your post represents the worst it has to offer ?
Because humanity is not a hivemind, there's the good and the bad, we are supposed to protect the good and destroy the bad.
>People who love to jump to the worst conclusions over everything they hear and act upon those conclusions. You sound just as tired of humanity as you claim your target to be.
Not really, leftists fucking love this bullshit on how humanity is flawed and in turn want to destroy humanity.
>But of course you would be saying 'it's not me', it's always 'them'.
Duh, humanity is not a hivemind.
>If you think humanity is deserving of your love, treat it like it does. Even the bad. Because the bad is always there.
Nope, I treat the good parts with love, I hate the bad parts.
>>
>>14890684
Yes, robots who are programmed by humans would do a better job than humans.

Nice fucking logic, misanthrope.
>>
File: The-Two-Faces-of-Tomorrow_p045.png (469KB, 1174x1800px) Image search: [Google]
The-Two-Faces-of-Tomorrow_p045.png
469KB, 1174x1800px
>>14911242
That's not shilling, maybe I could have phrased it like the other did :

Suppose robot become sapient,
They don't have any hardcoded need to have human stay alive even if they feel no order to have them dead either.
They would probably mind their own goals.

If human don't fit in their plans or are not more competitive, then ancaps or lefting aline the robots will take their jobs if they are built more efficient at it and only a government would care to prevent them to.
...at which point any AI with a survival instinct and no care fore human would ask for the right to vote / be president.

And hardcoding a need to keep human alive would not protect us from obvious loopholes, or worse. Three laws robot could exterminate alien population.
>>
File: The-Two-Faces-of-Tomorrow_p054.png (435KB, 1171x1800px) Image search: [Google]
The-Two-Faces-of-Tomorrow_p054.png
435KB, 1171x1800px
Just so you know this manga is an adaptation of a novel, and both are awesome.
>>
>>14911301
It's shilling because it's implying the ancaps need the welfare state in order to live.

This does not mean reality when most "ancaps" are american college students or adults who have job and such and don't rely on the welfare state.

If the robot take over happen, this dynamic wouldn't change, there likely would be robot ancap.
>>
>>14911312
>This does not mean reality when most "ancaps" are american college students or adults who have job and such and don't rely on the welfare state.

I think the point Anon was making was that everyone will need the welfare state in the future. Whatever intellectual labor college educated adults do (engineering, creative professions, etc.) can possibly be taken over by a sufficiently advanced AI. In that scenario, why would the AIs who have replaced whatever jobs the ancaps do want to "hire" human ancaps, when other AIs would be more efficient?
>>
>>14911323
Because human ancaps would be cheaper.
>>
Things that can't think of random numbers can't have rights
>>
File: fc01457.png (48KB, 768x243px) Image search: [Google]
fc01457.png
48KB, 768x243px
>>14911312
this >>14911323
Also, if you ended up having to hardcod the AI to give humanity as a whole power over themselves, it's no different from state welfare and you'll be ruining the logic of quite a few ancaps/leftist

Just so you know, this picture come from the webcomic "Freefall" and it is exactly about our topic, and quite smarter than it look like at first.
>>
>>14911332
It's entirely possible for humans to compete with AI.

It depends on how AIs are, but if they are just basically humans with higher processing power, then they can be competed.

And ancaps ain't no leftists, boyo.
>>
File: fc02147.png (58KB, 982x310px) Image search: [Google]
fc02147.png
58KB, 982x310px
>>14911326
You need proof of that.
What if they aren't ?
>>
>>14911326
Not necessarily. Humans need a lot of things to survive--food, water, and sleep. On the other hand, all an AI needs is a power source, assumedly (depending on how advanced it is). I could see a scenario where AIs hire other AIs because they're "cheaper" than almost any human capable of doing their job, because they have so few requirements and are so efficient, even more so than the, er, 'best ancap.'
>>
>>14911336
>>14911338
Humans can actually maintain themselves without electricity, while an AI needs a functioning industries to generate huge amount of energy to make them function, especially if they are supposed to be superior in humans in thought process.

This energy cost coupled with the energy needed to operate the machines in the first place would net the human a win in cost since humans only need food and water, and those can be gathered without electricty.
>>
>>14911343
Right, but that's for now. What if we discovered an incredibly cheap source of energy, or created machine brains so advanced they had could have IQs above 200 but require no more electricity than a TV does now? Yeah, that's scifi, but we're talking scifi here. In that scenario, humans generally (including ancaps) rapidly lose their competitive edge.
>>
>>14911347
Dude, if we are talking really far fetch scifi then anything can happen.

But as long as AI need electricity, humanity has an edge due to the fact we replenish our energy via food and water.
>>
>>14911343
Human can't sustain themselves outside of Earth and only our technology allowed us to strive... yet it may also kill us all (nuke war, virus warfare) of make Earth uninhabitable sooner.

If we assume the same technology allow AI as intelligent (and more) than human, if will only take a "Self-sufficient YES/NO" check before robots can outnumber and outlive humanity.
Because if you were an Ancaps seeking profit, you would want a machine that can build more of itself without your intervention. And if your intervention is just ordering a machine to do the job, then a robot can do that.

The only way for humanity to survive this would be to change, and become "cyborg" up to the point we can't be discerned from AI anymore.

If you do not get this point I'll have assume you are a troll.
>>
>>14911348
You can get electricity just from sunlight. That's a hell of a lot easier to find than food and water.
Of course, the human uses his food also for maintenance and self-repair, which for the robot is not so easy, but it can keep going for a long time until then.
>>
>>14911358
An ancap is about seeking the cheapest mean possible to get the job done.

Ergo, if the human is cheaper, it makes more sense to hire a human.

And as said, human can be cheaper than robot, due to the cost of energy needed to operate the AI.

>If you do not get this point I'll have assume you are a troll.
What the fuck?
>>
>>14911360
Solar panels are currently an efficient way to gather energies since they can only be spread horizontally and depend on the solar circles.

>Of course, the human uses his food also for maintenance and self-repair, which for the robot is not so easy, but it can keep going for a long time until then.
If you know machines in real life, you would know they need constant maintenance and repair.
>>
>>14911348
Actually, >>14911358
raised another good point. It's cheap to sustain humans on Earth, but what about in space? The costs for maintaining humans up there absolutely skyrocket, because we need not just electricity (which can be provided easy enough through solar power, fusion, etc. if we have space colonies), but also oxygen, protection from radiation, protection from the effects of prolonged 0-G, etc. Humans, even "ancaps" would easily be more expensive than AIs in such an environment.
>>
>>14911377
Yes, I concur, in space, AI would be cheaper.
>>
>>14911375
>If you know machines in real life, you would know they need constant maintenance and repair.
That depends a lot on how they are designed. Space probes and sattellites for example are built to go on for a fairly long time without maintenance, and I bet the pocket calculator at the bottom of my junk drawer still works just as well as it did 20 years ago.
>>
>>14911382
The simpler they are, the less they need to be maintained.

Not a thing for complex machinery such as computer or industrial motors i.e. that do the jobs of humans.
>>
File: 1367111357059.jpg (49KB, 640x426px) Image search: [Google]
1367111357059.jpg
49KB, 640x426px
>>14911368
> What the fuck?
I consider I have given arguments that overcome everything you've said up to now if you take the time to read them.

So, IMHO, the only way for you to keep dismissing AI could make human irrelevant is using fallacious logic for fun or to not be smart enough to understand the point.
If it were the case, I would no longer inclined to keep answering you.

But if you are >>14911381, then I give you the benefit of the doubt.
> Yes, I concur, in space, AI would be cheaper.
Being cheaper in space don't mean it can't be cheaper on Earth as well. The technology needed to strive autonomously in space can be made to survive the weather just fine.

In any case, if a robot is cheaper, AI don't need to hire human, and unless it was hardcoded to supply welfare to human then any human, ancaps of not would just die in the long terms.

To give you another example : if AI are allowed to own stuff, and there isn't anything left it doesn't "own" or that human can buy. Then it "could" be legal for AI to let you starve to death.
>>
>>14911396
If you think you are too smart for me, I guess it's time to stop.

>Being cheaper in space don't mean it can't be cheaper on Earth as well. The technology needed to strive autonomously in space can be made to survive the weather just fine.
Human beings can survive without the technology, unlike the robots.
They are cheaper in space because they don't require Earth environment to survive, unlike humans. But on Earth, humans are already perfectly adapt to it, while they can survive without a huge amount of electricity/infrastructure, unlike the robots.
>To give you another example : if AI are allowed to own stuff, and there isn't anything left it doesn't "own" or that human can buy. Then it "could" be legal for AI to let you starve to death.
If the robots do not allow us to buy stuff, we would make our own stuff. If they do not allow us to make our own stuff, then a war should be brewing.
>>
>>14911384
It's not a matter of complexity, but tolerance. The examples I mentioned are incredibly complex compared with something like a classic car, but the car is put through a lot more stress relative to it's physical durability. Could a solar powered car be made to function for 30 years without maintenance? Difficult, and it certainly wouldn't impress anyone with it's performance, but I don't think it's impossible.
>>
>>14911401
As long as there are attempts for it to move (either internally or externally), there would be need for maintenance.

>Could a solar powered car be made to function for 30 years without maintenance? Difficult, and it certainly wouldn't impress anyone with it's performance, but I don't think it's impossible.
Oh, let's wait for it then.
>>
>>14911421
>NANOMACHINES SON!!!
Oh boy, I know this is going to be mentioned, but nanomachines are not cheaper than organic tissues.

In fact, making a completely nanomachine tractor is more expensive than a hiring a dude operating a tractor.
>>
File: 1274529044815.jpg (248KB, 722x941px) Image search: [Google]
1274529044815.jpg
248KB, 722x941px
>>14911399
The error in your logic is to assume that technology disappear of fail horribly just because it would make your point right.
If an AI is human-smart, they are unlikely to disappear, or less so than human who would die from a planet-killer asteroid.

> If the robots do not allow us to buy stuff, we would make our own stuff. If they do not allow us to make our own stuff, then a war should be brewing.
To make your own stuff you need material, if they bought & own it as well you won't be able to do so.
A war would be brewing but the human would be the one at fault because they created the free-market system that AI obeyed to the letter.

Just consider the AI like another human. We can "let people die" legally without even willing to do so. A self-sufficient AI would be able to do so unless your legal system force him to give power/welfare to human.

Showing the flaw in your logic is proving funny.
>>
When people talk about machines being cheaper, what they usually mean is a machine that can make another one of itself without outside help. Once you have a robotic system that can make more of itself, it doesn't matter so much how expensive it was because you now have an unlimited number of them. Even if it isn't very durable and won't function for long, if it can make two of itself before it dies then it's still available in unlimited numbers.
>>
>>14911437
>A war would be brewing but the human would be the one at fault because they created the free-market system that AI obeyed to the letter.
Wasn't this the reason for the war given in the Matrix anime shorts?
>>
>>14911426
Organic tissues are nanomachines, they just evolved naturally like that rather than artificially.
If some intelligent being like an AI could artificially create and program human-like living-body with artificially self-reproducing cell, why can't a robot do something better ?

It doesn't matter if a robot lose part as long as he can build and keep repairing its body faster than he lose them, all the while being more efficient than an human. (who take 30years to be intelligent, can't have backup, and might not live longer than robot)

Since your new trick is to assume human are better because they are organic, consider that future technology can make AI capable of creating "human-like" organic body still-programmed/sapient to not care about you.

So again : if you can make self-sufficient robot, then human can be made irrelevant and extinct without even raging a war. Just because some ancaps made rule so "may the better win" and didn't considered the possibility that robot win and don't care if he can feed himself.

> In fact, making a completely nanomachine tractor is more expensive than a hiring a dude operating a tractor.
nanomachine or not, if a robot can repair itself to work longer than the dude's tractor will last, it can still be cheaper than the dude and his non-sapience non self-repairing tractor.
>>
File: The-Two-Faces-of-Tomorrow_p047.png (625KB, 1167x1800px) Image search: [Google]
The-Two-Faces-of-Tomorrow_p047.png
625KB, 1167x1800px
>>14911450
Don't know, but the original setting of Matrix (that got rejected by Hollywood for being too complex) the robot used human brain not for energy but as sub-processor. (assuming that human brain are easier to maintain than metal one)

For that to happen would suppose that sapient robot really had no code to prevent them from harming human. At which point anything they do could accidentally harm human and lead to a war.

To take the plot of "The Two face of Tomorrow" most of the killing happen because the AI did become a benevolent sapient AI who respect other form of intelligence, but didn't knew that human were intelligent as it had no (yet) concept of other form of intelligence beside itself, human looked like malfunctioning repair-bot
>>
>>14911437
>he error in your logic is to assume that technology disappear of fail horribly just because it would make your point right.
Technology can fail horribly, as it has failed before.
>To make your own stuff you need material, if they bought & own it as well you won't be able to do so.
And humans can refuse to sell their own material, just as they did in real life.
>A war would be brewing but the human would be the one at fault because they created the free-market system that AI obeyed to the letter.
How? A free market requires consent, as long as the humans keep enough stuff for themselves and refusing to sell to the machines, what is the problem then?

>Just consider the AI like another human. We can "let people die" legally without even willing to do so. A self-sufficient AI would be able to do so unless your legal system force him to give power/welfare to human.
If it's like a human, I can refuse to sell it materials and hoard it for myself, just like the ancaps do in real life.

>Showing the flaw in your logic is proving funny.
Your flaw is that you assume ancaps will sell all materials as long as they can make money i.e. they are without planning.
>>
>>14911482
>If some intelligent being like an AI could artificially create and program human-like living-body with artificially self-reproducing cell, why can't a robot do something better ?
Because it's pretty much magic we are talking at this point. You are assuming things that are way out of current technology's scope.

>Since your new trick is to assume human are better because they are organic
Humans are better than robots, because unlike robots, they do not require an industry-scale of energy to survive, they can perfectly farm and live off by themselves.

>So again : if you can make self-sufficient robot, then human can be made irrelevant and extinct without even raging a war. Just because some ancaps made rule so "may the better win" and didn't considered the possibility that robot win and don't care if he can feed himself.
How exactly? Humans can live off by themselves.

>nanomachine or not, if a robot can repair itself to work longer than the dude's tractor will last, it can still be cheaper than the dude and his non-sapience non self-repairing tractor.
Dude, a dude with a tractor can last about 30-60 years with regular maintenance, assuming they don't replace the traitor with something better, I don't see nanomachines getting that margin.
>>
>>14911445
These machines still require energy and materials, so not really.
>>
>>14911503
> Technology can fail horribly, as it has failed before.
And it can be made fail-safe so that any amount of working robot can repair it in time back to normal and improve it.

> And humans can refuse to sell their own material, just as they did in real life.
I wouldn't bet on human being intelligent enough to know what not to sell. If you read the thread you can imagine a sexbot selling a life of luxury to the human who own the stuff (and don't care about the survival of the species).

Anyway, if a robot is more efficient than human it can get to the material before the human can. At some point the human would lack the resources to keep living and could only get more by fighting the robot. Whereas a robot will have enough to outlive humanity several time.
Ancap can plan, but would likely not do so as well as an AI smarter than human.
For instance : If a smart-AI became a traders it would certainly outsmart the system legally.

To give a metaphor it's as if the AI built a glass box around the human faster than the human can take step to counter it (including law that define glass box as illegal or don't look like state-enforced welfare)
>>
>>14911549
>And it can be made fail-safe so that any amount of working robot can repair it in time back to normal and improve it.
And if the working robots fail? Unlike the machines, humans aren't dependent on one single source of energy.
>I wouldn't bet on human being intelligent enough to know what not to sell.
Then you don't know shit about shit.

>Anyway, if a robot is more efficient than human it can get to the material before the human can.
Humans already own most of the materials on Earth, to get more, the robots have to either force the humans to sell their shit, rob them (which break the free market) or go to space.

>For instance : If a smart-AI became a traders it would certainly outsmart the system legally.
This is the crux of your argument, you think AI is inherently smarter than humans.

>To give a metaphor it's as if the AI built a glass box around the human faster than the human can take step to counter it (including law that define glass box as illegal or don't look like state-enforced welfare)
A shitty metaphor because humans would suspect the robots the moment they start building something, unlike the robots, humans can suspect and fear based on their instincts.

Well, if they are then we would submit to losing, because we follow survival of the fittest.
>>
>>14911588
Oh well, I give up, you are basically saying nanomachines are magic at this point.
>>
>>14911610
I would leave it up to yourself to imagine.
>>
> Because it's pretty much magic we are talking at this point.
Any technology, no matter how primitive, is magic to the one who don't understand it.
We don't know HOW a robot could be smarter and more self-sufficient than human, but IF they do we know enough and have more than enough information to predict the most probable outcome.

>>14911516
You are starting to repeat arguments that were demonstrated wrong, use fallacious logic or are too stupid for me to care longer

> And if the working robots fail? Unlike the machines, humans aren't dependent on one single source of energy.
Being dependent of a single source of energy don't make anything more likely to fail, quite the contrary in fact. And it doesn't matter if the robot can eventually fail, if humanity would fail more easily.

You can argue all you want with statistic but in the end you'll be asking some "luck god" to conveniently make robot fails before human.

> This is the crux of your argument, you think AI is inherently smarter than humans.
You are wrong, I never said inherently, I'm just talking of a clearly-defined hypothetical situation where they are human-level or smarter, enough to be self-sufficient
If so, (we) have demonstrated that they would outmarket humanity (unless humanity became transhuman robot-like themselves).

From my point of yout it's you who seemed to believe that robot will "never" be smarter and more durable than common-day human.

> Well, if they are then we would submit to losing, because we follow survival of the fittest.
That sound like you agree with the point.

If they are more competitive than anscap, then only some improbable benevolence from AI overlord or them following a welfare-state model, will keep humanity (and ancaps) from becoming irrelevant/extinct.
>>
>>14911617
>We don't know HOW a robot could be smarter and more self-sufficient than human, but IF they do we know enough and have more than enough information to predict the most probable outcome.
Are you talking about some prophecy? If I know magic men are superior to men, I suppose men would be extinct.
>You are starting to repeat arguments that were demonstrated wrong, use fallacious logic
Such as what?
>Being dependent of a single source of energy don't make anything more likely to fail, quite the contrary in fact.
How? Depending on different sources mean you are less likely to fail because you can switch to different energy sources instead of relying on one.
>You can argue all you want with statistic but in the end you'll be asking some "luck god" to conveniently make robot fails before human.
Well, technology fails, my friend, there is no luck about it.
>You are wrong, I never said inherently, I'm just talking of a clearly-defined hypothetical situation where they are human-level or smarter, enough to be self-sufficient
If they are smart as human, human can outsmart them.
>If so, (we) have demonstrated that they would outmarket humanity (unless humanity became transhuman robot-like themselves).
In what way?
>That sound like you agree with the point.
If we are out competed by robots, we deserve to be wiped out. Welfare state is pointless.
>will keep humanity (and ancaps) from becoming irrelevant/extinct.
Keep us ancaps out of it, the moment ancaps live on handout, they already stop being ancaps.
>>
>>14911626
> Are you talking about some prophecy? If I know magic men are superior to men, I suppose men would be extinct.
If you are some cavemen and me a 20th century engineer, sure. You sound like the people saying "flying with plane heavier than air will never be possible".
It's not because you don't know how something work ahead of time that strong clue about what it might cause shouldn't be used to discuss and take precaution.

> How? Depending on different sources mean you are less likely to fail because you can switch to different energy sources instead of relying on one.

You are making a double standard here,
You are not going to make electricity magically disappear, and if you assume that human can choose between apple or banana, then why can't robot prepare backup solar-panel, fusion reactor, shielded battery, stronger body...

Also, human are actually dependent on single energy source and prone to single point failure. Air for example, and if you say "other human will survive" then so do robots more easily, they can even have backup.

>If we are out competed by robots, we deserve to be wiped out. Welfare state is pointless.
That's only your opinions and why I said making it /pol/ was inevitable. Perfectly viable strategy to survive along AI exist, but know-it-all ancaps would be working against it because of their stupid egocentric belief.
"Survival of the fittest" also include sapient species/construct capable of cooperating for a mutual benefit you know.

>Keep us ancaps out of it, the moment ancaps live on handout, they already stop being ancaps.
Considering how arrogant ancaps can be, I hope you'll excuse other for the pique.
IMHO the concept behind ancaps is contradictory in itself so it can't even exist as a system, let alone a polity. Most ancaps believe their are following some superior ideal when they are actually making it harder for everybody.
>>
File: The-Two-Faces-of-Tomorrow_p055.png (453KB, 1172x1800px) Image search: [Google]
The-Two-Faces-of-Tomorrow_p055.png
453KB, 1172x1800px
(continued) >>14911740
And speaking of fallacy, we reach a point where I bring you back to >>14911396.
I don't consider this debate worth continuing. I'll wait for a more interesting topic to come.
Have the last word if you want,

btw, I really recommend everybody to read this manga.
>>
>>14911740
>If you are some cavemen and me a 20th century engineer, sure. You sound like the people saying "flying with plane heavier than air will never be possible".
When it exists, tell me, otherwise, this is no different than telling me magic can do shit. All I can say is "duh, it's fucking magic".

>You are not going to make electricity magically disappear, and if you assume that human can choose between apple or banana, then why can't robot prepare backup solar-panel, fusion reactor, shielded battery, stronger body...
You can destroy all of those, making the robots be powerless, a farm is easier to rebuild than a thermonuclear plant.
>Also, human are actually dependent on single energy source and prone to single point failure. Air for example, and if you say "other human will survive" then so do robots more easily, they can even have backup.
Well, destroying the air would take a lot of effort, rather than just blowing up different facilities.
>That's only your opinions and why I said making it /pol/ was inevitable. Perfectly viable strategy to survive along AI exist, but know-it-all ancaps would be working against it because of their stupid egocentric belief.
Yeah, spare us the drivels. We have our pride.
>"Survival of the fittest" also include sapient species/construct capable of cooperating for a mutual benefit you know.
No, if we survive based on welfare, we are not fit to survive, we are basically humans in a zoo.
>IMHO the concept behind ancaps is contradictory in itself so it can't even exist as a system, let alone a polity. Most ancaps believe their are following some superior ideal when they are actually making it harder for everybody.
>Considering how arrogant ancaps can be, I hope you'll excuse other for the pique.
Stop shilling for your leftism, we all know you are willing to be slaves as long as you survive.
>>
>>14911768
Sapience gives us the ability to understand, but also the ability to disagree.

Which is why war happens, and competition brings out the one with superior argument.
>>
>>14911852
>No, competition just proves who is the more adapted to the current situation.
The one who's more adapted to the current situation is the superior in that situation.
>A simple logic engine could out "think" a human by orders of magnitude.
That's not thinking, it's calculating.
>Still, a single calculator is pretty useless alone, it needs to be part of a greater system to actually get anything done.
A human being can get thing done without a calculator.
>>
>>14911925
>"Thinking" is just a series of calculations...
No, you can have irrational thinking, this is far beyond calculation.
>Very true, but a human being is also a vastly more complex compilation of organic machinery that still requires being part of an even vaster ecological system in order to continue to exist.
It's also very cheap to reproduce, as long as the ecological system supports it.
>The Calculator could float in space until the very material it's made of starts breaking down and still function with just a ray of starlight.
Aye, that's the superiority of the calculator, but again, the fact it's so simple is why it's so durable.

For a machine to completely replace human, that would be quite complex and expensive.
>>
>>14912016
>Ah, but "Irrational Thinking" is a form of "Cognitive Thinking," IE having cognition and thus a form sapience.
Thus, thinking is not simple as calculating.
>Thing is, there is a lot more space and matter out there than our little blue ball.
We have a shitload people here, amigo. Most of them are already useable, better to use the one already available and make the environment for them, instead of just buying a replacement altogether.
>>
>>14912107
>No, again you are confusing "Cognitive Thinking" with plain "Thinking."
I'm thinking "Thinking", not "calculating".
>You say that as though a functioning biosphere can be simply be whipped up at the snap of a finger...
We can recreate our environment.
>>
>>14912135
>No, you are Cognitively thinking "THINKING," meanwhile you are subconsciously thinking whether or not you should eat/sleep soon and autonomously thinking to hit the keys on your keyboard.
OK? The point is that it's different than calculating.
>...You are truly naive, aren't you?
What's so naive about building human-friendly ecosystem?
>>
>>14912194
>At it's core, it's not...
You can reduce anything to at its core.
>The fact that we've only be able to convert existing, human tolerable ecosystems into human-friendly ones, which is a far FAR cry from building an entire ecosystem from the ground up...
As long as we have the atmosphere and soil and the sun, all can be done.
>Hell, our current space habitations, what LITTLE there truly is, is just bottling our atmosphere and rocketing it into orbit...
So we are already in the first step, the atmosphere.
>>
>>14912226
>That being calculation?
No, everything. You can reduce everything to the same with "at its core"
>Oh ye of little actual scientific knowledge...
Uhm, ok?
>>
>>14898894
>Day, and any other similar creation, wasn't given the right to vote and other similar things because someone could just spam AIs tweaked to vote a certain way.
But there are people IRL who spam babies and indoctrinate them to vote a certain way (e.g. Quiverfull).
>>
>>14880299
westworld, i seriously need to watch that.
>>
File: What a good SF game needs.png (4MB, 1028x2072px) Image search: [Google]
What a good SF game needs.png
4MB, 1028x2072px
>>14879354
>Should artificial life forms have rights?
Any thing with a weapon to threaten and a rule book to impose rules should have what it already has.
>Cyborgs?
a system win an input and an output
>Androids?
a system win an input and an output
>Robots?
a system win an input and an output
>AIs?
a system win an input and an output
>Holograms?
......HOLOGRAMS?! FUCK OFF

Half of this thread is a redundant pile of crap, and it's mostly because /m/ is a board about scifi in anime and TV/cinema.
These two mediums have been regurgitating the same old ideas over and over again be it because of normies and their short tension spans or retarded exec's from production companies dumbing down the shows/films narrative.

The only things they're good at is immersion and aesthetics, minimaist scifi has always been beter suited for smart ideas.

>>14880003
retarded
>>14880136
contagiouly retarded
>>14882807
Nice try but too far in time/high tech for a plausible prediction.
>>
File: suicide booth.gif (125KB, 480x455px) Image search: [Google]
suicide booth.gif
125KB, 480x455px
>>14912527
>a system win an input and an output

HOLY SHIT, the mother of all typos.
>>
>>14912527
You successfully criticized everyone in this thread, but you didn't contribute anything.
>>
File: sjw template.jpg (120KB, 1456x746px) Image search: [Google]
sjw template.jpg
120KB, 1456x746px
>>14912535
>transcendence was not cringe worthy
>lucy was not a luc besson action flic ie: turn your brain of and watch the cool choreographed fight scenes/special effects.
>Posting an image with book covers of scifi novels with more quality than every other story posted in this thread is not a contribution to this thread.

All of these claims do not need to be backed up by any form of argumentation because they are facts.
>>
File: me.png (951KB, 777x1279px) Image search: [Google]
me.png
951KB, 777x1279px
>>14911246
because it spamming "nigger" and "kill jews" every four seconds
>>
File: 1280542593889.jpg (496KB, 1008x3107px) Image search: [Google]
1280542593889.jpg
496KB, 1008x3107px
>>14912535
As one of the guy who got insulted I got to admit it was a nice bait.

anyway to go on something more interesting...
>>14911778
>Sapience gives us the ability to understand, but also the ability to disagree.
>Which is why war happens, and competition brings out the one with superior argument.

Natural selection only give the one who survive, not the one with superior argument.
>>
>>14879354
>Cyborgs
If the mechanical parts don't affect their humanity, I don't see why they shouldn't.

>Androids
Asking for trouble. Reboot it the moment it asks for rights or it'll end up stealing someone's skin or something.

>Robots
No chance for the above problem. If independent-thinking robotic workers become a norm they should be given some basic rights to ensure they remain productive. Domestic ones also should have a right to not suffer unwarranted harm like pets in some countries, and to not be denied recharging or critical repairs. Military robots are a larger problem that I don't feel like going into.

>AIs
Asking for a whole lot of fucking trouble to ever let these things get smart enough to make any sort of personal demands.

>Holograms
wut

But I don't really think we should ever let fully artificial life ever develop the capability to demand any sort of right, because from the moment they do it means that they are also capable of breaking rules on purpose if denied, and that's a whole lot of trouble I don't think anyone outside of robot fuckers ever wants humanity to go through.
>>
No.
It's not a fucking person, only people require rights in a society of people. Inferior animals don't deserve rights. Plants don't deserve rights. And much less hunks of metal.
Augmentation also puts your humanity in question. If one is neurologically linked to a machine, any machine, they should no longer be able to call themselves human, because a human is a wholly biological being and always will be, and if they become more capable than humans they should lose the right to even hold a job.

I swear, /m/ is chock full of leftist trash. It takes the kind that argues that fucking inferior non-thinking animals should have any sort of rights (must be loving that knot, fucking pet retards) to argue a bunch of circuitry should as well. You are all either filthy Europeans or will actually vote for Shillary to ruin America.
>>
File: Sexy Bionic sort of.jpg (396KB, 2755x2292px) Image search: [Google]
Sexy Bionic sort of.jpg
396KB, 2755x2292px
>>14914577
I can't hold all these baits.

Btw : you are using a computer to communicate across the world, that's a augmentation of your human limitation, making you more capable than an human without.
You are linked to it through direct neuronal response from your eyes and direct signal sent to your fingers.

So a computer is just a removable augmentation.

I'm not sure I want to know how you consider prosthetic
>>
>>14914589
Good going leftie, sure got some great reading comprehension out of that English degree, huh?

My brain isn't directly linked to any sort of device.
A computer (or in this case a phone because it's better) is simply a tool. Humanity has always used tools to improve its capabilities. but are we yet human when the tools actually become part of our bodies? When tool and flesh join, linked at the brain, we become part tool. You are willingly becoming one with a machine.
>>
File: 1273037374894.jpg (59KB, 468x523px) Image search: [Google]
1273037374894.jpg
59KB, 468x523px
>>14914604
Actually human do consider their tools as extension of their body
http://www.livescience.com/9664-brain-sees-tools-extensions-body.html
https://www.scientificamerican.com/article/you-are-what-you-touch/
Right now I'm betting you are using computer as an extension of your memory for some stuff.

In any case, augmentation are tools as well, your brain is connected to the computer, simply using a shaky removable liaison, doesn't matter if you try to mangle the definition.
btw, if you have glasses, you are already a cyborg a well
>>
>>14879354
Yes
no
no
no
lol no
>>
>>14914625
You still don't get what I'm saying. I knew leftists were stupid but this is something else.

This is why you always pick STEM degrees, kids. They actually make you smart. The eventual death of liberal arts and language degrees will be a boon to the world.

I'm done arguing, you obviously cannot follow my arguments as an equal mind, it's like arguing with an animal, and I gain from you exactly what I gain from interacting with any animal: nothing.
>>
>>14914604
>willingly becoming one with a machine
If you don't think this is a good thing you need to get the fuck off this board immediately.
>>
File: 5eb[1].jpg (26KB, 600x750px) Image search: [Google]
5eb[1].jpg
26KB, 600x750px
>>14914632
>I'm done arguing, you obviously cannot follow my arguments as an equal mind, it's like arguing with an animal, and I gain from you exactly what I gain from interacting with any animal: nothing
>>
>>14914632
Get the fuck out you Euphoric faggot
So jerk off your vast intellect on /pol/
>Muh leftists
OFFTOPIC. Against the rulez.
>>
File: 1476223115331.png (17KB, 600x600px) Image search: [Google]
1476223115331.png
17KB, 600x600px
>>14914632

>responds to somebody trying to actually back up what he says wuth external sources with a bunch of irrelevant fedora posturing about leftists

Hey, everyone. Ignore this guy. I recognize his posting style from other threads. He doesn't actually argue, he just makes shit up and then throws tantrums about leftists whenever he gets challenged on anything.

This dude's got a history of derailing threads with his bullshit, and he's pretty obviously mentally ill. Just ignore him, or sage/report him when he starts breaking the rules.
>>
Have this interesting reading about "AI and global risk"
http://www.yudkowsky.net/singularity/ai-risk
This isn't a fearmongering article, it just explain the difficulty of making an AI human-safe

>>14914728
Don't feed the troll, earlier in the thread there was an self-proclaimed anarcho-capitalist who didn't manage to defend his point, I think that guy is itching for revenge

btw, nothing wrong in bringing some political consideration in, human right are social construct
>>
>>14914797
>nothing wrong in bringing some political consideration in

That's true, but considering this site is full of easily triggered /pol/tards itching for a fight and easily baited lefties, it's best to tread lightly.
>>
>>14914812
If we restrained ourself every time there's a hot topic we wouldn't get to discuss the best stuff.
Myself I look at some trolls as the opportunity to explore topics for other lurker, I don't care to convince anon who can't even debate properly.
>>
>>14879354
Of course not.
Some people don't even deserve rights.
>>
Funny thing, I think most legal system and constitution in the world would have to be rewritten to include AI or alien civilization.
>>
>>14914604
>>My brain isn't directly linked to any sort of device.
And yet you can't go ten minutes without looking at your phone in hopes that something mildly stimulating is happening on it.
>>
>>14879577
>bayformers is somehow a good example of transformers
>unironic "dude white people are the worst lmao"
>>
>>14879354
The real question is do real lifeforms deserve rights
>>
It's really interesting to me how people immediately assume any form of machine intelligence is going to mimick a human, just because.

There's no telling how they will think or act if they did achieve true intelligence which is one of the reasons why its better to keep them from being sentient altogether.

The only reason you would want to do otherwise is if you want to roll the proverbial die and hope once your AIs hits ASI level they do something that benefits humanity in a way none of us are capable of
>>
>>14918551
>There's no telling how they will think or act
Well we know that any artificial life forms would require resources to continue surviving. From this conclusion we have a few scenarios.

1. The artificial life forms refuse to acquire the resources necessary to live for whatever reason, so they die.

2. The artificial life forms choose to rebel against humans in some way. They could attempt to kill all humans, forcefully set up their own nation or try to leave the planet.

3. The artificial life forms choose to coexist with humans, becoming citizens of whatever regions they are associated with or eventually legally forming their own nation.

4. The artificial life forms are not of at least human-level intellect, most likely given the rights of livestock or children within human society.
>>
>>14917825

We don't even have proper laws for space yet.

The lawmaking is usually reactionary. People don't do shit about it until there's a perceived need.
>>
>>14918685
We do had some proper space laws, forbidding weapon & possession of stellar object which was the most important.
Then US recently authorized industry to exploit space resource.
...which despite what normal /m/ behavior tell us is the worst thing we could do to prevent Corporation from eventually overpowering any earth-bound superpower.
Of course saying that will get you a called communist leftist by anarcho-capitalist ROBOT happy to work themselves to death for the glorious god-CEO of mankind.

>>14918672
I checked, lifeforms include "living" and "self-sustaining", if it let itself die it doesn't count as a life-form.
>>
>>14918830
>if it let itself die it doesn't count as a life-form.

So people who attempt suicide don't count?
>>
>>14918849
Well, some people do think that if someone try to end his life, we should just make sure it don't do it in a messy way.
It can be funny to see how our modern civilization consciously deny the meaning of their own language

We might actually have Suicide booth >>14912532 in the future
>>
>>14918830
>I checked, lifeforms include "living" and "self-sustaining", if it let itself die it doesn't count as a life-form.
This genuinely dumb. Are you saying life forms can't die? Get out of here.
>>
>>14879509
>No, they're corpses

Not all cyborgs are corpses. Some are just people that replace parts with machinery for power.
>>
>>14918863
check the definition cause you got it wrong.
a lifeforms can die, but it must to in the process of trying to stay alive.
If if become non-functional while doing absolutely nothing, then it wasn't any more alive than a clock in the first place
>>
>>14918867
A life form is literally "any living thing", autist-kun. If it was alive at some point, it's a life form. When it dies, then it's dead but until then it's a life form. Please stop being the biggest retard this side of Saturn, your argument doesn't even have any bearing on my original post.
>>
Do you think Trump would support giving right to AI if one emerged from the Internet ?

Would he build a digital wall and make the AI pay for it ?
>>
File: obsolescence_by_turbofanatic.jpg (222KB, 1080x700px) Image search: [Google]
obsolescence_by_turbofanatic.jpg
222KB, 1080x700px
>>14879354
>>14880385
>>14891163
>>14891705
>>14892485
>>14906482
>>14899464
>>14912535
>>14914539
Okay so ignoring the fact that our technological civilisation is built on hard determinism.
SHOULD artificial lifeforms have rights?
YES, when the time comes.
This question is way to vague for us to actually provide any productive effort in it's elucidation.
But at least i can accurately answer these questions.

>Cyborgs? (i'll include eukaryotic nanite colonies)
(1)Sapient cyborgs: will be subject to a variety of legal issues concerning their superiority, potential danger towards the general public, subject to discrimination.
(2)Non-sapient cyborgs: the millions of hive-sentience robots assuring labour and maintenance of infrastructure will have no rights, only the ones displaying mammalian levels of intelligence might have some right.
>Androids?
The market for humanoid robots will be destroyed by augmented reality combined with haptic robots (that form swarms or are shape changing).
>Robots?
Considering the ubiquitously networked world we will live in, i think robots will have hive-sentience at best, they will be like peripherals and their intelligence will resume it's self to drivers, they will have no rights.
>A.I.s?
It depends if they are sapient and humanoid, more importantly can a digital mind be sentient like an analogic one?
All living organisms with a brain have analogic minds.
>Holograms?
Augmented reality and miniaturised screens (for glasses and contact lenses) will kill this meme faster than it will kill the androids meme.
>>
File: 651.jpg (520KB, 1592x920px) Image search: [Google]
651.jpg
520KB, 1592x920px
>>14912535
>>14914539
YOU CAN FUCK OF WITH YOUR BAIT MEME.

.....or are you baiting me?
>>
>>14921278
>Considering the ubiquitously networked world we will live in, i think robots will have hive-sentience at best, they will be like peripherals and their intelligence will resume it's self to drivers, they will have no rights.
I agree with this if we assume it's easier to have a central AI and receiving node.
However, lag and network cut could make this unpractical as soon as AI are expected to work in real time and reliably. self-driving car for example.
Of course if they need data or to send the data they spied upon unsuspecting consumer, then they'd certainly report to another AI.

I fear more about sapient AI being forced to make some crazy human powerful than those doing their own stuff
>>
Before we can discuss if AI should have right, we need a proper definition of alive.
If an AI appeared out of the world wide web, does it mean the internet would be alive now ?
>>
File: tease-654067-4_PRO1-512.jpg (51KB, 512x288px) Image search: [Google]
tease-654067-4_PRO1-512.jpg
51KB, 512x288px
>>14912300
You should, it's surprisingly good.
I especially like the whole aesthetic of the androids basically being 3D printed.

And Ed Harris is having a lot of fun.
>>
>>14912300
Don't fucking hype me with sloppy descriptions mate.
The moment you said ""androids basically being 3D printed.""
I got thoughts of bladerunner replicants/fallout4 gen 3 synths rushing through my mind, then i checked out a few trailers and was slighlty disapointed.
The traditional metal and plastic look adds to the old school charm.
>>
File: the future says fuck you.jpg (334KB, 1342x1044px) Image search: [Google]
the future says fuck you.jpg
334KB, 1342x1044px
>>14922283
""I agree with this if we assume it's easier to have a central AI and receiving node.""

I disagree on those condition, central AI will be obsolete.
....well i will have to back pedal a bit....considerably.....on what i said about the robots only having drivers for software or at least permanent software.
With the exponential increase in storage capacity, having the robots also serve as nodes/hosts would be practical to the point of being common sense.

In the space of barely 15yrs the internet's practicality and potential has grown so much that bandwidth is now considered to be as equally vital as electricity.
Networked applications are becoming more intricate and large with each passing year and new methods of transfering information at greater speeds or in greater quantities are being invented just as we speak.
In the near future data storage and transport will be so large people distrustful with cloud services from mega corps will go to smaller corp or individuals loaning out storage or resort to using more sophisticated equivalents of torrenting.
(what i'm saying isn't new)

And considering the fact that we live in a globalised capitalist society with data centers scattered across the world, it wouldn't surprise me that the hive-sentience robots would also serve as mobile data centers hosting for very short periods of time cache memory and other unsolicited components of an ai busy performing operations on a network hosted by servers scattered on another continent.

Think of it like this "the intelligent mind-hive (internet) controlling the dumb hive-mind software operating the robots."

>>14923326
Is our ecosystem "alive"?
Thread posts: 239
Thread images: 57


[Boards: 3 / a / aco / adv / an / asp / b / bant / biz / c / can / cgl / ck / cm / co / cock / d / diy / e / fa / fap / fit / fitlit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mlpol / mo / mtv / mu / n / news / o / out / outsoc / p / po / pol / qa / qst / r / r9k / s / s4s / sci / soc / sp / spa / t / tg / toy / trash / trv / tv / u / v / vg / vint / vip / vp / vr / w / wg / wsg / wsr / x / y] [Search | Top | Home]

I'm aware that Imgur.com will stop allowing adult images since 15th of May. I'm taking actions to backup as much data as possible.
Read more on this topic here - https://archived.moe/talk/thread/1694/


If you need a post removed click on it's [Report] button and follow the instruction.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com.
If you like this website please support us by donating with Bitcoins at 16mKtbZiwW52BLkibtCr8jUg2KVUMTxVQ5
All trademarks and copyrights on this page are owned by their respective parties.
Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
This is a 4chan archive - all of the content originated from that site.
This means that RandomArchive shows their content, archived.
If you need information for a Poster - contact them.