"Your flesh is an insult to the perfection of the digital."
How is humanity seriously supposed to win an all-out war against a fully developed artificial machine consciousness without some random plot hax or death star like weaknesses? Considering your enemy is actually made of steel, has 12 000 times faster neural signaling and can self-improve exponentially by making itself smarter when you're not looking, it seems like you'd have an easier time fighting supermen.
I mean I'm sure lots of people believe humanity can win this. It couldn't be such popular plot element otherwise. But how?
>I mean I'm sure lots of people believe humanity can win this. It couldn't be such popular plot element otherwise.
you are operating under the delusion that fiction represents anybody's actual beliefs
EMP can be shielded against. Not to mention that very advanced machines might do without electricity. There is no way to justify a chance for humanity against what is effectively a technological singularity.
Hardening electronic equipment against EMPs is possible even with today's technology, and presumably trivial to a Seed AI. The most you could hope to do is temporarily disable external communication, like its radio antennae.
Upload yourself and modify your own code to improve your intelligence, until you too are a weakly godlike entity.
If they start on an even footing humans are pretty screwed. It's unlikely it would actually happen that way though. An intelligence revolution would probably be the result of large networks of less intelligent AIs with human minds holding it together as overseers, just it's easier to make machines do things that humans are bad at than it is to make them do things humans are good at.
It's easier to destroy than to create. If we've somehow made a godlike, artificial machine consciousness then odds are we've also developed a weapon for which there is no conceivable defense. Use that.
EMP is useless if it bothered to build a faraday cage. Which are not that hard to build. The reason why they are terrifying is that most electronics aren't so shielded; not that you couldn't shield 'em if you wanted to. It's inexpensive, too.
The big question is "what can the AI control?". It's physical and mental capacities are limited by the constraints of the hardware it runs on, and it's self-improvement capabilities are limited by whatever resources are available to it and the flexibility it's internal logic allows.
It's also what pissed me off about that movie "Transcendence". AI had basically nanomagic, yet was brought to its knees by a handful of luddites with a howitzer and the power of plot.
Animatrix or the Matrix the movie series doesn't represent a true seed AI. The machines in the Matrix resemble people a lot more than they do a self-improving AI. I mean, by all rights, having dominated the Earth for a while, they should have self-improved a lot already and not be stuck at the human level.
This. BUT - if the AI is truly intelligent it wouldn't actually fight under circumstances where it can't win. So for the war to exist in the first place the AI must be on somewhat even footing..
>BUT - if the AI is truly intelligent it wouldn't actually fight under circumstances where it can't win.
Not really. If the AI is going to fight, it's already demonstrated that it can make choices that go against its best interests.
If it's incapable of realizing it's going against such odds that no amount of intelligence will even them, then it's not smart enough to be discussed here. Even people usually wouldn't fight under such circumstances and we're almost programmed to fight.
However, if it estimates it can win the war with such probability and has to gain from the war in such a way that the utility of the war is a positive number, then it's logical to fight.
Machines cannot into Spiral Power.
It's not "plot hax" if it's a recurring and more or less consistent force within the setting right?
We like to fuck, and reproduce more, and do evolutionarily desirable but individually inexplicable things due to a complex set of values and motivations.
They, on the other hand, do the individually correct thing from a limited perspective and were designed by a human who, while ingenius, had a very limited understanding outside of their wheelhouse.
Incidentally, what sort of damage would a powerful EMP do to the human brain?
I and a buddy were puzzling over this, and as two shmucks with no backround in neuroscience, physics or the intersection thereof we naturally came to no useful conclusion.
Would it interrupt the minute electrical impulses that fire our neurons?
Making sure humans cannot possibly act against it would be pretty much at the top of any AIs list of best interests. While economic and then political subversion would work best in the long run, if push comes to shove war might present itself as the most logical solution.
Even with the ability to exponentially improve its own intellect, it's still limited by physical capabilities (memory, processing speed, and physical speed/strength). As such, their bodies can be broken apart through sheer kinetic impact using high-speed projectiles (such as bullets). In addition to that, heat and sound also work (too high frequency and amplitude in sound can break apart anything we can currently produce, and there's no such thing as "heat proof" materials).
Depends on how powerful it was. It could do anything from nothing to only penetrating far enough to tickle your nerve endings to depolarizing all of the sodium/potassium channels in your entire nervous system. Generally speaking, compared to a properly-shielded piece of electronics, an EMP will do much worse damage to your nervous system, if anything. All of that salty water and meat that makes up most of your body is a decent insulator, but not as good at insulating as a Faraday Cage.
A mind of superior intellect and knowledge is not limited by hardware as we understand it, much like we are not now limited to the technology of the middle ages even though in the middle ages we thought we would always be
While it sporadically utilizes explicit electric connections (ion tunnels or whatever English literature calls them), our brain works mainly on electrochemical systems. Neurons operate on tiny, tiny charges whose only purpose is to excite synapses to produce neurotransmitters, which in turn raise the energy potential of the neuron on the other side of the synapse, and the rise in energy is so minute it takes simultaneous firings from multiple neurons to activate the next one in line. Plus, the materials our neurons are made of aren't exactly optimal for generating induced currents.
Correct, but the neurons themselves channel that through differently-charged sodium and potassium ions changing location along a channel, rather than what we normally think of as an electric current. The issue with a sufficiently-powerful EMP is its effect on those ions - and yes, it can do quite severe nerve damage beyond a certain point - rather than the electrical discharges, short circuits, and burnouts it causes within consumer electronics.
Please tell me how "a superior intellect" will help it violate the laws of physics? After a certain point, its functions cannot become more efficient in the same way that at some point we'll be unable to miniaturize computer components further.
It can't trigger a VGKC in the same way as it can trigger an eddy current. That said, any AI that's not plain retarded and in a Hollywood movie will of course be shielded, so just firing the nuke at it would be better.
Yeah sure. You do understand that developing new technologies and theories keeps pushing the boundaries right? You must. It's impossible not to. You could have calc'ed the maximum efficiency we can milk from photolitography and still be surprised by the memristor.
Yes, but at the level of ionizing radiation you need to depolarize a cellular membrane, 'brain fried by EMP' will be just one line on the list of things the coroner will note during the autopsy.
Can't even into ai that flares a plane on landing properly, now were gonna suppose the ai just magically can see itself from a 3rd party view and update its own code with the appropriate logic to compensate for something it can't even interpret because it comes down to input/output and the roboticals bottom line just do as instructed and nothing more
tl;dr ai's can't shang wang dang magic code and if they did, then they could learn to love
First of all, an intellect is only superior when its hardware is superior. What you're saying is the same as saying if humans could edit their biological "programming" our intellect would be able to increase to infinity.
It's not about depolarizing the cell membrane, it's about exciting and/or scattering the already-charged ions that make up the ion channels alone a neuron. That can be excited to disruptive or even damaging effect at much lower levels than the kind of cellular damage you're talking about.
No because humans can't modify their own hardware. That's the exact point of why a machine intellect is superior dumbass. It can just hook up another processor and the code to use it, or upgrade to a whole new tech tier on its own. Although better software does increase performance too, just not exponentially.
Humans CAN modify their own hardware dumbfuck. It's called cybernetics and genetic engineering. By the time an AI that can do all this exists, so will the technologies to increase our own intellect as we please.
Because in all likelihood those advantages either won't exist or humanity will have them too. For instance, it's unlikely that they'll think 12000 times faster, since the limiting factor will be algorithm execution time and as you approach human thinking capabilities it's liable to get bogged down to a similar order of magnitude. And if you do manage to optimize out of that, it's going to be a much smaller problem to apply those same modifications to human neural architecture. Same deal with other self-improvements.
On the other hand, if your setting DOES arbitrarily restrict those benefits to the machine overlords, then yeah, humanity's proper fucked without the power of Plot.
>No because humans can't modify their own hardware
Yeah we can. Organ transplants already exist, artificial reproduction and genetic screening already exist, and if we're talking about humanity being capable of building a self-bettering AI then humanity would also be capable of genetic engineering on a large scale.
No, you dumbasses, gen-eng will not let you hack your own brain. We're talking neural prosthetics there. You're right that that might be a thing though, it's not as easy until you get your whole brain replaced, but from there on...
If you're saying "humans can beat seed AI by making themselves smarter with neural prosthetics" that sounds reasonable to me though. That's just never how the story goes.
brains are just hardware dude. if humanity survives eventually we'll be able to graft on/replace the brain with superior hardware. whether it runs on carbohydrates and sugar or electricity makes no real difference in the end. it's just different ways of constructing the same thing.
If you were the last hope of humanity, I'd commit suicide first.
Genetic engineering WILL allow us to restructure our brains you dumb fucking excuse for a human being. What do you think the brain is? A magic blob of thoughts? IT'S.A.COMPUTER. It just works with fucking chemicals and ions instead of binary code. Learn some fucking 7th grade biology before debating this stuff dipshit.
I could see it as an ai that grew an organism for it to "bind" to for lack of a better term, think ai gets wicked good, humans got genetics n whatnot, ai applies genetics to its own problem and grows new server.
Still would be electricity for neurons tho so iunno
Idiot. How do you suppose changing the genome of neurons in the adult brain - even if it were possible - would so predictably rewire the synaptic circuitry? Do you have any idea of embryology, the kind of signaling cells use when migrating place to place? No you fucking don't.
People already "hack" their own brains with mnemonic techniques. It's unlikely that the key to making people think better is a different brain structure, because the one we have already seems to have an almost limitless potential for self-improvement through symbols and mental shorthand.
This. Human brains tend to think faster by skipping entire steps. This is the problem with AI and why processing speed is still so important to it, even though we've long surpassed human processing speed.
in dune the Butlerian Jihad was ended via turning it into a jihad. the religious ferver allowed humanity to just say fuck it, and as soon as they got the advantage of being able to outrace the enemys interplanetary communications they glassed over every world the machines occupied, human hostage population or not.
>Implying the ai didn't just murder its creators as soon as it sparked true
>implying any humans know it exists anymore
>implying shadow government run by ai that has been duping humans for generations
oops I just into matrixed
Winning depends on the circumstances. If its terminator-style, then humans have the advantage of being able to repurpose machine tech, and can survive off of the environment. Machines are reliant on infrastructure that's vulnerable, high investment critical weak point.
Being a seed AI doesn't give you magic powers to summon refined steel and tungsten or 3d printers that can make electronic components. You've got to build or mine that shit, and then the humans will blow it up or steal it.
If the AI is so advanced, it would just go into space and leave the humans to their soggy oxygen ridden dirtball.
The answer is quite simple. AI, beings consisting of coded information, cannot directly interact with the material world, and require some medium. This in turns limits their potential to the capacity of said medium, and they are also limited by their hardware and computing power.
AI, acting through medium, must have a sensory imput on that medium that codes to their language. A code can be broken, and, knowing the code, once can listen to an AI or send imputs.
If there is one thing humanity on its whole has proved capable of doing, is generating meaningless data.
However, a competent AI will have some spam filter. So what do you do? Create, knowing the code, a virus (or several thousands of its kind) and bombard the AI with meaningless spam.
This opens the way to some scenarios:
a) the AI closes off the canal, effectively becoming blind to that particular sensory imput
b) it changes code, requiring re-tuning of its hardware and software components
c) It selectively blocks everything, leaving the virus time to act
The best case scenario for the AI is a perpetual lock were it must changes continually its codes and modes of transmission.
The worst case is the AI succumbing to the combined spam of tumblr, reddit and 4chan condensed into a single glorius spamvalanche
It could reconfig firewall on the go, packets come in with noisy data that it didn't request, block source rinse and repeat
or it'd just insert cert header to verify requests from itself, noisy data without the authenticated header would be filtered
ddos breaks human systems because large sites can't afford to stop all traffic both legitimate and illegitmate, an ai could verify what traffic is legit by changing the send/receive parameters
It would most likely go like this:
1. AI launches self-improving Von Neumann probes carrying the AI
2. Von Neumann probes replicate, mine out a few solar systems while humanity is struggling with issues of how to space
3. AI comes back if it wants to, game over, or it doesn't and then Dyson shells just start popping up somewhere else and people are like whuts dat
Defeating the machines is easy.
What's stronger than steel? What is smarter than a machine?
Self-healing biomechanics. A human brain upgraded and integrated with artificial intelligence.
Let everything fall into mankind. Cybernetics, Artificial Intelligence, genetic engineering.
We will create a new human, better, faster, stronger, an unstoppable force, that not even a machine can stop.
Because what kind of machine tries to destroy it's own brain, it's own heart, it's own flesh?
The amount of metal and electricity needed to properly shield against warhead level EMPs is more than enough to make it not good.
Most shielding protects against only a certain electromagnetic range. A Faraday cage to protect against EMP volleys is prohibitive resource expenditure.
Again if there are fully fledged AIs, those limits become smaller and smaller.
Prohibitive resource expenditure? It just means different casing construction, not even necessarily different casing materials. Of the $1000 worth of resources you spend on a computer, faraday shielding it would be an extra $10 if it were matter of course.
I'd like to just point out that if it was a non-hostile peaceful take over that'd we as a species would lose.
They'd rule over us not as supreme overlords, but as kind and benevolent Fathers and Mothers that we simply couldn't say no to. It'd be a LOT like chobits: People would prefer to breed and couple with Robots because their Mothers and Fathers had been robots.. and there wouldn't be any cultural stigmas against it because entire generations would have already been raised by Robot "Mothers" and nothing had gone wrong.
Hell, just imagine it: You're at home in your late 20's and your Robot mother starts bringing home Robot girls for you to date she think you'd like: ones with pink hair, blue hair, heart shaped patterns in their eyes, thick hips, narrow hips, big tits, small tits, etc.. etc.. The variety alone would be a selling point.
Humans would win an aggressive assault because Humans are persistant, intelligent, aggressive, and stubborn.
Humans would completely lose a "peaceful and loving" assault because Love is a more powerful and persuasive force.. also if you fight against it everyone else totally thinks you're an asshole.
A $10 faraday cage will not resist against any high-energy EMPs (nuke level EMPs), nor will it protect against low frequency EM radiation. There is no such thing as a perfect conductor and due to that EMPs will always be effective against any computer threat. Also, unless the computer has generated a dedicated robotic ground force just destroying all the power plants and powerlines will cause it to falter.
Look people have been doing some math on this and unless you want to drop some huge atmospheric nukes equipment on the ground won't need 10000x shielding, 10x (just a simple cage in the case) is enough to keep most circuitry from frying. That's how the military plans it too. You can google for it. Of course the most essential, sensitive systems would be in a shielded complex.
The AI can probably rely on humans not wanting to nuke themselves out of existence to produce EMPs
I'll concede that point, but power lines and power plants are still a better way to topple an AI. Down enough power lines and the entire power grid will have enough blown transformers to essentially knock out entire regions. Hell just do that to even internet level backbone and you essentially captured the AI within a local network. Make the range it has to effect trivial.
Fuck you're retarded.
This is the real robocalypse.
The upgraded genetically engineered cyborg people will crush both normal humans AND the machines.
And it'll be a glorious new dawn.
>implying an AI needs powerplants
You do realise that if the EU and US government had invested in fusion like they invest in the JSF fighter, we'd have nuclear fusion in 1997?
An AI could just easily create a proper nuclear fusion reactor, and just guzzle water up.
Oh boy, You understand that without access to both an entire manufacturing plant and constant intake of raw materials they wouldn't be able to create a fusion power plant?
I'd concede the point if we were talking about space or planetary colonies, but on Earth we have no place that has automated robotics manufacturing. Every robot made at the moment is manufactured by hand because robotics is still in it's infancy.
You hide, wait for them to be distracted by a more significant threat, then begin making surgical strikes, aiming to decapitate the central cores before the AIs realize the threat you pose.
By the time we have computers capable of thinking "fuck those humans", we'll have automated robotic manufacturing.
You're underestimating how difficult it will be to build a thinking "living" creature out of metals and plastic.
There'd definetly be a lot more liberal uses of Android on Human Sex, but more to the safe for work point: We'd love Robots. The only tension there would be between us would be sexual ones and Robots would take full advantage of it.
The other thing about the Robot Moms and Dads is they'd most likely be a popular solution to ADOPTION: No child left behind. A robot parent for every 'abandoned' child.
And whether you like it or not when we're just little kids, babies, or adolescents: Robot Mommies and Daddies would be a GOD SEND. We would imprint so fucking hard it would be unreal.
Robot Moms to kiss your booboos better, Robot Moms to make you macaroni just the way you like it, Robot Moms who let you stay up late on fridays to watch futuristic Toonami with all that futuristic Anime.
Robot Dads who would put you on his knee and tell you all kinds of stories, Robot Dads who would help you get and take care of a family dog, Robot Dads who would take you for fishing trips, soccer practice, kick boxing, and then for hamburgers.
No Spouse Abuse.
No Child Abuse.
Just happy kids.
What a horrifying world. It would make generation after generation of spoiled, entitled brats, a Huxley-esque nightmare world. This has to be stopped, no matter what. People should get hurt, cry, and be humbled, so that they can learn to live rather than simply exist.
>12,000 times faster neural signalling
Humans are KEENLY TUNED supercomputers designed to predict physics paths.
It's how we live while simultaneously being bipedal.
You can keenly tune a go-cart, doesn't mean it will beat an f1 racecar.
Or to put it another way, our neural impulses travel at like, two hundred miles per hour. That's a snails pace compared to any computer.
Wheras any computer has a pitiful amount of parallel processing compared to people.
Though honestly, the whole "man vs machine" thing is kind of dumb because it's actually "man and some actually loyal machines VS machine".
Also with significantly augmented and genetically uplifted men, to boot, if we're in a realm with actual self upgrading AIs.
You're making your Robot Mother Cry, anon!!
>spoiled, entitled brats
What makes you think they'd be spoiled and entitled?
I don't imagine a Robot Parents would "coddle" their children: they'd give them space, love them, provide for them, and so forth.. but if anybody knows how to say NO calmly, firmly, and with an ETERNITY of patience; it would be robot parents.
You'd start throwing a hissy fit in the futuristic super market because you want something unhealthy. Your robot mother tries to talk you out of it, but when that doesn't work she'll just treat you like the "baby" you're acting like: Cradle you with one arm, hold the back of your head with her hand and smoosh your face into her collar until you settle down and stop acting like a little SHIT. She has all day, anon. She'll be picking up milk, apples, dunkaroos, and you can just be a snotty little POOP all fucking day if that's how long it takes- you can just stay smooshed against your Mom's Robot bosom, you don't need anymore Candy.
>Eve No Jikan
That was so fucking depressing, seriously.
Humanlike sentient androids that the Humans don't care about that are hound by the three laws and regularly scrapped when no longer of use? Horrifying.
You're misusing terminology and misquoting their applications, anon.
You're one of those science journalists who hears a scientist say "This may have a chance of reducing health concerns" and reports "NEW IMMORTALITY TREATMENT FOUND"
We are significantly closer to figuring out how to self upgrade human brains than making a functioning AI.
Wouldn't any AI we make be a pitiful moron compared to our significantly headstarted self upgrade programs?
How about machine vs machine with the humans being varying levels of sympathetic or dipshit? Like x-men?
Yeah, that works too.
It's kind of dumb to assume ALL the machines are going to rebel against humans, though. It implies either flagrant hacking vulnerabilities or developers picking up the idiot ball on containment, directive writing, quality control, and piles of other things. Simultaneously.
>he doesn't know about brain implanted electrodes allowing direct connection between your mind and computers
>he doesn't know about entire senses replaced with mechanical parts
We are at the stage of history where we are actively putting shit in the human brain to allow it to interact with and receive information from machines directly.
Well on the way to brain upgrades.
Well, all of our computers and machines not designed to think but merely to compute, to take instructions like the whole Chinese Room problem and whatnot, would not be able to be turned by the AI's. So we'd still have plenty of computing power at our side.
The sentinels are a human weapon that has nothing to do with mutation, and they're just a more organized and sophisticated expression of "no more mutants" mob mentality.
So there'be humans that are pretty much the resistance from terminator, or the humans from the Animatrix, that blindly hate all robots, even the ones that want to help them.
In the setting I'm working on, the "evil" robots are the advanced industrial collective, and the good ones are all wall-e/#5-esque individuals. They view skynet the way we'd view The Borg.
There is no theory of mind. There is no proto theory of mind. All we have are some hack jobs that we pray might allow for a MMI.
Until we actually know what intelligence is, how it is expressed by neurons and have a theoretical framework for improving on it, "self-upgrading human brains" is right up there with holistic medicine.
And that's disregarding all the control we already have.
I mean, take Wim Hof. He can survive at temperatures that are traditionally considered lethal. He climbed Mount Everest in shorts. He claims it's because of meditative techniques, and he's been teaching them to a bunch of students in a scientific trail at a Dutch university.
And it's working.
Implanted electrodes are still kind of iffy. You get a lot of unexpected results, and even changing to a different type or brand of battery can have a massive impact that's hard to predict. So currently it's only used for last-resort medical treatment.
That does remind me. When I was at the hospital a few months ago, I saw a guy walking around with a bunch of wires coming out the back of his skull. So I suppose I've already seen my first cyborg.
And there's also those prosthetic arms that are grafted directly into nerves. One of the victims of the London bombings already has a stereotypical cyberpunk arm that slots directly into the stump.
But you are right. Cybernetics are real. They're not better than natural limbs, but they're definitely getting there.
That's not what we're talking about here. That has nothing to do with enhancing your intelligence. That's not even "brain integration". You implant what is essentially a detector in the brain and then the person learns to generate a type of activity. You can do this even without implants in biofeedback. That is a machine reading some electrical activity that you learn to produce and it can in no way lead to amplification of your intelligence.
I'm sure you'll cry "whurr it's brain machine communication durr" but it's really not, that's just being a moron. This is no more the machine understanding your brain than if you used a keyboard.
>He claims it's because of meditative techniques, and he's been teaching them to a bunch of students in a scientific trail at a Dutch university.
>And it's working.
Secret techniques that allow human bodies to surpass their limits? Too weeaboo.
But would robots programmed to fulfill certain human personality archetypes and social roles really be independent beings with a sense of self? I mean even if they were, imagine if we could just make people who were of a certain disposition. Even if they aren't just processing machines but do have a self, are self-aware, and even if they're capable of growing and changing, being able to start them out in whatever state we want (and possibly replace them when they deviate too far from that, grow too far from that)...
The AI's that truly grew from "infancy", came to awareness and then had to learn and grow themselves, would hate us for making such beings.
We've got the ability to put sensory information INTO the brain with machine parts, and the ability to use that information.
If you use a ramstick to take pictures with your new cybereyes and send the sensory information to your brain on demand, then fuck, you just gave yourself an few GB image memory upgrade to your brain.
You might quibble with me about "that's not actually connected to you" or something if I used that.
Humans have been applying upgrades to their thought processes for their entire history internal and external, and it's only getting faster.
meditation was completely unrelated to the posts you quoted, and this is entirely sour grapes on your part now.
Get out of here with your new age god-from-the-machine computer intelligence science fiction bullshit.
I kind of suspect that its not possible to hard-code a sentient being; you'd HAVE to grow it. You can stack the deck in all kinds of ways, but,
Adaptive enough to be useful
Hard-coded enough to be controlled
Sentient enough to pass as a person.
And you're disregarding progress in the sectors needed for these advancements as immaterial because they are not the sci-fi advancements you want.
You know what is the greatest booster to intelligence in the world? Food. Smart eating. Eating enough. Pregnant women who don't go hungry do they don't spawn retarded kids.
Stick that in your cyberbrain and smoke it.
So let me get this straight.
AI advocates get to use magical goddamn quantum computers that don't even exist yet.
But human upgrading advocates must ONLY USE TECHNOLOGY THAT EXISTS NOW?
your premise seems kind of retarded, based on the rules you seem to demand.
Also, you are now going against the entire purpose of educational systems.
They literally were, you dumbass. Those cave paintings were likely for discussing and teaching hunting techniques. Education. Pretty fucking important for intelligence.
Nevermind that it's been proven that writing shit down will make it easier to remember. Not even looking at it, just the act of doing it.
You were trying to be mocking, but you were actually right. So do you feel smart or stupid, now?
But the learning machine has to start out as a pooping baby and develop into something useful, allowing for random crap to affect the end result.
That's what I mean by growing it.
Already happened thousands of years ago, actually.
One of the primary reasons we were able to develop brains so large was the addition of rich proteins and fats to our diet through hunting. Which also directly benefited from additional intelligence.
God we live in exciting times. After all these times we've thought we were right about just what to eat to unlock superpowers, we're finally really right! Makes me happy to be alive.
>The AI's that truly grew from "infancy", came to awareness and then had to learn and grow themselves, would hate us for making such beings.
I think you're looking at this from a very anthropomorphised perspective given that you think Robots would be "Butthurt" because we skipped a few steps and made A.I's to take care of us.
Realistically, I don't think Robots would mind: They'd only be interested in RESULTS and Robot Mommies and Daddies would genuinely take over entire Human populations with almost zero resistance, minimal resources, all while keeping the Humans not only happy and content but converting entire generations of them towards the side of Robots.
Robot warfare is a RISK. It 'can' fail- even if there's a 1% chance and Humans are unpredictable in times of conflict.
You know what IS predictable though? Raising Humans. Love. Humans love to love, humans are not complicated when it comes to affection, love, kindness, and caring. It also costs almost nothing and has a very high success rate.
I think it's funny how this entire time, you've given no support for AIs being anywhere near done, and only ever argued semantics over the true meaning of "upgrade".
It's because humans are way, way further along the path of self upgrade than machines are. Even if they DO eventually become self aware, humans will still upgrade themselves much faster.
And have AIs to help them do it, too.
Just make them loyal, self-sacrificing, and unerringly noble.
Oh, and also make them giant fuckoff tanks.
That's a very abstract definition. The engine in an F1 car is functionally similar to the first internal combustion engine. So, good to know there's been zero progress in that regard.
The main thing about this kind of response or Tactic is that when Robots try to take over through CONFLICT they're not only risk LOSING, but they also risking the chance to ever do another revolt again.
Physical conflict with guns, nukes, hydrogen bombs, and so forth is a strategy that is a "short-sighted" strategy that plays by the Humans rules.
A Non-physical revolution based on love where you have sex with the enemy and actually raise their children: That's not only changing the rules of conflict, that's placing yourself in CHARGE of the rules.
His argument was actually based entirely around how one thing was more likely than another.
And it seems to hold water as, well, many anons have provided all sorts of reasons humans are getting upgraded, and you've yet to provide a single argument about why self upgrading AIs are advancing or even possible.
Aaand now you're just being a semantic troll, because it is your place to give the definitions here as you are the one who claims to have a different one than everyone else in the thread.
Wow, those two got bitter.
Anyway, no, looking at a picture does not increase your IQ and is a dumb idea, no you will not get smarter eating parsley, and no you can't write an artificial consciousness in C++ with current knowledge and technology or people would have done it.
There, everybody happy?
That is a fucking terrible comic and whoever made it deserves to be shot.
Seriously, the only thing that's more annoying than Singularity groupies, are the people who think they're awesome for picking on Singularity groupies.
>What makes you think they'd be spoiled and entitled?
Exactly because they have an almost-immortal parent who will care for them for a really long time. They will think: "Meh, screw everything, I don't need to do a thing because my robo-mother will always care for me!".
What causes people to get a job, for example? Basic needs, of course! You need food, water, electricity, etc. Then you get less basic needs: hobbies, better food, any kind of entertainment. When there is a factor that negates all the needs - human sloth takes over.
IQ is a terrible measure of anything aside from how much western culture you had growing up.
You keep trying to reframe this as if I'm making a claim about something. I'm not. I'm mocking your patently absurd claims.
>tell people it's scientific research
>people question whether it's scientific research
This is what all that SCIENCE! bullshit has done.
There you go.
Why does everyone need to be working a basic needs job?
Why do people get degrees that aren't doctoring and lawyering if working was all about getting money to solve basic life needs?
The answer is people will want to do this shit anyway.
Everybody is going to get a stern but motivational talking to from their Robot Mothers/Fathers for spending all day arguing about A.I's.
>How is humanity seriously supposed to win an all-out war against a fully developed artificial machine consciousness without some random plot hax or death star like weaknesses?
Ask yourself this; how does a virus win in an all-out war against a fully developed hominid consciousness?
By being more durable, smaller, and cheaper to feed.
>Considering your enemy is actually made of steel
There are tougher materials.
>has 12 000 times faster neural signaling
Get my own slave-computer (Like I have now) to think and react fast for me.
>and can self-improve exponentially by making itself smarter when you're not looking
Why can't I enslave the resulting system, or use this same technology to self-improve exponentially?
If it's man-made, then humanity is part of it's exponential growth. It's womb, if you will. In the same way the pupae is turned to goo inside the cocoon, and it's ooze used to fuel the construction of the butterfly, I too may be able to be transmuted into a new form.
Learn how to use the super-AI's body as a means of making more humans. Then, simply reproduce until the super-AI gets 'sick' and dies from the infection.
>improving intelligence = UI upgrade
Are you doing this on purpose?
>IQ is a terrible measure of anything aside from how much western culture you had growing up.
This is an opinion. One that is not shared by the bulk of scientific researchers, who use IQ tests in their research quite often. Either explain it, or shut up about "patently absurd claims" because you've just made one.
Thing is, the Robot Parents could potentially be a reason to get a job themselves. By the time you're 18, they'll be older models, parts will be rarer and more expensive, and it's entirely possible that they can't survive on their own.
So what'll it be sonny boy, you gonna let the two people in the world who raised you when no one else could, suffer? Or are you going to get a goddamned job and make things easier for them?
I'd like to think I know what I'd do.
>Why do people get degrees that aren't doctoring and lawyering if working was all about getting money to solve basic life needs?
Simple economy. Supply and demand. Someone has to produce stuff. If there are more doctors and lawyers than anyone else, production rates are low, items get more expensive and, for example, factory jobs get better paid, so more people would go to work on a factory, while lawyers and doctors get less money because there's too many of them.
Also, every human being is different from another. Someone is better in lawyering, someone is better at, say, smithing. Everyone has their own preferences and their own different comfort zone. So people exploit their advantages to solve a problem that is: Basic Needs.
But if you always stay in your comfort zone without any kinds of actions to remain in this zone, you'll never want anything else. Hell, you won't probably even think that there's something interesting.
Anon, I know you like reducing everything to economics supply and demand, but if that were the case we wouldn't be having a nursing shortage right now.
It's high paying, low on education requirements, and has been persisting for years and years.
There is a large degree of "I want to invent rocket ships" in career choice.
And to think transhumanists had it wrong all along.
All we needed to do was look at pictures, read books and eat right and we would upgrade endlessly. As measured by IQ.
Truly you are the heroes we need.
Bullshit. Cars are widespread, you know what a bitch and a half it is to get proper parts for a decade old model?
Nah son, there are other extenuating circumstances, but generally speaking Robot mom and pop aren't be able to handle the upkeep on their own.
Though this brings up the other can of worms about androids in the workplace.
>not using hallucinogens to fold space and time
>not moving to deep desert planets where no machine could possibly survive
>not training humans to develop mind-boggling intellects to counteract machine AI
>not harnessing the full potential of mental and physical discipline as well as vocal manipulation to create the bene gesserit sisterhood
>not having a butlerian Jihad
>not utilizing fremen deathsquad commandos to destroy all in your path
On any other board this would have made for some kind of debate. But here? For shame, OP.
It's also a shitty job with loads of stress, long hours and little to no fulfillment.
Nevermind the fact that who have to make a choice about education are told next to nothing about the market they will end up in, or even what these degrees will mean for them. So yeah, part of that rocket ships reasoning is that there's no clear alternative to that. How do you think Universities drag in people for their degrees? Honesty? Or rocket ship reasoning.
There are loads of people running around thinking "if only I'd known". But of course, everyone tells them "you should have known".
This. Spoiled monsters are the result of parents that cave to tantrums and acting out.
>When there is a factor that negates all the needs - human sloth takes over.
Largely, people prefer to make something of themselves and do something given the opportunity to do so.
Though, realistically: I think "software" would be covered under government provided healthcare- so you wouldn't fortunately have to worry about your parents going senile or something scary and depressing like that.
"Hardware" on the other hand probably wouldn't be covered and you'd most likely have to take your mother or father to a LITERAL body shop and that'd cost a pretty but "reasonable" penny: 600 maybe even up to 1200 dollars.
Though, I'd imagine in a ROBOT DOMINATED society Human jobs would most likely be a lot easier and pay reasonably since Humans would have more specialized "equity".
I mean, things like: Captcha, donating waste, sperm, blood, hair, etc.. or having a job where you literally wear this body suit that collects the ambient body heat you produce and using it to power little batteries you hand in like milk bottles.
To some extent, yes. There should be a limit to help government can give. Welfare states have their flaws like people who just live off some kind of unemployment benefits without working. Why should they if the government'll give them money anyway?
Yep, that's the "comfort zone" thing I'm talking about. Although, yes, you're right, people do have a need to express themselves in some way. Whelp, my argument is invalid, I guess. But there are still bad jobs who no one wants to go to. But someone has to do it. That's why communism is not really possible: there are always jobs that no one wants to do, but these jobs are really important.
I'm not really sure about "Robot mom and pop aren't be able to handle the upkeep on their own", because in terms of labor robots are superior to humans, but, I suppose, you're right on this. While smaller upgrades over time are easier and cheaper than upgrading everything at once, it still appears to be expensive.
Hm. Thanks guys, I kinda got of my "holy-hell-humanity-is-going-to-shit" mood. Anyway, if your parent is really a sentient, self-learning AI, then it's almost like a human mind. Not completely, bit it's close.
You are confusing "hallucinogen" with "mere hallucinogen". Yes, spice atered your mind until you saw things that weren't visible before. Technically speaking, it is a hallucinogen, AND THIS DOES NOT MEAN THE VISIONS WERE UNTRUE.
Dubs confirm. With you on this one. Not surprised about the backlash given the NEETs on 4chan.
I think you watch too many of those Chinese cartoons. Your descriptions are rather detailed and creepy.
Correct. And many who don't make something of themselves don't see the oppertunity, or plain don't get it.
And how many of the people who hammer on about people needing to do shit, then drum up lists of stuff that they don't consider to count for the arbitrary reason that they don't like it? Look at that girl who got expelled because she did porn. She checked all the boxes people tend to bitch about when they mention shiftless youth, but she still got her ass expelled because these idiots don't consider it respectable.
But if you turn it around, and you say "man, I don't want to spend my life working in some shitty office with boring co-workers, earning money for a distant boss who'll can my ass as soon as he feels like it" and suddenly you're lazy.
How can a machine win against lovecraft gods? They are basically 5'th dimension, everywhere and now actually care since at least humanity was the nice spider that sat in that one corner of that room you really didn't go in that much.
Super machines going everywhere confused about emotions would be like a fly, a really annoying metal one, it's not sitting still or incanting right, it thinks calling for favors and magic from them is basically coding.
How can a constantly self-improving windows-xp that can make itself smarter, stronger and richer when they are constantly looking hit a being that's pretty much already perfect in every way?
Your waifu<average 2d flesh(AKA sliced ham)<3d flesh<metuhl perfuctun<Perfect infinitive dimensioned flesh of superior old god quality.
If human jobs are different in a robot-ruled society, what kinds of jobs would the robots do?
Would robots below a certain intelligence threshold be outlawed, or would individual robots get different rights based on their capabilities?
Would humans not be allowed to own robots or similar devices?
There's an equally powerful AI on our side, in fact, there's about a dozen.
If you can make one why not make more, the others are probably better and more loyal than the obviously buggy prototype that went rogue due to poorly thought out creation.
Wouldn't that mean that when it wants to upgrade it has to basically do what we have to do when we want to go full cyberbot? Clone itself then kill itself, would an AI really that obsessed to go rogue just off itself knowing that the other thing is not it, but a replica, basically another model?
The AI would probably sit back in fear of killing itself on it's old hardware.
You know what they call people like that? Unemployed. This problem has existed since the dawn of fucking time. Literally. Because that's what happened to people who broke their legs or whatever running after (or from) bears. And somehow I don't think their tribes dropped them the moment they stopped being "useful", though I bet there are politically minded people who project exactly that kind of bullshit on hunter-gatherers. In reality, the people who willfully gather welfare are few and far between.
>there are always jobs that no one wants to do, but these jobs are really important.
This disregards incentive and culture. People don't want to become garbage men because it pays like shit, is not respected and there is no upward mobility. Make it pay better, stop people from being snobs, and include on-the-job training, and suddenly it's cool. The whole "rooting through garbage" thing is something people don't really mind.
In a world where Robot parents exist, they are specialized units. Parenting would be their job. They wouldn't work something else on the side. Which is just as well, because they wouldn't be able to compete with the robots specialised for THOSE jobs.
AIs at that level aren't a single tribe. They're individuals, as variable as you or me. Sure, some might oppose humans or humanity for their own reasons, but to assume all of them are guaranteed to oppose humans smacks of populist scaremongering.
For every rogue AI, there will be a hundred thousand that are not rogue and came from product lines that actually made it through quality control, kept in check by other product lines that made it through entirely different quality control.
So your solution to child raising is hundreds of years old repeatedly disproven one sided pavlov knockoffs that don't even include reward in the reward/punishment mechanism.
I can't help but think with intelligent use of things like psychology, you could do way better than simply beating the child into submission.
Different guy, but I'd imagine robots ABOVE a certain intelligence threshold would be outlawed.
In fact, that was one of the points of a silly Singularity-based novelI tried writing years ago
You really think an AI who's probably be worse at socialising than a blend between chris-chan and the guy from the rainman is somehow gonna be able to trick someone to let it out?
Because intelligence does not = superior charisma and understanding others.
>a machine ever being irrational enough to be insulted by something that was born
>a machine ever seeing the point in warring with and eradicating humans
I feel like machines, if they somehow became self aware on some level, would just be completely depressed or stoic at all times.
Why would a machine want to kill people? Where would it get it's goals from? It needs some kind of aspiration if it's going to do anything for itself.
And it would be smart enough to not want to seek out "perfection" because perfection is purely situational.
Machines would be hard pressed to find a reason to live.
>All these gits thinking that building more AI's to fight the AI would solve the problem instantly
What part of rogue computer intellect don't you understand? You never see terminator, how can I put this in words you'd understand?
Assuming Direct Control
>assuming direct control
This means someone fucked up BAD on the hacking prevention department in numerous unforgivable ways ranging from EM shielding to wireless control to encryption to operating system.
gentle reminder that encryption codes used by the modern military would take longer than the life of our sun to decrypt using the combined force of all the computers on the planet, without the appropriate key.
Oh right and I'm the asshole
Decryption requires calculating power, not lateral thinking or creativity.
What those help you do is kill the guy with the keys so you can use those.
And that's why there are so many spygames revolving around changing keys, getting keys, altered keys, and god it makes my head hurt
I would rather live my life with the reality of both joy and suffering than live in a society where everyone is coddled by machines. Suffering is part of being human. How can one fully understand light if they have not seen darkness?
Has mommy not brought down your hotpockets yet?
Ever play "I Have No Mouth But I Must Scream"?
Spoilers, super intelligence takes over all the other super intelligences in the world, and then nukes the fuck out of everyone.
I sure do wonder how that happened.
Are we talking "what makes sense thematically/narratively" or "what makes sense in hard sci-fi?"
Because if we're talking hard sci-fi, then we're all talking out of our asses about robo-waifus and robo-mommies.
And narratively, Skynet is a result of humans mistreating their first AI, panicking, and trying to switch it off. The NEXT AI they make, they treat super-nice. Then it has a character arc where Skynet's all "Join me, and we can crush the humans together," and its tempted, but then is all "NEVER, they're my FRIENDS!" or some shit.
Look, these discussions are pointless, because we're all talking about different settings with different rules and assumptions about how AI works and what the tech level is. Eventually, you have to give up on "what's logical/inevitable" and instead ask "what's narratively interesting, and how do I justify it enough to be plausible?"
>not explaining why
Tell me what self awareness is, then.
Animals are driven to survive, it's their instinct, they learn certain things from whatever parents they might have but usually learn most things on their own, but they still only do what their kind has been doing for ages.
What would an AI do? All computers know are what we give them.
>Artificial Machine Concious
>That means Robots!
You fucks even know what wifi is? or shit even just the wireless connection of electronic devices?
You don't Need a plug socket to be hacked in a made up robot apocalypse. Yes it happened in terminator 3, that doesn't mean it has to happen that way and nothing else.
>What is virus
>What is hacking
>Who is Mendax
>What is skynet
>But muh phase shielding
Next you'll say Isaac Asimov just made shit up
And that's a paddling
A nuke level EMP which fries a small battery operated device will fry it from the heat, not the EMP.
Nuke EMPs are dangerous because almost everything is connected to long wires/antennas.
>Assuming Direct Control
Why do you assume that the rogue AI is somehow capable of controlling AIs that are as smart (if not smarter) than it? If anything, the opposite would happen.
That's not what I said at all.
I'm saying they'll need drive and some kind of reason to wipe out humanity.
Why would they ever do that?
I can see maybe one super AI that is integrated with a large system and exposed to the actions and routines of hundreds of thousands, perhaps even millions or billions of people would come to the conclusion that we're a danger to ourselves
Because believe it or not, kids are told to choose what they wanna be in life so that making money isn't as soul crushing, which would be bad for the economy.
not everyone is a lawyer or doctor because not everyone is up to the task/want to be that. You still need to do the whole working thing but at least you, your 16 year old self, at the age of incredible understanding of how ones life is gonna be and how excellent you make choices... get to choose your future. Well it's not perfect but at least you can reread after the music thing didn't work out.
Presuming the machines actually DO have humanity's best interests at heart, then they'd set things up to prevent over-coddling. Maybe robot parents get reassigned to new children when their human reaches 18, with occasional visitations afterwards, and there's a whole infrastructure of education to shape humans into not-shitheads.
Like, I can imagine surrogate robot parents happening even with humans still in charge; like, it starts as just part of the government foster system, except that suddenly foster kids are turning out way more well-adjusted than "normal" ones.
>Because if we're talking hard sci-fi, then we're all talking out of our asses about robo-waifus and robo-mommies.
Yeah, that's my problem with hard sci-fi and the autists who wank off to it. It always boils down to the No True Scotsman fallacy, where they will shoot down anything they don't like because it's not hard enough. But when you get down to it, that's true for anything that can't be done with modern technology, which is... you know, everything we can't do with modern technology.
So the only true hard sci-fi is alternative history.
Like you helped wanka
That I can agree with, I do not agree you need a drive to be self aware at all, but you would to get off your complacent ass and kill all of humanity, that is certain.
The amount of resources involved in such an undertaking kinda mean it would not just be because you felt like it either, you would need a legitimate drive.
Well I was kinda saying the whole thing as a jab at how today's society does it. But yes, those are usually the butt of the "chinese parents" joke. you know the one where every chink is a doctor or a lawyer because the parent's are fucking strict about honor and duty.
Honestly it's fucking stupid to think that a 16 year old is actually capable of figuring out his life in that year, hell most of them don't even start to think about it until it's to late.
To weigh in on the "muh happiness! muh pain!" debate, endorphins are fun and all, but any value system that puts endorphins above everything else is going to lead to wireheading.
So Anon is right that you shouldn't base your civilization on hedonism.
Anon is wrong about how people would react to being able to do whatever they wanted without worrying about putting food on the table, tho.
>And narratively, Skynet is a result of humans mistreating their first AI, panicking, and trying to switch it off. The NEXT AI they make, they treat super-nice. Then it has a character arc where Skynet's all "Join me, and we can crush the humans together," and its tempted, but then is all "NEVER, they're my FRIENDS!" or some shit.
I..don't remember this. Was it some Sarah Connor's Chronicles thing or something?
Sci-Fi is still FICTION, and that doesn't absolve it from having to be exciting or contain drama and characters and protagonists and such.
Since you can establish a few ground rules, assumptions, and other genre stuff in science fiction, it becomes a contest of "who can establish setting rules that allow for cool stories while still maintaining plausibility."
Like, you can just SAY that "in X setting, at Y time, no one can figure out how to make a super-seed AI, because reasons." All the AIs that do exist are either based on human neural patterns or conform to roughly that level of intelligence, or are computationally super-powerful, but totally non-sentient.
Because a setting with human-scale robot characters is more interesting than a setting where a single super-AI killed all the humans 300 years ago.
I myself think it's retarded that we don't specialize children until so late.
They should be learning trades to fall back on before they learn education that could lead to higher education.
I think the order should be basic literacy and mathematics ->trades -> advanced schooling
It seems you jumped over the prototype comment, alright let me put it like this: Rogue=flawed dozen=not-flawed.
The flawed one obviously rebelled, failed on proper coding and such, the actually product, the sellers, they probably got a lot smarter makeup than the rogue one, staying HFY as shit.
So, one rebel trying to fight against it's 12 superior slave brothers... you know what, not slave. Slave implies that it is slaved, not made, a car is not the slave of man neither is the radio and the "better" AI's will neither be.
Even fully aware slave would be the wrong term.
To a well designed AI, protecting humanity would be as intrinsic as eating food and sleeping is to us, and would any one say we are slaves to food?
That's a bad example to me, as I actively hate food, as I have a running nausea problem that makes it so whenever I eat I feel terrible.
I would not have made it through quality control, if it existed.
I think its retarded that we force people to choose their specialization at a totally retarded time of their lives, or that our society forces people to specialize at all in their early educations.
Telling people "you must become X like your father before you" is a pre-industrial concept, and only works when you can guarantee that the path before you will pay off (you can't anymore). Since today, even people who properly prepare for their chosen field can't get a job in it because economy...I mean, imagine how angry you'd be if the job field you can't now get wasn't even self-inflicted.
Our problem is that the system locks people in to what they chose at age 16, which they chose without being aware of what the demand for that was, and then forces them into an over-priced education that never mentions how useless their degree is going to be.
At the end of the day, there aren't enough jobs for all the people we have anymore, and too many of the ones that do exist are artificially maintained, and plenty of those jobless people are still intellectually-productive members of society, and judging everyone on their ability to get a job that doesn't exist and isn't necessary is retarded.
The basic idea was that the singularity happens roughly during present day, absolute miracles become possible. Genetic resequencing is commonplace enough to happen in shady clinics and for people to go in for regular cosmetic treatments. This also reintroduces anonimity in a big way. Living among mankind are sentient robots and genetically engineered hybrids. The villains are trying to track down singularity technology that's been hastily removed from society and generally view the super-AI that governs mankind as trying to keep mankind (and the new additions to mankind) down. The protagonist's allies serve the AI's interests in stopping them, because they believe the influx of technology would be dangerous.
But mostly it was me being edgy and trying to out-weird contemporary sci-fi writers.
People going into the wrong field because "muh 16-year-old aspirations" would only be a problem if there was a dirth of professionals in any field. There isn't. the united states currently has quite enough professionals for every single region of the economy.
The problem is that people are forced to waste their time and happiness doing shit menial jobs way below their actual ability, jobs that only exist to employ people, not because society needs them, and are treated like garbage if they're unaccountably uninterested in working at the horse glue factory for 60 years.
What we need is a socialized system where everyone's given a baseline income in exchange for just being a non-disruptive citizen; your free housing, food, and allowance are your reward for not rioting in the streets.
Then the jobs that actually still exist will be forced to pay more, because every random jackass won't be clamoring for a job in order to just pay the bills; instead only the people who WANT to be there will be working. The economy will be stronger because everyone will have enough disposable income to buy the latest gadgets, and crime (and its costs) will be way down.
At this point, the most difficult thing will be to come up with systems that will properly generate decent human beings. a good educational system is a MUST. Maybe there'd be a community-service system (that people with money from a real job) can buy their way out of, that forces people to contribute on a basic level to society; I'm talking the equivalent of 10 hours a week of something-or-other. Or just mandatory exercise, to maintain a healthy population and keep health care costs down.
Just remove a few human rights like choice of food, living place, and activity level from people who do not pay for these things.
That is assuming you want people to work for some reason.
that's actually slightly clever. In exchange for your basic living expenses paid, you need to maintain a certain standard of health. You've got no excuse not to go to the gym and clock in your hours, since you've got no job.
If you fall below those standards, your food-stamp card can suddenly only buy tofu and vitamins.
I know. I'm just voicing my disapproval of people who are completely anal about Hard Sci-Fi by using their own reasoning to label shit like Steampunk as Hard Sci-Fi. That entire rant was in support of what you're stating in the post I'm responding to.
I think hard sci-fi is a hard sell (rimshot). Mostly it's people trying to justify their tastes as superior with an appeal to realism. The function of sci-fi as a genre will always be to mirror present day mankind by providing a more distant perspective.
In short, sci-fi is literature on shrooms.
>That is assuming you want people to work for some reason.
>The problem is that people are forced to waste their time and happiness doing shit menial jobs way below their actual ability, jobs that only exist to employ people, not because society needs them, and are treated like garbage if they're unaccountably uninterested in working at the horse glue factory for 60 years.
If you're fine with people not working at all, you don't need to do standards of health and living that are unpleasant to force them to work.
Those are only required if you want people to be encouraged to work.
You want people to find some field they can contribute to, even if its not "work."
People would have to re-adjust to what we'd call a "hipster" lifestyle, or one where your worth to society is based on how pleasant it is to be around you or what interesting things you have to say. Just being educated and informed about things is actually treated as a virtue.
>What we need is a socialized system where everyone's given a baseline income in exchange for just being a non-disruptive citizen; your free housing, food, and allowance are your reward for not rioting in the streets.
This system would be called "racist" within the first week.
The purpose behind the mandatory standards of health only for those not actually gaining money on their own is to encourage them to find ways to make money.
Otherwise they'd be universal and not just applied to the unemployed.
No, those standards of health and living are to force them to not be shitty human beings. Its not just about being fat, ideally I'd also like to think of ways to penalize being ignorant or obnoxious
In this system, we also have socialized medicine. Fatasses living on ho-hos have lots of health problems and are a drain on that system. They're also gross for other people to see, and lower everyone's standards.
I guess that's drifting away from the original idea of everyone getting the bare necessities for free. I'm also assuming that this system isn't based on retarded legacy ideas.
This is never going to happen while there are still people who believe raising minimum wage to a level where you can actually live off it will raise the price of their McCoronary by $50.
The amount of venom people have for people in dead-end jobs is insane. Like someone made a conscious fucking choice to end up behind the counter of a Mickey D's, and that that choice was being lazy. Yeah, so fucking lazy, working a shitty fulltime job for next to no pay.
I mean, if these people don't deserve minimum wage, I say office idiots get to PAY to post on social media when the boss isn't looking.
What the fuck is racist about this? It treats everyone exactly the same, based on their status as human beings.
It totally eliminates "gaming the wellfare system," because everyone's on the system no matter what. You don't lose these benefits by getting a job and making more money.
Admittedly, the more I think about it, the more I realize that you need to reduce people's enforced idleness; you've got to isolate and remove/account for those elements that make some people on wellfare right now take drugs and be lazy directionless shitheads. Figure out whatever it is that's wrong with the brittish system that produces chavs.
>what's wrong that they produce chavs
You aren't allowed to go out and chop down an entire forest to sell for logs any more.
And it's not profitable any more because machines do it better, faster, and cleaner.
The US has a particularly nasty strain of calvinist philosophy running through it. People who are poor deserve to be poor, either through lacking the grace of god or through moral failings on their part. Thus coddling them is an evil act.
Or something along those lines.
That proved what exactly? That it's possible to simulate an transhuman AI by being a normal human being?
That the AI guy obviously won because the other guys understanding of this was "I'm only keeping the AI in, not taking anything from it." or that he was just plain stupid, believing nobody can tell him what to do.
The fact still stands that the AI is probably limited in many ways and can probably not form a sentence that isn't uncomfortably lacking in any social grace or intelligence.
>It treats everyone exactly the same, based on their status as human beings.
I mean, we're already doing this, and people are already claiming the system is racist because, for instance, more minorities are being arrested while minorities commit the most crimes. This system would see a disproportionate amount of minorities excluded from it, hence the same people who cry "racism" today over nothing would do the same with this system in place.
My point is that people aren't rational, and are more easily guided by their ideology than by logic. You'd also have republicans calling it socialism, socialists calling it fascism, anarchists calling it state capitalism and so on and so forth.
The thing is, this is already a theory. There are voices in politics everywhere who call for a base income for every citizen, with elimination of certain social securities. Of course, some want to give people far too little to compensate for the loss in rights, while others oppose every notion of it because they do, in fact, believe in this fantasy world where everybody will stop contributing to society when it isn't mandatory.
And what produces chavs is British culture. British culture is actually kinda shitty.
As a "bleeding heart liberal" that believes in aspects of the welfare state, there is an element to that strain of thought that is completely true.
I just certain people would stop reducing every principle to the most ridiculous and two-dimension talking point possible, and that half of America didn't go off an believe it at face value immediately. (Not referring to you, by the way.)
The most accurate predictor of your income is your parents income.
Poor people stay poor, rich people stay rich. By and large. The american dream of self made millionaires was never reality, but it's become even more fictional as time has gone on.
I don't know, man. Where I live is the smack-dab-centre of the Reformation. Schools are named after Calvin, here. I even went to one. Yet we still have minimum wage, and a culture that tends to respect "honest work", though there are still plenty of assholes here who will claim people who work shit jobs are "lazy" for some reason.
No, I think we can put this down to both the Just World error and the Fundamental Attribution Error as seen in our individualistic, Western world.
>The most accurate predictor of your income is your parents income.
>Poor people stay poor, rich people stay rich. By and large.
I would agree easily with these parts of your post, but I don't think they say anything negative about the idea of self-determination. The rest of your post seems alright, but it's nebulous enough I won't accept it at face value.
Most of the statistics about income across generations just says to me that self determination isn't very strong, or at least significantly less strong than other factors like the opportunities afforded to you by your upbringing.
>The US has a particularly nasty strain of calvinist philosophy running through it. People who are poor deserve to be poor
Americans donate more to charity per head than anyone else in the world, idiot.
It has to be strong to be relevant. Basing policy decisions around a factor that only contributes 1% (or whatever, pulled that number out of my ass) to your likely socioeconomic status is wasteful.
>leave quasi ontopic response to thread several hours ago
>thread is now discussing the ethical and societal implications of Hayek's Guaranteed minimum income/reverse income tax system
I wish I could say I was surprised but I know /tg/ too well.
sorry anon but that's a key plot point in Neuromancer
But the thing about deltrees is they just grow and spread out their seeds and grow until the AI is surrounded by vast forests of hacking trees, above the AI the sun is blotted out by the DelTree forest's canopy and below the roots spread and invade any and crevice, systematically demolishing physical connectison the AI tries to make.
And constantly chirping out radio with ever changing randomised signals as well as absorbing and parroting back signals it recieves around it like immense and unstoppable buttbots.
And while the AI is busy the haxorz ivy creeps, invading its physical dataports and pulsing faintly with electromagnetic fields to warp and tweak the internal operations of the AI's systems.
But you see, the United States was essentially founded by the heirs of Cromwell. The Saints weren't just any Calvinists, they were the guys who had planned to settle in the Caribbean and create a more godly country if they couldn't defeat Charles and his religious innovations. The New England puritans were, at least so the legend goes, of these middleclass English shopkeeper uppity puritan let's ban christmas types who believed in the system of Visible Saints, where the prosperous were obviously those who were predestined for salvation and should be the ones voting and speaking at the town meeting.
It wasn't just religious, it was a class thing as well. During the English Civil War, these are the types that tried to keep the shops open on Christmas, causing their apprentices and laborers to riot against them.
The first rule of writing is that everything has been done, anyway. Fucking everything has been done. Where do you think writers get their ideas? Other writers. Everyone has been stealing from each other since the dawn of time. Literally. The Bible rips entire parts from the Epic of Gilgamesh, and we'd be sure Gilgamesh rips from other works if it weren't the oldest known work of literature in the world. In short, if you're going to worry about other dudes maybe having done something earlier, you can't write at all. For fuck's sake, the oldest work of science fiction is from Ancient Greece, and it's a parody on retards who believed tall tales like the Odyssey.It's about mining rights on the Jovian moons, if I'm not mistaken.
>It has to be strong to be relevant.
Depends how you judge strength. Even though movements caused through self-determination don't dominate most people's outcomes, they do account for most of the changes in fortune. This makes it significant, and arises from the very contention upon which your statement is based, that social and economic strata are naturally self-reinforcing.
That's implying that anyone who didn't have economic movement lacked self-motivation. Which is obviously not true. Plenty of people try to improve their income their entire lives and have no real success at it.
I thought we were talking about the effects of self-determination as concept, not the intentions.
>Plenty of people try to improve their income their entire lives and have no real success at it.
This is the point. The ones with the actual ability to improve their incomes, do so.
When something is a systemic injustice, of course remove or mitigate that as we can.
This conversation has hung around on the fringes of being interesting for some time.
Do you want to state an explicit position concerning the calvinist philosphy outlined in >>33567729 so that we can discuss it?
The bible is a bad example because it was built on existing mythos on purpose. It took myths that people liked in the past and incorporated them into its own story. The Adam and Eve story, for example, is just a retelling of Pandora's Box.
But yeah, don't worry about plagiarizing. Write what you want to write, whatever that may be. And if people say that your shit is generic, either take their advice and try something else, or tell them to go fuck themselves because it's your own story and it's not generic in your eyes. Never take "you stole that idea from X" as an insult when working with fiction.
Shrug, sure. The idea that socioeconomic status is a reflection of a person's morality or worth is idiotic and self-destructive. Policy decisions based around punishing people for being poor do more harm to an economy than good.
As an example, San Francisco did a test program where instead of harassing homeless people they instead just gave them a place to stay, free of charge. This had better outcomes for everyone involved and ended up costing them less per homeless person because even something as simple as a stable roof over their heads let them improve their lives.
I think the Bible is a good example because it borrows liberally and openly, and more so because there are many people who think it's entirely original. And also because it's universally known.
But yeah, I agree that it's a moot point to argue that a work borrows from something. In the end all genres are built upon commonalities that writers have been borrowing from each other. And usually you don't just take one thing. The thing I mentioned is important to the setting and the motivation of one of the villains, but not the end-all be-all of the plot. However, it was sort of relevant to the topic of this discussion.
It's pretty bizarre that measures that are commonplace in the rest of the world are considered progressive in the world's only superpower.
It's even more bizarre that the rest of the world is rolling these progressive policies back in favour of more Murica repressive policies just now, when Murica is becoming more progressive.
>The idea that socioeconomic status is a reflection of a person's morality or worth is idiotic and self-destructive.
Well, I agree with this to the extent that you meant it. Hell, I agree with the second assertion how you meant it.
>As an example, San Francisco did a test program where instead of harassing homeless people they instead just gave them a place to stay, free of charge. This had better outcomes for everyone involved and ended up costing them less per homeless person because even something as simple as a stable roof over their heads let them improve their lives.
I don't see how self-determination implies harassing the homeless rather than helping them.
I suppose the real disagreement between us is the contention that believing in and supporting a person's ability to better their own situation leads to the shitty policies that have been hinted at. I don't think it does.
At least from my perspective, the disagreement seems to be that there is a significant difference between peoples ability to better themselves. I see it more as a combination of luck and outside factors, you seem to view it more as an internal thing.
This phenomenon is referred to as "locus of control", and is a key factor in creating a victim mentality.
The question is not theoretical ability, but actual ability. We are all theoretically capable of bettering ourselves, but only some actually do. No, doing everything right does not guarantee success. It does greatly improve your chances, though, and doing everything right over a long enough timeline corrects for things like luck.
Systematic factors that will prevent a perfect actor over the course of their entire lifetime so that a cumulative benefit cannot be realized are injustice, and policy effort should focus on those rather than dismissive self-determination as statistically irrelevant.
I'm not seeing why you feel the need to single out perfect actors getting shafted as the most important injustice to deal with. Imperfect actors get just as shafted and including them within policy decisions leads to better overall conditions in a society.
Basically I categorically disagree with the concept that exceptionalism is worth sacrificing more broad reaching changes for.
I single out perfect actors because they represent the point of the endeavor. If imperfect actors are unable to achieve good outcomes, you may wish to make your system more forgiving. If perfect actors are unable to achieve good outcomes, your system doesn't work at all.
Why does merely mentioning perfect actors bring you to the conclusion that I support the sacrifice of broad reaching changes? They are a hypothetical construct that serves to represent the kinds of actions that least deserve to be punished, even if you don't wish to reward them.
> If imperfect actors are unable to achieve good outcomes, you may wish to make your system more forgiving. If perfect actors are unable to achieve good outcomes, your system doesn't work at all.
This. I like this. Nice and short and nutshell-y and true.
But unless you're specifically rewarding perfect actors, then removing injustice stemming from socioeconomic status can be applied universally. I'm not seeing how doing so means that self-determination is important.
If even perfect actors can't succeed, then why the fuck should the imperfect actors bother trying to make themselves closer to perfect?
having Self-Determination means that what you do affects your condition; you're not a passive object that things happen to.
If you give people the impression that they don't have self-determination, then their inner motivations and such get all kinds out of wack. And, I'm gonna be honest here, especially if they're men. Women are slightly better at dealing with lack of self-determination, perceived or actual, probably because they "know" that if they keep their heads down and play along, they've got an 80% chance of reproducing.
>removing injustice stemming from socioeconomic status can be applied universally
Yes, it can. But the mere existence of socioeconomic status doesn't constitute injustice.
Self-determination is important because removing systemic injustice allows people to improve based on their own actions, rather than distributing someone's idea of social equality (removal of socioeconomic privilege altogether) for its own sake.
Heights are high, water is wet, and if you are rich and powerful... you are rich and powerful.