Let's imagine an alternate world. Don't mind the logistics of it, just humour me for a second.
In this world all humans:
- Are completely rational
- Will always prioritize the well being of the human race over their own.
- All work under the same framework so all humans will reach the same conclusion if given the same information.
With all these conditions in place, could you construct a scenario in which a war between two groups of these humans is inevitable or possible?
You can make up or change anything you want as long as those rules aren't broken.
>>1782050
>All work under the same framework so all humans will reach the same conclusion if given the same information.
what
And yes, feed them false or flawed information.
>>1782050
Blacks are exterminated because they offer no logical benefits whatsoever.
>>1782100
One group is fed information that another group is planning to start a global conflict. The information would have to be something of the effect of "Well, those guys believe this global conflict would be a form of population control, but I believe that it would escalate into global annihilation."
Tho maybe this would lead the first group of people to just not fight back...
>>1782120
>Tho maybe this would lead the first group of people to just not fight back...
If they believe their own extermination would be detrimental to the human race, they would have to fight back.
>>1782120
One group is fed information that another group is the "Other". The information would have to be something of the effect of "Well, those guys are all the bad things and none of the good things and probably want you destroyed..."
oohhhhh.....
>>1782050
Two groups have different info and achieve different conclusions
I'm not a linguistics expert and it's been a while since my intro level logic classes but I believe something like
majority A believes 0.9999-repeating != 1
minority B believes 0.9999-repeating = 1
could lead to conflict.
>>1782285
>logic
>believes
barf.targa.7zip
>>1782050
they will all kill themselves, as this is the most rational choice in the long run.
The sun is going to end one day, and all people will die. Or if they're somehow lucky, and manages to become super technological, the universe itself will end one day, and everything will become futile.
When the well being of the human race is prioritized, they have no qualms about killing themselves, to spare their descendants the futile existence near the end of the universe.
So yes, there will be a war, and they will gladly kill each other.
>>1782050
Have them choose:
Ass or tits?
Humans would probably act like cells. Doing specific tasks and altering their genes to be better at their specific niche. Would we even be conscious if we only acted out of logic?
>>1782050
>prioritizing the human species over self-interest is "rational"
Why?
Also, yes, war is possible. It is not post-scarcity.
>We, state A, want this commodity
>We, state B, also want that commodity
Sheningans ensue.
>>1783999
You are imposing your own judgement upon them under a guise of rationality.
What you have suggested is only rational if they think that:
1) the only thing that csn be meaningful is human progress
2) things must be eternal to be meaningful
3) human progress won't be eternal, and is therefore not meaningful
3) death is preferable to a life that lacks striving for meaningful things
4) children will suffer near the end of the universe
5) suffering of others is worse than my own death
Basically, you made a bunch of baseless assumptions about those hypothetical people without stating them.