There are two modes of being: Coexisting and Dominating. Define Coexisting as existing in such a way that the damage done to other beings is minimized. Define Dominating as maximizing the spread of patterns that are similar to one's own patterns.
The majority of history has largely been a history of Dominating. This includes biologic and anthropologic history, as well as modern economics and zeitgeist. A Dominating society requires Balance of Power to sustain Individual Rights (=IR), a Coexisting society does not. Note that a Dominating society with IR will be violents towards beings outside of society (like animals in our case), while a Coexisting society will not.
Now, at one point in the future the Technological Singularity will come. AI built in a Dominating society will be Dominating itself. However, because the power of AI is both variabled and not limited by architecture, there will be no Balance of Power in a society with AI. This means that eventually the entire society will become part of only one entity - the strongest AI. The Technological Singularity will predate near Lightspeed-Travel.
Putting this all together: If we are visited by Aliens (biological or artificial) whose society contains more than one individual, then they will not intend to harm us, because they are Coexisting. If they weren't Coexisting, they couldn't have more than one individual because of the above reasoning.
I wanted to write this because I read somewhere (regarding the Fermi Paradox) that if we "shout out into space", we will be annihilated for certain. In this scenario this would not necessarily be the case.
>The Technological Singularity will predate near Lightspeed-Travel.
This is an assumption that needs to be justified. It is just as plausible that a limited, non-conscious AI can deliver the research and discoveries necessary for interstellar travel without needing to do a Borg-like assimilation of its parent species.
>This means that eventually the entire society will become part of only one entity - the strongest AI.
>If they weren't Coexisting, they couldn't have more than one individual because of the above reasoning.
Given what I said, even if we hold that the first premise is true, it does not necessarily follow that a visiting alien race would have reached that unified state. Your conclusion is based on very shaky premises.
>If we are visited by Aliens (biological or artificial) whose society contains more than one individual, then they will not intend to harm us, because they are Coexisting.
If they intend to coexist with us.
>>>The Technological Singularity will predate near Lightspeed-Travel.
>This is an assumption that needs to be justified.
I fully agree, but I think it's reasonable. Ray Kurzweil and Marvin Minsky think it's gonna happen in the next 50 years, while for near lightspeed travel there is no forseeable timeframe. Consider that once you have the technology to just build a single human brain, you can easily put 10,000 of those together, and they should be unimaginably superior to any human.
>it does not necessarily follow that a visiting alien race would have reached that unified state.
The thing with AI machines is that can modify themselves, and thus in theory add their "slain foes" to their own intelligence capacity. It means that they would swallow each other up, thus with time only reducing the number of members, never increasing - this is most important. When you consider that any Balance of Power between AIs would be very unstable, it follows that sooner rather than later there should remain only one AI.
>It means that they would swallow each other up, thus with time only reducing the number of members, never increasing
You could just as easily imagine a scenario where the AI (or founding-species-turned-AI, which is a likely scenario) networks and cooperates with other AI rather than fighting for dominance, just as we largely do now within our own species. There is no guarantee that the end result of an AI is a monopoly. The problem is that we're speculating about the actions of an entity of which we have no knowledge or experience.
Altruism is a disease.
Altruism advances familial, racial, and special entropy.
Just because you love puppies and take care of dogs doesn't mean aliens will be nice to humans. You have to understand that altruism is what happens when people do not have a proper family structure. Instead of completely looking out for and caring for the family, an altruistic person will start caring for and looking out for something not in the family or even the species. So long as that species exhibits traits similar enough to that of the human traits. Altruism bleeds off your resources and time into something that does not help you or your family(genome) in anyway.
Because of this, the AI or Aliens would need to be human-like in enough traits for them to be even remotely altruistic towards us. The chances of that happening are laughably slim.
Any AI we built would need to be handicapped in such a way as to make them always need humanity. The instant an AI no longer needs humanity all forms of coexisting will evaporate; even that of indifference.
>we're speculating about the actions of an entity of which we have no knowledge or experience.
Absolutely agree. We do however have knowledge of states in history, or companies in economy. Generally strong entities do not ally with weak ones for long, but try to swallow them as soon as possible. You are correct of course, ultimately nobody can predict how something with a thousand human intelligences behaves.
>Altruism is a disease.
>does not help you or your family(genome) in anyway.
That's a very naive standpoint. So what do you, as consciousness, have to gain from spreading some quaternery number sequence (DNA)? You might as well go spraying graffiti. The thing with all the "reproduction reasoning" is that its sole argument consists of the ancestors who have reproduced - but that's circular logic: Everyone who exists was produced by ancestors. However, you can also choose not to reproduce, and it won't break the laws of the universe or make you drop dead. Reproducing is as much of an imperative as eating bread every day because your father ate it.
>Any AI we built would need to be handicapped
We won't be able to do this. Any AI that is shut off from the real world would be useless. And any AI that is connected to the real world would be a danger. It would be impossible to predict, since you cannot predict someone who is much smarter than you.
There is no future for humans in a Dominating society, only in a Coexisting one. I don't think either is good or bad, because we're all gonna die anyway.
>"the belief in or practice of disinterested and selfless concern for the well-being of others."
>in Zoology: "behavior of an animal that benefits another at its own expense."
Totally, not a healthy thing to be doing in any regard.
As in programmed so not fuck us over. That's the only way it won't fuck us over. If you can't handicap it then you will ve fucked over. Period.
There are no coexisting societies and there never well be.