If we program a super intelligent AI which is able to program by itself and tell it to program a even smarter AI and tell that even smarter AI to do the same etc. will we end up with the most intelligent being possible?
It needs to be running on a quantum supercomputer though.
No, because it's intelligence will be limited by its initial programming and the sum of knowledge it has access to.
>>58824279
what am i reading
>>58826225
It has access the entire world wide web. I'm sure that is enough information.
>>58824279
Create something that can learn to learn and you just made a human. Create a program that can learn to make machines that can learn to learn and you just made a "God".
>>It needs to be running on a quantum supercomputer
Why? It could be running on a turing machine, your question was about if this process converged to some ideal. Which by the way...
>> most intelligent being possible
type "vague" in all fields
>>58826225
No, that's knowledge.
Intelligence is how well you are able to understand and learn
>>58824279
AI will be limited by processing power, access to information and storage.
It will keep learning but will stagnate when one of those three is absent.
The programming though, it could too complex for today technology.
You don't "program" an AI, though. You grow one.
>>58824279
No. Since entropy must always increase. Any AI that programs "an even smarter AI" will just program a more bloated AI.
Eventually your AI becomes so bloated, even Xorg is shameful
>>58830701
Hence it being limited by its initial programming