Will singularity actually happen or is it just a sci-fi fantasy?
>>8975717
Did I post before the bingo image and >>>/x/ recommendation?
If so, nice. That poster's usually damn fast on these threads.
Singularity is an umbrella term, so there's no way to say it will or will not. It's just supposed to be the point where our ability to research and apply new developments begins to result in returns at an accelerating rate beyond which past speculation is pointlessly inaccurate.
Kurzweil marks it as the point in which a single computer will have more computational power than all the currently living people on the planet's brains. That might happen, but we nowhere near it currently.
>>8975717
When quantum computing will have developed far enough and machine learning reached a high enough level. Currently we could only do simple arithmetic on quantum computers, and machine learning is used for really really narrow applications right now. Even then... how will we know the singularity has happened?
>>8975717
I believe that Singularity is not possible. For what I'm going to argument I define Intelligence as "inference and deduction" (this is an over simplification but follow me).
The AI Singularity theory proposes that at one point when we create a machine capable slightly more intelligent than any human could, this machine could then create something more intelligent than it and so on creating this so called infinity intelligent machine.
For that to happen intelligence would have to be unbound, and I think that intelligence is actually bounded, that is, there is a limit A on the rate of inference and deduction any machine (biological or otherwise) could archive.
Suppose we have a machine M with X spatial computational power and Y temporal computational power (that is, how much and how fast it can make computations) I think that any optimal inference and deduction strategy is independent of X and Y.
This is observed in biology, the rate of evolution is independent to the size of the population and speed of reproduction.