How the fuck is blind belief in The Singularity and a techno-rapture even a thing?
How can anyone be so fucking blind to the obvious and glaring problem with the idea of "infinite progress"?
Moreover, how can all these pop-science pothead-logic reddit soothsayers (e.g. WaitButWhy, CGPGrey) keep talking about the Singularity being "right around the corner" with such optimistic conviction when every sign is pointing to technological progress slowing down and growing exponentially more expensive? The difference between 1950 and 2020 is minuscule compared to the difference between 1880 and 1950. How can they not see that?
>>8908970
Because it's easier to muse about hard work being done than to actually do it.
>>8908970
so you're telling me sam harris is a moron?
yeah fuck off
>>8908970
>The difference between 1950 and 2020 is minuscule compared to the difference between 1880 and 1950.
I'm curious why you say that, and what metric you are using for technological progression.
The world's population in 1880 was 900 million. In 1950, the population was 2.55 Billion. The difference is 1.65 billion people. (183% increase)
Now, the population in 2020 will be 7.56 Billion. That is a difference of 5 billion. (196% increase). We are able to hold 7 billion people solely because of advancements in technology.
Arguing that technology is "slowing down" is just nuts. All measures of reality show we are rapidly accelerating.
>The difference between 1950 and 2020 is minuscule compared to the difference between 1880 and 1950.
What metric are you using to compare these two time spans?
>>8908970
How is blind belief in anything a thing? People like to pretend they're done examining their ideas and growing.
>>8908970
>The difference between 1950 and 2020 is minuscule compared to the difference between 1880 and 1950
>>8908970
All that's necessary for singularity is a machine that's able to improve its own hardware and software by learning, solving, and adapting.
>>8909386
No, singularity also requires that the increase of computational (or cognitive is maybe more fitting) capacity outruns the increase in difficulty of increasing the computational capacity.
>>8909417
Which would be solved by an ai that can improve its own hardware and software.
>>8909422
No, you don't understand. If for example an AIs ability to increase it's ability was proportional to its current ability:
y'=ay, a>0
Then there would be no singularity only a exponential increase is computational power, just like we have now.
Of course the ability to increase the computational power of an AI system ultimately rests on the laws governing nature, so it's impossible to predict if a singularity would occur.
>>8909517
Of course singularity wouldn't be instantaneous, it would be exponentially faster than we are going right now.
You seem to define singularity by a certain improvement over a certain time frame, and there may be one to some, but I define it as an exponential progression of intelligence because of ai. In other words, once ai can discover and solve unsolved problems faster than humans, I'd say we're in the singularity