Moore's law for supercomputers appears to be slowing down, with progress more limited than had been anticipated. Have we hit the wall?
Silicon transistors really are slowing down.
If you look at technology over a longer term, it tends to plateau a little until a breakthrough is made then it shoots up again. Graphene transistors or quantum computing is coming in the next 100 years to revitalise moore's law. Which isn't really much of a law than an observation.
Silicon transistors are reaching their smallest size rapidly. New architectures will have to be employed to revitalize Moore's law. Otherwise, new materials will have to be used. Graphene looks to be the next small step before quantum computing really takes off.
Regardless of what his happening, Kurzweil is hardly BTFO. In his language, a tapering of the curve merely signals the end of a technological paradigm and the beginning of a search for a new one.
What an insanely myopic view.
You're not interested in having the computational power of your current desktop PC fit on a chip the size of your thumb nail?
You're not interested in single nodes with the power to simulate the whole human brain, when today's best supercomputers must be content with covering only a small fraction of it?
You're not interested in nanobots that could mimic the function of red blood cells or neurons?
>it wouldn't have flops
sure it would. It works be like every other computer, except algorithms can be in multiple states at once.
example: today a video-card has thousands of ALU to compute the same thing with just different input-values. Those thousands of ALUs could be in theory be replaced by a single quantum ALU.
Luckily the concept of god is a silly caveman superstition so we can play what we want.
We should worry about not accidentally (or purposefully) destroying ourselves instead of playing fictional entities.
Best not open an umbrella indoors
>What an insanely myopic view.
Honestly I am more terrified of the implications of such technologies, we already see what humanity uses the internet for. Ordering dragon dildos and watching others play video games for them or just sending naked pictures to each other.
Also I strongly doubt that even the smartest minds will have the capacity to impliment these technologies if they actually become a reality. I mean software written today is still single threaded and the most brilliant people at sillicon valley are trying to reinvent another hype driven chat client.
I think the biggest breakthroughts in science are going to be within genetics and automation rather than IT.
Moore's Law was about price, not power you stupid nigger.
You might want to argue that with Gordie.
11:01- "I wanted to get across "here's an idea, where the technology is going to evolve rapidly and it's going to have a major impact on the cost of electronics. That was the main point I was trying to get across, that this was going to be the path to low cost electronics."
- Gordan Moore.
>conflating transistor count with performance
>Silicon transistors really are slowing down.
Performance is derrived from architecture first and foremost, the issue is not the substrate material. This is popsci garbage regurgitated by the clueless.
No one is going to massively increase serial integer performance without some radical paradigm shift in IC design. We can't possibly feed instructions to the ALUs fast enough. FPU performance can still make healthy gains by way of larger vector instructions, and multicore scaling can still benefit greatly without any major R&D being required, especially when it comes to socket to socket communication. The structure of the core itself is the issue, and this is true regardless of ISA. Every Von Neumann architecture will reach a plateau in single core performance.