Why are we seeing this decrease in the rate of moore's law?
primarily it is because we have hit the limit of easy reduction and new gains are increasingly difficult to mass produce. underlying this is the problem that lithography cant get much tighter than it already is.
It was never all that much of a law in the first place. I think a lot of people are going to be in for a rude awakening when in 20 years we have nothing new other than slightly faster computers and higher resolutions. Innovation is dead. "The future" as it's described in science fiction and cyberpunk will never be.
The "normies" are right. When was the last time you were really excited for a new piece of technology? Everything is just slightly faster phones etc. now. The only thing that comes to mind is VR and that is already failing. Driverless cars are failing since they don't work every time they are not in ideal weather conditions. Technology is pretty much at the best it's gonna get.
>when things that are bottom floor entries into a subset of technologies that didn't previously exist don't work perfectly, they are "failing"
Remember when the internet failed because 56k was so bad? Thank god we all learned our lesson and moved on.
Moore's law describes the rate in which the number of transistors on a chip doubles.
HOWEVER, it was never designed to account for quantum mechanics. The transistors are so small, quantum tunneling is becoming an obstacle that takes extra design time. The equipment to mass produce the chips also take longer to make.
I mean, the chip in your phone has about a billion to two billion transistors. In six years instead of four, Chips will have about six to eight billion transistors.
Not the same. The price isn't even one of the major failings of VR. But it does put the nail in the coffin. There's still:
-the requirement of an extremely high-end machine to run it, making the price effectively higher
-nogamez, this will especially hit ps4 VR which is already going to be filled with shovelware due to the fact that the ps4 really can't handle VR
-It has no potential. It really can't go anywhere other than some cheap games and movies
Still doesn't change the fact that tech as a whole is lacking in this day and age.
>Still doesn't change the fact that tech as a whole is lacking in this day and age.
No, moron, we have Smart phones. They have been a huge game changer and are an incredible leap in technology.
9 years ago, no one had a computer with wireless mobile internet in their back pockets at all time.
>making the entirety of VR development about the jewbook machine when everyone and their mother is scrambling to get something out the door
>implying being expensive and limited in scope is a unique problem for an emerging technology
How do smartphones exist at all? Cellphones should have failed before ever evolving, because they started out as expensive, clunky, and of limited use. Goddamn idiots perpetuated the cellphone meme, they should have been able to tell it was a gimmick.
The difference is in the fact that the newest phones now are really just small computers and not an improvement on the original tech in the phone. Something like VR won't have any kind of improvement from haphazardly slapping computers inside it. It won't make it any better.
Of course, you will never be able to point to any time in the last century where a 20 year period didn't see massive upheavals in technology, but this time will be different. Cause we kept slappin the computilateridoos into the widgerydots. Right, BTFO.
I'm itching so hard to facepalm myself unconscious due to this thread.
The rate at which technology and science is being developed will keep accelerating.
In 20 years time:
Mars will have hundreds of people trying to establish a colony due to the advancements in technology which have taken place over the last 50 years.
3D printing ... do I even need to go there?
Quantum computers will not see mass adaptation due to their niche applications but there will be a lot of development especially as classical computers will need to start dealing with quantum properties.
I want to say more but I feel ashamed because I just realized that I can only be responding to what must have been bait or unsubstantiated blind conjecture.
Correct. The "easy reduction" with associated increased clock speeds has reached an end.
Symmetric multi-processing / multi-core CPUs is the current strategy for improving performance, but you often need to re-write software to realise full performance gain.
Because we are reaching the limit of what we can currently do with electronic transistors. We will hit the limit of how small we can possible make them (and thus how many we can pack onto a chip) within 5-10 years, although the price may continue to drop for a while after that. Further performance can be gained with parallel processing, but that is often very complex and requires a lot of rewritten code for shit that's already out there, or a lot more skill and discipline on the part of software developers than they currently have.
It doesn't necessarily mean Moore's law is dead, since there are alternatives to transistors and the computers of today under which it could continue. However, unlike in the past, it's going to require a major replacement of existing infrastructure, and probably for a lot of industries it simply won't be worth the investment until the benefits become obvious.
What I think we'll see is a 10-year plateau of "Moore's law" for everyday electronics before some company gets the balls to take the initiative on the next computer hardware paradigm.
kek. This is the mark of an uneducated and unimaginative mind.
Sure, with the way things are going right now, we're pretty well fucked on the exciting new tech front. That's because, as the world stands right now, a lot of the major powers are working on humanitarian technologies to get the rest of the world caught up. Look at Facebook's free internet thing in india, or Google's wind-power generating blimps with Wi-fi, or NYC's plans to supply public wi-fi. Right now we're in an equalising stage. The goal isn't to advance technology so much as it is to spread it. That's not to say that there aren't new technologies being developed, but you can only discover electricity once.
And you're completely ignoring advances in software development. Let's focus on just the medical field, yeah? There are numerous examples of advances that have the potential to save lives, reduce the cost of medical care, and by extension spread it to parts of the world that need it. Look at IBM's Watson. While it may seem like a gimmick now, with a skyrocketing world population, especially in parts of the world with limited medical care, it makes complete sense to train humanitarian workers as nurses and have Watson do the actual doctoring. Now you say, "What about surgery?" Enter the da Vinci Surgery System. That shit can peel the skin off a grape, and sure it requires an operator, but hooking that motherfucker up to the internet and using surgeons in the developed world would be trivial.
Hey, now let's look at the rest of modern robotics. Look at companies like Boston Dynamics and IRobot, and universities like Carnegie Melon and MiT, and the all the others I can't name off the top of my head. They're all doing brand new stuff that's never been seen before. Exoskeletons for soldiers, autonomous robotic pack-horses (pic related), humanoid robots for natural disaster recovery efforts, there are new developments in the field of robots all the time.
Now let's talk about space travel. NASA and the Russian's aren't the only ones going to space, and they're not even the best. Look at SpaceX's Dragonfly launch vehicle. That shit lands on its tail, requiring only refueling and loading of payload. That's that main stage, by far the most expensive, being made nearly 100% reusable. Then you've got Companies like Virgin Galactic that want to make space tourism a reality, and throw a few hair-brain ideas like Buzz Aldrin's StarBooster, and you've got the makings of a new golden age of space exploration and commercialization, which will in turn drive more innovation. You think if VG's SpaceShip stuff becomes a success, which it easily could, that they won't pursue more avenues of bringing tourists to space? And Elon Musk has already said he eventually wants to be mining asteroids and colonizing mars in the next few decades, and who says that won't inspire new technologies? Just the other day I recall reading an article about experiments in tractor-beams using a special projection pattern of light, and there are of course NASA's plans to lasso a fucking asteroid and bring it into EO for study.
In short, if you don't think that new developments in technology are even possible, you're an idiot speaking out of your ass with literally no idea what the fuck you're talking about. This is probably just fucking bait anyway, but this really pissed me the fuck off.
Do you realize how retarded you sound?
You're literally saying that smart phones prove technology isn't advancing, because rather than stick with the old retarded design, they advanced technology.
You're an idiot.
but to make such a blanket statement is the same as saying we would have flying cars in 2015. One thing I think is interesting is while yes we probably won't have sci fi supercomputer smart watches, the possible diversity of computer technology now as amazing potential.
But it's painful to see all the potential the technology has and slowly realize that social and political factors most likely won't allow all this to reach anywhere near its full potential.
>I think a lot of people are going to be in for a rude awakening when in 20 years we have nothing new other than slightly faster computers and higher resolutions. Innovation is dead.
I think people underestimate the impact artificial wombs will have. That's a technology that will change our society imho.
Moore's law is genuinely stuttering (for real this time; single-core performance fell off Moore's law a long time ago, and it's proving extraordinarily difficult to economically manufacture further scalings on schedule.) and will likely break down entirely in the near future.
The idea that this means the end of innovation, or even the end of massive improvements in computer technology, is ludicrous. All it means is that those improvements to computer technology won't come on an insane doubling-every-two-years timeframe, but in slow evolution and stuttering punctuated-equilibrium breakthroughs like, for instance, every other field of technology ever.
Hell, it doesn't even mean that innovation will suddenly slow elsewhere. Kurzweil's claims aside, improved computing technology wasn't even close to the main driver of innovation in other fields, and they never grew exponentially in step with it.
Also, the way chip companies were chained to Moore's law genuinely stifled a lot of innovation.
When you have to deliver a doubling every two years or be outcompeted, long-term basic research gets crushed under the mad scramble. You have to go with the safe option or you're gambling everything on being able to develop and scale unproven technology in a time horizon of less than a decade.
The end of the mad trillion-dollar scramble of engineering and research on an epic scale that Moore's law required will mean an expansion of the time horizon and a greater ability to experiment and pursue radically new designs.
Oh, and with developers no longer able to count on improved hardware fixing their software bloat, we might finally start getting actually good software. Things have changed so rapidly there hasn't really been any time to discover how to code well. As soon as the field has had enough experience to start developing that, things have changed so radically that's no longer applicable.
And again, seriously, doubling computing power never doubled other fields of science and engineering. It's been extremely helpful, but simulations aren't even close to everything. For instance, simulations of biochemistry have pretty much always been useless, and yet medicine advances.
It is call physics,
or more accurately "known physics".
There are many limitations we are running into.
Personally I don't trust transistors smaller than 15 nm, because of all the instability I have seen working with nano materials around that range.
Just because we can build single atoms transistors does not mean they will be significantly functional because of all the things that can go wrong. Seriously, miss place an electron and you got a bad day.
Despite reaching these limits which do not appear to have any solutions we can still get a lot more out of out technology with better software and application setups. That said we got to learn to better use what we got before we get more than we can handle less we waste it like we do.
>He actually thinks that quantum computers will be useful for general purpose computing
ITT: A bunch of neckbeards in denial of the fact that they won't see a drastic technological advancement in their lifetimes like their parents did.
Get a load of everyone literally posting meme technology:
Literally trying to market humanitarian "technology" as an advancement
>believing today's "quantum computers" will amount to anything other than oversized refrigerators that are just there to be there
>Technology is invented
>May not even get patented
>Will be used for small scale or specialized stuff to small degree
>20-40 years later
>Technology is used for practical application towards the public
>Caring about Moores benchmark law at all
And "full performance gain" is often not possible to get. Amdahl's law is kind of a bitch.
>thinks we can achieve the singularity without substantial advances in hardware
THROW A FUCKLOAD OF AWS GPU INSTANCES AT THE PROBLEM, BEOWULF ALL THE WORKSTATIONS AT THE OFFICE TOGETHER.
And then you hit memory and networking bottlenecks and cry.
No wait, shit, we were talking about parallel workloads, the picture is all wrong.
>Mars colony by 2035
How do you think we're going to get there? Our current engines aren't good enough and it's doubtful we'll see any major improvements in a few decades.
>implying carrying more computing power than the entire Apollo program in your pocket is somehow not a big deal
So, we need to start making CPU's out of uranium?
Practical ion engines are fairly new, but really only useful for probes
>Our current engines aren't good enough and it's doubtful we'll see any major improvements in a few decades.
well there is that guy in omaha trying to get a warp drive running, but he's probably going to fail
The era of great technological, scientific and medical progress is over because we have stopped dreaming.
In 1960, when you asked people how they see the future, they spoke of flying cars, of the end of major diseases, of colonization of space.
In 1970, innovation was the spacial rockets, automation of daily tasks, etc.
In 2010, innovation is a facebook application.