Ray Kurzweil and "techno-futurists" utterly BTFO. Turns out it's harder to make computers faster than they had predicted.
Well they're at the point where microprocessors are fucking tiny. There's a point to where there can be no improvement without making a processor the size of an atom. It has to stop somewhere.
quantum mechanics will not allow miniaturization of transistor channels beyond approx <8nm. we are pretty much there now. most advances lately have come through parallelization and clever uses of architecture and compilers.
quantum computing is well on the way to bring us to a new era though.
It's not the microprocessor it's the width of the transistor. Transistors can act a few different ways, but in CPUs they basically act like switches. The size of the switch is getting so small it's getting extremely hard to manufacture them that small, not only that but electrons have weird properties and the quantum properties become a problem when the width of the transistor is a few atoms thick.
It think it HAS been posted to /g/, but this kind of stuff has become political, too. I see a lot of "well, it won't matter in a few years because we'll all be in virtual reality."
Except, maybe, we won't. Maybe that's not going to turn out quite so well for us.
While it will be hard to reduce microprocessors circuit length under 14nm without artifacts and problems with simply having electron fuck off elsewhere, I would like to remind you that 3D lithographic structure are a possibility and that even going 2.5D or later 3D in the near future will greatly increase the efficiency of our hardware as examples I'd like to cite HBM graphical memory and Samsung recent SSD breakthroughs.If Intel ever go about funding it's R1D for something else than hiring "more diverse" people and actually do some real science thus figuring out how to make a 3D microprocessor while fixing the obvious issues(photo-lithographic processes,heat dissipation inside the structure ect) i'm pretty sure we'd see the same kind of increase in computing power than when we switched from monocore processors to multicore processors.
I'm not an expert tho
>implying we'll ever commercialize graphene at a reasonable enough price
>implying every hardware corp won't just sign a deal not to use it because it's more expensive and reduce their gain
Unless graphene become really fucking cheap and we find a wait to use in lithography, you won't see it in any microprocessor.
>muh quantum computer
We actually only have a 5 qbit working prototype i believe and all it did was factorize 21 into 3*7.
All that you see by google and all are imperfect prototypes not taking into account a lot of data regarding the electrons positions.
Their so called 512 qbit computer is actually not that powerful and has a limited range of tasks it can do.
Don't fall for the quantum jew we won't have quantum computers in the next 30 years and even then you still won't have one because it's require a -273C environment to operate, and i doubt you have a fridge that powerful.
Pretty sure Kurzweil addressed this, I specifically remember these plateaus being discussed in one of his books. Progress tapers off as you approach the limits of one technology, but then a new paradigm/completely new technology comes along and exponential growth continues.
It's no secret that we're approaching the limit of silicon-based semiconductors. But all it will take is for graphene, optical processors, quantum computing, something like that to become economical and push the limits of computing far further than they are currently.
That said, the "singularity" or whatever isn't going to happen in 2030 or whenever it was that he predicted.
I think you're greatly doubting our capacities to adapt and at the same time overestimating them.
Graphene is a much more costly option than simply switching to 3D structure in microprocessors.
Optical processor would require a lot of work in material research like graphene and in the end we still haven't found ways for some structures.
Quantum computing is currently absolutely out of reach.
Silicon based semiconductors will stay and will be disposed on multiple layers to form 3D structures.
Not because this the best solution but because it's the one that require the least R&D and will be the easiest to implement.
Techno-utopists are such a moronic bunch anyway. Their belief in forever increasing technological advances is based on the fact that we have had it in the past. Guess what, just because something has been happening in the past doesn't guarantee that it will continue forever.
Science is about figuring out laws and interrelations about how reality works. Technology is only the application of that knowledge. Both have limits and we are are subjected to diminishing returns as you figure more and more of it out.
Most fundamental science has already been discovered a long time ago (almost exclusively by genetically European people btw). Our rate of technological progress has been decreasing for some time already, people just don't notice it yet because of flashy techno junk like smartphones.
As the world as a whole hits the wall of limits to growth we might even regress to lower technological levels because high-tech is very energy-intensive and thus expensive to afford.
The reward for innovation is not there yet, our current silicon based semiconductors get the job done at a competitive cost. As soon as we hit the brick wall there will be a massive incentive to be the first company to produce the next gen processors. Don't we still have 2 silicon generations to go, roughly 4 years? 10 nm and 7 nm
Yeah, and they started putting the waste byproduct into our water supply! Electronic manufacture is water and rare earth metal intensive, with planned obsolescence, I think we are there, the ends of the means. Mass surveillance, the end of creativity and the enlightenment in general. Assholes are multiplying exponentially and resource wars brewing.
Singularity simpletons are as bad as fusion fucks.
>Science is about figuring out laws and interrelations about how reality works
what is Cern?
>Our rate of technological progress has been decreasing
said on the internet in an age where hardware is being updated every year with a minimal 20% efficiency upgrade
>people just don't notice it yet because of flashy techno junk like smartphones
I think you don't notice we're building tons of stuff like self driving cars and high capacity batteries and Nuclear fusion reactors.
>As the world as a whole hits the wall of limits to growth we might even regress to lower technological levels because high-tech is very energy-intensive and thus expensive to afford.
Again the "we don't have enough energy/ressouces" from a fucking belge i'm not surprise if you came out from sub-France into real France you'd realize we're able to produce 80% of our energy from nuclear reactors thus enabling us to maintain our society's electricity even if we only had a portion of the world's uranium for at least a good 200 years.
Infrastructure is cheaper than you think and research is cheaper than not doing it otherwise nobody would do it.
Also why would it be appealing to return to lower technological levels? Are you a Mormon?
Moore's law predicts a linear trend for the increase in processor power over time.
Basically the graph is starting to deviate downward from the very desirable Moore's prediction.
Which means our computers aren't getting better as fast as they were predicted to.
Of course, this is a slight deviation but if the trend holds out for much longer it will have serious effects on long term computing development because raw data is being generated by the entire human species at mind-blistering rates, and the devices we use to measure and obtain information that data is generated from is getting better too.
Consider a high definition camera.
That can be considered an information gathering device. It generates data from measuring visual information.
As these cameras get better - and the better cameras get used by more people - the computation power required to move, store and read the data (say, buying a movie and using your computer to watch it, or downloading a movie and watching it.) also increases.
Basically if we want to keep being able to execute increasingly complex computational tasks, we need processing power to increase. We need our Intel chips to double their "horsepower" every year.
Or we'll hit a plateau on the usefulness of computing and we will cease to discover new technology of any kind at the rate we've been enjoying since you and I were born.
(reminder that the sheer volume of capital brought into the global economy due just to the technological explosion of the past 100 years has utterly transformed our species, but we're absolutely dependent on a sustained rate of growth to keep our line of credit open or else we all starve to death.)
This has more to do with the vast majority of software being terrible in every way.
Computer engineers can make more transistors than anyone can understand how to use and IBM has recently revealed their 7nm prototype.
supercomputers aren't going to run on RISA
The global economy depends on it.
The future of space exploration and possible economic exploitation depends on it.
The growth of every science depends on it.
Hell if this gets bad enough it could crash the tech markets. (it's not affecting it yet im assuming)
If and when Apple's diminishing return starts kicking in and the next iPhone isn't slightly better than this one, the most valuable corporation in the world collapses like so many matchsticks. Because Apple ain't actually shit. It's working voodoo on the masses by revealing next-gen shit with slick marketing like some kind of magician. And it sells like no other in history.
When that market taps out, or markets like it tap out, the economy will feel it.
I'm not saying that there is NO progress at all anymore. I love science and technology, it's utterly fascinating, but the rate at which technology emerges that truly revolutionizes the world is decreasing significantly. The most groundbreaking and revolutionary tech has all appeared a long time ago. Something like electricity, the Haber-Bosh process to produce synthetic fertilizers or combustion engines have uplifted the standard of living for humans on this planet like nothing ever had or will. These are all relatively old already. The last decades have been pretty meagre compared to the technological revolutions of the past. That's because all the low-hanging fruit has already been picked so to say.
Pointing this out angers many people these days because many cling to science and technology to remain optimistic about the future.
Alright i'm just going to cite a few thing which will revolutionize the world:
>Fusion Reactor (building one in my country fuck yeah)
>3D microprocessors and memory
All of theses thing will become a reality before 2030 and will most likely impact at least the upper-middle class.
Also we might need a bioethics thread about genetic engineering on /pol/. Would like to see the general consensus.
>Pointing this out angers many people these days because many cling to science and technology to remain optimistic about the future.
The cult of technology was always meant to distract people from their worsening lives and the decay of society.
While this is true Science and Technology is a path to self improvement and the improvement of society but like each powerful sword, it has two edges.
It doesn't liberate us from any need of societal order or cultural control but instead it give us tool to help us in those tasks, or hinder our progress.
In the end technology will either be the tool of our suicide or the tool with which we'll build ourself and the world around us.
>but we're absolutely dependent on a sustained rate of growth to keep our line of credit open or else we all starve to death.)
>but we're absolutely dependent on a sustained rate of growth to keep our line of credit open or else we all starve to death.)
>but we're absolutely dependent on a sustained rate of growth to keep our line of credit open or else we all starve to death.)
Tech and Trade are both shutting down. And our entire economies have been built around these two sectors. Meanwhile, all other forms of economic progress (Wage growth, Government Investment, Financial Reforms) have been sacrificed on the alter of Reaganomics.
The 21st century is ending not even two decades in. This is the 'big' picture that's going to blindside the world.
Good. The only institutions that need supercomputers are the tyrannical regime governments in Washington DC
The last thing these treasonous cocksuckers need is more computer power to further erode and violate our god given Constitutional Rights.
>Turns out it's harder to make computers faster than they had predicted.
The reason they're having trouble right now is that the manufacturing limits of silicon are being reached.
You're getting inconsistent yields on chip designs, you're going to face quantum tunneling in a few years of shrinking the dies (basically the transistors are so small they experience quantum effects).. add to it that there's little real competition happening right now (Intel rules the roost and is only threatened by mobile chip designers) and we're standing around waiting for the next breakthrough in design.
You have quantum computers (that won't be going mainstream anytime soon because of the need for a completely new kernel design).
You have new materials (various forms of carbon nanotubes have been proposed to deal with the die shrink issues).
Or it might just be that computing will have to halt on shrinking and start expanding with multiple cores even further. The problem there being that most programs aren't written to take full use of more than one core, even now after years of multicore computing.
the time horizon is too short to confirm or deny moore's law
your may as well say the S&P tanked over the last couple of months so the economy is BTFO
>Saudi Arabia run out of oil or raise prices
>while financing terrorists groups
>Suddently America decide they need Democracy because they have too much army and their GDP is too high while China and India stopped being third world shithole and America need the bucks to make sure they stay n1.
>Suddenly America Attack Saudi Arabia over "oil prices" when the oil reserves are low in Saudi Arabia
>proceed to rape their Army and steal all their crazy luxuries and resources
>suddenly America is n1 again
And that's just one likely scenario, ressouces from space won't come in that face in large quantities, good luck getting back an asteroid back to earth in one piece before 2050 at least.
>Moore's law predicts a linear trend for the increase in processor power over time.
Nitpick. Moores law was not a linear trend prediction nor was it about "processor power" it was about the number of transistors.
In any case it's true that lately processor advancements have not been nearly as exciting as they were in times past. Today's server processors don't execute *single-core* tasks significantly faster than the high-end server processors of 6 years ago. But, you get 18 cores now instead of say 4.
Which suggests we may be approaching the limits of what is practical / economical with current CMOS designs.
For years specialized computations have been offloaded to GPUs and FPGAs because processors are too general-purpose.
Good day to you.
>Also we might need a bioethics thread about genetic engineering on /pol/. Would like to see the general consensus.
Pretty sure the consensus will be "this is a good thing".
/pol/ has an anti-GMO, anti-vaccine, anti-science branch, but they're not that big. Genetic engineering is the promise of eugenics in the space of one generation. It's the big leap in medicine for the future; preventing a majority of problems from existing in the first place.
With gene therapy we may even be able to help people already alive (this was always the big funding issue, people already alive don't wanna spend big bucks helping the unborn).
Of course the impact effects will be immense. Will genetically enchanced kids be allowed to compete in sports? How could you even tell they'd been manipulated?
Further, rest assured it's already happening. The technology exists, all you need is the (lack of) medical ethics to do it on humans.
What you're not recognizing is people don't need better computers doesn't mean that we can't make better computers.
When people had 256MB HDDs, they also saw the quick growth that came with it because 256MB wasn't enough. No one needs toasters that can do calculus. No one needs desktops that can process data centers of information.
This isn't true though. IBM has prototyped 7nm transistors. There are rumors of 5nm transistors.
And none of that matters anyways. Software engineers can't take advantage of most of area on a chip because it's just too dense.
I actually have to concede on this point, though, senpai.
With Finance, the writing is on the wall, because all of the trends are clearcut. But in Tech, it's alot harder to predict what could happen.
I don't know what this is, but maybe it could be corrected with an invention like this. >>61218388 Or maybe a new 'innovation' could emerge. >>61218400
But it does effect the price of Oil, so it indirectly controls the calculus.
Jew Intel says Moores Law wont end until they hit 10nm or maybe even 7nm chips.
But they said they will be switching to some kind of gas shit instead of silicon they use now to get past the Moores Law limit to hit 7nm and 5nm chips by 2025
how the fuck is he bernie?
Giving all of america's capital to the chinese, the saudis and the UN/world bank hasn't got us anywhere.
I'm libertarian as fuck but pro-domestic labor.
I say wage growth, government investment, financial reforms, subsidized college education, and limited immigration.
Abolish all personal income tax at any level, put in place a moderate cap gains tax for UHNW indiviuals, and tax the living shit out of corporations that utilize foreign labor or to skirt environmental regulations.
All taxes should be on private property by the county only, state level on corporate revenue then federal level on corporate profits, dividends, and retained earnings.
Capitalists like myself pay their taxes indirectly through their ownership in corporations.
Yeah but we need to go 3D with CPUs too .
HBM is merely 2.5D also.
HBM2 was annouced to be full blown 3D tho also comes with an enormous upgrade in bandwidth.
NAND seriously need to be adapted for L3 and L2 storage also.
>google officially has a Quantum computer
NO NO NO.
they have an imcomple quantum computer it doesn't take into account all the positions of the electrons.
The biggest achievement of real quantum computer are factorizing 21 so far.
their pseudo 512 qubit is an insult to quantum computer.
Around 300 qubit would be enough to compute everything since the Big Bang.
512 qubit since there is exponential growth would be way overkill if it was real.
To add onto my point, computers are not following Moore's law at the moment because silicon semiconductors can't get any smaller without leakage. This is why carbon nanotubes are needed
See. That's the thing. I could work with someone if he had an inherent loyalty to the nation. We could disagree about policy, sure. But we are still fighting on the same team. Therefore, both of us could come up with a plan better than either of us.
Who the fuck knows what's behind these Wall Street Slimes. It's all lies and fraud. And if shit goes to hell, they'll just jump ship at the first sign of trouble.
Silicon is reaching it's limits. Future advancements need new materials. The problem is that everyone is so invested into silicon and moving away from it will cost lots of money so no company wants to be the first one to take the big hit.
>And none of that matters anyways. Software engineers can't take advantage of most of area on a chip because it's just too dense.
Nonsense. High performance computing developers definitely are able to take advantage of what's on today's processors. They care about pipelines, cache-aware algorithms, memory latency, and all the fine details necessary when wringing out all possible performance from all cores of a processor.
desu i would say that if extrapolate Moore's law to technology as a whole the trend is clear cut, we went from being monkeys, to figuring out farming, to shit posting on smart phones all at exponential intervals
And that's basically what Kurzweil does in his book. If you go back 100+ years even before Moore's law, even back in the days of solid state relays, then vacuum tubes, etc. computing power has been increasing at an exponential rate. There may be slight dips and slowdowns as one technology nears its limits before a new one is introduced, but the long-term trend is pretty clear.
Yes. But technology effects the world in different ways. If computers hit their limit, then new kinds of technology will emerge.
More importantly, we "switch" between kinds of machines, as each machine suffers a decline in Marginal Utility.
Once upon a time, railroads and Menlo park was where technology was done. But now, railroads aren't particularly special anymore.
The problem with Kurzweil is that his creativity is very narrow. Technological progress may come, but it may not necessarily be from computing powers. It may be something entirely new altogether.
It's like assuming that all technology must come from Steam, because Steam has been the primary growth of tech to date.
We are so ignorant about economics, that most 'economists' aren't even aware of their own economic laws.
Wirth's Law is just Baumol's law applied to Tech.
Holy shit though. If these fuckers didn't have their billions of dollars and patents, they'd be nobodies. The more I know what Tech knows, the less I find what Tech truly knows.
Do you think the exponential appreciation behind Moore's Law is partially responsible for the lack of software optimization?
Go back to the 80s, with much more limited hardware, and people would devise ways to push that limited hardware to and even past it's theoretical limits.
If high-end PC games were programmed in the old sense, they could very well run faster and on much weaker hardware.
This may be of interest to you.
See the "Theoretical Limits" section.
This sort of suggests there's a limit to how efficient processors can get.
It may be a long while yet before we run into these limits, but the laws of physics suggest there is a point where it becomes impractical to develop further.
FUCKITY FUCK FUCK
MOORES LAW IS ABOUT COST TO QUANITY OF TRANSISTORS OVER TIME YOU FUCKS
NOT COMPITER PERFORMANCE OVER TIME
NOT DIE SIZE OVER TIME
NOT MOST POWERFUL COMPUTER FLOPS OVER TIME
F U C K
ITS ABOUT COST YOU FUCKWITS
WE COULD HAVE PERFORMANCE STAGNATION FOR THE NEXT 10000,00000,0000 YEARS AND MOORES LAW WOULD KEEP HOLDING TRUE IF THE TRANSITORS JUST KEPT GOT CHEAPER FOR FO
Yes, but that's dealing with abstract thought that's too far beyond us, at the moment. I'm impressed you brought this up. A good culture can mitigate alot of the damage (No more Man Child coders anymore.)
Just understand that software still suffers from Baumol's cost disease. Mathematically, this nixes the limits of computing.
Trains, OTOH, don't suffer from cost disease, because shipping can always improve with Capital. (It may not pay well, though. So we don't do it.)
One day, a chip is $10
The next day the same chip is $1
The chip doesn't get better
It's just cheaper
Sure, you can now have more computer operations per second with your increased number of chips
But the performance to transitor ratio didn't get better
the point is we become exponentially more productive while utilizing less resources
>all you need is the (lack of) medical ethics to do it on humans
>Not saying: all you need is the (lack of) Luddite-esque, stubborn, and stifling Judeo-Christian ethics of no concern to the Chinese
FTFY. But, in all honesty, if we don't take this as seriously as landing on the moon, the Chinese will and we will pay for stuffing the issue under the rug, likely in more ways than one.
>Moore's law predicts a linear trend
no. it predicts doubling of number of transistors every ~18 months. this isnt linear, its exponential.
its linear if you view its the logarithm base 2
That just means the current paradigm is ending for how our chips process information.
You will be BTFO when Quantum Computing comes on line.
The singularity will not be stopped.
Yes, but it isn't inevitable though.
(Also, if we think Computers are the only way we can do this, then we stagnate.)
Shitposting at ever increasing speeds as we approach the singularity. If the computing power of the hivemind isn't strong enough its consciousness will be be unable to overcome the shitposting. Skynet will be a damaged mind from the start.
>Do you think the exponential appreciation behind Moore's Law is partially responsible for the lack of software optimization?
IMO yes and it's not hard to see why.
Search this thread for "Wirth."
Cheap processing power allows developers to be lazy. As a developer laziness can be tempting if you have more important problems to think about.
Faster processors allow unskilled developers to get away with writing inefficient algorithms, slow code, etc. because it performs quickly enough that it isn't a major nuisance.
However, our software is way more featureful now than it was before. Consider browsers.
For a laugh you could try running an old version of Netscape Navigator to see how quickly it starts up and loads webpages. However it doesn't support any of the flashy modern goodies we are used to, nor even CSS.
>Honestly people don't realize how huge a fusion reactor would be.
Jokes on you they are building one in my country right now.
>tritium is a limited resource
>the closest wait to get some is to mine the moon.
Please best ally can you bring more space rocks next time?
so provided we don't all kill each other we will reach utopia?
sounds about right.
I have an inherent loyalty to my nation from a purely utilitarian standpoint. Which is the most stable form of loyalty anyways.
What is the best guarantee of survival for a person and their lineage? A system that keeps them and their progeny the primary benefactor of their labor and ideas.
How is this accomplished? By cultural security, economic freedom, and moral and mental development of the person and his community.
Nationalism as a near-religious ideology is stupid. Nationalism as the cultural tool humans have developed for survival is an incredible precedent.
It's like tribe or family.
You might not have the most renowned gens (family name) or most powerful tribe but it's all you have and you must honor it and do your best to bring honor and repute to it for it is where you and your progeny belong. You cannot belong to another family or tribe or nation without very grave rituals and processes that establish you as worthy and dependable. And these types of initiations are only granted for the most important of reasons.
Systems like these are why it is important to ensure that people within them have access to moderate social safety nets (I benefit from having a safety net as much as anyone else which is the reason I support it.), education (a person who has their education provided by the community will produce more for that community than was given to him. fuck giving people money to study worthless degrees. fuck hiring professors that are degenerate.) Providing workers with fair wages and systems of negotiations keeps them productive and willing participants in the economy and in culture.
Once you veer into the nihilistic morality of providing material wealth for people because they deserve it due merely to their existence you've lost any basis in reality and you will be abused by predators who mimic your moral system then gut you.
>I have an inherent loyalty to my nation from a purely utilitarian standpoint
you live on a tiny spaceship made of rock hurtling through a vast expanse of nothingness.
>What is the best guarantee of survival for a person and their lineage?
the time horizon on which evolution is significant is so great that your probably have better odds of winning the lottery (or casting the deciding vote in an election)
Perhaps it will only surface again when it's absolutely needed. Maybe in a situation where, after 15 years of continuous development on the same framework, the structure turns into a snowballed mess.
The lack of it has certainly caused some more trouble than others (Adobe and Microsoft come to mind)
I'll take your Netscape offer and raise you one Internet Explorer 3.0
I'm a Hoppe libertarian who lives in a trailer, but this is just fucking Cletus tier.
>Generating digits of pi
>Particle behavior modeling
If you can't see the industrial applications of massive processing power, you need to reevaluate the world.
>Perhaps it will only surface again when it's absolutely needed.
That's what I think will happen.
People will be lazy until throwing more hardware at the problem because impractical and people are developers are forced to use their brains.
Once in a while there are software developers who are displeased with crummy, slow software, and do something about it.
Examples that come to mind are the early versions of utorrent, and early versions of the opera browser which was touted as being lithe and snappy compared to the slow browsers of the time.
No shit and it's a spaceship filled with all sorts of terrors that the natural world has cooked up. And a lot of meaning for humans to discover for themselves and share with their fellows.
Yeah but the time horizon of human history is a lot smaller and your actions have a lot more fucking impact.
Do you want to see a world of humans that died after peaking in the 1900s or a world that makes it past the present zeitgeist'o'shit that is the postmodern world?
>Oh no! Whatever will happen to all the high IQ slackers who go to "learn to code" bootcamp after a failed run at professional gaming, learn Python, and expect to make their 250K starting?
Phrasing it like that, many potential incentives are coming to mind. So many things that exist with latency and problems would be easier to build up from scratch using modern standards and classical discipline.
Remember when Firefox became a sensation for being lightweight, fast, and stable after MS was resting on it's laurels? Then when Google Chrome became a sensation for the same basic reasons that Firefox did after Firefox got bloated?
It will be interesting to see how Bare Metal OS works out. Hand-coded x64 low-level ASM in the modern era.
Moore was never wrong though.
>Moore's law (/mɔərz.ˈlɔː/) is the observation that the number of transistors in a dense integrated circuit doubles approximately every two years.
He never mentioned speed ups, just double the transistors per area. He obviously knew it would run out once you got small enough.
Well memory companies have only started to stack things vertically which is what I think will happen more so in the future but with processors. I'm thinking along the lines of a parallel processing computer that mimics the human brain so to speak
>Programmer here, a lot of programs simply cannot be multicored. If you have to perform a calculation on the answer from a previous calculation, then you ABSOLUTELY MUST wait for the previous calculation to complete before you can move on to the next step. You cannot make such a program parallel; it doesnt even make sense.
Like lets say you wanted to compute a frame of a video game. Mario moves forward (mario's X coordinate increases by 3) and then you have to do a collision check to see if he bumped into anything. (to see if any shapes overlap or are occupying the same coordinates) There is literally no way to compute the collision check until his position has been updated. More cores wont help. They can never help in a problem like this.
In other problems that do benefit from parallelization they are generally already being performed in parallel. Its not as bad as you make it sound. Bitcoins are mined massively parallel, so are graphics.