Moore's Law is dead. Chips haven't gotten faster for shit lately. It will stay dead. Prove me wrong.
Silicon can only get so hot and current transistor gate manufacturing processes limit us greatly. We need new material that is more heat resistant and new way to compute data and rewrite all the code and shit. Bro . We are all gonna die
Isn't IBM and HP working on a new type of computer that imitates the brain basically? Just stacking a whole bunch of processing chips into a cube or whatever the fuck and apparently it's really power effecient
>But the old way — the old promise — of a perpetually improving technology stretching into infinity? That’s gone. And no one seriously thinks graphene, III-V semiconductors, or carbon nanotubes are going to bring it back, even if those technologies eventually become common.
>He thinks moore's law is about processors getting faster.
It's just about the count of transistors which does not equal the "speed" of the processor.
p.s. I bet your one of those kids who say they have a 10.8 Ghz CPU just because it says 4x2.7Ghz.
no. it's the marketing demand.
evolution is flawed because the need for CPU resources is stagnating. Tablets and smartphones are selling, and requires small chips but not real chips. it's technology that is the equivalent of a ten years old laptop/office desktop, and even...
there's no real need for insane cpus, since most of the thing done with PCs relies on the internet connection. the only recent thing Intel brought was just rebranding core2quad into i5, i7... AMD doesn't care much, they're just continuing on selling.
Most of the high performance computing's demands are basically servers. AMD makes cheap opterons, what Cray fills entire boxes to make supercomputers. Of course we still have Fujitsu, NEC, SGI and IBM and a few brands that makes monstruous processors for vector machines. (altough even SGi now are filled with Xeons.) add a few custom corprocessors like a rack of nVidia teslas and you got something perfect for scientific simulation.
There's nothing requiring real processing power in matter of software, even if we wouldn't mind a few more gigaflop/s on our machines to improve a few things. GPUs does a lot of work for all 3d tasks.
the cancer lies in the cheap, and so-called "green computers", macs, all in one desktops, touch screens runing windows8. the goal of course is to destroy personal computing. People being equipped with tablet or other terminals is the goal of this market-controlled and technological degeneracy.
since it's better for those who aims to control the internet to not have you owning your own personal storage device. many faggots claims the cloud is the fucking future, these are a part of the problem. using ipads/chromebooks/macbooks, things you buy and throw away after the battery no longer holds a charge (2 years max)
the goal is to destroy anonimity, privacy, possibility for all to create, hack, keep all kind of downloaded movies and media at home, including all kind of illegal books and docs.
I understood fine.
What was the point in the anon who you are referring to being correct? It could be assumed that that anon was being sarcastic.
You could also interpret what that person said as being incorrect, since they might have been trying to say that silicon were the same thing.
Silicone != Silicon
Jesus fuck, trying to follow this train of thought is harder than undoing my bro's box of cables.
So, let me get this straight: The one anon said that silicon providers can now focus on providing silicon to make implants, correct? Silicone contains silicon. Silicon don't come from thin air, it has to be provided by someone.
It seems like they're actually making headway with tangible gains, though. To me it looks like it shouldn't be too far off.
I know rite? I dunno why the IDs that /b/tards get are omitted everywhere else. Sure, less shitposting elsewhere, so I suppose they wanna roll with the full anonymity thing, but it does get confusing if you don't know who the heck you're even supposed to be arguing with.
It is just anglotards who decided to not call it silisium
The focus is on making them more power efficient.
Which does have as a side result them become more cheaper as processes mature.
But the overall goal is battery life, not cheapness of silicon.
It is physically impossible to make a functional transistor out of silicon that is smaller than 5 micrometres in size because of leakage and the uncertainty principle.
The smaller it gets the harder it is. Cutting edge chips are currently about 14 micrometres. We are very close to the end, and each micrometer is more expensive than the last.
That's because people are dipshits and they just want them so they can think they're cool like everyone else when really they just spent $250 to look like a complete retarded sellout with no concept of the value of money. It's a stupid trend just like the rubber band animals, it will pass when people realize how stupid they are.
But the products themselves are cheaper. Don't compare processors with beats, it makes you look like a retard to compare products that can be wholely quantified against those that are 90% qualitative.
Apple's entire success is based on making their products depend on qualitative properties rather than quantitative ones.
Before apple, computers were sold with gigs and bytes. After, they're sold with 'style, thiness, it just werks, etc'. See what I mean?
Anyway processors can be objectively quantified in terms of performance so competition can actually send the prices down.
Because how are you supposed to make money if no-one buys your products because they're more expensive?
Well you give retail discounts to people who work in retail, so they are indoctrinated into selling intel.
And by marketing intel as a brand. And by paying OEMs to not use AMD chips etc.
Business practices aside, if Intel had an actual competitor (they are actually competing with ARM in the mobile space at the moment, so they have real competition instead of the sincere but lackluster AMD)
Mobile space processing is competitive.
And this is why low power processors are the future. don't expect great performance leaps until a change in substrate.
There will eventually be an end of course. You can't pack an entire computer into the size of a pearl. It's just not practical. However, each micrometer they shave off will increase speed more than the last. So it's slightly expodential. I can't see silicone based transistors taking processors past 10ghz tho.
Look at the GPU space. Hardware accelerated decoding. Problem is you're doing 10bit. If you were doing 8bit, the hardware acceleration on the iGPU would kick in and make it smooth. The actual processing for that sort of thing is cheap so long as the processor is dedicated to that task. CPUs are general processors.
>basically the same elemental composition
No they're not. Graphene is more like Graphite than like diamond.
Does anyone else remember the run from the 386 to the Pentium 4 bring like HOLY SHIIIIIIIIT
Every 12 months your $3,000 desktop was an obsolete piece of shit and the new hotness was like 10x as fast.
I kind of miss it now even if prices are better.
they are designed not to be repaired.
You can fix a good old PC, it will never break, it will still access the internet as long wifi or TCPIP technology exists. but you cannot fix a sandwich of LCD display, chips on a plate, battery. Soon these will be sealed in plastic and not even have screws, what's already the case for a lot of apple products, cheap chinese screens and printers. Things ARE designed to break.
Things are designed to move people to the cloud and destroy personal computing and all the things people like us used to be able to do. The coming up generation will probably not even be able to use a proper keyboard, or bypass a simple DRM. This is idiocratic as fuck!
That's their problem. You act like a more 'repair friendly' PC would be of any use to them.
>they are designed not to be repaired.
No, they're just not designed to be EASILY repaired by end users. I've had no problem replacing batteries in unibody MacBooks, but that's because I'm not an idiot. Do I wish it was as simple as in the older Apple laptops? Absolutely. But Apple went with the internal battery for one reason or another, probably due to 'MUH THIN'.
Stagnating under an average of 4GHz since 2004.
Silicon still have a bright future before quantum tunneling and other electronic weirdnesses tend to have any effect, but we're far from the absolute limit, for this material at least. and there's plenty of room for larger chips, just on the surface of a few inches's LGA. . .
The limitation of technology is a made up lie, it's just a mass consumption system. If we had computers built to last, like the thinkpads or HP elitebooks, or even a Dell Vostro, we would keep our computers for like, 5, 7, 10 years and they would not be much obsolete as long we keep it for watching our movies, writing, working.
We have the technology though.
>We can make the same old shit barely any faster and people will still buy it at almost the same rates!
Meanwhile my 2600k is still more than adequate for just about everything still. :^)
>Exactly. It's easy to replace shit on Mac books. iPhones aren't that hard either.
Okay, so what's your point?
>But if you were smart you'd buy your own parts and build your own pc though.
Yea, I'll get right on building my own laptops.
Games makes you buy like; one GPU per year now, despite technically, if the developers wanted to make an effort, the most modern games would need 2Gb of RAM, a dualcore 2GHz and a 9600GT. I think the PC gamers and professionals will prefer to have their software on a decent battlestation rather than a cloud-based, disposable device.
Of course someone will still be able to fix it, to hack newer stuff but everything will be made to discourage such activity. In the 1980s, when you bought a hifi, they offered you the diagrams for maintenance, in case something failed. My 80's Marantz still works like a charm. I can't tell the same about my phone. We really have to fix this system.
>In the 1980s, when you bought a hifi, they offered you the diagrams for maintenance, in case something failed.
What's your point? We have SMDs, now. Can't repair those at home. Audio equipment isn't computers, either.
Cause there's no practical use for more CPU processing power in general usage.
I suspect even older ddr1, or even sdram cpus will work fine, if they were provided with sufficient memory pool and bandwidth.
With gpus it's a bit different story since incompetence and terribly inefficient 3d engines are common and need brute forcing if you intend to play the games at maximum (which is silly and very expensive imo).
This is good news, my haswell i3 should be good for a long time to come. All I need to do is buy an nvidia card and 4 more GB of ram then I'll be set for 10 years.
The drawback is I doubt that computers will be as intelligent as humans by 2020 like was expected.
bro, computers from the eighties still works fine today, the manufacturing is the same since chips are sealed. Everything manufactured decades ago will certainly still work today, while shit you buy today is designed to break.
if you bought a phone recently that's made in China, don't expect it to work in a couple of years. While a Nokia, Motorola or Blackberry will still be operational.
anyone got actual stats of the last 4 years in cpu's and their power. Its definitely has not doubled^2. My ye olde i5 2005k barely has been trumped by the most extreme end intel ones.
for its price range it what, got a 50% increase in speed over these 4 years? Barely 25% per 2 years?
Look, I even heard retards who say that the CPU is loosing MHz with time. It's like audiophile crap. my core2quad + GTX480 still plays modern games. Win7 ultimate and everything, I can hardly find an equvalent today, I'm not interested to upgrade.
I still prefer my 1920x1200 ASUS display versus an IPS 1920x1080. And I'm completely right about it.
what? I am talking about the literal interpretation of moore's law. The double of transistors every two years.
All I am asking for is for accurate stats and all I am giving is some rough estimation of what it could be.
>bro, computers from the eighties still works fine today
Do what they were designed to do? Sure. Compete with modern products? No.
>while shit you buy today is designed to break.
Go on believing that. I've got plenty of modern technology that has worked for many years and is showing no signs of stopping.
>Release cycle on the most popular products shortens to a borderline braindead degree
>New phones every year, shit always improving
>Development of applications and all that jazz focus on the new products
>Older devices become obsolete fast and completely worthless even faster.
Simple as that, "designed to break" is just shit retards spout because new stuff won't work on old stuff. That and shit is much more delicate and people are clumsy cunts who can't take care of their shit.