Is it just me or is the demand for CPU power by games currently in relative stagnation? I'm due to for a new build soon but my 2500K still handles things fine, and it's not even overclocked yet. Feels like the thing could go for another three and a half years.
There are a few games (Arma 3 and Kerbal Space Program come to mind) that are CPU intensive, so I'd suggest picking up a 4690k on sale during cybermonday. Just so you can have that headroom.
No reason to get an i7 though.
K8 is about the maximum one would need for gaming and other casual uses, but want it or not the bloat and marketing doesn't rest so you gotta buy the latest shit or be made fun of.
I predict that by 2024 there will be no programmer capable of running space invaders on pentium 4.
Because graphically intense stuff that is optimized well has not much need of a CPU?
Why do you think you REQUIRE the game to use 100% gpu and 100% cpu power?
Why does the fact you can keep a CPU for years without any real downside seem like a bad thing to you?
No it's not just you.
But think about it, why would games use more CPU power? What is there to do?
The GPU side you can imagine for sure, "more graphics", but with the CPU side once you have enough to run HL2 level physics, AI, pathing and other crap algorithms, you don't need much to do.
It's simulation type games rather than racing, shooting, whatever games that use up CPU power.
Dwarf Fortress eats CPU like a motherfucker. Flight sims, 4X games like star ruler.
Well we'd see more of these cpu intensive games but you can again thank consoles for that. Weak-ass CPUs as usual.
I bought an R9 270x a couple of days ago and I can play almost anything on at least high graphics.
My CPU is older than the hills and I think I bought my motherboard like 10 years ago.
>have money to upgrade to an overkill PC
>have no need to update my 4 year old 6950/i5 2500k system
>realize I only ever played touhou and 2D games on it for a few hours
>don't even have internet friends any more because lack of games means nothing in common
that STEM feel.
>That fucking feel
Go to friends house, I haven't played vidya is forever, so watching him play Dark Souls is drab. End up talking to his sister and posting on the chans all night and drinking too much.
I tried to play dark souls twice, when it was flavor of the month at PC release, then again recently. I more or less bored quit, the same thing happened with skyrim when it was released.
I tried to play the witcher 2 but when I got about 10 hours in I felt too guilty about the time I was wasting on it and how slow it was moving. I loved the first one (which was inferior) and played it twice years ago.
Thats less your specs, more Mojang are a shit.
Apparently nearly reference to a blocks coordinates spawns a new Object for the position. Apparently the amount of objects created causes a full GC every 8 seconds, its disgusting.
I think most AAA games don't really use much CPU, the real demand is for graphics processing. Simulation type games like Kerbal Space Program and Dwarf Fortress tend to be more CPU intensive. Shadows of Mordor apparently uses a fair bit of CPU to process the activities of off-screen NPCs as part of it's "Nemesis System"; so much so that the Nemesis System won't be available on the Xbox 360 port of the game.
Yeah well, your mom oinks like one.
>suggesting the 4690k over the 2500k
If you oc both the difference is maybe 1% not to mention the 2500k is much better suited for overclocking because it has a properly soldered heatsink as opposed to the newer generations of intel cpu's
Gaming has stagnated due to consoles.
I have an old I7 laptop with a quadro which is basically a 9800gtx with extra memory and I can actually still play A LOT of games.
It's usually on mid settings, some high on 1920x1200
I never tried battlefield, but I can run metro last light on low settings, and Skyrim on high (aa ect off) just fine.
That's pretty fucking pathetic if you ask me.
No, it's not just you, OP.
Most games are cross-platform and even the new consoles are like entry-level gaming PCs at best.
Muh i7 920 also still plays games fine. A bunch of upcoming high-end PC only release titles might finally make me upgrade, though. We'll see.
If you used the 4690K you'd also have to use the 2550K since they're both the lower model but with an extra 100 Mhz, and we all know how much of a difference that makes especially once you overclock them. None.
Don't let ARMA be a representation either.
When I had an i5 760 and dual radeon HD 6850's I got better FPS than I do on my i5 3470 and R9 270x
and my laptop
Runs it not as high, but more consistently than either of them.
I've also never seen more than a few FPS difference between highest and lowest settings besides when I was on an Intel HD 4000
That fuckin game man.... What the fuck
I wanted to but then I used the stock cooler for a while and overclocking it seemed complicated at the time and I was happy with the performance so I just kinda forgot about it and about ordering an aftermarket cooler.
Plus, what's a few dollars over the non-K version?
Don't be so stubborn! We're not talking about a few thousand dollars, we're talking about a few dollars!
I didn't say the 2500K would need replacing in a years time, what I said was simply that buying the slightly pricier model of a CPU so you can overclock it and get another year out of it already makes it worth it.
But my bookmarks are a mess too and sometimes I come across stuff that isn't really worth bookmarking but maybe I'd like to look at it again later.
We're straying from the point though, Will the current stock 4.0ghz i5/i7 cpus outperform my 2500k by a large margin in this regard? My 2500k is 3 year old tech after all. Thanks.
If you want to look at it again later, that's when it qualifies for being a bookmark. There is no quality standard for them, all that matters is that you want to look at it again later.
Someone was a fucking retard at that shop then
You didn't say it, but you heavily implied it, by saying "I'm gonna oc it and see if I can get another year out of it" like you're squeezing the last bit of performance out of it
The 2500k is still an amazing chip, plenty strong enough for the coming years especialy when oc'd, only difference with more modern cpu's is a slightly lower power usage (though the 2500k seems to be less power demanding than the 4670 when oc'd)
I have a 3570k and a HD7950.... I have only used it to play gta for a few hours.... Now using it to lurk on /g/ and writing some simple code... My Thinkpad would have been fine for that. I wasted about 800 YuroDollars on this PC.
Mainstream Gameplay hasn't really changed in ten years or more. Seriously. Barring some indie games, they're all doing the same shit over and over again. WoW is so much like EverQuest, it was a joke.
I think Dota is one of the only new gametypes to come out recently... and it probably has older analogues, but I can't think of them right now. Then it got copied by LoL and people eat that shit up even though it's a total knockoff. LoL is fun, though.
RTS games haven't changed much since the 90's, really.
It makes this old gamer sad.
Please stop. You obviously don't know what you are talking about.
That's how bad it is. That's exactly what I'm saying. Mainstream games for the last decade have all been rehashed shit that makes old gamers sad.
No, you don't know what you're talking about. Dota is just a rehash of aos. Claiming that lol is a copy of dota is stupid since guinsoo is the one who revived dota in tft. Dota is just a dumbed down version of the game for shit players who couldn't manage ladder games, not a new gametype.
Go back to >>>/v/.
I kinda know that feel. Got a lot out of my 680 and refuse to upgrade because I mostly lost interest in gaymes. The 680 is still good for editing and transcoding and holds up fine for gpu enhanced playback. Still pulls its weight on modern gaymes as long as AA is not maxed out. Fortunately, I got a lot of playing time out of it so it wasn't a waste.
I've been talking about this for a couple years already, and noticed this trend starting as far back as 2008. Remember how long the Q6600 and E8400 were dominating everything well beyond their expected lifespans?
I suspect the primary reason is because of the big development shift away from PC - more and more games being developed for consoles first and then ported to PC. Working with the stagnant resources of consoles means developers will essentially always be working for the lowest spec.
My 1090T is already four years old, and is still performing fine; I expect I will upgrade somewhere around 2016/2017. This longevity is something that was unheard of before.
Sure, there are a FEW PC-first titles that can stress a modern CPU, but nothing that makes me feel the need to upgrade. The only ports that bog down CPUs are unoptimized garbage and don't really have anything to do with CPU power anyway - they always perform like shit and wouldn't scale properly with a faster chip anyway.
I pretty much play nothing but Stepmania and decade-old racing games on it. Sometimes Quake on occasion.
>tfw you bought a 2500K, the greatest processor made in the last 3 years :^)
no end in sight bros
I've pretty much given up on games at this point, I'd figure that if I still want to be interested in it as a hobby I'll need to make my own games and I don't have the time to generate 100+ assets and make a functional program out of it.
sorta sucks that gaming has come down to all of this, I still enjoy programming really simple games (ala pong, missile command) in C and making ardinuno arcade cabinets that I put up in my garage or resell on ebay