>he fell for the "you need an unlocked OC'd i5 with aftermarket cooler to run AAA games" meme
Guess what anon: Barring some very rare exceptions, games are still GPU limited at the resolutions and sync targets that 99% of people actually play at. And they are going to stay that way.
Now stop suggesting a 4690k in every build thread.
Maybe you missed my point anon. If you uncap, turn down all the settings, and use a godly GPU to ensure CPU bound performance- you will see a big difference between different CPU performance classes. This is what benchmarks show you.
But 99% of people have a 60hz monitor and play at 1080p or 1440p. They are completely GPU bound.
With a 60hz monitor, there's no difference between 60hz and 420hz. You can't tell the difference (unless you allow tearing or other godawful shit, or the game is godawfully coded to read input synchronously with render, and usually not even then). You vsync to your ~60hz monitor and that's it. Most of the power of your i9-9999KX is completely wasted
compare single core gaming perf on the 8370/8350 to a 4690k.
the intel is maybe 25% faster.
i have a 4670k and a 1090t 6 core amd. in games like battlefront i dont notice any difference.
but in Banished or other games that require massive single core performance, i notice a huge difference.
fuck off back to /v/ fucking degenerate
>[unless] the game is godawfully coded to read input synchronously with render, and usually not even then
This is not how games are supposed to be programmed
Sure it's definitely sensible to get an Intel chip with strong IPC (i3-4330 or something in that area). I'd never suggest an AMD chip. I'm talking about all these "i5 vs i7 for gaming" discussions, and the constant 4690k recommendations. It's like people are out of touch with what games actually need.
The amount of fun you can have directly correlates to your CPU's strength.
Not the other guy but I must agree with what you are saying. Using myself as an example (though I wouldn't say I'm an average gamer) I use fx6100 with a 650ti @1366x768 60Hz(tv) and have yet to come across a game I can't play at decent fps(>45). Far cry 3 with ultra settings and full AA stays around 50fps or higher. I have disabilities so not much money and amd was perfect way to go for cpu($95 on sale or rebate in 2012). I guess my point is amd is not bad because they're not the fastest. I don't understand the hate unless most people are underage on this board.
All of the AMD shills ITT, please mail all of your FX's to me. I have a Pentium III 700, a PIII 900, an Athlon XP 3000+, an X2 5200+, some bottom-of-the-barrel A4 APU, a Sempron (I think), and a Phenom II X4 that I'll trade back.
Since they'll give exactly the same performance being that "games aren't CPU bound," it's an even trade. I'll make a rendering cluster out of all that power that's going to waste. I don't really know why y'all went out and paid so much for something games don't use, but I can use it, so my gain.
>meanwhile, at 4k............
Grid 2 doesn't support SSAA you sarcastic fool.
2. What a stupid benchmark.
What you do is you get a good processor for TOMORROW's games.
That and not all games are GPU limited like that.
You need a 4690 to play dwarf fortress.
Dorf Fort wants the anniversary pentium clocked as high as you can - the game is so hilariously single threaded the only way to get performance regardless of cpu is brute force. ARMA is the same way.
> upgrades from super old gt 640 to gtx 960 for Christmas
>worried Its still not gonna be as beautiful as possible but whatever I don't have the money for a $500 card
>run battlefront fine at constant 60fps
>wonders why we let shills speak