First of all you need to understand the basic design of AMD/NIVIDA GPU's in each category:
NIVIDIA >smaller number of processors on chip >Faster Core clockspeed on processors >Less V-RAM capacity (most of the time) >Smaller Memory bus (bandwidth) >Faster clockspeed on V-RAM
AMD/ATI >Greater number of processors on chip >Slower Core clockspeed on processors >Greater V-RAM capacity (most of the time) >Larger Memory bus (bandwidth) >Slower clockspeed on V-RAM
These fundamental differences in design cause the GPU's to handle Rendering loads very differently which affects the performance of games in ways that synthetic benchmarks will rarely show you. There are two types of rendering loads in games and benchmarks and they are "Light load" (when not all of the processors are being used) Heavy load (when all of the processors are being used and there are not enough processors to handle all of the load currently tasked which has to then wait for its turn to be rendered)
As a result when a Nvidia GPU's are under "light" load they can render faster than AMD GPU's due to their higher clockspeeds but when under "heavy" load their performance drops off a cliff because they have to much work and to few processors to do it all. When AMD GPU's are under "light" load they render slower than NIVIDA GPU's because of their slower clockspeeds but when they are under "heavy" load they render faster because they have more processors to do all the work.
So a NIVIDIA GPU's "Maximum" framerate will usually be higher than a equivalent AMD-GPU but its "Minimum" framerate will usually be much lower than a equivalent AMD-GPU. This is where the deception comes into play because "Maximum FPS", "Average FPS" and synthetic benchmark "scores" are all meaningless to game performance the only number that truly gauges the power of a GPU and its performance in gaming is the "Minimum FPS" which is how fast the card will render frames when it is under the heaviest possible load.
>>46332907 For this example I will compare a NIVIDA GTX970 (340$USD) and an AMD R9-290 (250$USD) using the "Valley" benchmark which is well known for being biased towards Nvidia GPU's The settings of the benchmark are 1080p resolution with all graphics settings and filters set to maximum.
First the specs of the 970 which is a Zotac model the card tested has an extreme OverClock of 1455mhz/1955mhz (Core)/(memory) and it is paired with an i5-4690K OC'ed to 4.2ghz (no cpu bottleneck can occure now)
The Result is "Maxmimum" 121.5FPS, "Average" 63.5 "Minimum" 30.1 FPS and benchmark total score is 2658
Now for the R9-290 I will be using a Sapphire Tri-X model runing at non-OC "stock" speed of 1000mhz/1300mhz (Core)/(memory) and the same i5-4690k CPU running at 4.2ghz
The Result is "Maxmimum" 116.2FPS, "Average" 60.9 "Minimum" 29.2 FPS and benchmark total score is 2546
These results should be astonishing to most of you. Keep in mind I chose valley because it has a performance biase IN FAVOR of nvidia GPU's. Yes the "maximum" is higher but even at the most extreme Overclock the GTX970 can just barely get a better minimum framerate than the AMD card running at STOCK SPEED (no overclock) when the cards are under 'Heavy" load. Did I mention the R9-290 is almost 100$ cheaper than the gtx970?
The "score" is based on averages just like the "average" FPS. Both have a severe bias based on Nividias higher "maxmimum" framerates when under "light" load and as a result they help disguise the much lower "heavy" load performance.
i'm saving dosh for a 380/380x and a 2560x1600 setup on a 750Ti
i don't see any point in upgrading my current setup without going dual monitor/higher resolution unless devs finally start dropping support for last gen consoles and really crank up the grafficks or TW3 really turns out to be that brutally demanding.
>>93031215 As someone who had a 970, yes. It's does what I need it to do. The only reason I am concerned about their little scandal is so I can try and Jew my way into a 980. However, the 970 plays all current games on ultra at 1080.
>>46332926 Why is "heavy" load performance important? Have you ever experienced screen tearing? Have you ever used V-sync to stop screen tearing but with V-sync on you get massive performance lag and/or stuttering? Screen tearing occurs when the timing of frames being rendered is out of sync with the refresh rate of your monitor/TV so when the screen refreshes it ends up refreshing part of a new frame onto an old one which makes the image tear between the two frames. What "Vertical-Sync" or "V-Sync" does is hold the refresh rate of the screen until the next frame is fully rendered. This solution can work flawlessly or horribly it depends entirely on how erratic your GPU's rendering speed is (Min-Max) how severe screen tearing becomes in general without V-sync enabled is based on this fluctuation as well.
So by having a higher "heavy" load performance and a lower "light" load performance the rendering rate of AMD GPU's is much more consistent overall in games. This means that without V-sync the screen tearing will be less severe (but it will still exist) and with V-sync enabled you will be able to run games at higher graphics settings than NiVIDA with less lag/stuttering.
What it all boils down to is that if Nvidia made cards with the same number of processors, same amount of V-RAM and same memory bus-size. They would be the faster and objectively "better" cards overall. Instead Nvidia cuts corners in their design, pockets the money saved in manufacturing costs and then increases clock-speed to make up the difference and then markets their product using irrelevant selling points like "LESS POWER CONSUMPTION!" (because it has fewer processors and less memory/bandwidth) "LESS HEAT OUTPUT" (because it has fewer processors and less memory/bandwidth)
>>46332954 I can imagine that with current graphics capability, but what about in 3 or 4 years? I know full-on futureproofing is bullshit but if I'm gonna drop a lot on a new card I'd like to hold off on needing to upgrade as long as possible and even that 0.5GB vram might have the chance to fuck me up in the long run
>>46332953 >Why is "heavy" load performance important? Heavy load is by far the most important time to have a good card. If it's light load you'll be getting good frames anyways, and it will almost always be times that you don't need the extra frames. When you have a lot of action going on is when you need frames.
I've got the cheaper gigabyte 970, overclocked to 1.4mhz. No game has ever pushed the temps above 67C with vsync on. Even running benchmarks for an hour didn't get it hotter than 70C. You must have shitty airflow or something, 80C seems way high for a 970, even with heavy OC.
>>46333366 The main difference between the two gigabyte 970's is the pcb design. There are 4 pipes on the g1, only 2 on the other. I want to say the 2 are bigger than the 4 you get with the g1 but I can't remember so I could be wrong. Temps are a little higher than you would have with the g1 but nothing extreme, maybe 3 or 4 degrees higher. Oh and as it should be obvious, the g1 has a backplate too. I've had mine since october and it isn't sagging at all despite its ridiculous length, so I don't mind not having a backplate. Anyway, thats why his temps seem odd to me, I have the comparatively hotter 970 yet my temps are significantly lower than his.
>>46333450 I forgot to mention, the performance between the two cards is the same. The g1 comes with a higher overclock by default, but you can and should manually overclock them both anyway so thats a non-issue. The only real difference is the cooling design, but as I said it only comes out to a few degrees difference since 970's run pretty cool no matter what.
>>46333459 >>46333349 Also this is at 1.45GHz core clock and 4GHz mem clock (constant, seems like Dying Light induces Boost Mode 24/7 whereas Valley benchmark doesn't). Thankfully the temp doesn't go above 80C and the fans stay at 55%, not loud at all.
If you're just going to do 1080p gaming, look into a 960, 270(x) or 280(x). You will save a lot of money and accomplish the same thing. 970 is aimed at mid-high 1440p gaming, so you're basically wasting your money if you're going to do it just for 1080p. That money you save can be put towards the next build you do because 1080p is going to be shit on by 4k in the coming years for which the 970 is grossly underequipped with its 3.5GB. Also, I would recommend waiting until Spring if you can because prices are going to drop once the R9 3xx cards come out, even if you don't buy one of them.
>>46332852 Y-you can, but some games might need a setback of shadows, textures, and AA. The R9 290(x) is a worthy card in the right direction while the 980 seems to be the better answer, if you're into the premium. But, if you were to get an r9 295x2, you'd be set for a long while.
In short, 970 good, but 980 is better. If you have sufficient headroom, 290(x) would be a good alternative and the 295x2 would be even better. The problem is that these cards are too good for 1080p. The 280x and 960 seem to be the best selections for this resolution. Anything higher than that is personal taste. I'm not sure how games would maximize vram in a couple years, but the 970 would work, in this case
if it's not the k version of the 4790, don't bother with z97 boards, it's a waste. z97 = overclocking, h97 is fine for non-k. Aftermarket cooler is also worthless in the same case. If it is -k, then z97 with that heatsink is fine. You don't need an i7 for gaming, there is basically 1-2% increase in games at absolute most. An i7 doesn't benefit you at all over an i5 if you're not saturating CPUs, which most games won't, especially not DayZ. Consider the i5 4690k to save $100 USD. You don't need a 750W PSU if you're going to go with the 970, even if you're going to SLi, 650W would be fine. You can go 500W or below if you're doing single card with no plans to SLi. I would recommend a 980 though if you can spring it. You don't need 16GB for gaming, up to you though if you found a good deal. 120GB is great for an OS bootdrive, consider 250GB if you want to put some games on it as well to help further reduce load times, but the biggest thing is the OS being on there so 120GB is fine. Have you picked out a case?
Agreed on z97 vs h97 - if not a k it's a waste of money.
As far as the cooler goes hyper212 is overkill if not overclocking UNLESS you want the rig to be pretty quiet, in which case it will help a lot over the stock cooler
Agreed that i7 doesn't help with gaming but if he's planning on using his pc for more than gaming it may be worth it depending on how long he's planning to go without upgrading
Agreed on PSU, 750W is overkill for 970 but again depends how long he wants to keep the rig. If it's not a lot more it's not necessarily a bad idea to spring for a higher PSU (within reason, don't go buying 1000W or something stupid) so you won't have to replace it later. Nothing sucks more than realizing your current PSU can't handle that fancy new card you just bought (650W is probably enough to be perfectly safe for a long time though unless running multiple cards)
I'd just recommend against the 970 in general now (and I own one) - 290s are now like $120 cheaper offering very similar performance with almost certainly longer longevity
Agreed on RAM, 8GB is fine atm and you can always add more later
I'd recommend the 250GB hard drive - they're not that expensive anymore and windows tends to bloat after a bit. 120GB can get cramped fast.
>>46332954 >>46333147 You know, the funny thing is, you really can. I know it's a shitty, unoptimised game but when I got my 4GB R9 270X, I thought I'd try running Watchdogs on highest settings to see how it'd fair. Obviously the game was laggy as fuck, I didn't bother checking but I'm guessing I had 30fps tops.
Opened up GPU-Z, and I can't remember exactly but it was definitely using more than 3500MB VRAM @ 1080p. My monitor is only 60Hz as well.
Normally I'm not one to berate people who not having enough VRAM, as I was still running on 1GB up until last month, but seriously, 4GB usage is easily done at 1080p.
Watchdogs is a shite example, but even on highest settings it still looked like a fucking PS2 game, so I'm guessing decent looking stuff could easily use up similar sized memory.
So now that amazon saw through my master plan of submitting a return request, getting it approved, and holding onto the card until the 3XX's came out (they still approved the return, I just have a week to mail it) should I replace it with a 290 or slum with a 6950 for a few months?
>>46332907 >>46332926 >>46332953 Stop this fucking pasta. Especially that this guy had little idea about what he's talking about. Another shitty reviewer or computer technician wrote that shitty pasta.
>>46335077 Like that guy said, literally no reason not to get a 290. At this point I'd argue the 970 needs to be at a significant discount to the 290 to warrant it. The design is really flawed and it's going to be a problem in the future.
>>46335117 I've got the Strix for one weekend more (shipping it back next week). Nice card, the quiet is as advertised (no coil whine on mine) but it's not worth the $350 that I paid, which is why it goes back
Do not cheap out on a PSU. Look into something that is AT LEAST Bronze 80+ and is rated for continuous wattage. 620 is plenty for a single 980. Some brands I would recommend: SeaSonic, Corsair (anything but CX/CS series), EVGA, CoolerMaster (V series). There are tons of valid brands out there. Trust me, if you want this baby to last you a long time, go with a good PSU. Lower quality can damage your components over time due to voltage ripple, especially when OCing, and it may not deliver the power advertised if the efficiency is shitty or it's not designed for continuous use. Run like hell if you see the term "Peak" anywhere in regards to wattage.
>>46334004 This is a 2 year old Win7 install along with a crapton of stuff I don't use/need anymore, probably some games I haven't touched in a long time too.
I could shave down >40GB and still have junk I don't look at. Yes it does bloat over time but it isn't nearly the same as the XP days when a 1.2GB install could double (or more) in less than 6 months. Just don't do automatic updates, hotfix/security patch minimally as needed or wanted, and you can keep the OS from consuming like a ravenous beast.
If you're able to hold out, wait on the 380x/390x and go 4k. It's on the cusp of taking over 1080p and is becoming more widely accepted on a daily basis. In a couple years, it will be everywhere like 1080p is now. 290/290x are fine cards but they won't hold up in 4k so you'll be buying into technology that's being phased out already.
you can overclock some GTX 970s to match stock GTX 980 performance, why would you want to pay $550 compared to $350? the extra $200 on top isn't worth it just to be able to overclock from the stock GTX 980 performance
at the very least, wait until AMD releases their R9 3xx line, even if you won't buy AMD, nvidia might lower the GTX 980 prices to compete
Yeah that's a good PSU. It's actually on the high side of bronze (around 88% efficiency with 310W drawn) and is based on SeaSonic, which most of /g/ busts a nut over. Should last you just fine. If you go with a 620W, bear in mind that you will need to upgrade to >800W if you intend to SLi down the road.
>>46335315 >you can overclock some GTX 970s to match stock GTX 980 performance You can't overclock an extra 0.5GB of RAM or addition of L2 cache/ROPs. At this point, knowing what we now do, claiming you can turn a 970 into a 980 in terms of overall performance is absurd.
Agreed on waiting for the 300s though, if nvidia lowers their price that's when they'll do it
Is 4k already the standard nowadays? I barely know anyone where I live who has a 4k monitor so it doesn't really bother me at all at this point. I only recently got an upgrade to my monitor after years on a 1366x768 lel
I'm just sick of all the shit I've been through AMD. Too many RMAs in my experience (note: MY) ever since the 4xxx series. Maybe if 3xx series wow me that much then I won't mind shelling some out for it. I'm just in need of a card asap to game since I have a month's break.
>>46335349 You won't match framerate performance due to the VRAM lie. MAYBE you'll be able to get average FPS pretty close, but your frame times will be fucked and performance will be all over the place on anything recent.
>>46335343 >Too many RMAs in my experience (note: MY) ever since the 4xxx series. Wouldn't that be an issue with what manufacturer (e.g., XFX) you buy cards from? If it were a flaw inherent to the chip design of the GPU then it makes sense to blame AMD/ATI the chip designer, but overall reliability and PCB design is in the hands of the graphics card manufacturers who take nvidia/AMD design to produce cards to sell to end consumers
Here's where that logic is flawed: you can OC a 970 to perform to the level of a stock, reference 980. You can OC the shit out of a 980 and blow a 970 out of the water. You also get a full 4GB instead of 3.5GB+0.5GB, which nVidia basically admitted that that was one of the reasons they priced the 970 at the point that they did. I agree on waiting on the 3xx series. Even if you don't buy one, it will drive prices down because HBM is going to beat the shit out of GDDR5.
>>46335363 >but your frame times will be fucked and performance will be all over the place on anything recent. Except they won't, because more or less every game that requires +3.5 GB vram already runs like shit regardless of the vram issue on 970. All frame time issues I've seen posted have been of the game running at 10-30 fps.
I must've just been terribly unlucky, cause I tried a lot over the years - stock, Sapphire, XFX, Asus, etc. The only one that's stuck thus far has been my crossfire 6950s but they died recently after 2 years of OCing, out of warranty as well, so that's why I'm looking.
>>46335363 no shit, what else would I be referring to?
>performance will be all over the place on anything recent that's debatable, seeing as no one had any performance issues until months after the launch when some guy found one of the few instances that made his 970's VRAM usage over 3.5GB
still, no sane person would recommend a GTX 970 for that kind of graphics workload, and I don't recommend a GTX 970 either, which is why I told him to wait for R9 3xx to weigh his options
4k isn't standard yet, but it's quickly gaining momentum. The idea of 4k gaming was laughable or foreign to most people this time last year. Now it's quickly becoming a selling point. As I was saying, it will quickly become the new standard.
>>46335387 read the post, I was in no way recommending the GTX 970, I'm only pointing out that the GTX 980's price-to-performance ratio isn't that good so he should wait for GTX 980 competition that may result in a price cut
>>46335350 >>46335339 Okay just limiting it to 1 4k monitor, I'm still screwed with the Gigabyte Geforce GTX 970 Windforce? Would getting a 980 make a difference or do I basically need 2 in SLI regardless?
I did read your post. There's a reason multiple anons (myself included) have responded to you claiming you said it was. Your phrasing implied that you were saying the 980 is not worth it over the 970, not that the 980's price:performance is terrible. I would like to emphasize no high-end cards' price:performance is "worth it", there is a premium associated with maximizing performance. Until the recent AMD price crash, it was the same way for them. 980 is currently the top performing card on the market save for the 295x2, which is almost twice the price.
>>46335424 No one is arguing with your second point. But your first point is no longer true.
The 970 outperforms the 980 in price to performance IF you use <3.5GB of RAM. If you need to use all 4GB the 970 is out the window. So it's no longer a simple statement of "the 970 is a better buy than the 980" - you said it yourself, you wouldn't recommend the 970. If he HAD to buy a card today and for whatever reason refused to get AMD, then the 980 is really the only card to recommend (a paranoid person would wonder if nvidia planned it that way....)
I'm assuming if someone is spending >$300 on a card they want it to be completely viable for more than a year
1. Did you know what I meant? Then fuck you. It's a valid industry term and has a direct connotation. 2160p is just as stupid a term because I'm sure your monitors don't have progressive scan, yet you still have that "p" on there. The meaning is understood.
2. "Standard" in our conversation wasn't alluding to an industry standard, rather whether it is "the standard" AKA if most consumers use it. Currently, 1080p is "the standard" as most people own 1080p TVs, but as I was saying, 4k is gaining momentum.
>>46335509 It's also possible they just genuinely don't understand the issue. I've seen people who truly believe this can be fixed with a driver update. Once I understood the full ramifications of the problem my 970 was returned ASAP
Why not? The delta from Bronze to Gold is about 4%. Ever wonder why you don't see "silver" anywhere? The price difference alone invalidates the potential power you could save. Even 4% at 1000W is 40W. You have to run a 1000W PSU at full load for 25 hours to get 1 kWh difference, which is $0.10-0.11 in most places in the US. You will need to run it for 2500 to make $10 difference on your power bill. Again, this assumes 1000W from the wall. Gold is completely unneeded.
>>46335509 Man I remember when I got my 560 Ti, I was kicking myself for getting 1GB because trivial shit like GTA4 or Skyrim vanilla was pushing 1GB. Skyrim runs nicely but everywhere I turn it's fucking stuttering from swapping new textures.
Thread replies: 139 Thread images: 9
Thread DB ID: 35432
All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
This is a 4chan archive - all of the shown content originated from that site. This means that 4Archive shows their content, archived. If you need information for a Poster - contact them.
If a post contains personal/copyrighted/illegal content, then use the post's [Report] link! If a post is not removed within 24h contact me at [email protected] with the post's information.