Looking to get pic related or any other card in the same price range. Preferably Nvidia because of functionality. Mostly used for gaming ofcourse.
Wanted to get some opinions on if this is the right choice of card.
Also general graphics card discussion thread.
DON'T GET THE STRIX!
its a terrible overclocker and poor booster. i had a strix 970 myself and for the same price, i swapped it with a gigabyte g1 970 and its boosts MUCH higher (1.4ghz vs 1.2ghz) and overclocks much better as well.
the strix is limited to one 6pin power connector due to its hard coded low tdp. its also voltage locked, 1.17vs vs 1.2vs like most other cards in its class.
is a meme. i have two in sli and LOVE IT (both cards combined draw a total power of a single 980 ti). for 1080p its a monster and the best value in its class.
the 390 is a massive house fire for its performance. you would think a card with a 350 tdp would deliver better performance, but it doesn't, and its 8gb of ram is useless since if you run into scenario where you need it, your gpu will bottleneck first.
crossfire might be different, but crossfire with two 350 watt tdp cards would be insane in both heat, and power draw.
OP here, I'm not necessarily looking to overclock if maybe a little. I just want it to stay a bit quiet and be able to run videogames without ever having to worry about settings for the next 2 to 3 years
if bait then pls go
If serious what do you mean by "functionality"?
cuda exceleration for whatever renderer or video editor you use might be nice. But amd also has advantages in certain programs.
If its literally just for gaming a 390 may do you better, it may not. a 390 WILL be better in the long run because 1. 8gb of vram for crossfire should you be so inclined 2. AMD cards getting better with drivers while nvidia is known to ignore/intentionally lower performance in older cards with new drivers.
Here's a list of GPUs a person should buy, from lowest to highest cost.
750ti, R9 380, R9 290(x)/390, 980ti
Generally, everything else should be avoided...
OP, for your price range, Sapphire R9 390. Don't ever buy Asus, ever...
even if you don't overclock, you have to take nvidia's "boost" into account. nvidia drivers automatically "boost (overclock)" your card by default as long as temps and tdp are in check.
the strix, even with its great cooler, is terrible, because of asus hard coded low boost defined speed and low tdp. its one of the worst overclockers and boosters out there.
its great if you all you care about is a cool running card that draws little power. but for the price, in terms of performance and feature set, its overpriced for what it is.
I have had amd since my first ever pc and will keep it in my work pc but I like Nvidias extra software like shadowplay and such. Also there's physX but who cares about that.
Another advantage is hairworks and other such developed technologies and Gsync. I already have a monitor equipped with a Gsync chip so I want to make use of it.
started in june, 75 pages and growing until amd locked it in october.
enjoy all your display driver crashes and whatnot
>a card with a 350 tdp
TDP isn't the same as power consumption, you fucking retard. Also, Nvidia and AMD calculate TDP differently (Nvidia uses typical usage scenario, AMD uses peak usage scenario), so you can't even compare the numbers each of them provides.
This shows total system consumption. but serves to show the difference between the 970 and 390 is just 30W~40W typically, and 65W on the worst case scenario (DA Inquisition). Nowhere remotely near the difference between the bullshit "145W vs. 290W" TDP numbers Nvidia and AMD provide.
>and its 8gb of ram is useless since if you run into scenario where you need it, your gpu will bottleneck first
Seriously, are you retarded? VRAM consumption and GPU performance are not necessarily correlated. Specially when it comes to textures, since using higher-resolution textures increase VRAM consumption significantly with small impact on FPS.
>Above, we have a chart of relative power consumption. Again the Wattage shown is the card with the GPU(s) stressed 100%, showing only the peak GPU power draw, not the power consumption of the entire PC and not the average gaming power consumption either.
its a house fire
>Seriously, are you retarded? VRAM consumption and GPU performance are not necessarily correlated. Specially when it comes to textures, since using higher-resolution textures increase VRAM consumption significantly with small impact on FPS.
if you're running a game with that many textures, either because a resolution higher than 1080p, or a next gen game, vram usage will be the least of your bottlenecks.
thats right, just like how 4gb on a 960 is so worth whiled!!!!
>that extra vram really raped that 970 alright!
oh wait, they scored near identical. the 390 at best was a few fps higher, but nothing worth while. both where pretty neck to neck most of the time and the dips where similar.
There you go, two sources (including the previous one) that show those graphs you just posted are complete bullshit. Anandtech does show a higher difference than TechSpot, but it's never higher than 80W.
>either because a resolution higher than 1080p, or a next gen game, vram usage will be the least of your bottlenecks
Resolution of textures is not tied to resolution of the screen, you idiot. Even at 1080p, you still benefit from having higher-resolution textures for whenever texels are displayed as larger than apixel on the screen (specially the case for objects closer to the viewport).
Also, there are games today that have high-resolution texture options that won't fit into 4 GB, like Shadow of Mordor and Black Ops 3. That will only increase in the future.
Finally, have you never heard of mods?
>just like how 4gb on a 960 is so worth whiled
It is. What the fuck are you talking about? Are you one of those reatrds who think the 960 somehow "can't utilize 4 GB of VRAM"?
And most games today may not show it, but that doesn't mean it will remain like this forever. There was a time when 1 GB of VRAM was plenty for 1080p gaming and 2 GB was overkill. Then 1 GB was no longer enough and 2 GB became the standard. Today 2 GB is already getting pretty tight and you want at the very least 3 GB.
VRAM consumption is not fixed to resolutions, it increases over time at the same resolution. I don't know what's making you retards think 3.5 GB will be "enough for 1080p" forever, specially with the stuttering issues past that point.
>80w difference doesn't matter
lol shill detected
and whats funny is that 80 watt difference is very similar to the very link from guru and photo i posted.
>msi gaming 390x
>258 - 154
most likely a stock 970 or something like the strix
take into account most 970 on the market today, like evga ftw+ or gigabyte g1
>170 - 190 watt
>258 - 170
>258 - 190
now go back to mine, it showed a pcs+, which is a factory overclocked 390, like most 970's are factory overclocked these days, factor in the updated 970 tdp, and you still get the same results
so no, my links are not nonsense. your frist one from techspot was, granted, techspot still showed the 390 using more than the 970.
the 390 is a house fire. it draws more power than a 970. its fact. you can shill all you want, its a fucken house fire.
pic related, its my old 390 crossfire setup before i switched to my 970 sli, so i'm no shill. they run hot, they draw a shitload of power.
It's a Titan X in the first post, which is a reference-only card. No shit it runs hot.
The I posted a Fury, with a giant triple-fan air cooler on it. It performs worse than the Titan X and still runs hotter.
Bottom line, AMD still can't into housefire prevention. Do not buy.
Even the LIQUID-COOLED FURY X is a fire hazard.
Another one. Fucking housefires, AMD can't into thermals. Not even liquid cooling can keep this shit under control.
You don't recommend the 4 GB 980? I'm using a card that's literally 5 years old now and was considering getting that because the ti is too expensive for me and I keep hearing mixed things about the 970.
I've posted this numerous times before but I'll say it again. Just find an upgrade guide on the net and follow it to get accurate and reliable details compared to the autism you'd get over here.
>b but 3.5
>b but amd house fire
none of the cards in the $400 - $590 range are worth it for the performance increases over the 970 / house fire 390.
its either 970 for $350 or 980 ti for $650.
you only go amd if you enjoy bdsm
>AMD is now competitive in this space, but the trade-off comes down to heat and power consumption - both 390 and 390X are large, hot cards and you'll need good ventilation in your case.
>AMD has its own adaptive sync alternative, FreeSync. It's not quite as flexible as G-Sync, but with a little more care in settings management, you can achieve very close results.
>It was always on a knife-edge, but we've decided to flip our verdict here. Previously, we opted for the R9 390 owing to its future-proof VRAM and excellent performance, but recent issues with driver support for key titles such as Fallout 4 and Just Cause 3 have highlighted that weak software support really impacts the Radeon ownership experience. Meanwhile, Nvidia's GTX 970 has continued to flourish with day one driver updates for every major release. If there are issues with the controversial 3.5GB/0.5GB split in VRAM, we have yet to see them manifest outside of multi-card SLI set-ups, and for 1080p gameplay in particular, the strong performance plus superb overclocking performance puts the GTX 970 on top.
>The best graphics card under £250 / $330: GTX 970
MSI is probably better but Asus has that sweet backplate.
So does Gigajew's one but three smaller fans can't be as quiet as two larger ones.
Which one would you if you had to? Assuming your first choice isn't killing yourself.
the gigabyte g1 is actually pretty damn quiet. it only gets loud when the fans kick up, and thats only when the card gets hot (65c+)
once shoved inside your case, your case fans, even when the g1's fans kick on, should be louder unless you're only running 1,000rpm and below case fans.