"who would've thought it? You do love the 970!"
>being nVidia "customer"
>having to lick up your bulls jizz, too.
How beta do you have to be? I don't think the greek alphabet has enough letters to express the submission of the average nvidia fanboi
Nvidia customers are generally just ignorant when it comes to technical minutae. They want to play their children's gaymes and don't care if it stutters a bit. Appealing the the lowest common denominator is almost always a winning strategy.
>there are people who buy team red products though they are inferior.
What matters is performance if you have money, and price/performance if you dont.
Anyone who votes for any team as mindlessly as a big part of /g/ does, is almost equally fucking retarded.
Also, >>52653661 is what you get when AMD is weak. So Nvidia fanboys are slightly worse than AMD fags imo.
But the 390 objectively offers better price/performance right now, as well as much better potential longevity than the 970. The 970 sells more because of marketing, nothing else.
Just as >>52653796 you totally misunderstood.
Im against "chosing a side"
>Anyone who votes for any team as mindlessly as a big part of /g/ does, is almost equally fucking retarded.
And I generally DISLIKE Nvidia fanboys MORE because they dont understand, that a weak AMD leads to shitty Nvidia products aswell.
>Also, >>52653661 (OP) is what you get when AMD is weak. So Nvidia fanboys are slightly worse than AMD fags imo
When did /g/ get flooded with underage fucking faggots that ARE UNABLE TO READ? This is plain horrible.
You people are the fucking worst. Kill yourselves.
welp, apparently you can't read since you completely dismissed >>52653835 very valid point.
It is hardly people making the rational choice as you claimed >>52653723. The 390 completly outperforms the 970 at a cheaper price. So it has to be some sort of stockholm-syndrome if you are going for nVidia.
Well done. You are worse than guys spaming spiderman.
i cant read nazi, no idea what youre trying to get across with the pic, but i got a 970 on sale for $250 like a year ago and ive had no issues what so ever with my purchase. I'm actually thinking about getting another to sli them
They do. Only 1 google search away, you lazy fuck.
So not LITERALLY twice you fucking aspie, it was exaggeration, but the point is, why are AMD cards so much hotter and louder?
Would like to see proof, honestly interested.
244 x 2 = 488, not 371 you chucklefuck.
Now go find a chart that shows averages instead of full load, and you'll see the difference go down to 30 or 40W, tops.
Of course you won't, since you only care about the charts that make whatever you got look good, to justify your shitty purchase. Deal with it kid.
nvidia uses 10w more average than they say, AMD uses 14w less than they say
Also this picture, before you even think about complaining that the previous one compared the 900 series to the 200 series, you fucking piece of shit.
Also, these cards are meant to be fucking played on, which means they WILL go up to FULL FUCKING LOAD you moron. Who gives a shit about idle power consumption when thats not the thing making your PC run hot and your wallet empty, in BOTH cases.
>Nvidia and AMD both consume similar power when they are not under full load
>AMD consumes more power under full load
>naaah, lets forget that last part and sway on /g/ saying they need the same power
Your logic is fucking stupid.
AMD hits the tdp just a bit over
nvidia goes 46W over at max load
Nvidia is more efficient but they also lie to make it look more efficient.
You sure about that? How about you go by actual data, and not the shiny tag Nvidia puts on the box? You know, it's not the first time that their label tells you something that isn't quite true. They still put "4GB" on the boxes of the 970.
>but the point is, why are AMD cards so much hotter and louder?
because maxwell changed the game. it's not that AMD got so much hotter, its that nvidia got way more efficient and cooler.
it was generally the trend of GPUs to get louder/hotter as they got more powerful, as it was the outcome of using an inefficient architecture
just take a look at the gtx 770, it's at 375w !
40/50/60w might not seem like that much when looking at total system power, but it does mean fans will be running faster/louder.
so there is some trade off here.
I didn't claim there wasn't a difference. I claimed it wasn't two times higher, which was what was stated. Go double check.
And I stand by it. The difference is not double. In fact, it's barely noticeable.
The difference in power usage over the course of a year, is less than $5. You're buying cards in the $300 price bracket, and you care about $5 difference in power usage, not in getting the card that performs better. How does that sound for retarded?
Funnily enough, Nvidia only started marketing towards less noise/temperature with the coming of Maxwell. Before that, no one cared. They even burned down a couple houses, and everyone forgot about that already.
Now, getting a card that saves you $5 in power a year, and that is 5ºC cooler, is more important than getting the better performer for cheaper. Just because it's more "efficient", even if negligible.
That doesn't mean it's spiking up to 1000W, though.
The fact that the card MIGHT need some extra power at some points, doesn't mean it will average out at that point. The average is always much lower than the absolute maximum power spike.
Put it this way: AMD cards average under their announced maximum TDP, while Nvidia cards average much higher than the announced TDP. You think that's fine? I don't.
It is done. The typical nvidia-fag draws his last resort weapon: power consumption.
Yes, nVidia is better because on an average gaming load you save $10 a year on energy cost.
Yes, you are not a poorfag because you bother about $10 of energy cost
Yes, your mom will love you for saving 30watts for 2 hours a day
Yes, the one thing you care even more about then your vidya is paul the polar bear, which you just saved by buying a nvidia graphics card.
I love the power consumption phase of the nvidiot. Next stop: buyers remorse phase.
The trend to be more efficient had to come eventually, you can't just keep making shit hotter eventually you won't be able to cool it or will have to have a closed loop cooling system like AMD has to cope with it what happens when that isn't enough? you go with a 240mm radiator next ?
Also lets be mindful of the prices of the cards when the 970 did come out, at the time of release it was priced well, well under the 290x and under the 290 as well and then out performed them both (most of the time) and it drew less power and was quieter too.
The only way for AMD to compete with that was just to slash prices, it seems to be the only way for AMD to compete by slashing prices.
Intel and nvidia both enjoy double digit margins, AMD has negative margins..
With headphones, you won't notice the noise if you're playing. Unless you're using the card on something other than games that puts it under load, noise isn't relevant, you won't be noticing it.
As for heat, good coolers on AMD cards play a huge roll, and can minimize that massively. If you have a good case, it also minimizes it even more. Small room won't be a big problem.
I have a 390 and it idles at 30ºC in the winter and 38ºC in the summer. Under load, it never goes over 75ºC, which is not even 10ºC higher than a 970, if I'm not mistaken. And that's at the absolute maximum, it's usually under 70ºC even with load on it.
The absolute only reason to buy a 970 is on one of those mini-ITX builds, if for some reason you want one. Other than that, if you get a good 390, none of that is an issue.
I mean... That's what AMD is doing now too, and they're not doing too bad.
Zen will also have a new upgraded cooler that performs much better than a good part of the aftermarket coolers. They're actually investing hard into it now.
who cares about stock coolers? Any OEM cooler is just as silent as nvidia ones and the heat argument is just lame. If you don't run furmark 24/7 and live in a cupboard the average room temp difference will maybe be 2-3°C
EVGA doesn't make coolers for AMD, as far as I know.
Basically, avoid ASUS at all costs. They are part of the reason AMD gets such bad rep. They just reuse the same coolers they make for their Nvidia cards, and the result is obvious, their coolers suck donkey dick on AMD. It's absolute trash.
MSI, PCS+ and the Sapphire coolers are the best. This is for the 300 series, for the 200 you should stick to Sapphire.
These 3 are all really good, and will keep the card under 75ºC at all times under normal conditions, and at under 40ºC on idle. Some of them have fans off by default until they reach 60ºC though (it's some no-noise marketing shit), but you can just turn the fans up to 10% on idle instead of off. Even with fans off while idling, it won't even get to 60ºC. I know MSI does this, not sure about the others.
XFX and Gigabyte are the middleground there. Not too bad, but not as good as the other 3. If you get them considerably cheaper, it's a decent option.
I like AMD cards over Nvidia, but I believe AMD also keeps their memory clock maxed while using 144Hz, that's not exclusive to Nvidia, but don't quote me on that, I'm not too sure and I don't own one of those monitors myself.
Have you read the thread ?
It's just memes after memes.
With one guy writing novel full of non sense and acting full reddit by quotingj every posts and stating which are his.
Stfu and have a bit of self-awareness please.
No, this is an old known issue that was outlined by pcperspective while testing the newest swift again.
Here is full story: http://www.pcper.com/news/Graphics-Cards/Testing-GPU-Power-Draw-Increased-Refresh-Rates-using-ASUS-PG279Q
>muh aftermarkets coolen
They slap the same ones on both brands,
Easier to keep the temps on a colder one. Means less noise on a colder one.
I fucking hate the shit Njewdia consistenly pulls but cum on.
Following that logic we all may just as well let them make cards reaching 200C because i can attach a leaf blower to my case and play on headphones.
Willingly supporting anti-consumer shit is actually pretty stupid m8.
You know that the "optimized for Nvidia" sticker on it, just means that comes next generation, it won't be "optimized" for your card anymore, right? You gotta keep supporting them, you know, so they want you to buy a new card every generation. Are you ready for that?
No they don't. But since you're the one claiming they do, go ahead and source that, or at least prove it.
MSI is pretty good on both ends, no way they would be able to get their temperatures close to Nvidia cards if they just slapped the same cooler on the AMD card. They have to make slight changes to coolers for them to even fit properly on the card.
Source your shit.
So according to you i should join a moral fight against an evil company so in a "possible" utopic future vdya may run equal on all cards?
Nah m8, i just want the best performance for what im playing right now.
Its fucking video games, not politics or some other ground-breaking cause.
It's got nothing to do with a moral fight. Your actions will end up biting you in the ass though. Because while Nvidia has a history of gimping their older cards, AMD doesn't. AMD cards do only marginally worse than Nvidia in GameWorks games, to the point where it's irrelevant or you just tone it down a notch and barely notice the difference.
On the other hand, older Nvidia cards on "Nvidia optimized games" completely shit the bed and are considerably worse than their older AMD counterparts.
Case in point, Project CARS, which is one of the most obvious examples.
And use what instead Windows? Please no
Apple: no thank you, I don't have crippling autism and/or brain damage
Bsd/wtf os nobody has Hurd of: I want to actually use my os for things