Does anyone else feel like the current generation of AMD cards are secretly superior but everyone's so used to the prior era's of AMD fuck ups in both the CPU and GPU market that they are too far down the Nvidia rabbithole to admit it?
Re-branded 260 with higher clock speeds and 2GB of VRAM, priced similarly at release, with street prices being closer to what the superior 260X used to be
Re-branded 265/7850 with higher clocks - again, similar MSRP, but with the superior 270 diminishing and no replacement released, its prices haven't gone down as much as they should be.
285 Re-brand at cheaper MSRP and 4Gb available. Good budget card, especially against NV's 960, albeit cut-down memory bus from the 280, and barely performing better. It also has a fairly high power-consumption towards Nvidia's offering.
290 with higher clocks, lower MSRP and more (actually as advertised :^)) VRAM. Nothing bad here as well, but again, power-consumption is an issue among the 970, especially when overclocked.
290X with higher clocks and similar to the 290. Can match or get close to a 980 at a much lower price point, but in contrast has significantly higher power-consumption, which in excessive use, can not only contribute towards your power bill, but might also require a new PSU purchase, which may void its price advantage in the first place. Also OC's worse on average, even considering Maxwell's shitty scaling with clockspeed.
Cut-down Fiji, can beat a 980 but is a good leap from the 980Ti whilst the difference in price is rather marginal. Likely outclassed by the 390X and Nano.
Good card, especially after the price drop. Fully-fledged Fury X at lower clocks, which unfortunately only hovers it around 980-level and barely even beats a 390X, but at least it's pretty efficient, at least until you overclock it. Good for small form-factor builds, not so good as a budget GPU.
Can get close or match a 980Ti around similar price points, but contains less VRAM, uses more power and doesn't overclock as well. Quite a disappointment, at least for its price.
Overall, AMD's offerings are decent, but definitely not outshining Nvidia's. We'll see about Polaris.
Wouldn't have fit into 2000 characters anyway, but here we go :^)
Tonga in full. Performance lies on Tahiti-level, just like on the 380 - here against the 280X/7970, which is only about 10-20% faster. Doesn't seem worth the additional 50+ Dollars to a 380 right now, you'd be better to drop another 50 into the 390, but it's relatively new, so things might change. (Driver optimization/price drops)
AMD cards are competitive, especially in the lower price segment (let's say up to 300 €).
I upgraded to an R9 390 from a HD 7950 and sold my HD 7950 for 110 € on ebay last summer.
Best price performance at the moment is undoubtedly the Radeon R9 290 with custom cooler. You can get those for around 200€ if you're patient and wait for a deal.
As someone who uses AMD and lives alone, this amuses me how much faggots who live with their parents worry about this.
I live completely solo working a low tier job (retail butcher) and pay just a smidgen over half of my income per month to rent alone.
However, I don't give a fuck that the Fury X at stock sucks down slightly more than the 980ti (at stock - though the 980ti EATS power when OC'd).
I DO turn lights off that I don't need and stuff like that, and I always make sure to buy energy efficient bulbs and do stuff the efficient way (use a kettle instead of water boiler to do washing up, for example.)
Proof is in the pudding; using an AMD card doesn't mean jack shit to your power bill.
That's true, fuck I hate morons who do that kind of shit. When I bought my 750w PSU, it was for the fact that the prices were fuck all different and IF I decided to one day go crossfire, I would be able to.
This whole LEL NVIDIA SOOOO MUCH MORE EFFICIENT XDDDD meme needs to die. The differences are tiny.
>980ti EATS power when OC'd
That's just wrong. Maxwell doesn't require any voltage increases when reaching their max OC thus the power consumption increases only negligibly.
Also it is not about the power consumption, it is about the heat output.
Higher heat output increases the room temperature.
Increase room temperature turns on the central ac.
Central AC consumes 3000w and increases ambient noise.
A 50~100w increase will heat up a room over the hours and cause a domino effect.
I own one as well. And it's watercooled.
Between stock clocks (1200) and overclocked (1500), the difference is about 30w at the absolute most. It probably averages about 10w more overclocked.
EVGA said the same thing also, I can link the article if you don't believe me.
Fuck me, you think running an AMD card increases the room temperature that dramatically. Try dual GTX580's before I get my Dual Fury X's.
I'm going to need some proof on that 30w? got an AX PSU?
From this chart alone there's a 26% efficiency difference between a STOCK Fury X and a non reference OVERCLOCKED (1400) 980 ti.
Also take a look at pic
Most cards decreases pref/watt when you oc it.
Maxwell are the only cards that increases pref/watt when you oc it.
Did you use the correct settings this time?
lol why did it end up consuming 800w like I predicted? or are you scared of blowing up your vrms? (I would tbqh)
The last 290x I tested consumed 600w+, wayyyy more than my 980 ti overclocked. The card was squealing like a pig.
He's probably talking about the entire system because people are still too retarded to seperate the two. A 290x, without a heavy OC, will draw roughly 300W from the wall.
You're retarded, see: >>52488629
When will people learn the difference between a rebrand and a refresh.
Yes, those cards are tired as fuck already, and we should have gotten Polaris instead of the 300 series, but those are not rebrands, but refreshes.
The fact is that the only thing they Really achieved with the 300 series is the power consumption, though it is still higher than nvidia's.
The 200 series are still heat-emitting power-eating monsters, but they perform the same fucking way, and are currently superior because most people have a 750W PSU anyway, and can OC that 290 to eternity with this, reaching a $320 970, all the while being priced at like $200.
Also, AMD should fire their whole PR team who outright lied about the Fiji performance.
That's a good point, something I hadn't expected. But still, even if we assume he was using a basic Intel CPU... that's more than should've been plugged into it unless you're literally using a custom VRM board.
i7 4790k @ stock is 88w, even if it was an AMD CPU the stock is something around 120w. That would still mean that if the CPU wasn't clocked that its 200w on the GPU. Again, no way should people be trying to add THAT much power unless they're going for world records.
The numbers stared right at my face, I have no reason to fabricate lies.
Maybe the archive stores webms, try visiting the rbt.asia link above.
Must be nice to be ignorant
If you heavily OC the CPU and GPU then it's not totally unreasonable to see 600W from a system like that, especially if you measure it from the wall. At 90% efficiency that's just 540W for the actual system, which sounds reasonable.
Pic related, that's for the GPU alone. How exactly did you measure the power consumption?
From the wall is another matter, but in terms of what the system itself is using? A reliable PSU will be able to supply up to a minimum of what is stated on the PSU itself. Sometimes you're lucky and they can exceed it slightly, but its never encouraged to try.
What the PSU takes from the wall is different altogether and THIS is bullshittery because such arguments are trying to obscure the truth in hidden context.
You saw my return receipt and you know I don't have the card on hand to retest it.
It's up to you whether you believe me or not. Either way I don't really care because I don't have control over any of these.
It's how most people measure it, and honestly it's not a terrible measurement since most PSUs are pretty damn efficient nowadays. So, as long as you're not using a really shitty PSU on purpose they're alright to get the general ballpark of what the entire system is using, and then you can calculate it at different states to get the numbers for the GPU itself. Like the pic I posted, that's done like that so at the very worst it's 290W for a stock 290x, that's a number you can work with as long s you know what it stands for.
Or maybe, just maybe; you're a fucking moron who tried overclocking the card with 2x the power requirements/OC'd and are using whole system wall power draw with OC'd CPU as well, to obscure the facts of how retarded you are.
Either that or you got one in a million in that the card was built so shittily and slipped by QA that it was too faulty to regulate power consumption at all.
Its not so bad when a tech firm does it AS LONG AS they've got variables that don't change outside of the GPU.
Literally the way to do it would be:
>Build rig, no CPU. Power it on, test power usage
>Change nothing about rig, implant GPU, test power usage
>A - B = C
However, I highly doubt anon does this.
If you understand the nature of occt you wouldn't be saying such things.
Either way the power was calculated from total amperage from the pcie connectors. Take at look at >>52488367 for example.
That's the general figure you'll find everywhere, 600w is more than double that, you're either bullshitting or your methodology was stupid.
A 290X consuming 600W would blow the power delivery on the card and overheat very fast, it would be very clear that the card is faulty, he has to be bullshitting or he has no clue how to measure power consumption.
>Cut-down Fiji, can beat a 980 but is a good leap from the 980Ti whilst the difference in price is rather marginal. Likely outclassed by the 390X and Nano.
>whilst the difference in price is rather marginal
Strix Fury is £430
Strix 980ti is £600
You can't put any kind of real load on the GPU without a CPU, so that wouldn't work. The testing methodology that guru3d uses is mostly fine, it's not perfect by any means but it works and is fairly consistent. With proper lab equipment you can obviously do a better job, but barely anyone is doing that.
If you try to kill any chip, it'll consume massive amounts of power. Using the cards for what they're intended for will net you less than 500W. If you want to make the card to whatever it needs to, to chew as much wattag as possible then go for it. Just don't give that shit to normal people who use GPU's normally.
I honestly don't know if I will buy another AMD card after they decided to stop supporting my crossfire 6950's whenever one of them still performs as well as their mid/low offerings.
I understand not supporting older hardware but I wouldn't be as mad about it if the "stable" driver they have for windows 7 wasn't released in a broken state (HDMI audio doesn't work)
Ive tried the beta crimson driver and it fixes the audio issue but breaks crossfire. It might have been updated since I tried it so I might try it again soon.
To give a little comparison look at when the last driver update for some Nvidia cards that were released in 2004 were
You can literally buy a 7850 for like $80 that is loads better than your 6950, or shit, you can even get a 7950 for fucking ~$100 even some of the betters models. You can sell your 6950 and pretty much just make a swap.
That's cause they are superior. The only reason they perform better in gaymes is because Nvidia has their proprietary cancer that is gameworks already injected everywhere. The FuryX for examlpe has so much throughput it shuld be able to crush any Nvidia card but nooooooo.
Anyway go back to /v/ gaymen.
That's literally what guru3d is doing, except they take a general GPU idle power consumption into consideration, something around 10W, which is a good number for 90+% of GPUs out there.
How much longer is the 7950 going to be supported? Going by whats already happened to me probably not much longer
Keep telling yourself that
R7 370 is slower then 270
380 slower then 280
380X slower then 280X but muh
> +1GB VRAM
enjoy your upgrade
it is this way only two nvidia gens 700s and 900s
400 and 500 weren't that great, consequently it began when amd had some restructuring to do
i want 60-40 market so much, ideally 50-50
It seems to me that in mostly online marketing AMD is underated. Personally I was a little let down by the recent GPUs that AMD released, it's still undeniable that a lot of what AMD offered before the 300 cards were outstanding (ie HD 7970 and R9 290x). I'm not saying that the 300 cards are bad, actually they are pretty kick ass for non gaming applications like using OpenCl acceleration in Adobe programs. I feel that AMD gets an underated reputation partially from the YouTube community (ie Linus Tech Tips), but that could be contributed to Nvidia's superior market skills, however damaging they may be to the overall market. With all this said, come at me Nvidia fanboys, I can't wait to see you all make fpols of your selves.
Everyone says "power consumption" and "heat" but those are retarded.
Extra 50-70w isn't going to change anything in your billing. "heat" can either be referred to as heat generated by the card's power or the temperature of the card itself. The later is addressed by aftermarket coolers. It was only an issue for the reference 290/x cards with blower design. The initial doesn't even affect you in the slightest. The amount of heat it generates vs a similar card is so microscopic, its just hilarious.
Others I've heard is drivers issue. Couple of years ago, that might have been the case, I very much doubt this is the case. Last couple of years, AMD's drivers have been pretty stellar. I've had my share of both nVidia cards and AMD cards (AMD cards being my current setup). I've had some minor issues with both companies, however overall, I'd say they were pretty good. I've heard the current driver issues lies with the nvidia cards and windows10. Scanning over the reddit nvidia official page, that seems to be the case.
Current gen nVidia I think are overvalued and AMD cards undervalued. If you value your money, you'd choose wisely. Heresy and opinions will have to be discarded if you want whats best for yourself both for current games and for future games.
Radeons are better designed in an ideal world where DX11 driver efficiency was never an issue, GameWorks never existed, and games were limited just by available shader power.
In reality however, you have one vendor with slightly beefier tessellation/ROP throughput paying developers to flood scene with tons of 1-3 pixel triangles that don't improve visuals but choke the other vendor's GPUs.
If Polaris doesn't reverse this trend, AMD is probably finished, no matter how much they might consider themselves to be taking the high road or whatever.
Your actually going to see a bit of a reversal in power consumption/heat and performance from pascale and polaris.
AMD is building brand new core designs radically difference from current GCN and they are putting them on a superior process than the 16nm node pascale will be on
speculation is that nvidia is refreshing maxwell for the smaller node which means they will likely say fuck it to power efficiency and heat and scale maxwell CUDA in order to drive core clockspeeds even higher than they currently are in order to beat polaris on synthetic benchmarks
so pascale will be very inefficient and possibly even a housefire meme waiting to happen in exchange for possibly being the first 2ghz GPUs in the world
while polaris will be farther behind on clockspeed by an order of magnitude more efficient shaders and more of those shaders packed into the die.
if my assessment is correct pascale will cost an arm and leg but be technically faster on desktops simply by pushing for insanely dangerous clockspeeds to make up for the architectures other shortcomings and AMD's pascale will totally outclass them in the mobile variants that go into gaming laptops
>radically difference from current GCN
>pascale will totally outclass them in the mobile variants that go into gaming laptops
pascale will be everywhere but destinated for 64FPU computing which nevertheless will be blocked on every non-professional card. SURPRISE!
this post gave you 0.021$ to your nvidia savings account (tm)
I wonder when gtx480 was released why no nvidia shill cared about the power or heat :^)
It is literally 30-50w or around 10-15% for whole pc more power usage 970 vs 390.
Same applies to Fury vs 980ti.
It would cost maybe 10$ in year more in electricity and considering half of the year heating is only positive thing anyways..
So nice try Rajeed. Go POO in LOO
69xx series cant really be overclocked more than 10%
78xx goes to 1.2ghz and 1.4 for memory and still uses less power.
7850 can get nice even 35%-40% bump relatively easy in real performance and is on par or better than stock 7950 even.
t. some guy who had 1260mhz core and 1450mhz memory with on asus dc2 hd7850