Does anyone else feel like the current generation of AMD cards are secretly superior but everyone's so used to the prior era's of AMD fuck ups in both the CPU and GPU market that they are too far down the Nvidia rabbithole to admit it?
>>52485889 >360 Re-branded 260 with higher clock speeds and 2GB of VRAM, priced similarly at release, with street prices being closer to what the superior 260X used to be
>370 Re-branded 265/7850 with higher clocks - again, similar MSRP, but with the superior 270 diminishing and no replacement released, its prices haven't gone down as much as they should be.
>380 285 Re-brand at cheaper MSRP and 4Gb available. Good budget card, especially against NV's 960, albeit cut-down memory bus from the 280, and barely performing better. It also has a fairly high power-consumption towards Nvidia's offering.
>390 290 with higher clocks, lower MSRP and more (actually as advertised :^)) VRAM. Nothing bad here as well, but again, power-consumption is an issue among the 970, especially when overclocked.
>390X 290X with higher clocks and similar to the 290. Can match or get close to a 980 at a much lower price point, but in contrast has significantly higher power-consumption, which in excessive use, can not only contribute towards your power bill, but might also require a new PSU purchase, which may void its price advantage in the first place. Also OC's worse on average, even considering Maxwell's shitty scaling with clockspeed.
>Fury Cut-down Fiji, can beat a 980 but is a good leap from the 980Ti whilst the difference in price is rather marginal. Likely outclassed by the 390X and Nano.
>Nano Good card, especially after the price drop. Fully-fledged Fury X at lower clocks, which unfortunately only hovers it around 980-level and barely even beats a 390X, but at least it's pretty efficient, at least until you overclock it. Good for small form-factor builds, not so good as a budget GPU.
>Fury X Can get close or match a 980Ti around similar price points, but contains less VRAM, uses more power and doesn't overclock as well. Quite a disappointment, at least for its price.
Overall, AMD's offerings are decent, but definitely not outshining Nvidia's. We'll see about Polaris.
>>52486640 Wouldn't have fit into 2000 characters anyway, but here we go :^)
>380X Tonga in full. Performance lies on Tahiti-level, just like on the 380 - here against the 280X/7970, which is only about 10-20% faster. Doesn't seem worth the additional 50+ Dollars to a 380 right now, you'd be better to drop another 50 into the 390, but it's relatively new, so things might change. (Driver optimization/price drops)
>>52487751 As someone who uses AMD and lives alone, this amuses me how much faggots who live with their parents worry about this. I live completely solo working a low tier job (retail butcher) and pay just a smidgen over half of my income per month to rent alone. However, I don't give a fuck that the Fury X at stock sucks down slightly more than the 980ti (at stock - though the 980ti EATS power when OC'd). I DO turn lights off that I don't need and stuff like that, and I always make sure to buy energy efficient bulbs and do stuff the efficient way (use a kettle instead of water boiler to do washing up, for example.)
Proof is in the pudding; using an AMD card doesn't mean jack shit to your power bill.
>>52488157 That's true, fuck I hate morons who do that kind of shit. When I bought my 750w PSU, it was for the fact that the prices were fuck all different and IF I decided to one day go crossfire, I would be able to. This whole LEL NVIDIA SOOOO MUCH MORE EFFICIENT XDDDD meme needs to die. The differences are tiny.
>>52488605 He's probably talking about the entire system because people are still too retarded to seperate the two. A 290x, without a heavy OC, will draw roughly 300W from the wall. >>52488624 You're retarded, see: >>52488629
When will people learn the difference between a rebrand and a refresh.
Yes, those cards are tired as fuck already, and we should have gotten Polaris instead of the 300 series, but those are not rebrands, but refreshes.
The fact is that the only thing they Really achieved with the 300 series is the power consumption, though it is still higher than nvidia's.
The 200 series are still heat-emitting power-eating monsters, but they perform the same fucking way, and are currently superior because most people have a 750W PSU anyway, and can OC that 290 to eternity with this, reaching a $320 970, all the while being priced at like $200.
Also, AMD should fire their whole PR team who outright lied about the Fiji performance.
>>52488642 That's a good point, something I hadn't expected. But still, even if we assume he was using a basic Intel CPU... that's more than should've been plugged into it unless you're literally using a custom VRM board.
i7 4790k @ stock is 88w, even if it was an AMD CPU the stock is something around 120w. That would still mean that if the CPU wasn't clocked that its 200w on the GPU. Again, no way should people be trying to add THAT much power unless they're going for world records.
>>52488669 If you heavily OC the CPU and GPU then it's not totally unreasonable to see 600W from a system like that, especially if you measure it from the wall. At 90% efficiency that's just 540W for the actual system, which sounds reasonable. >>52488675 Pic related, that's for the GPU alone. How exactly did you measure the power consumption?
>>52488694 From the wall is another matter, but in terms of what the system itself is using? A reliable PSU will be able to supply up to a minimum of what is stated on the PSU itself. Sometimes you're lucky and they can exceed it slightly, but its never encouraged to try. What the PSU takes from the wall is different altogether and THIS is bullshittery because such arguments are trying to obscure the truth in hidden context.
>>52488721 It's how most people measure it, and honestly it's not a terrible measurement since most PSUs are pretty damn efficient nowadays. So, as long as you're not using a really shitty PSU on purpose they're alright to get the general ballpark of what the entire system is using, and then you can calculate it at different states to get the numbers for the GPU itself. Like the pic I posted, that's done like that so at the very worst it's 290W for a stock 290x, that's a number you can work with as long s you know what it stands for.
>>52488745 Or maybe, just maybe; you're a fucking moron who tried overclocking the card with 2x the power requirements/OC'd and are using whole system wall power draw with OC'd CPU as well, to obscure the facts of how retarded you are.
Either that or you got one in a million in that the card was built so shittily and slipped by QA that it was too faulty to regulate power consumption at all.
>>52488749 Its not so bad when a tech firm does it AS LONG AS they've got variables that don't change outside of the GPU. Literally the way to do it would be: >Build rig, no CPU. Power it on, test power usage >Change nothing about rig, implant GPU, test power usage >A - B = C However, I highly doubt anon does this.
>>52488749 See: >>52488694 That's the general figure you'll find everywhere, 600w is more than double that, you're either bullshitting or your methodology was stupid. >>52488754 A 290X consuming 600W would blow the power delivery on the card and overheat very fast, it would be very clear that the card is faulty, he has to be bullshitting or he has no clue how to measure power consumption.
>>52488770 You can't put any kind of real load on the GPU without a CPU, so that wouldn't work. The testing methodology that guru3d uses is mostly fine, it's not perfect by any means but it works and is fairly consistent. With proper lab equipment you can obviously do a better job, but barely anyone is doing that.
>>52488503 >>52488624 If you try to kill any chip, it'll consume massive amounts of power. Using the cards for what they're intended for will net you less than 500W. If you want to make the card to whatever it needs to, to chew as much wattag as possible then go for it. Just don't give that shit to normal people who use GPU's normally.
I honestly don't know if I will buy another AMD card after they decided to stop supporting my crossfire 6950's whenever one of them still performs as well as their mid/low offerings.
I understand not supporting older hardware but I wouldn't be as mad about it if the "stable" driver they have for windows 7 wasn't released in a broken state (HDMI audio doesn't work) Ive tried the beta crimson driver and it fixes the audio issue but breaks crossfire. It might have been updated since I tried it so I might try it again soon.
To give a little comparison look at when the last driver update for some Nvidia cards that were released in 2004 were
>>52488920 You can literally buy a 7850 for like $80 that is loads better than your 6950, or shit, you can even get a 7950 for fucking ~$100 even some of the betters models. You can sell your 6950 and pretty much just make a swap.
>>52485889 That's cause they are superior. The only reason they perform better in gaymes is because Nvidia has their proprietary cancer that is gameworks already injected everywhere. The FuryX for examlpe has so much throughput it shuld be able to crush any Nvidia card but nooooooo.
It seems to me that in mostly online marketing AMD is underated. Personally I was a little let down by the recent GPUs that AMD released, it's still undeniable that a lot of what AMD offered before the 300 cards were outstanding (ie HD 7970 and R9 290x). I'm not saying that the 300 cards are bad, actually they are pretty kick ass for non gaming applications like using OpenCl acceleration in Adobe programs. I feel that AMD gets an underated reputation partially from the YouTube community (ie Linus Tech Tips), but that could be contributed to Nvidia's superior market skills, however damaging they may be to the overall market. With all this said, come at me Nvidia fanboys, I can't wait to see you all make fpols of your selves.
Everyone says "power consumption" and "heat" but those are retarded.
Extra 50-70w isn't going to change anything in your billing. "heat" can either be referred to as heat generated by the card's power or the temperature of the card itself. The later is addressed by aftermarket coolers. It was only an issue for the reference 290/x cards with blower design. The initial doesn't even affect you in the slightest. The amount of heat it generates vs a similar card is so microscopic, its just hilarious.
Others I've heard is drivers issue. Couple of years ago, that might have been the case, I very much doubt this is the case. Last couple of years, AMD's drivers have been pretty stellar. I've had my share of both nVidia cards and AMD cards (AMD cards being my current setup). I've had some minor issues with both companies, however overall, I'd say they were pretty good. I've heard the current driver issues lies with the nvidia cards and windows10. Scanning over the reddit nvidia official page, that seems to be the case.
Current gen nVidia I think are overvalued and AMD cards undervalued. If you value your money, you'd choose wisely. Heresy and opinions will have to be discarded if you want whats best for yourself both for current games and for future games.
>>52485889 Radeons are better designed in an ideal world where DX11 driver efficiency was never an issue, GameWorks never existed, and games were limited just by available shader power.
In reality however, you have one vendor with slightly beefier tessellation/ROP throughput paying developers to flood scene with tons of 1-3 pixel triangles that don't improve visuals but choke the other vendor's GPUs.
If Polaris doesn't reverse this trend, AMD is probably finished, no matter how much they might consider themselves to be taking the high road or whatever.
Your actually going to see a bit of a reversal in power consumption/heat and performance from pascale and polaris.
AMD is building brand new core designs radically difference from current GCN and they are putting them on a superior process than the 16nm node pascale will be on
speculation is that nvidia is refreshing maxwell for the smaller node which means they will likely say fuck it to power efficiency and heat and scale maxwell CUDA in order to drive core clockspeeds even higher than they currently are in order to beat polaris on synthetic benchmarks
so pascale will be very inefficient and possibly even a housefire meme waiting to happen in exchange for possibly being the first 2ghz GPUs in the world
while polaris will be farther behind on clockspeed by an order of magnitude more efficient shaders and more of those shaders packed into the die.
if my assessment is correct pascale will cost an arm and leg but be technically faster on desktops simply by pushing for insanely dangerous clockspeeds to make up for the architectures other shortcomings and AMD's pascale will totally outclass them in the mobile variants that go into gaming laptops
>>52491770 >radically difference from current GCN no >pascale will totally outclass them in the mobile variants that go into gaming laptops pascale will be everywhere but destinated for 64FPU computing which nevertheless will be blocked on every non-professional card. SURPRISE!
Please support this website by donating Bitcoins to 16mKtbZiwW52BLkibtCr8jUg2KVUMTxVQ5 If a post contains copyrighted or illegal content, please click on that post's [Report] button and fill out a post removal request
All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
This is a 4chan archive - all of the content originated from that site. This means that 4Archive shows an archive of their content. If you need information for a Poster - contact them.