So let me get this straight:
AMD is selling this exact same card (HD 7970) since 2013, but just renames it every year. Even worse, the new cards are 7970s with disables chips.
So the brand new high end AMD 300 series cards are just gimped 7970s. 380x is based on the Tonga GPU (found in the 285) but with more CUs enabled, which is only 256-bit compared to the Tahiti chip found in the 280x/7970, which is 384-bit.
Why do AMD fags eat this up and buy new AMD cards? Are they just confused by the rebranding?
This is b8
Go look up how GPU/CPU binning works idiot.
Manufactures don't want to spend money researching lower end/mid tier cards or building an architecture thats more efficient, so they do it less often, plus AMD doesn't make as much off of low and midtier cards. Hence why they do some tweaks and just rebrand it.
Bus width does not usually matter. Look at nvidia.
Also, AMD cards are much faster due to thier computation power.
>Meanwhile the rebranded 390x is trading blows with 980 in some benchmarks.
Both NVIDIA and AMD have always been doing this. Its nothing new. Did you really think they throw away all their tech and start over for every new card they bring out?
No only AMD does this dishonest shit, and their current cpu lineup is just retarded overclocked versions of 3 year old chips.
Nvidia rarely reuses architecture and when they do, they downsize it to the next level lithography process for lower power consumption and sell it as a low end budget card with an xx5 part number, ex GTX745.
They also come with improvement hardware too.
ie the Geometry chipset in the 950 is 3x better than the one in the 780Ti which is why you sometimes see the 950 on par with the 780ti.
>Nvidia rarely reuses architecture and when they do, they downsize it to the next level lithography process for lower power consumption and sell it as a low end budget card with an xx5 part number, ex GTX745.
Geforce 2 MX -> Geforce 4 MX
Geforce 8800 line -> Geforce 9800 line (straight up rebrands)
Geforce 8800GT -> 9800GT -> GTX250
GTX 230, 240, 240 were all G92 rebrands.
AMD is like Apple back in the 90's. Only die hard fans actually gave a shit about them and their inferior products.
The truth is we NEED AMD, otherwise we'll have a Jewtel/Nvidia monopoly which is bad. So thank the AMDfags for keeping competition going.
AMD has the most power efficient GPU right now, in addition Tom's shillware just tested lower the voltage on the Fury and got it as efficient as the 980 while still out performing it
>no gimping drivers for older cards
>cards get better with time
>work better with dx12 and asynchronous compute
>higher performance for a lower price
>not trying to destroy competition and progess with underhanded proprietary bullshit and bribes
>le wood screws
>"lol quit being so poor"
Fanboys, ladies and gentlemen.
>They are also more expensive
Are you retarded or something? of course they will be more expensive you dumb fucker it is a new architecture how the fuck do you think it will pay off?
>don't support DX12 properly
As if AMD cards can.
>rely on marketing
So now having video cards that work properly now is marketing?
As if AMD API which by the way they stick 100% to the standard set by Microsoft for the DX feature any work better than Nvidia's plugin, let me let you in little secret, they don't and they are really strict to make sure they get it to be as the papers say.
Also complaining about "driver cheats" is like complaining about AMD writing Mantle, which by the why lets not forget that while they had offered it to Nvidia they wouldn't allow them to modify the code to adapt it to Nvidia cards, making it completely useless as it was written for a completely different architecture.
Lets not forget too that Nvidia has offered every piece of tech they develop/ buy to AMD and AMD has refused every time causing lot of inconvenience to people who buy AMD GPUs because they refuse to keep up with the times, then years later AMD is playing catch up, lets not forget how HBAO destroyed any GPU Nvidia fixed it with dedicated hardware, AMD whined about it, then years later they did the same, now it's literally the same with Tessellation, Nvidia is doing better than AMD by using a geometry engine, AMD whined about it and now they are adding a Geometry crusher in Polaris to do it and you people will praise AMD for playing catch up again.
You mean selling similar performance since 2013,with better power consumption.By this logic we should be saying the same thing about the 780 ti and the 970 and anything similar with nvidia,they both play the same game.
>AMD keeps rebranding their cards but still keeps up with Nvidia's performance
>all their R&D gets poured into the top end of chips instead of being watered down into smaller, shittier architectures like nvidia
These days the now ancient 7970/280x fights a 970 lol.
Don't forget how they reuse their GPUs.
Gk104 was used for the gtx 680, 670, 660 ti. The 670 was then rebranded as a 760, and the 680 was rebranded as a 770.
They then made the gk106 for the gtx 650, gt640, and gt 730.
Gt 630, 620, and 610 are all fermi rebrands.
At least AMD only does it once.
7970 -> 280x
285 -> 380
290x -> 390x
Also doesn't make sense to bump up a 260x to a 370, if pitcairn is still in the game it'll be a 360.
I'm using a 260x and it is a pretty based card for the price I paid.
>they don't realize technology hasn't advanced in years because we've hit a gigantic wall and processor and videocard makers are desperate to sell new product
EVERYTHING is a barely incremental rebrand now. EVERYTHING.
It's over, we've hit the end of computing until some radical redesign gets cheap enough to allow us to continue.
I hope what you realize what you say is phisically impossible and the article you are using as source didn't even try it because you can't undervol using msi afterburner as you have to manually edit the bios and change the voltage for 35 or so clock state.
Mantle wasn't open source when it was offered to Nvidia though.
Free-sync is something AMD came out 18 months after G-sync.
TressFX isn't even comparable to Hairworks, TressFX looks like fucking carpet when used on stuff other than hair unlike Hairworks.
if the 280x was made any better it would have just cost amd money.
280x has been and still is the best bang for the buck gpu in the past couple years.
where amd fucks up hard is in x and non-x cards. we didn't want a 290, and we didn't want a 390.
should have had a 380x (oc'd as fuck 280x), a 390x (same), a fury, and a fury X.
the 390 and fury nano don't serve much purpose.
>290x master race, bought it at the right time and right price
No, where they fucked up is not having the 380 be the 380X right from the start with the 280X being rebranded as the 370X and the 390X being available with 4gb and slightly lower clock speeds or 8gb with the 390X clock speeds and having the Fury selling for a lot less with 3/3.5gb of VRAM letting them use the GPUs with a fucked bit of memory, having the Fury X as is and then releasing the Fury nano and Fury X2 using the same shit back in 2015.
>Are they just confused by the rebranding?
Can you really blame them though?
This is the most asinine naming scheme ever. The generation is actually the 2nd number whereas the "segment" is the 1st number.
You may hate nvidia, but fuck, their naming system is so elegant compared to this confusing pile of shit.
that's some nvidia level branding right there, anon.
>make X/Ti card not an X/Ti card next time around
>downgrading numbers by one digit
anyway wccftech (lel) already confirmed (double lel) a $1700 flagship dual core gpu from amd that's about to come bring the dick to rape town, and will cost 1/3 that a year from launch like they always do.
>not running a 495x2
the 980ti is a great card if you only want to play games.
if you want to do some 3d modelling/assembly software, and video composition/editing/rendering... you will get significantly better performance in texels, GPU render, wireframe, lray, and viewport from the titan x. enough that you can get the titan x instead of a 3000 dollar quadro. which means if you're doing those non-game tasks, but also like games, the titan X is a perfect card for you.
also, the 384 bit bandwidth and 12gb ram means that if you have the cores, the ram, and the QPI gt/s to accompany the titan X, it will outshine the 980ti in some games with massive textures and detail.
some people are willing to pay the premium just for that extra bit of performance over the 980ti, if i was just playing games i'd prefer SLi 980ti over a single titan X.
but like i said for some people that do visual and 3d media as well as game, the titan X is the pinnacle right now.
my 280x actually got better over time through driver updates. I remember I got it around the time Titanfall released and was so disappointed in the framerate and artifacts, as well as BSOD all the time. Turns out Titanfall was just a shit port. Was going to get Nvidia as my next build, but after learning how jewish they are about their technologies, I'm rethinking that decision.
>that's pretty much the same thing as the i3, i5, i7, but nobody gets confused by that
Intel deserved its naming system by being the top of its game. Second of all, it's strongly correlated to core # so it's incredibly intuitive.
R5, R7 and R9 are just relative distinctions, i.e. R9 is supposed to always be better than a R7 WITHIN A FAMILY. Just "better", no actual intuitive structural note that can be elegantly memorized and debated among enthusiasts. This is why it's easy to rebrand and confuse AMD consumers. Most of them have no idea what they're buying in the first place.
I bought a 380x on sale, first amd card in years, I'm pleased.
For 190 I couldn't find a better card with at least 4gb, so whatever.
Runs cool, does what I want. It's mostly a placeholder till next cards come out, thought about a 970/390, but couldn't justify it with what I do on my computer.
the 280x will be obsolete for 1080p gaming long before it's technically not powerful enough.
it's got the specs to stay relevant but as usual, shitty optimizations will be it's downfall, not actual hardware performance.
i push the limits of my 290x all the time and honestly the 280x would be hard to go back to, the 290x is pretty much a solid 150% performance gain at anything i throw at it, mainly a modded to fuck all skyrim.
if you have the extra money to spend, get a 290x. sadly they seem to be going for 225-250 USD used and you can get a 280x for 125-150 used.
even these fucking reference cooler 290x's are holding their value on ebay, tried to lowball one yesterday and got shot down. they're overpriced, they should be in the $180 range for blowers and 200 for fans.
this if you can find one for sale, though most people selling them know the value of them vs the 280x and that they're the same card, equalizing values.
my saphire 280x showed up as a 7970 while my powercolor 280x showed ups a 280x, both CF'd just fine in case anybody is wondering.
390 is better price/performance(nvidia fangays pls go i have sli 970s), but if you're fine with what you have already then there's no reason to upgrade. If you want to upgrade anyway you might as well wait until polaris/pascal come out to see which is better.
Price and updated support, is it that hard to understand?
The only reason i got a r9 280x was because my former gpu was a hd6870 that was legging behind.
I will stick with r9 280x for at least 3-4 years. Back then r9 380x wasnt avaiable.
There's also a lot of people who wont keep their gpus for more than a year because they lose too much value for reselling. If you are a good dealer, it's better to just keep rebuying semi-new gpus.
I've seen 7950s go for as low as 95 dollars on Amazon. Good shit too like sapphire, not the reference cards.
No. AMD uses crossfire.
Set FRTC to something like 45 frames per second. 35-45fps is much more steady than 35-60 fps.
>Tonga GPU (found in the 285) but with more CUs enabled, which is only 256-bit compared to the Tahiti chip found in the 280x/7970, which is 384-bit
Tonga has a smaller memory bus, but it also has delta color compression to make up for the lower memory bandwidth, so that's not really an issue. This is the exact same thing Nvidia did on Maxwell cards, which is why the 960 has a 128-bit bus and the 980 has a 256-bit bus. Smaller buses + compression is more power efficient that larger buses.
Also, Tonga also has much better tesselation performance than Tahiti, as well as many smaller improvements of the GCN 1.2 architecture compared to GCN 1.0 (Tahiti). Among them, better DX12 support, FreeSync support, and even that audio thing nobody actually made use of.
And even if it were literally just a rebrand, so what? The 380 still wipes the floor with the 960 for the same price. It would still be the better card anyway.
>Second of all, it's strongly correlated to core # so it's incredibly intuitive.
Are you fucking retarded?
>desktop i3 = 2 cores
>laptop i3 = 2 cores
>desktop i5 = 4 cores
>laptop i5 = 2 cores
>desktop i7 = 4, 6 or 8 cores
>laptop i7 = 2 or 4 cores
How is that related to number of cores in any fucking way, you clueless retard?
Its not called a rebrand if you change it , instead its called a refresh please be more correct next time.
Also only th GPU core and GDDR5 chips are same ther are some minor twecks with better transistors and some capacitor changes variest other changes
Its not a full re-brand per se but ya know buzzwords and so forth.
as a owner on a Super .OC r9 380 it in no way can compete with a 380x
But yes it (R9 380) kicks asre in 1080p but then again
R9 390 is the better card IMO