Why are people so negative towards AMD graphics cards these days? Did the no drivers meme really hurt their image forever?
Because NVidia bribed software developers. Games like Witcher 3 and Fallout 4 purposely run like shit on AMD hardware due to stupid settings. Users can compensate for this by changing max tesselation level in global settings, but benchmarks are done without this. Basically, NVidia is doing to AMD in the GPU market what Intel did to AMD in the CPU market back when AMD actually had better hardware than Intel.
ATi had the no drivers meme as well, and yet people loved them a lot more than Nvidia.
It's just that ever since AMD took reigns of the company, they've only had two series of cards that could truly make the ATi brand proud - HD 4000 and HD 5000. The rest were either bad (HD2000, HD3000, HD6000) or "ok" (HD7000, 2xx 3xx).
Doesn't support gameworks
High power consumption
Cancerous user base
Pretty much awful for gaming
Also awful for productivity because no cuda
It's basically the Coby of GPUs
>amd optimized game
>runs beautifully on all systems, even on toasters
>runs like shit on anything but high end hw
>expect your new gpu get deprecated in the next year
Gameworks and Nvidia abusing their market share killed AMD.
Also false advertising concerning HBM and Fury X, where they outright lied about the performance, and priced it as high as the more superior in every regard 980ti. Yeah, the benefit is water cooling, but it still overclocks like shit.
I'm waiting for Polaris. If that shit flops for whatever reason, then the company is done for. Sad, but true.
Fucking this. Can't even use addons to reduce graphics to playable levels on AMD laptops in fallout4 like you can in skyrim. 1.5GB of my RAM is dedicated to my A8/R5 apu and it runs pretty much any modern game I throw at it on Medium to high settings @720p with a steady 40-60 FPS depending on the game, yet with all of the settings at low 800x600 with a graphics disabler add-on I still get like 15FPS max in fallout 4.
Meanwhile, my other laptop with an i3 and nvidia integrated gets 60FPS on low 800x600, same specs otherwise.
>have 4 year old nvidia GPU
>install new 2015 drivers
>play 4 year old game
>uninstall 2015 drivers, install 2012 drivers
>gain 15 FPS
so it was true
>It's OK when nvidia does it.
AMD releases patches for their games themselves, while nvidia has to whine and tell their users to "not buy the game". Bullshit.
AMD's code is open-source. Nvidia's GW is closed source.
They got hit with their own shit and didn't know how to deal with it.
I can't see how the 6000 series was bad. They basically dropped the 5870 and 5850 down $100-$200 and then added two significantly more powerful cards on top of them. And even the "weak" 6850/6870 were good for their compute performance where as nvidia's highest offerings were absolute shit for anything other than gayman.
Nvidia just use a lot of tessellation on their gamework features and amd cards have 2012 tier tessellation performance. Not the fault of Nvidia really. It's architecture differences.
Because Nvidia has the best new features goy-er-guy
And didn't you enjoy our enhancements to that latest Batman game?
Oh wait, we completely broke it and forced steam to institute a refund program? lol whoops our bad.
We'll just keep making shitty closed source hair FX systems to artificially bog down performance along with gimping cards via drivers to fool you into parting with more of your money
Performance not up to snuff?
You just need a third card in that SLI setup, heck, make it four just to be safe! Future proof anon!
AMD is fucking garbage. Everything they make is absolutely, objectively crappy as shit and only a fat neckbeard would buy this shit-tier crappy shit because he's afraid of running out of autismbux (because God forbid he spend less money on Taco Bell and Mountain Dew.) I tried building an AMD gaming rig once and it was the worst experience of my life. First off, that CPU clip is fucking impossible to use. It looks just like an old Pentium CPU clip, but it's autistically retarded and I couldn't deal with it. And the graphics card was so fucking huge, I had to cut a hole all the way around my case to fit it in. Fucking AMD.
But wait, that's not even the best part! When I tried to start up my shitty autismbox (which I sold to the Association of Retarded Citizens BTW) the absolutely massive power draw dimmed every light in my house and throughout the entire neighborhood like I was firing up fucking ENIAC in the 1920s or some bullshit. I had plenty of time to worry about my power bill too, because this shitty thing couldn't figure out which of its 32 CPU cores to use and it took half a day to boot. As soon as I tried to play a game (fucking Starcraft from 1998), it lagged so hard it went backwards. I thought I should try downloading a driver, but there aren't any. There are no fucking drivers. What the shit, AMD? No wonder Apple doesn't use your shit. It wasn't long after that that I realized this thing was putting out more heat than a goddamn jet engine and it burned my house down.
Two weeks later, when I sold that piece of shit, I built an Intel/Nvidia rig! Holy shit, it's like jacking into the Matrix! I built it 5 years ago and it still runs every game in 4K at 240 FPS while using no power and it actually cools my house during the summer. What a beautiful machine, I seriously hope none of you are poor and stupid enough to buy AMD.
Nvidia's Apple-like brand image leading to blind preference towards GeForces mostly.
Circlejerking with naive kiddos wanting to be on the "perceived cooler" side (which is Nvidia becasus its brand has more mindshare) to be cool themselves.
And the emotional investment that fuels stupid fan-wars antagonism. There's the people being dumb problem on the background, as always.
First off, I've been a hardcore AMD supporter for quite some time.
Complete video card history:
Geforce 2 GTS
Geforce 4 ti 4200
380 (short lived)
As you can see, I've been straight AMD/ATI since 2006.
Unfortunately, at least for now those days are over.
The hardware has always been top-notch, but sadly the drivers have been wanting for some time now.
I'd say the issues really became prevalent around the 5850's launch.
Shortly thereafter, it became a constant game of "finding the best driver for ______ game". Once I found the one that worked best for me, I stuck with it.
The 10.4a driver for Battlefield: BC2 on the 5850 and the 12.8/13.12/14.12 drivers for the 6950 immediately come to mind etc.
I decided to give AMD one more chance and picked up a 380 4GB about two weeks ago.
There were no HUGE issues really, just a lot of little hassles.
So yesterday after much contemplation, I packed up the card, took it back to Microcenter, and exchanged it for a 960 4GB.
Make no mistake, I am fully aware of the 960's 128-bit bus and the fact that it is approximately 15% slower than the 380.
But, between Nvidia's much more frequent driver updates, market dominance, Fallout 4's apparent system requirements, and the overwhelming ease
of use/lack of issues, I decided to switch over to the green team.
I have to say, I am very pleased with my decision thus far.
Please, don't flame me. I just felt the need to express my opinions/impressions etc.
Make no mistake, if AMD gets it act together, I'll gladly switch back.
But for now, it looks like my love affair with AMD hardware has finally come to an end.
The funny thing about this is that those games run better on my 290 than my friends 780, but both of those are inferior than 970 by far.
And my friend specifically got the 780 for gameworks. He traded it to a friend for a 290 he had. Technically he could still sell the 780 for more than the used price of a 290 despite all of this.
What little hassles did you have? I've had no problems with that card, and went the other way around from a 960 to a 380.
And why even buy the 4GB 960, if its just snake oil? Should have saved money and got the 2GB 960
>Did the no drivers meme really hurt their image forever?
Their Crossfire support is the shittiest it's been in years. You have to wait months for some profiles and even then they may not appear at all.
Not the guy you asked, but crimson fixed BSOD issues on my crossfire 290's while playing at 4k.
It did however make Battlefront unplayable, but no one plays that so it's not really a big issue.
It introduced super easy monitor overclocking. Got my 60hz IPS panel to 75hz in 15 seconds with the built in tool. Free performance feels great, and I had no idea this was supported by Crimson or my random LG monitor.
Other than that, not much. I wouldn't usually notice a 2-3% improvement anyway.
>literally every graph shows the newer drivers improving the performance by at least 10% and often more
I don't get it.
Also that just shows nvidia's drivers which are often released with the card where as AMD can't seem to sync their driver releases with their cards and it usually takes a month or two for decent drivers to be released
Crossfire DOES work very well when it is properly supported. That much is true. But AMD support is crap. They release profiles weeks or months late, if at all and even then the first version may be buggy or glitchy. The bugs and glitches sometimes never get resolved at all.
Witcher 3 for instance STILL has some UI flickering issues and enabling CF breaks the awesome water simulation.
I can't fucking wait for new high-end cards to come out so I can dump my 290Xs and get a decent upgrade in the process.
How did you manage that? My Iiyama monitor is 75 Hz according to its specifications but I can only select 60 Hz in Windows and haven't found a setting in Crimson for this.
I tried creating a custom resolution in Crimson with 75 Hz refresh rate, but that doesn't work either.
>finally get the money to upgrade my shitty ZOTAC 820
>new architecture coming out this year
>dunno if i should just buy a 960 and be done with it or wait and see
Ahhh, well they fixed it anyways. I installed the driver and had an issue of temps rising to 94C after fucking around with some OC's that I was able to fix with a custom fan profile. I think it only seriously effected people who OC, but who also don't check temps while playing with OC/fan profiles.
Because they are total bitches. Nvidia® keep their inventions closed source so they can have the upper hand but what does AMD do? They make it open source and let themselves get walked all over. They are total doormats and Nvidia is winning because they let them.
Windows has some basic drivers compatible with pretty much every single GPU ever made. They don't support much and you certainly won't be able to game or watch videos but your machine will work. Barely.
Crimson drivers didn't kill shit. MSI afterburner causes a conflict. In this case I'd be more inclined to blame the third party software but then the EULA does explicitly state that it voids your warranty and assumes no liability.
Oh, you mean the good old windows 10 bullshit where it redownloads the drivers that it wants as opposed to the ones you want? Yea, the fix for that appeared within 2 days, but MS's bullshit made the problem continue for a lot longer.
While we're on the topic: http://www.theregister.co.uk/2015/07/28/windows_10_update_nvidia_driver_conflict/
The real problem with Nvidia is the windows10 drivers. I've checked the reddit's nvidia forum page and its mostly about windows 10 drivers issue. There's even a sticky thread addressing the windows 10 drivers issue.
However people still continue to say "amd drivers". I've been using AMD since forever and the only time I've had issues was due to overclocking instability(which is fixed via voltage/clock speed control).
Nvidia drivers are horrendous right now.
Nobody had their GPU burn up, it was just dumb redditors trying to get free stuff by making shit up. All modern GPUs have thermal shutoff well before they could do any damage to themselves.
Every AMD optimized game I played runs beautifully (BF4, Tomb Raider, Deus Ex HR, DMC fuck you edition, etc)
I can't remember the last time an Nvidia Gameworks game run great for everyone including Nvidia users who don't have the latest $500 GPU. Ass Creed Unity was a trainwreck, Lords of the Fallen literally ran like shit for everyone even on GTX 970s, Fallout 4 runs and looks like garbage unless you mod it, and ARK still has issues out the ass related to Gameworks.
>All modern GPUs have thermal shutoff well before they could do any damage to themselves.
Not him but this is what I thought as well. Every chip since Fermi has had this so why the heck would these cards suddenly up and die like that?
Every AMD optimized game I played runs beautifully (BF4, Tomb Raider, Deus Ex HR, DMC fuck you edition, etc)
I can't remember the last time an Nvidia Gameworks game run great for everyone including Nvidia users who don't have the latest $500 GPU. The exception is probably MGSV but that game was optimized for next gen consoles first (which uses AMD).
For the last 5 years at least, TWIMTBP and HairWorks/GameWorks have targeted a rather particular design difference in Nvidia:
higher tessellation bandwidth and more granular ROP/TMU blocks.
AMD ROPs are 4x4 pixels, Nvidia uses 4x2 and has higher tessellator throughput.
By flooding scenes with tiny 1 or 2 pixel tessellated triangles, GameWorks titles can force AMD cards to lose 88-94% of their pixel throughput while wasting only 75-88% of their own capacity.
That's not to say that Nvidia hasn't also pursued legitimate optimizations in high-profile games, but the core of the program has always been about trying to shoot their opponent in the head even if they have to shoot themselves in the leg in the process.
Beats me, my pair of 7950s have been great. Thanks to some great drivers I have been able to have 980 performance before a 980 even existed, this gave me the boost I needed to run 3D with Tridef and then they gave me the option of cheap mixed res eyefinity. The only bad things I have to say in what, 4 maybe 5 years would be the legacy for my old 6870 was a bit of a bumber and Crimson v1 fan issues.
Mix of things really:
>History of bad drivers going all the way back to the 90's
>Currently not as competitive as Nvidia hardware and has had long periods of this in the past
>Hasn't been as successful as Nividia to get game developers to use their extensions that run well on their hardware, but like shit on competitors' hardware
In essence it's a combination of doing a bad job technically and going up against a company doing a very good job both technically and being anti-competitive assholes.
JewWorks is not a meme...
I will always buy Nvidia™ because I only play games The Way It's Meant to be Played™. Nvidia also pioneers innovative new technologies like PhysX™, Gameworks™ and the highest quality driver to ever grace Windows.
When I boot up with a brand new Nvidia™ Geforce™, I can experience the game just like it's mean to be played. Nvidia™ also delivers a far more silkysmooth experience.
Nvidia Geforce™ is also very power efficient. A graphic card is the most power hungry device in your house. Refrigerators, air conditioners, water heaters, dish washers, lights, etc all use significantly less power than a graphic card. Which is why Nvidia™ puts gamers first by ensuring that their gaming experience is of the highest quality while looking out for gamers by giving them the most value in their electrical bill.
At this point in time, there's really no reasons to consider an AMD graphic card at all. I tried one one time, it caused so much heat that it exploded. It also consumed so much power that it gave on an EMP and destroyed the rest of my computer.
Nvidia™ also pioneered how useless GPGPU is with CUDA™. Years ago, everyone thought GPGPU, CUDA™, and OpenCL were the future. Now, Nvidia™ has removed those useless features from their GPUs and increased efficiency. Now you can save thousands a year in electricity thanks to Nvidia™ ensuring that useless features like GPGPU are "optimized" for gamers.
It's quite clear that OP's an AMD shill trying to convince you to settle on something less than The Way It's Mean to be Played™. Nvidia™ is the only real way to play games. We have seen recently that they offer incredible libraries for software developers like Nvidia Gameworks. He is probably too poor to afford the Nvidia Geforce Experience and can not afford to play any games The Way It's Mean To be Played™.
Don't be a poor gamer with bad drivers and a huge power bill. Play games with the Geforce™ Experience™: The Way It's Mean To Be Played™
>priced it as high as the more superior in every regard 980ti
The Fury-X handily beats the 980-ti at 4K.
At 1440p it's a tie. The Fury-X was a little worse at launch because the 980-ti was an older card so the drivers were more mature there's not much in it now.
Originally you could say the Fury-X couldn't overclock. But now the Fury-X has voltage unlock, it'll overclock better than some (though not most) 980-tis, so I don't think luck-of-the-draw silicon lottery things are useful comparisons.
The 'waiting evolved' meme is real, at least for the Fury-X.
The Fury, on the other hand, is only a tiny bit worse than a Fury-X (the same when OC'd) and has a pretty nifty price to performance because of it. As such AMD are redirected Fury-X chips to Fury cards to meet demand.
Fiji wasn't a flop technically, just a bit of a flop due to shoddy marketing that results in people like you thinking Nvidia's card is better. The only thing about Fiji that truly sucks is the ROP count.
Nvidia has been actively brigading this board for years now.
The facts are that nvidia supports closed source standards and is an evil anti-consumer, corrupt company.
Meanwhile, AMD is for open source standards and is for the consumer in every facet.
There is no real choice between them.
The fact that everyone is finally starting to question this means there should be a comeback soon.
With AMD staying competitive with far less money than its competitors, I can't imagine the type of innovation they'd be capable of with the capital to back themselves 100%.
>The only thing about Fiji that truly sucks is the ROP count.
it's not just ROP count, it's also ROP design.
GameWorks tessellation has been a big enough drain on AMD for long enough that they can only blame themselves, even if in principle their design is more "ideal":
>Putting in shit that makes no difference, saturating meshes with polygons that aren't even renderable.
Feels great those fuckers are getting a taste of their own medicine with Async compute.
And I say this as someone who's been using Nvidias since I first got a PC in 2001.
I just turned it on with my 720p TV and it didn't do shit. Do you have to have 1080 -> 1440 for it to be noticeable?
I'm just trying to get a large desktop real estate
>Curiously, their driver reported this feature was functional but attempting to use it was an unmitigated disaster in terms of performance and conformance so we shut it down on their hardware. As far as I know, Maxwell doesn’t really have Async Compute so I don’t know why their driver was trying to expose that.
LITERALLY untrue for anything 285(380), 380x, 390(x), Fury/X/Nano.
Unsure about the 370 but I'm pretty sure 360 and downward are only 1st gen rebrands as 260 downward were new architectures, weren't they?
TL;DR: I think ONLY the 370(x) is a rebrand of a rebrand.
Unstable drivers for the R9 series currently.
Crimson drivers maxing fan speeds 24/7 and wearing out card coolers in matters of months.
Mandatory bloatware bundled with drivers, that require a 24/7 internet connection to function and will crash otherwise, and which contains buggy js that poses security issues and includes adware.
The fact they use 120-150% of the power of comparable nvidia cards for 85-90% of the performance, all for a $10-20 price savings.
The fact that their current flagship card is the first not-100%-rebrand theyve put out in 6 years.
>Get the cheapest used shit you can find
this, craigslists, ebay, whatever
it's not just the HBM2 cards, which may only be the higher end.
Polaris is definitely 14nm from top to bottom, Pascal is expected to be as well (though it appears to be slipping).
All the new cards will have twice the power efficiency, DP 1.3/HDMI 2.0, h.265 acceleration, amongst other improvements.
>Shit. I can still cancel but since my gpu died I have no gpu. What should I get?
My personal recommendation would be the cheapest second hand card that can do the job good enough. If I hadn't recently bought a second hand GTX 970 last September GPGPU work but was still interested in buying a new card, I'd stick to my GTX 670 (which I should sell off at this point but can't be arsed to do it).
When you use VSR, is the new higher resolution going to be used for everything? I thought you just set VSR up and then when you go into a game you can choose higher resolution than normal, without having to change the entire OS and everything?
It broke FO4, Insurgency, and TES4 for me. Fixed some insane ragdolling in Ark survival. Stuck my gpu fans at 100%, which can be overridden with Sapphire TriXX. Took my stable 1200/1665 overclock and made it unstable, cant get over 1135/1300 with Crimson. Gained about 5fps in Arma3 and almost 20fps in Arma2 DayZ.
Overall not worth it.
>run mantle in BF4
>fps goes from ~135 to ~137
>game crashes every 30 min or so
>change back to directx
>no crashes in 4 months
is 1440p in 1080p very noticeable in the aliasing department?
I wonder if the performance cost of rendering the game at 1440p is less than rendering at 1080p but with AA. Guess it depends on each game
if you have money to burn, I guess buying whatever you want whenever you want is fine, but due to 28nm lasting fucking forever and 20nm being skipped, this is literally going to be the biggest generation jump in GPU history.
Its pretty noticeable and requires at least 2x AA in some games, and is a nonissue in others.
For example, running 1440p VSR in Insurgency absolutely requires 4x, but 1440p VSR in DayZ does not. Fortunately, my 390x is pushing 160fps in Insurgency and I can definitely afford it. In fact, I run 8x just to reduce the number of framedrops on my cheapshit 60hz monitor since at 1440p VSR with 8x MSAA Im down to ~90fps.
>If no drivers killed AMDs image, why didn't housefires and woodscrews kill Nvidias image?
Because Nvidia can afford dedicated forum shillers, and they have had them in their employ for over a decade.
That and woodscrews only really happened for one generation, while AMD drivers get their bad reputation because they have one or two completely silly bugs stuck in there for ages - like digital panel image scaling being broken for years (you could only set the option on a non-native resolution, which completely defeated the point).
Also AMD GPGPU support is kind of a joke to this day.
It's not like Nvidia never did that, but then again you are probably too young to remember the Geforce 4 MX, the 9800 series (complete rebrands of the 8800 series), the GF260 (rebrand of the 9800 which was a rebrand of the 8800), etc.
>post a old drivers one
''Now we can stop with the "AMD will be vastly superior to Nvidia in DX12" nonsense.''
When I bought it I had wish I could keep stable 60fps on things. For instance, something like Shadow of Mordor. Turns out I can't.
Hell not even Tomb Raider 2013. Have to turn them shadows a little, aside from the obvious no MSAA and no tressFX.
I used an AMD GPU for 3 years without a single driver issue
Then again it was a laptop GPU with drivers specifically tailored by HP so I don't think it's a good example
Anyway I ordered a 380X, we'll see how that goes. I'll probably get Nvidia next time I buy a card, just to be able to say I've used both so I'm not biased.
This. One reason why I sold my gtx 670 and bought an r7 260x instead.
My 670 was just too unstable on windows 10, constantly crashing or returning errors, especially if I ever tried using shadowplay.
Raptr might not look pretty, but it works a lot better than shadowplay was for me.
>cringe marketing all the year
>rebrands of mostly all the 3xx with more memory
>higher power draw
>absolutely no reason to upgrade if you have a 2xx
>no good mobile gpu because of the poor power efficiency
>meanwhile even a desktop gtx 980 can go in a high end laptop
>meanwhile people still buy the 970 like hot cake even after the 3.5 scandal/lie
>release the fury x after the 980ti
>slower then expected even with hbm and 4096 cores
>not good in overclocking even with the aio cooling
>the drivers was not ready for matches with the stock 980 ti
>absolutely no reason to upgrade if you have a 980 ti or a 290x2
>release the 380 and 380x
>very good gpus but too late
>/g/ is surprised
I use an AMD card in the summer and an nvidia card in the winter. It really helps save on the light bill not having the heater running all the fucking time.
Lets put it this way.
Why would anyone who owns an AMD card proclaim that their card caused the 2015 California wildfires?
They wouldn't. Why would anyone who owns an Nvidia card say theirs does the same?
They fucking wouldn't.
It's shit made up by either side to discredit the other.
ATI cards have drivers
Nvidia cards don't start housefires
They're both a little different each round in different ways. Buy whatever.
They decided it's better to constantly waste 20-30 watts in IDLE rather than sometimes having a short flicker when switching clock rates
Also they are using a cheap Single-Link-DVI port instead of 2 Dual-Link to save a few cents (and cost me $100+ extra for adapters)
As long as they do shit like this I will always buy Nvidia
>35~45 fps on high
>switch to mantle
>60 fps on ultra
On my hd 7950 mantle pushed my framerate from high 50s to low 60s.
I use frtc to lock it at 60 and it rarely drops below that in bf4 ultra.
My friends gtx 960 on ultra gets low 50s and he's stuck on dx11.
I gave AMD a shot and bought a 390 5 months ago. Elite Dangerous, which is a game that I played with friends quite often, is unplayable on AMD 3xx cards up until a week or so ago. AMD had an issue in their drivers Since August 2015 that was only fixed 1 week ago that made this game unplayable for 3xx owners. 6 months of shit drivers, with no workarounds. They publicly accepted responsibility for this 5+ months ago. I have never been treated so poorly when I used nvidia. Their no drivers meme is the truth and when I swap my card in a few years I will not be going AMD again, even if I have to pay more for nvidia.
Back when I used AMD, I remember having a spreadsheet of driver versions and what worked with each of them.
When Rage and Battlefield 3 were released the same day, they both crashed with the current driver. AMD went ahead and released two different drivers, one for each game - the other one would crash.
I grabbed the OpenGL DLL from the Rage drivers and overwrote the one from the Battlefield ones, and (much to my surprise) both games actually worked.
This was useful later on, as I'd grab the best available OpenGL DLL and put it with the best D3D driver (both of them usually several months old).
No thank you. On the bright side, this may greatly improve with D3D12, but we'll see.
because idiots would love to buy products if it's been slapped with NVIDIA sticker/logos on it
and shits on any competing products while they're at it
literally apple-tier cancer fanboys
that kinda makes his point clear that 14 year old gaymer / manchildren would buy anything by Nvidia even though they've been cucked hard
literally who cares, i cant tell the difference in my powerbill after going to 290 from both 670 and 750Ti
>have a gtx 770 (evga)
>buy a 390 (msi)
>the fps in cities skylines is now low
>the drivers are probably not optimized
>game crash after every ~10 min in gta v!!?!
>look in forums
>it's only few 390 owner
>buy a gtx 970 gigabyte oc
They fixed the crash issue (3 week later I think) with the new beta drivers but thats was fucking too late. I want support AMD but sometimes they are incompetent for some reason.
Shit like this is why I hate my 390. It is the only thing on my computer I have ever regretted buying. I came from an EVGA GTX 770 and now GTA V runs like shit, the drivers constantly underclock my card while running games that aren't optimized quite so well, Rainbow Six Siege likes to crash every few hours. When browsing the internet flash videos run like shit and more than 2 twitch streams causes all of them to drop down to 2 FPS...the list goes on and on as to why I should have bought a 970 and not a 390 other than Black ops 3 using 5 GiB of VRAM.
Honestly, the reason I buy Nvidia hardware is pretty funny. Back in the early 2000s sometime, I used ATI cards. But back then I wasn't as savvy with computers as I am now, I had never built my own tower and relied on those companies that prebuild them for you. So I had a cheap midrange tower built by cyberpower or whatever the fuck they were called back then. And after a couple of years, the thing ended up shitting the bed on me. I take it to the local PC repair place and he open it up and finds that the video card had become so packed with dust that the cooling fan had seized up and LITERALLY melted onto the fucking card. Melted fan plastic all over the video card. No wonder it wouldn't display video. Since then I've been using Nvidia cards and never really found occasion to switch. They work well and the driver issues have been minor; there was a major issue back in my 8800GTX days where I had to use specific drivers to avoid a bluescreen problem, but other than that, it's been smooth sailing.
Getting a Sapphire 390 for 1080@60Hz mostly because it will serve my purposes and hopefully Vulcan/Mantle/GLNext/Vaporware might make lunix gaymen less shitty(steam+lunix+vulcan=my OTP)
Why do you faggots obsess over this so much? If you end up getting a card you don't like, just sell it while you still have warranty and get one from the competitor. All this fanboy shit is beyond retarded.
i always here these horror stories.
i have a 390x and a 390 in crossfire, and except for anno running like shit on my amd rig, i have never had a single issue with the cards
I feel like the companies are trying to make cards for different markets.
Nvidia is castrating it's compute performance in favor of improving vidya fps.
While AMD is trying to make OpenCL/DirectCompute workhorses.
that's pretty much impossible to deny since it's true.
Fermi and recent GCN were/are hot since they prioritized shader GFLOPS over everything else.
Maxwell is comparatively weak in shading but throws transistors at fixed-function graphics blocks like ROPs, TMUs, tessellators, and texture memory compression that do GPGPU things zero good but make some games smoother.
Are you me? Just blew my 8800gtx and looking for a new card. Every time I've installed an amd graphics card in the past decade, something has gone wr9ng/required more work then necessary to function. Either the colors look to light or washed out (multiple amd cards) or catalyst control center crashed the computer every 20 minutes (multiple cards and machines)
Never had a problem with Nvidia until my card just went out after years of use.
I had relative frequent driver issues with AMD, and my 7970 is my only GPU that ever decided to fry itself randomly just after warrenty was over. And no, I didn't even OC it or used it to mine those retarded bitcoins.
Now I'm back on the Nvidia team and the drivers are butter smooth, the "graffic center" isn't a giant mess, and the card is a silent as you can imagine even when gaming. The 7970 sounded like a freaking yet engine, and even then it was kinda hot.
>amd is for poorfags
>nvidia is for richfags
>richfags don't like poorfags and viceversa
How hard is that to understand poorfag?
it's old yes however compared it's performance to today's cards it's a $180-200 card that uses way more power but still works and unless he's going up a tier it wouldn't matter really unless he really needs to to run his games
This shit only happened to retards who activated Crimson's overclock while they were using some third party software for overclocking (like Afterburner)
Okay, guys. Currently I've got an AMD Radeon HD 6570, and I've been looking at the R9 290 and 290X. Are they acceptable, or should I look into something else? I'm a heavy gamer and I'm trying to get into animation and such, if that changes anything.
I have an r9 290.
If you're debating over the 290 and the 290x just get a 290. The x only has like 3fps gains and in most games its the exact same.
As far as performance, its about the best you're going to get without buying an enthusiast card like the Fury or the 980ti.
It will be a good card for whatever you need it for.
The only reason being that crimson is complete ass if you're worried about desktop management, and honestly anything that has to do with Nvidia's proprietary stuff you'll want in on with ease.
How Fucked am I? What's the next best card i should upgrade to?
it's not a meme you retard, their drivers are garbage, I can't even use overdrive in their current crimson release because it might fuck with my card and not handle the fans correctly
i literally can't use vsr with my CRT monitor because AMD won't allow it but if I fuck with the windows registry it works just fine, their drivers are just full of artificial restrictions
not to mention how nvidia always optimizes the games themselves but amd won't do it because of some stupid microsoft thing or some other bullshit excuse like that
>mfw crimson improved performance with all games
>mfw it did this while also reducing temps by up to 15c on some games (80c down to 65c now for example)
>mfw after Pascal releases
>nvidia will release drivers that decrease FPS and set houses on fire so people will buy the new pascal cards
f...feels good reassuring my purchase
everything im saying is 100% true fact, but honestly it's all just reassurance
Pretty hyped, i want Vulcan to succeed so i can switch to Linux.
I just hope AMD's CPU's get a bit more oomph to them, the games i play are pretty CPU intensive, and i want more incentive to switch from Intel's kikery.
>mfw when the 390x got released amd withheld the performance updates on the 2xx cards despite it being a rebrand
>they only released it a few months later after people called them out on it
shills destroying amds image and the kids jumping on the bandwagon, recommending nvidia and defending their decision like it's about life or death.
I wish consumers were looking more at the price/performance ratio of a product and how a company behaves itselfs towards their customers. And that includes locking down tech like physx which is nothing but a gimmick thanks to nvidia.
i hate PhysX, unfortunately it has come to the point where a dedicated PhysX card will improve your fps in select games. Luckily i have a few mid/low tier nvidia gfx lying around, but i'd prefer not to have a mandatory nvidia card in my AMD system. It wouldnt be so bad, but it can negatively affect your minimum FPS quite a bit, which makes the difference between above/below 60fps...
I really hope Gaben saves PC gaymen with SteamOS+Vulcan(as i have recently taken to calling it)
The process to install the latest proprietary AMD Radeon drivers 15.9 on the latest stable Linux kernel 4.4 is as follows:
1. Patch the kernel. AMD's driver uses a kernel symbol that is GPL only and has been for quite a while.
2. Compile the kernel with GCC 4.9. The drivers break when the kernel is compiled with GCC 5.
3. Extract and patch the Radeon driver.
4. Temporarily switch /usr/bin/gcc to version 4.9 because there is no way to tell the installer to use a specific version.
5. Install the Radeon driver
6. Don't forget to switch GCC back to version 5
Thankfully the open source drivers have been gaining a lot of momentum and generally just work. Linux 4.5 will make them quite fast.
>Unstable drivers for the R9 series currently.
Which ones? They're all pretty stable to me, and I see little complaints elsewhere.
>Crimson drivers maxing fan speeds 24/7 and wearing out card coolers in matters of months.
The issue was locking them at low values, not maxing. If you don't notice your fans are maxed out for months, you're retarded anyway.
>Mandatory bloatware bundled with drivers, that require a 24/7 internet connection to function and will crash otherwise, and which contains buggy js that poses security issues and includes adware.
Raptr is not mandatory, it can be checked out during the installation or uninstalled afterwards.
>The fact they use 120-150% of the power of comparable nvidia cards for 85-90% of the performance, all for a $10-20 price savings.
The power usage difference is minimal. At idle, they both use the same power. At full load, the average difference is of about 10-20W. The difference at the end of a year of intense gaming every day is less than 5$. And what the fuck are these percentages? What are the percentages about? How is one of them 150% and the other is 90%? Which one of them is the average of 100%? Are you retarded, or do you just have no idea what you're talking about?
>The fact that their current flagship card is the first not-100%-rebrand theyve put out in 6 years.
And yet, the rebrands are still perfectly able to compete and win in the majority of tiers, with the exception of the flagship, where the 980ti edges out a bit, for now anyway.
Nice disinfo there buddy, too bad you were wrong in all of them.
>Not that I don't believe you but can I have a sourcd for that? I'm pretty hyped.
Tried it myself with the powerplay patches in "drm-next".
I could play XCOM in 1080p with high details, perfectly fluid
OpenCL works fast too, but so far only OpenCL 1.1
The new proprietary AMD drivers, due in 2016, will share the kernel code with the free drivers. This should yield MUCH better compatibility and stability
>Why are people so negative towards AMD graphics cards these days?
it sucks when you buy a graphics card and it's incompatible with your system because amd didn't think to make their pcie 2.0 cards work with pcie 1.1. it also sucks when you get a brand new amd card and it has a broken bios that causes it to constantly switch from 2D to 3D frequencies which cripples performance. and then there was that time that i bought amd's $270 flagship single gpu on the first day it came out. its day 1 bugs were never fixed, it had a half assed implementation of gddr5 where it had to be running at full speed even at idle, it was a hot, loud, power hungry POS that was born during the time that amd had issues with jittery frame rate, then when they finally address that issue they drop support for my card, so thanks for that amd.
so basically personal experience is why a lot of people stay away. no company has ever fucked me like amd.
So, from the data we have right now, will Pascal and Polaris be a significant jump or just the same shit as always? How did gpu release plans go, top tier than a few months for midrange? If so, it might be a long wait and then will they even be affordable with the new stuff tax.
I'm wondering because I don't know if I should be getting a 390 now. I got my paycheck and I'm really fired up to buy it even though I don't need it THAT much.