Do you guys think AMD can get a sizable chunk of the market share this generation if Nvidia shits the bed with the 1000 series, or are consumers so used to ONLY looking at Nvidia's options that it'll stay that way?
AMD seems to have Polaris ready sooner than Nvidia has Maxwell, even if it's one quarter due to the large performance gap between the new and old generation they'll still get a good chunk of market back.
I have high hopes for Polaris. AMD have been planning for a die shrink for a longass time, it's why their 200 and 300 series cards were disappointing, they were the 'oh shit the next die shrink fell through' cards.
Nvidia on the other hand has not been planning for a die shrink, Pascal is 'Oh shit there's a die shrink now' card.
Hopes are high for polaris.
As for market share, I think nvidia is going to destroy AMD.
Nvidia cards are going to sell like hotcakes and AMD cards are going to be niche poorfag products.
Woodscrews 2: Firehouse Boogaloo
AMD does potentially hold the upper hand with a smaller process for Polaris (now that PX2 is a fraud), but it's fingers crossed as to whether they fuck it up though.
Who knows, Zen + Polaris could swing 2016 back in AMD's favor if they manage to undercut Intelvidia while offering similar performance. Now that Broadwell-E has been *leaked to cost as much for 6/8-core parts as Haswell, it's once again AMD's chance for a $500 8-core.
*but then again it was "leaked" that the Titan X was to cost $1350 and everyone was sure that it was true right until the announcement, so grains of salt.
>Do you guys think AMD can get a sizable chunk of the market share this generation
Even if they made better products, too many people are brainwashed into always choosing Nvidia
>AMD seems to have Polaris ready sooner than Nvidia has Maxwell,
what makes you say that? pascal taped out 6 months before polaris did and was already previewed behind closed doors in september of 2015
I'd like to know why AMD isn't in the neural networks business like nvidia is.
In school, the entire machine learning and neural networks department runs nvidia because of CUDA.
Where's AMD's answer to that? I want to like and use AMD but I can't because my research uses so much CUDA.
>Where's AMD's answer to that? I want to like and use AMD but I can't because my research uses so much CUDA.
It seems you haven't even bothered looking.
all use OpenCL and d4j is in the process of supporting OpenCL as well.
amd doesn't bother marketing to those businesses and the few customers they have don't get any decent level of support (firepro drivers are trash, especially on linux, worse than standard consumer drivers).
Yeah sorry but no, caffe, torch and theano runs better on CUDA.
It's objective. I've measured it. OpenCL is also a clusterfuck. CUDA's interface is neat and the researchers get used to it fast.
I've been at it for 4 years, I know this shit.
>nVidia's entire R&D budget is worth more than all of AMD.
And yet AMD still manages to release competitive GPUs and develop new technologies like HBM. Imagine what they could do on Nvidias budget.
>their 200 and 300 series cards were disappointing
wtf am i reading.
im rocking 290x CF and the cards still beat the living shit out of most things.
i bought the cards new for 350$ each when they came out.
and that was WITH BF4 included.
and i can at this moment sell them for 300$ each.
the only downside is the heat output and the watt usage, but honestly i play like 5-6 hours a week, so its nothing.
the 290x clocked and unlocked to 390x is the best thing ever.
You cant even get AMD hardware in the cloud. AWS has very affordable (especially spot) Nvidia instances. No AMD in sight. I've been toying with AWS gaming rigs, and Torch training of recurrent neural nets.
No AMD there, and you can't beat like 10 to 20 cents an hour to rent these cards.
>And yet AMD still manages to release competitive GPUs and develop new technologies like HBM.
they really aren't though, AMD is only achieving performance parity to reference nvidia cards while drawing way more power. they have some serious deficiencies that they're trying to make up for by rebranding the old stock and marketing hype bullshit.
You know what he meant. They're great cards but nothing new. I agree with a lot of things >>52415574 says. The fact that they reused gcn 1.2 with new media capabilities for Fiji means they've been investing their time in something else (besides the interposer for Fiji). Right after Hawaii or maybe even earlier amd must have begun working on next process node cards.
I'm basically 95% certain already that I'll buy Polaris.(I never had a Nvidia card in the first place so far) Freesync + opengpu are 2 things I'm very much looking forward to and would like to support.
at $350 and 6hrs a week, youre much better off renting GRID gpus from AWS. Its like half the cost, even before you factor in the power use and cost of the rest of your gaming rig.
No way AMD is a good deal there :-\
>achieving performance parity to reference nvidia cards while drawing way more power
Except for the the 980TI, most AMD cards in the same tier are faster than their Nvidia equivalents. And if you overclock your Nvidia cards to get the same performance, they'll consume just as much power as AMD cards.
>Where's AMD's answer to that? I want to like and use AMD but I can't because my research uses so much CUDA.
AMD are total cucks so they actually PAID Nvidia to licence CUDA on their cards. But since their cards don't support it naively they have to use some slow ass translator.
When you're buying AMD you're supporting Nvidia as well.
I don't think so, Nvidia has slightly higher perf/watt at the high end (GM200 vs Fiji) but that's it.
Now Hawaii and GM204 have a larger gap between them but that's because GM204 is a generation newer.
And if you look at Nano, it has higher perf/watt than any Nvidia card, if Nvidia can show off a high bin like that I'd like to see them.
It took forever for companies to implement OpenCL acceleration for video editing and color grading software, and Premier / DaVinci still run better on CUDA. It did a real number on editors who were using newer Macs, since most of 'em utilized AMD chips.
what: why lol?
when i game i like to game with everything set to max,
and i have a 1440p 120hz IPS.
the power use cost is trivial at best. maybe 50$ a year lol.
and when you can sell a card 3 years later for almoast the same price you bought it then how the fuck isnt the investment worth it considering most hardware lose half their value after just a year
>Except for the the 980TI, most AMD cards in the same tier are faster than their Nvidia equivalents.
they really aren't, please do your research
>And if you overclock your Nvidia cards to get the same performance, they'll consume just as much power as AMD cards.
again, you need to do your research. maxwell does not need a voltage change to overclock, which means even a maximum OCed maxwell card will still not draw more power than the card a tier above it.
>And if you look at Nano, it has higher perf/watt than any Nvidia card,
that card throttles from the gimped power delivery even in a gaming load, the underlying architecture is still flawed.
also, the performance scaling on maxwell is absolutely insane. in the case of the 970 it actually outperforms a stock 980 once it's OCed
I was doing fine too, at 1080p, but since I got a 1440p monitor I had to upgrade. Even a 980ti struggles to push higher resolutions. As >1080p resolutions become more standard everyone will need to upgrade their GPUs quite a lot.
Of course it throttles by 50MHz as it needs to keep to the strict TDP guidelines, it's practically like a mobile chip.
Fiji has flaws, particularly in the frontend, but Maxwell isn't much better as it achieves its power figures by ripping out all FP64 capabilities.
this may be hard for some of you shills to believe but nvidia do fuck up once in a while, and fermi was only a minor example
try geforce fx NV30 vs radeon 9xxx R300 back in 2003 where nvidia lost on the trifecta of failure: performance, power/heat and price
>Of course it throttles by 50MHz as it needs to keep to the strict TDP guidelines,
it throttles far lower than that lmao, you'd be lucky to maintain 800mhz (from the base of 1ghz) in a gaming load.
>but Maxwell isn't much better as it achieves its power figures by ripping out all FP64 capabilities.
fiji lacks decent fp64 capability too you dunce
>Fiji has flaws, particularly in the frontend,
that's an understatement, fiji is an absolute turd. you can't even get any OC out of it and it's limited to 4GB of vram, for a chip that's heavily marketed torwards 4k gamers/enthusiasts it doesn't offer much
nigga, im on my work phone, its an iphone so writing shit on it is fucking retarded and it does the double space each time i hit Enter once.
and with the lag i cba to fix it, so please die in a fire and lick steve jobs dead asshole.
>Most people won't buy anything other than nvidia regardless of price and performance.
i doubt that, there was a pretty healthy 50-50 and 60-40 split for some time between the two. we only say a big shift in nvidia's direction when they had a definitively superior product, we'll see a return to normalcy once the playing field levels again hardware wise.
the time to worry starts if AMD rebrands their GPUs again while only replacing the flagship
No matter how much better AMD is, people either don't know of it or see it as some off-brand option that's cheap and tacky/liable to breaking (despite being the absolute opposite of the truth) because they've not heard (enough) about AMD.
Kids only seem to know of Nvidia these days. Look at /v/ for a prime example.
Kids will scream that if it isn't Nvidia, they don't want it - its not what they know and Nvidia is 'the only way to go'.
Sadly, I know this to be truth first hand from the amount of times I've seen it done by friends. They know nothing of technology and so want my help; but when I offer them ANYTHING AMD they either say it isn't Intel or Nvidia and thus don't know, don't want to try something that is 'probably gonna break in a few weeks or something like all Chinese knock offs do.'
Seriously. I'm not even an AMD shill so much as I say "if you're budget conscious (which 90% who have come to me are), you'll get an absolutely fine experience from (AMD CPU here) which to you won't even be any worse - or hell, it might be better since you do (eg. photoshop)"
"No, never heard of AMD, I want an intel i7 because Facebook."
Basically the same thing with the graphics cards, like I say. Because they're accustomed to hearing 970, 980 and TITAN X, they think 290/390 is bad because, I quote, "the number is smaller."
>they think 290/390 is bad because, I quote, "the number is smaller."
When I was retarded I bought a HP laptop because the graphics card was a Radeon HD 7670M, and I thought "bro that's a pretty high number, must be good"
Good thing I'm not retarded anymore
>the 'crippling' meme is the stupidest shit i've ever seen.
Nvidia cripling AMD meme exist way long to when the first Assasin Creed further patch their released game so it won't support DX10.1 anymore because Nvidia GPU doesn't support it yet like AMD.
>Perhaps the DirectX 10.1 code path in Assassin's Creed needed some work, as Ubisoft claimed, but why remove DX10.1 support rather than fix it? The rumor mill creaked to life, with folks insinuating Ubisoft decided to nix DX10.1 support in response to pressure from Nvidia after the GPU maker sponsored Assassin's Creed via its The Way It's Meant To Be Played program.
A privacy expert has warned that the ‘Incognito’ mode in Google’s Chrome browser may not be quite as ‘incognito’ as some users may like.
Specifically, it can reveal videos which a user was watching, when another application is launched on the same PC.
Toronto engineering student Evan Anderson encountered the problem on his own PC - but warns it could afflict anyone on a shared PC with an NVidia graphics card.
On his blog, Anderson explains that he opened a game (Diablo III) on his PC - and was greeted by a jazz film that he’d been watching earlier in the day.
Anderson warns that the bug could affect other PC users - including those on shared computers - and says he has submitted a report to both Nvidia and Google.
Anderson says, ‘When I launched Diablo III, I didn’t expect the pornography I had been looking at hours previously to be splashed on the screen. But that’s exactly what replaced the black loading screen.
‘Like a scene from Hollywood, the game temporarily froze as it launched, preventing any attempt to clear the screen. The game unfroze just before clearing the screen, and I was able to grab a screenshot.’
>Samshit only starts 14LPP mass production today
>GloFo is even shittier and months behind Samshit because they licensed the process
IT'S OVER, AMD IS FINISHED & BANKRUPT
Only nVidia I owned was a GTX8800. When it ran it was decent but it was very unreliable. I've bought upwards of 20 AMD/ATI cards in my life and honestly have never been disappointed. I'll stick with em til the end because they've always delivered a good product for me.
>the Heterogeneous-compute Interface for Portability (HIP) tool for porting CUDA-based applications to a common C++ programming model
Bad troll is bad. They're not licensing anything and they're certainly not interpreting it. It's a set of tools to recompile CUDA code to something that will run on OpenCL hardware. This will benefit everyone that ISN'T nvidia, Intel will also gain from this for example as it'll help people get their code working on the Xeon Phi.
>AMD are total cucks so they actually PAID Nvidia to licence CUDA on their cards.
got any proof of this?
>But since their cards don't support it naively they have to use some slow ass translator.
it's purely a translator to port CUDA to OpenCL.
But really, they need to so the Jews at Intel and nvidia will make better products.
Funny. The only AMD cards I owned was back in 2002 - 2004, and everybody recommended them.
They ran like fucking shit, and I had nothing but problems with them.
Switched to Nvidia until 2009, and everything "just werk'd". Been using MSI since then, though.
Where were you when you realized that anecdotal evidence doesn't mean SHIT because for every person who has had bad experience wth a brand there's also someone who's had great experiences with it
>AMD are total cucks so they actually PAID Nvidia to licence CUDA
Wanting whats best for the customers makes AMD chucks? Thanks for giving us more options AMD as for Nvidia they would never do something like that.
>fiji lacks decent fp64 capability too you dunce
That's funny since Nvidia's been locking FP64 to 1/24th or lower on all their shit cards since Kepler.
>that's an understatement, fiji is an absolute turd. you can't even get any OC out of it and it's limited to 4GB of vram, for a chip that's heavily marketed torwards 4k gamers/enthusiasts it doesn't offer much
4GB limit is only because of HBM 1st gen. 4GB is not actually a bottleneck at 4K. You can have a large paging cache (390/X) or you can have a really goddamn fast one (Fury/X).
The architecture itself has higher raw throughput than Maxwell. If you don't understand why, I pity you.
I understand your point.
But that doesn't change the fact that the three AMD cards I owned in those two years was faulty as fuck. I may just have been unlucky to buy the rotten apples in the batches, but the experience only showed me that AMD = shit.
>That's funny since Nvidia's been locking FP64 to 1/24th or lower on all their shit cards since Kepler.
With Kepler it was to differentiate the consumer and Titan/Quadro lines, but both companies needed to find ways of making future cards faster on the same process node, and the only "reasonable" way to do that was to dump a feature 99.9% of people didn't use - FP64.
Maxwell can't do it even on Quadro/Tesla cards now because it's not in the architecture.
AMD shouldn't have encroached on the 600mm2-ish sized GPUs, they should have kept to sub 420mm^2, while slightly slower, than Nvidia, it would do better on power and price and overclocking.
>power and price
I VERY MUCH CARE
Take your fucking 60 watts and shove it up your ass, you shitty fucking treehugging liberal cunt, my bedroom lights use at least 3 times that and it's on for 12 hours every day.