As you know AMD has seen a huge decline of sales and is like 2 years behind of nVidia in their graphics architecture right now.
If intel or nvidia get greedy, they could just wipe AMD off the market by lowering their profit margins per product, but that would put them on the spotlight for monopoly and monopoly is never good.
AMD still sell by doing lower profit margins and giving up their fabs a few years back.
Now they are releasing the Polaris architecture, wich with VSync on is "as powerful" as a GTX 950 and consumes half the power.
What do you thikn is the future of AMD? Will this Polaris project fall flat? Will it have any impact on the current hardware market, wich has lowered sales on the last few years and is currently full of one-sided very brand loyal customers?
And how did they manage to achieve less consumption?
Being know for their high consumption specially on their top graphics cards its the thing that amazes me the most.
To understand polaris you have to understand what nvidia and AMD's plans were.
AMD was relying on the 20nm dieshrink to be ready in 2015 and to use this dieshrink. They designed cards for this dieshrink, and then it fell through and they had to emergency hit 28nm harder.
Nvidia's designs were instead to improve efficiency on 28nm.
So in short, AMD has been ready for a dieshrink, moreso than nvidia.
Polaris is likely to be very good, better than pascal. It's also why pascal is coming out after polaris.
I think the future looks very bright for AMD. Knowing they get way better performance improvements than the competition from using DirectX 12 and Vulkan this could be a very good year for AMD
I don't think AMD is really behind at all...
The real reason why they're behind in performance is mainly due to them making the incorrect calculation that API's would develop faster than they actually did and developed their chips to give good performance when fully utilized. However API's like DX11 however do not allow their hardware to be fully utilized so they had to sort of dial their hardware to 11 to compete with Nvidia, who made the exact opposite calculation and built their hardware with the limitations of API's like DX11 in mind.
Over the next couple of years it will be interesting to see which happens first, developers properly utilizing API's like Vulkan and DX12 or Nvidia catching up with AMD in these low level API's before developers learn to use these API's properly.
I thought AMD had said specifically that they were going to have a new high end GPU this year. They at least need cards that can replace the 390 and the 370. I was expecting something like:
New Fury X - Full Polaris 11 + 16GB HBM2
New Fury - Full Polaris 11 + 8GB HBM2
490X - Cut down Polaris 11 + 8GB GDDR5
490 - Cut down Polaris 11 + 4GB GDDR5
They are releasing Polaris 10 for laptops and Polaris 11 for desktops. Maybe they adapt Polaris 10 for desktops in new APUs ?
its all up in smoke at the moment.
also why GDDR5? is it that expensive to implement HBM ?
You need to remember that the Maxwell architecture is primarily used for single precision. It was designed purely for gaming cards. There's a reason why Kepler is used in Quadro's and Tesla's still.
Maxwell has been stripped back for one job. Single pipeline gaming. Which also means power efficiency.
Turn to AMD and the architecture is the complete opposite. The ACE's are parallel compute pipelines. It was designed with DX12/Compute/multiple heavy tasks in mind. The GPU is doing all it can to function outside it's design but still does OK.
Not sure what's going/coming with Polaris but a restructured GCN sounds great
Definitely. It's like with any new technology, see Galaxy S6 SoC. New, fast, fkn expensive. Also, from a hardware standpoint it is not needed for a GPU would out the power to utilise it.
The new APUs are coming next year, any low-power Polaris that would wind up in a laptop this year would be a discrete chip.
>also why GDDR5? is it that expensive to implement HBM ?
No, but using the die shrink to reduce power and heat while maintaining the same level of performance would make more sense with how AMD have segmented their market. A cut down polaris with 4GB of HBM might compete against an older Fury card and that for the moment wouldn't be good for business.
We're more likely to see 1-2GB of HBM used in low end cards and APUs before we see 4GB in mainstream performance cards.
I'm happy AMD crashes and burns. Large stake is owned by muslims, who are enemies of Israel. This is simply another failure of muslims to extinguish eternal flame of God's chosen people. Did you know that Intel has advaced FAB's and research centres in Israel? This is literally a battle between Jews and Muslims. Israel is humanity's shining beacon of hope democracy and human rights while muslims are horrible mutant creatures from pits of hell that need to be eradicated. Israel is also greatest ally of America. Intel trusts Jews and their superior intellect. Only freedom hating Bernie Sanders supporting islamic communist would even consider AMD.
>AMD was relying on the 20nm dieshrink to be ready in 2015 and to use this dieshrink. They designed cards for this dieshrink,
Not a single word of this is true. AMD never had any plans to design 20nm GPUs. They had two small core 20nm APUs, and both were shitcanned because their little chips weren't selling enough to be worth purchasing the wafers.
The tremendous leakage with 20nm planar parts was well known years before the node ever went online. Stop inventing your own fanfic history.
There is no reason to put HBM on a design unless its bandwidth starved, or providing the required bandwidth is drawing excessive power. You don't put extremely expensive memory on entry and mid level parts.
Bristol Ridge is just repackaged Carrizo.
>A cut down polaris with 4GB of HBM might compete against an older Fury
Now 14nm part is anywhere near Fiji in performance, and bandwidth has literally nothing to do with it.
Pity everyone is overwhelmed by nVidia's marketing and making deals with OEMs to use their chips.
Last count i think AMD made up just 10% of the GPUs in the steam hardware survey, compared to 70% nVidia
Just because they're mostly used for gaming doesn't mean that R&D would stop if gaming wasn't a thing. GPGPUs are a thing whether or not gaming is, and GPUs will continue to progress regardless of gaming being a factor.
even intel HD is getting bigger. 18% of computers surveyed are intel HD graphics. mostly laptops i guess. that is scarily close to the 26% AMD graphics card users.
we got Mr Smartypants here saying Steam gamers are a minority but i don't think thats the case, at least from my personal friends and most gamers i know.
>Last count i think AMD made up just 10% of the GPUs in the steam hardware survey, compared to 70% nVidia
Yeah, no. AMD GPU usage has barely changed on the Steam survey in a long time. Maxwell seems to have eaten into Intel's slice of the pie more than anything. Most likely mainly people with shitty prebuilts finally being able to get a half-decent card via the 750 Ti (which is the second-most popular card on Steam after the 970).
HBM is in the shitcan once GDDR5X comes out. Exotic memory is nothing more than a meme and Pascal will probably run on that instead of HBM. If and when polaris comes out they will get huge benefits from the die shrink thanks to the process. The question is will they outpace Nvidia which literally had the upper hand in 28nm architecture.
From the TDP slides shown in OP.
It doesn't look like it. Half the TDP from half a die half the size looks like literally the same shit being moved to a new process rather than a completely new architecture.
HBM and GDDR5X are not competing against one another. GDDR5X is a direct upgrade from GDDR5 that is 90%+ identical, it was designed this way to drive down costs while increasing performance for lower end GPUs.
No one was ever going to put HBM on entry level GPUs.
They already do. If you go to apples website, you won't find a single nVidia GPU anywhere. They don't use AMD CPUs, but they ALL use AMD GPUs. In fact, if I remember correctly, the Mac Pro uses some sort of AMD DXXX series GPUs that are found ONLY in the mac pro, at least I'm pretty sure.
I looked at the mac pro and saw the weird way they did cooling... Is that effective in any way? I feel like thermal throttling is a problem but I've never bothered to look it up.
That weird triangle heatsink is just barely good enough. In Anandtech's review they actually got their system to encounter a power virus, so power consumption momentarily shot up way higher than a normal full system load, and everything throttled pretty hard. Under normal rendering workloads its a bit warm but otherwise fine. Definitely not good enough for my standards, but I guess it works.
Are you retarded? Last figure I saw more than a year ago was that Steam had over 125 million active users (log in at least once a month). That's almost as much as the entire userbase of PS3 + Xbox 360 combined, and certainly way more than the userbase of PS4 + Xbox One.
If we assume 125 million is a "small percentage", as in less than 10%, meaning total is at least 10 times higher, that would make it 1.25 billion people are PC gamers (more than one sixth of the planet's population), which is completely unreasonable.
>I have no clue about the existance of Quadro and FirePro lines and their prevalence in workstations. Also, I have no idea what are Tesla cards.
>AMD made up just 10% of the GPUs in the steam hardware survey, compared to 70% nVidia
Are you retarded? The hardware survey shows AMD's share is about half of Nvidia's (which isn't so bad), not that it's seven times smaller. Just go to the hardware survey and check it before spouting nonsense.
So? You still have percentages for individual GPU models (except for some AMD cards where they group similar products, like "HD 7900 series" or "R9 290 series"). If you get an estimate of how many steam accounts there are, you can estimate how many people have each GPU model. So exclude the Intel and shit AMD APU ones and look at the rest.
290 series sold like hotcakes for buttcoin miners. Those will never show up in Steam charts.
Also Steam Hardware Survey is not mandatory, so it only tracks users who are dumb enough to click "yes" when Steam asks for it.