[Boards: 3 / a / aco / adv / an / asp / b / bant / biz / c / can / cgl / ck / cm / co / cock / d / diy / e / fa / fap / fit / fitlit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mlpol / mo / mtv / mu / n / news / o / out / outsoc / p / po / pol / qa / qst / r / r9k / s / s4s / sci / soc / sp / spa / t / tg / toy / trash / trv / tv / u / v / vg / vint / vip / vp / vr / w / wg / wsg / wsr / x / y ] [Search | Free Show | Home]

We should blame the 14nm Globalfoundries process or the

This is a blue board which means that it's for everybody (Safe For Work content only). If you see any adult content, please report it.

Thread replies: 149
Thread images: 18

File: 14nm vs 16nm vs 28nm.png (344KB, 1988x930px) Image search: [Google]
14nm vs 16nm vs 28nm.png
344KB, 1988x930px
We should blame the 14nm Globalfoundries process or the Polaris architecture?
>>
Lmao

Why are AMD cards so bad with power.
>>
Difference in architecture.

nvidia chose to optimize for serial que and use software que to make optimizations.
amd chose to optimize for parallel que and use hardware to optimize games

hardware option cost extra power usage. this is what you see
>>
amd and nvidia took different approaches to architecture .

those differences have pros and cons, one of the cons for amd is power consumption .
>>
File: perfrel_1920_1080.png (42KB, 500x1170px) Image search: [Google]
perfrel_1920_1080.png
42KB, 500x1170px
>>55672494
Then why is the 1060 also faster?

Seems like the 1060 is just better in every way.
>>
>>55672444
Because nvidia don't want fermi 2.0. AMD are so incompetent
>>
>>55672584
Benchmarks are influenced by game settings and game choice. Look at multiple benchmark reviews and create a meta analysis. Find out what the outliers are and try to eliminate any bias.


>https://www.youtube.com/watch?v=V54W4p1mCu4

Meta analysis of the GTX 1060 review here, if you don't want to spend 20-30 minutes crunching data.
>>
>>55672412

I'd blame the process more than the architecture. Just look at TSMC 16nm vs Samsung 14nm (very similar to GloFo 14nm) with the iPhone 6s' A9: http://arstechnica.com/apple/2015/10/samsung-vs-tsmc-comparing-the-battery-life-of-two-apple-a9s/

The Samsung A9 consumed ~40% more power at load than the TSMC A9.
>>
>>55672782
I said that 2 month ago and all the AMDfags bully me :c
>>
File: 480 design.png (55KB, 1171x402px) Image search: [Google]
480 design.png
55KB, 1171x402px
>>55672412
Neither. Its the result of a design choice to ensure the Polaris 10 die was as cheap as possible.
>>
>>55672412
AMD always had higher power draw.
>>
>>55672782
>>55673183
Stop regurgitating debunked nonsense

http://www.tomshardware.com/news/iphone-6s-a9-samsung-vs-tsmc,30306.html
>>
Will ARM ever replace x86?

In the future will my desktop PC just be a dock for my phone? That would be so cool.
>>
File: GCN vcore-clock.png (37KB, 1171x703px) Image search: [Google]
GCN vcore-clock.png
37KB, 1171x703px
>>55673339
Supplemental
>>
>>55672494
AMD might have chosen to make their cards more future proof by enabling new tech, while nvidia chose to optimize for the benchmarks and games that are out right now, so most tests will make the competitor look shit.
Now who has made the better choice financially? What use is the "oh my god, the 480 is great even after 5 years" when AMD is bankrupt at that point, not selling any new cards anyway, because the old one is just as good as the new one?
They made some great products, but they suck at getting money, and they will die for that.
>>
File: xBgSksM.jpg (18KB, 340x260px) Image search: [Google]
xBgSksM.jpg
18KB, 340x260px
>>55672659
>>
File: 1be77eea.jpg (49KB, 575x480px) Image search: [Google]
1be77eea.jpg
49KB, 575x480px
>>55673354
nah
>>
>>55672412
The reason for the higher power draw is because AMD cards still use hardware schedulers. Nvidia removed their hardware schedulers after the 400 series proved to be a disaster for heat and power draw.
>>
File: Polaris-10-die.jpg (885KB, 2048x1152px) Image search: [Google]
Polaris-10-die.jpg
885KB, 2048x1152px
>>55673563
Why would anyone do that, just go on the internet and be stupid?

The Polaris 10 die draws 110w nominally. Of its total target power of 150w the memory allocates 40w. That 8GB of 8Gbps GDDR5 sucks down that much power.
The die itself only pulls 110w~ because of how its clocked, and its only clocked that high to compensate for having 32 ROPs.
It only has 32 ROPs to ensure the layout was as stupid simple as possible
The whole layout was designed to maximize yields, to be as economical as possible.

The hardware schedulers in the ASIC have nearly nothing to do with the total power draw figure. The SIMD lanes and memory PHY are the larges power drawing blocks by a long shot.
>>
>>55673369
Not in our lifetimes.
>>
>>55673369
>>55673804
Yeah the only way X86 is getting usurped is if the arch replacing it can still execute X86 code with near 1:1 parity.
No one is going to give up the massive software library we've built up.
>>
File: funny-pictures-auto-478650.jpg (18KB, 500x205px) Image search: [Google]
funny-pictures-auto-478650.jpg
18KB, 500x205px
>>55673487
is that the /v/ equivalent to this /g/em?
>>
>>55672412
>reduced power over the 380
>bad
>>
>>55672412
Didn't they fix this?
>MFW my 780Ti consumes less power in idle even on multimonitor

Any ACX 1070 fag here, would you recommend it to me? I was also thinking about the 1060 but I need something that can run 4K decent (High settings).
>>
>>55673487
>DX12 bans multithreading
You can't make this shit up.
>>
>>55673487
Is there any proof at all to this? Or is this that the market is heading in a certain direction which AMD predicted and nVidia did not?
>>
>>55674267
Its a bait. Either by a retarded nvidiot with no idea what he's typing or a false flag kek'r.
>>
>>55674181
A 7850 beats your card in Vulkan doom.
I had a 770 and I liked it before I sold it
>>
>>55674316
Because of the gimped drivers. Unfortunately there is no AMD alternative. As I said I'm running 4K.
>>
>>55674332
Why would nvidia Gimp their own drivers
>>
>>55672412
I am fucking done with AMD.

Everywhere I turn I see evidence of Nvidias superiority.

AMD is always the underdog. WELL FUCK THEM.

I will gladly pay more for Nvidia since they cant see to fucking be competitive!

You all know in your heart of hearts, Nvidia was always better.....
>>
>>55674347
>Why would nvidia Gimp their own drivers
To get people to upgrade? In case you didn't know they gimp older cards in their newer drivers all the time.
>>
>>55672412
The biggest issue with AMD is the RAM speed, it typically only has 2 settings, super low, or maxed out. If you are on a monitor that is 1440p+ above a refresh rate of 60, or dual displays, the RAM gets jacked up to max speed. On my 390, I use a profile to keep my VRAM speed clocked at 150mhz, so it doesn't spike at max speed, and allow it to sit at a watt draw of less than 20watts. And the reason for a bit higher draw on gaymen, the VRAM uses a bit more ppwer than the nvidia counterparts, since it's ramped up like hell, and double the bandwith speed.
>>
>>55674375
I support this and apples planned obsolescence.
>>
>>55674316
>>55674347
Its true, look it up. They gimp their own games, I can't play Witcher for example.
>>
>>55674367
>3 yuans have been deposited to your green account, the Way its Meant to be Banked(tm)
>>
>>55672412

Why does anyone care about power consumption anyway... I was rocking twin 295x2s for a while. Roughly 12000 watts at full load. Zero fucks given.
>>
>>55674449

>12000 watts

Whoops.... 1200.
>>
>>55674449
I never felt it was very important, most people are not running their system under load 24/7.
>>
>>55674449
laptops?
>>
>>55674529
Not running under load is actually worse for AMD. With two non-identical monitor attached or when playing video, the power consumption is 6 times that of Nvidia. Look at OP's pic.
>>
>>55674449
your parents will care since they are complaining about the power bill :^)
>>
>>55674552
>mobile
>>
>>55674367
AMD is just retarded now. They used to be good but now every product is a failure.
>>
>>55674682
>using the smiley with a carat nose
>>
>>55674316
what stop nvidia user from using opengl.
>>
File: 1469067980384.png (110KB, 1591x639px) Image search: [Google]
1469067980384.png
110KB, 1591x639px
>>55672584
>overclocked non reference board vs a amd reference
here lets make the comparison more equal
>>
File: 1469067906447.png (105KB, 1591x637px) Image search: [Google]
1469067906447.png
105KB, 1591x637px
>>55677352
more
>>
File: 1469067768949.png (108KB, 1596x635px) Image search: [Google]
1469067768949.png
108KB, 1596x635px
>>55677394
more...
>>
blame the architecture. the difference between samsung 14nm and tsmc 16nm isn't large enough to account for the literally 2x perf/watt that amd is behind nvidia now, that's mostly the result of the fact that they have no made real improvements from gcn 1.0.

i suspect that AMD reinvested most of the saved money from firing engineers into hiring substandard H1B import labor and paying to shill/astroturfing marketing firms and the like.
>>
>>55677411
>>55677394
>>55677352

nice faked images rajesh, there's plenty of evidence that the rx 480 does not overclock over 1350mhz core clock without LN2 and volt modding.
>>
>>55674449

because more power used = more heat and more fan noise
>>
>>55677442
>being this retarded
>>>/v/
>>
>>55677481

>don't disagree with me or point out my lies or else i'll call you retarded

>>>/r/amd
>>
>>55673418
>What use is the "oh my god, the 480 is great even after 5 years" when AMD is bankrupt at that point, not selling any new cards anyway, because the old one is just as good as the new one?

cards like the 480 are never even relevant after 5 years, you'd be lucky for even a flagship card to be good enough to game at low/med settings after that long.

cards like the 970, 480, 1060 will only have about 1/2 more years of decent performance to extract before they're completely obsolete.
>>
>>55677499
>don't disagree with me or point out my facts or else i'll call you retarded

>>>/r/downsyndrome
>>
>>55677422
This, I don't think we should blame Samsung/Globalfoundries. The architecture is not optimized because no money.
>>
>>55677499
>constantly redirecting to leddit
yea thats a thing that /v/ spillage does all the time, you really need to go back
>>>/v/
>>
File: luvia.jpg (45KB, 334x334px) Image search: [Google]
luvia.jpg
45KB, 334x334px
>>55673339
>>55673370
>no amdrone has addressed these yet
>>
>>55677541

>facts

where did you get these factual benchmarks?????
>>
480 is so much fail
>>
>>55677637
you are so much shill
>>
>rated at 150W
>pulls 163W on average while gaming
When will AMD stop lying.
>>
>>55677855
When will you stop being a newfag?
>>
>>55672412
>>55672444
nvidia separately clocks every aspect of their cards, amd only has one clock rate throughout.

its also why you nee near linear gains when amd oc, and nvidia... well... you oc it FAR more and get far less if you can even call what you do with nvidia overclocking.
>>
>>55674347
heaven benchmark, only time it was caught, there is also the 500 series that has to use older drivers then the card supports because of blatant gimping

as of late, nvidia just has game devs put things like retard levels of tessellation in games and refuses to put in limiters in driver for older cards, along with other effects they abuse that run better on a 970/980 and soon the 1070/1080/1060 and don't support 1 gen old hardware.
>>
>>55674267
proof to what this retard says?
amd has crappy drivers, they have an overhead, why, i cant tell you for sure, but it seems like it has to do with a-sync, they were ready for a-sync when the 7000 line came out.

then comes along no new api that really helps amd, along with what was shown of dx12 only being 'look, we have texture streaming now' so they made mantle.

mantle removed the overhead of dx11 and gave amd a fairly large boost on crappy cpus and even on good ones
then it also had async that is responsible for about 10% more performance.

you see nvidia lose out because they made a dx11 asic that has some ability to more then just that, so they can emulate async, which is why they take a performance hit when its introduced.

volta will likely have true parallel async, but this shit they have now is just a 'look we can do it too' for marketing.
>>
>>55673831
i have argued this for a while, why not have a new instruction set, with a legacy cpu socket?
>>
>>55677525
280x here, still playing most games either maxed out or with mixed medium and high settings, even then, i could have stuck it out with the 5770 for a few more years,
>>
File: amd dx12.png (2MB, 1450x3400px) Image search: [Google]
amd dx12.png
2MB, 1450x3400px
>mfw dx12
>>
>>55678736
>>>/v/
>>
>>55678736
>implying AMDrones are smart enough to notice stuttering
>>
File: 1465402320579.png (167KB, 709x636px) Image search: [Google]
1465402320579.png
167KB, 709x636px
>>55679220
>>
Feels good when you can upgrade to a new GPU for free in around 6 years because you chose a GPU that doesn't consume a shitload of power.
>>
>>55679243
#triggered
>>
File: 1452244965674.gif (2MB, 237x240px) Image search: [Google]
1452244965674.gif
2MB, 237x240px
>>55679473
>#
>>
>>55672659
>removes project cards because of "nvidia optimization"
>doesn't remove hitman because of its heavy optimization for GCN

Lol I always knew this guy was a fraud. This video is next levels of butthurt.
>>
>>55672412
>comparing 2 diff uarch
>thinking its the same
>using tpu as a reliable source
>>
>>55679497
hitman doesnt have anything that offloads on the cpu like physx is doing
i fail to see how its the same
>>
>>55679548
That is just how nvidiot reasoning works
>>
>>55679487
#reallytriggeredrightnow
>>
>>55679548
>what is architectural optimization
>nvidia does it: reeee nvidia so evil
>amd does it: i-its perfectly f-fine
>>
>>55679698
Logic does not work on a desperate shill anon.
>>
http://www.golem.de/news/geforce-gtx-1060-vs-radeon-rx-480-das-bringen-direct3d-12-und-vulkan-1607-122214.html

Some interesting results all told.
>>
>>55672412
polaris is more efficient than pascal
>>
>>55673642
>The hardware schedulers in the ASIC have nearly nothing to do with the total power draw figure
yeah bullshit. nvidia are literally not capable of using ACEs because fermi was a housefire when they tried so they just got rid of them and went back to 2005 level hardware with maxwell and paxwell
>>
>>55679497
I used to watch him but his newer vids are a massive turn-off when he went full AMD fanboy mode.
>>
>>55679975

AMDRONES ON SUICIDE WATCH
>>
>>55673370
Can someone explain what I'm looking at here?
>>
>>55680143

>10 yr old cpus

who cares
>>
>>55680167
>w-who cares

People that these cards are marketed towards.
>>
>>55680184
oh please. at least test 2500k. or whatever the amd equiv is.

I'm getting a 1060 myself but testing an i5-750 is just retarded
>>
>>55680184
The 1060 isn't that cheap, its retarded to pair it with systems that are worth less than the card only. A current i3 or i5 makes much more sense to compare, and even then you're going to see loads of normie builds pairing it with i5k or i7s

Anyways, its not news that nvidia cards run better than amd equivalents on lower tier CPUs
>>
>>55680205
And at stock speeds no less when every one of those chips is capable of 3.5Ghz minimum, more usually 3.8-4
>>
>>55680205
You do realise that 50% of steam users only have dual core cpus, right? Pairing it with an i5 750 is perfectly logical since a lot of people who want to play the latest games won't think of upgrading their cpu but their gpu only. This is even more evident when we see that the 2nd most used gpu in the world is a gtx 960. With 50% of people using dual core cpus and pairing it up with an x60 series gpu, it makes perfect sense to say that these are the people this card is marketed at. People keep going on about muh 980 performance 1060 but they forget that it's still a fucking x60 series gpu, which would be the gpu of choice for these people who want to upgrade their toasters since its midrange card.
>>
>>55680441
50% of steam survey responders are likely on a laptop than not, where even a lot of "i7"s are dual core
Laptops, you know, that thing where a dGPU is practically of no worth whatsoever.
>>
>>55680441

>You do realise that 50% of steam users only have dual core cpus

Thats because laptops and the vast majority of laptopsare running cpus that are faster (more IPC) than a decade old desktop chip.
>>
>>55672498
what are the pros of amd?
>>
>>55680517

A simple examp-le is GCN doesn't give a solitary fuck about context switching - you throw any sort of workload at GCN and it will crunch the lot with no fucks to give. You start feeding Nvidia chips with graphics and compute workloads without proper scheduling the card will choke as it will make lots of latency intensive context switches.
>>
>>55680457
>>55680464
>all the 50% of people using dual core cpu are running laptops and there aren't people in the world who are using old toasters with old dual core cpu
>>
>>55680570
Putting words people didn't say into their mouths is a cute trick for a twelve year old.
Or the mentally impaired.
>>
>>55680464
Most laptops with dedicated GPUs are have a quad core CPU or at least dual core with hyperthreading.
>>
>>55680619

Intel still sell a lot of celery and pentium based laptops. That said your point must be ignored because other anon ( >>55680441 ) tells us that 50% of steam users are only using dual core chips and if someone on 4chan says so it must be true right?
>>
>>55680570
And do you think that those people with toasters are going to buy a 250$ card? Some may, but more are likely to just buy a new system altogether if they actually plan to play games
960 was a bit cheaper, but now I expect lower cards to take that place seeing how nvidia is pricing their cards
>>
>>55680719
I can't tell if you're legit retarded or trolling.
>>
File: 1440990477651.png (36KB, 943x303px) Image search: [Google]
1440990477651.png
36KB, 943x303px
>>55680570
>>55680719
pic related

So let's cut the numbers easy and say effectively half of people use 2-cores and the other half use 4-cores. Most people are normies, /g/ AND high-fidelity conscious gamers with the budget for it are an exception, and laptops are very popular for anything to do with anything PC nowadays.
I stand by my statement that more dual-core machines are laptop than desktop, we can argue nonexistant numbers but I don't have the time for people unable to make logical inferences.

>>55680619
By dGPU I (the post above his) meant external, real dGPUs and not mobile-lite but you make a great point, thanks.
>>
>>55680788

>5 cpus
>0.01%

How the fuck? At least the 3 core chips makes sense.
>>
>>55680832
You can disable cores in the BIOS, maybe some people on older or balls to the wall i7-extremes are running more stable with only five active, or just for kicks, who knows.
>>
>>55677352
>>55677394
>>55677411
How is there such a big difference in fps with only 200mhz OC?

And what does in fact 1200mhz mean?
>>
>>55681113

>How is there such a big difference in fps with only 200mhz OC?

GCN is designed differently to how Nvidia build their cards. In this context its because GCN uses a single clockspeed for everything within the gpu core (which is effectively everything except the vram) so when you overclock GCN EVERYTHNIG goes up by the amount you specify - hence the much, much better gains per mhz for GCN than kepler/maxwell/pascal. Equally its why power draw goes up so much.

>And what does in fact 1200mhz mean?

Reference card throttling due to the tiny heatsink.
>>
>>55677574
Theres nothing to address, those posts are plainly factual. AMD even bragged about getting more performance out of a smaller cheaper die on twitter.
Polaris 11 has higher perf/watt than 10 because its clocked in its sweetspot where as 10 has its clocks pushed as high as possible to increase pixel throughput from only having 32 ROPs.

>>55680114
Its astounding how poor your reading comprehension is.
>>
AYYMD HOUSEFIRES
AYYMD HOUSEFIRES EVERYWHERE
>>
>>55673563
GF100 was a disaster because TSMC spectacularly fucked up the initial transition to 40nm.
The node was immature, and nvidia insisted on putting these fuckhuge GPU dies on the node, which in turn resulted in horrid yields, high heat output, and high power consumption, which further wasnt helped by the huge number of FP64 units on the die.
GF104 hits the market after the node has had time to mature a bit (and the cut a bunch of the FP64 blocks) and it was everything GF100 was supposed to be.
>>
>>55679497
>hitman
>gives performance boost to all cards over dx11

>nvidia cars
>penalizes AMD cards with 50% performance hit


>"nvidia optimization"
kek
>>
>>55672412
What about the irrelevant Double Precision units? Wouldn't be better if AMD get rid of then in their gaming cards like Nvidia?
>>
>>55684690
There are no "double precision units" in GCN.
DP performance is limited only in firmware.
>>
>>55684713
And therin lies the problem, silicon is going to DP and DP is not really needed for consumer chips.
While Nvidia's FP64 units are not coupled to their FP32 units, so they can remove them.
>>
>>55672584
TPU is the most cherry picked review short of Tomshardware

funny how you fanboys always linked hardocp and guru3d until their less favorable 1060 reviews came out
>>
>>55677442
>plenty of evidence that the rx 480 does not overclock over 1350mhz

KEK
>>
>>55684727
You're not understanding.
The GCN SIMD lanes are it. There is no separate silicon dedicated to double precision number crunching. The 4X16 SIMD lanes can do both. DP is only limited in firmware because it spikes utilization and runs hot.

Nvidia's architecture is not comparable to AMD's.
>>
>>55684768
Yes, and if AMD had actual money they'd have a consumer architecture that specializes in pixel throughput(what is TeraScale and VLIW durrrr) and a enterprise/HPC architecture (what is GCN)
>>
>caring about power consumption

what, are you too poor now to have a good psu and pay the power bill? lmao guess nvidiots aren't that rich after all
>>
>>55684831
Programming for VLIW is a horrendous wreck. Your ideas are terrible.
>>
>>55684841
there actually are major downsides to it, but Polaris does it well enough. huge step up from earlier AMD gens.
>>
>>55684860
>huge step up from earlier AMD gens.
This, thats why I'm ok with it. AMD is improving you can't say they are not.
>>
>>55684852
But they had something to fit, they could have also designed a non-VLIW arch for the consumer market that's not a heavily compute and HPC one like GCN
>>
>>55684860
>huge step up from earlier AMD gens.
It has the same perf/watt as Fiji which was like a few % less efficient than Maxwell but it had plenty of advantages Maxwell didn't have, you stupid twat! Polaris was made to be inexpensive, that's the reason they didn't scale their Fiji design and got similar perf/watt as Maxwell.
Vega will do that, because at $500 people actually care about that, and HPC certainly does care about that.
>>
>>55684841
I'm not poor. AMD is for poorfags. I can afford electricity.

REEEEEEEEEEEEE
>>
>>55680788
you can get more info looking at the screen resolutions. x768 to x900 resolutions make up like 49% of all users on the surveys. so yea. lot of laptops. Probably 90% of the 2cpu users
>>
>>55677574
I don't really see a problem. The GTX 1060 is a more expensive card, so I would expect it to have better performance than the RX 480.
>>
>>55684841
The main problem with power consumption is not the cost, it's the heat

Power = heat

AMD cards are like mini space heaters
>>
>>55685823
>amd 150w are mini space heaters
>nvidia 150w are not
>>
I'm kinda disappointed with the 480 results. Even those graphs showing the Nitro+ overclocked to 1420 don't impress me. I was hoping the AIB 480's would wipe the floor with the 1060. If those graphs are any indicator and if they are real then the AIB's may match the 1060 in most games when overclocked but still use considerably more power than an overclocked 1060. It's just not living up to my expectations. At least with a 1060 you get better performance overall per dollar and by the time the 480 gets much better performance (while still being power hungry) it will not be relevant anyhow and Nvidia wil have an answer for DX12 and Vulkan.
>>
>>55686069
> it will not be relevant anyhow and Nvidia wil have an answer for DX12 and Vulkan.

they already do >>55680143

they know not everyone uses top of the range i7. look at those massive performance drops on the amd card across all 3 cpu.
>>
>>55685870
>amd 150W is actually 160+W
>nvidia 120W card matches AMD 160+W
>>
>>55686069
RX480 WILL OC TO 1.5GHZ YOU FALSEFLAGING NVIDIA SHILL. THE GRAPHS ARE A LIE!
>>
>>55686069
>still use considerably more power
It's still no a lot of power. They are sub-200W cards that perform as good as previous gens 250W+ cards.
>>
>>55686455
I am not. I am currently sitting here with an R9 290 on an i7 4770K. The 290 I have is one of non-oc'd Sapphire Tri-X's that were rushed out of the factory to supplant the lack of stock at OCUK when the coin miners were causing shortages. This meant the silicon never got tested to OC. I barely get past 1050/1300Mhz with it and it is a noisy space heater.

I was looking for something to replace it with less power, quieter, slightly better fps @ 1080 and smaller form factor (The Tri-X is enormous) then sell the 290 on to help towards the cost. Sure I could wait for Vega but fuck that. Also a 1070 is just a little too rich for me.

There is my dilemma.
>>
>>55686573
>Also a 1070 is just a little too rich for me.
My biggest issue with going nvidia is g-sync. You can literally add 200$ to your costs if you go nvidia for whatever reason.
>>
>>55686753
Well yeah that too. I was thinking of getting a Freesync later on so that is a big factor to consider. I hate waiting. Waiting is suffering.

I have the Nitro+ OC on pre order. Once the official benches come in I will make my decision. Maybe I will just suffer the 290 for a while longer based on those benches or grab a 1060 as a stopgap.
>>
>>55686844

>290
>moving to 480

For what purpose does thou do this?
>>
>>55686900
Slightly better performance, power usage and noise. My 290 won't OC much past 1050 without shitting itself. I tried all kinds of +mV and fan profiles etc. I simply lost bigtime on the silicon lottery. I hate having to lower the settings on some games too.

My Firestrike Extreme graphics score is 4033 @ 1000/1300 (The default overclock for the proper Tri-X but mine came as 957/1250 becuase it was a special order during the shortages)
>>
>>55686961

Thats still a lot of money given the very small gains in performance all told. If you want an upgrade either nab a fury/fury x or go look at Nvidia.

> I tried all kinds of +mV and fan profiles etc

Out of pure curiosity, just how much mV are we talking? Hawaii has some weird voltage scaling and sometimes you need to feed it lots (over +100mv) to really unleash the chip. Not to say you didn't lose, but most people only use afterburner which is limited to +100mV at stock for safety reasons.
>>
>>55687067
Yeah I use Trixx. I tried past 100 mV and can get a stable Unigine Heaven test @ 1100/1500 but after a reboot to Win10 the screen goes berserk and corrupted to shit and I had to reset it. I had a similar issue using 1050/1300. So so now I am stuck @ 1000/1300 +50mV which seems stable. I am wondering if the GPU is actually a tad fried thinking about it. Maybe I should roll the BIOS back to an older version as this was supplied to me by Sapphire to fix a screen blanking issue back on Win7 and old drivers.
>>
File: gpuz.gif (23KB, 400x494px) Image search: [Google]
gpuz.gif
23KB, 400x494px
>>55687143

Forgot the pic
>>
480 is Amd worse card since 2900
>>
>>55687143

>Sapphire to fix a screen blanking issue back on Win7

That was one weird issue - a lot of people had it fixed simply by bumping up the voltage as (iirc) vdroop was fucking the memory chips.

If you feel insane go look up some of The Stilt's bios mods on OCN - one thing he did do was tighten up memory timings and fuck with how powertune scales so some chips could actually undervolt and run at higher clocks. I keep wanting to try it on my 290x (luckily the bios switch is useless as my card has identical clocks on both) but I keep pussying out.
>>
>>55677442
Not a reference one, no. Those measurements were taken with a pre-production version of a 3rd-party rx 480 with 8-pin power connector and better cooler.
>>
File: 1468687046972.jpg (17KB, 285x279px) Image search: [Google]
1468687046972.jpg
17KB, 285x279px
>>55677442
>Being this stupid
Thread posts: 149
Thread images: 18


[Boards: 3 / a / aco / adv / an / asp / b / bant / biz / c / can / cgl / ck / cm / co / cock / d / diy / e / fa / fap / fit / fitlit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mlpol / mo / mtv / mu / n / news / o / out / outsoc / p / po / pol / qa / qst / r / r9k / s / s4s / sci / soc / sp / spa / t / tg / toy / trash / trv / tv / u / v / vg / vint / vip / vp / vr / w / wg / wsg / wsr / x / y] [Search | Top | Home]

I'm aware that Imgur.com will stop allowing adult images since 15th of May. I'm taking actions to backup as much data as possible.
Read more on this topic here - https://archived.moe/talk/thread/1694/


If you need a post removed click on it's [Report] button and follow the instruction.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com.
If you like this website please support us by donating with Bitcoins at 16mKtbZiwW52BLkibtCr8jUg2KVUMTxVQ5
All trademarks and copyrights on this page are owned by their respective parties.
Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
This is a 4chan archive - all of the content originated from that site.
This means that RandomArchive shows their content, archived.
If you need information for a Poster - contact them.