[Boards: 3 / a / aco / adv / an / asp / b / bant / biz / c / can / cgl / ck / cm / co / cock / d / diy / e / fa / fap / fit / fitlit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mlpol / mo / mtv / mu / n / news / o / out / outsoc / p / po / pol / qa / qst / r / r9k / s / s4s / sci / soc / sp / spa / t / tg / toy / trash / trv / tv / u / v / vg / vint / vip / vp / vr / w / wg / wsg / wsr / x / y ] [Search | Free Show | Home]

Why can't a gtx 950 play doom at 60fps on 1080p but a ps4

This is a blue board which means that it's for everybody (Safe For Work content only). If you see any adult content, please report it.

Thread replies: 221
Thread images: 27

File: header.jpg (46KB, 460x215px) Image search: [Google]
header.jpg
46KB, 460x215px
Why can't a gtx 950 play doom at 60fps on 1080p but a ps4 can for like 90% of the time. is it because of console optimization ? I don't really don't see the point of budget builds then if a ps4 will outperform them.

not bushing pc gaming just budget builds my rig runs doom smooth as fuck at 1080p with a 970.
>>
>>55927756
Same settings?
>>
>>55927756
Have you tried comparing IQ?
>>
>>55927756
>is it because of console optimization
Yes you fucking tard. That and PC versions usually have much higher texture resolutions and more demanding visual effects requiring better hardware in general.
>actually expecting a $150 GPU to run brand new games at 60 FPS at max settings

>>55927771
Of course not.
>>
>>55927771
Probably this.
>>
>>55927756
>ps4 can for like 90% of the time
Because the PS4 version is a lower texture blurry mess compared to the PC version.
https://www.youtube.com/watch?v=munsIFm3XaE
>>
>>55927771
Op, answer this fucking question because otherwise you're full of shit.

Consoles don't have any optimization over the PC. That was some shit made up last generation when they were getting away with games at 15fps in many instances where it was hard but not impossible to tell.
>>
Optimization is one reason. Obviously it's easier to make a game run well on a single specific piece of hardware than unlimited combinations. Another reason is the console version of Doom changes graphical settings on the fly to avoid frame drops.
>>
>>55927820
Looking at this video, I think it's hard to justify dropping around a two thousand dollars to play games like Doom with maxed out graphics when you can get a console for 200 bucks. Of course there will be a very real difference when you go from 25-30fps 1080p to 144fps 1440p but is it really worth the investment? Great stories like Witcher 3 or GTA V will still be great stories, no matter what platform they are played on.
>>
>>55927881
yeah but you only really need $1,000 at 1080p and if you need a pc not just for browsing the web then it's a no brainer to invest more in the pc but if all you do is shit post then yeah a console is better
>>
>>55927881
Agreed. When comparing the $65,000.00 PCs that are needed to play PC games to the $25 consoles, we can see that those PCs are just 3-5% better.
>>
>>55927881
You can build a machine which will smoke the consoles for $400-500.
>>
>>55927912
but it won't run doom as well as a ps4
>>
>>55927925
At the same settings, it will run Doom better.
>>
You have a bigger library of games.
You need a pc anyways - only the gpu is gaming specific, and a GPU is cheaper than a console.
You can use any controls.
You can customize and mod.
You save a lot of. Money on games - it acctually pays for itself.
>>
>>55927881
>I think it's hard to justify dropping around a two thousand dollars
Why would you drop $2k when you can get the 1080@60fp for around $600? Or are you planning on getting better image quality in numerous games?

>144fps 1440p
Okay, you'd need to spend a bit more. Maybe $1k for a system to do that.

>Great stories like Witcher 3 or GTA V will still be great stories
Read a book if you want a great story. If you want a great experience, which includes great image quality along with a great story you need great hardware as well. Mediocre consoles will get you a good experience but if there is a PC version then it will be the one to give you the best experience (great vs. good).
>>
>>55927947
This. Also we don't get cucked by remasters we get them for free. Skyrim, Bioshock series and also mods etc..
>>
You can pirate easily saving loads of money.
You can get a old gpu of cheap or free from a friend.
>>
File: doom_1920_v.jpg (144KB, 548x629px) Image search: [Google]
doom_1920_v.jpg
144KB, 548x629px
>>55927925
Nice meme. Even a lower mid-range card will run Doom at 1080/60 on ultra settings, which is far above what the consoles use. Drop it down to console settings and even a post-gimping Kepler will run it.
>>
>>55927828
>Consoles don't have any optimization over the PC.

/g/ once again in charge of not spreading misinformation and knowing what they're talking about

the console versions adjust the rendering resolution on the fly to hit the 60 fps mark, whereas the pc version does not

fucking do some research you retard
>>
File: 1464194302619.jpg (53KB, 426x426px) Image search: [Google]
1464194302619.jpg
53KB, 426x426px
>>55927909
>>
>>55927972
I can't pirate games for some reason. I don't feel like finishing them unless i get them on steam
>>
>>55927977
>the console versions adjust the rendering resolution on the fly to hit the 60 fps mark
You mean like Rage did back in 2011?
>>
>>55927967
I actually bought both Metro remasters 75% off on Steam sale, though.
Haven't even played the games yet because I pretty much only play Path of Exile.
Don't know why I keep buying games if I don't play them.
>>
>>55927900
Considering that most tech companies supply you with a decent laptop like 13-15" fagbooks, I think that having a desktop for facebook is overkill. I can even play Civ and KSP on mine because of the dedicated GPU.

>>55927909
Quality shitposting right there tripfag.

>>55927912
A machine that will run 30-60fps with med/high graphics on 1080p. Not really worth it imo. I'd rather build that 2k machine to actually see the difference.
>>
>>55928004
dude same here lol i bought both a year ago and just started playing like a month ago and stopped. now i just play overwatch and doom
>>
>>55928012
>>
Used sandy/ivy system: $250
970 on sale: $250
Rapes consoles
>>
>>55927756
PS4 runs Doom at medium settings. Even a radeon 7950 can run Doom at stable 60fps on vulkan medium settings.
>>
>>55928019
Same with the Souls series.

>played Demon's Souls on PS3
>got Dark Souls limited edition while still on Demon's
>never played it
>stopped playing Demon's about 50% in
>bought Dark Souls 1 and 2 on Steam
>start Dark Souls 1
>configure controller
>run around for 5 minutes
>quit
>play Path of Exile for the rest of the evening
>haven't played them since

I'm 100% sure I'll buy Dark Souls 3 as well soon.
>>
>>55928035
That's some high quality bait.
>>
>>55928035
Low hanging fruit. Temptation is there but a bit of restraint except for this.
>Mobility due to form factor.
I'm trying to think up what the console version of a gaming laptop would be and I can just see the pants-on-head retard duct taping the console to the back of his Walmart 32" flat screen.
>>
>>55928094
>>55928130
Gentlemen.
>>
>>55928066
yeah stopped at half way through dark souls 1 and everytime i see the other games from the series i'm tempted to buy them but i refuse because i know i'll just get piss and stop playing them
>>
>>55927756
>is it because of console optimization ?
Partly. Very little API overhead and there's a lot you can do when designing a game for a fixed hardware setup.

Another major reason are the settings the settings console version uses. Judging by some comparisons I've seen, they are close to the medium settings on PC but console version dynamically adjusts the settings on the fly depending on the load so the game can hit the 60fps target most of the time.

>I don't really don't see the point of budget builds then if a ps4 will outperform them.
Aside from preferred input/control method and the fact that the budget PC can do a heck of a lot more than match consoles in video game performance, me neither.

>>55927881
>Looking at this video, I think it's hard to justify dropping around a two thousand dollars to play games like Doom with maxed out graphics when you can get a console for 200 bucks.
Well I think you know your answer already then. Not worth for you.

>>55927828
>Consoles don't have any optimization over the PC.
You are a dumb fuck.
>>
>>55928143
>The new Xbox 720 and Playstation 4 will use APU technology, which will enable them to have a graphical output similar to a 7850, ONE OF PCs MOST HALLOWED AND EXPENSIVE CARDS.
Where the fuck did the author do his research for this document? Best Buy?
>>
>>55928163
PC gaming is DEAD! Really guys it has died and now no one plays games on the PC any more! I am serious it is dead, and to prove that all I need to ask is: Did you or any of your friends ever play a game on the PC recently? If you say no then that proves that PC gaming is dead. If you say yes then you are a damn liar because PC gaming is DEAD, and you are lying because you have no proof or evidence, and THAT PROVES PC GAMING IS DEAD!

Really, if you take a look at the recent games for the PC then you will see that many of them sell less than 1m copies (pathetic!) and that PROVES that PC gaming is DEAD! And some PC games are so casual that they get more than 1m sales (lol casuals), and that further proves that PC gaming is DEAD!

So stop using PC for games, only consoles are meant to play games that is why they are called PCs. If they were meant for games they would be called PGs: Personal Gamestations. That also proves that PC gaming is DEAD!

Only consoles have good games anyway, all the PC games are horrible, and this is why PC gaming is dead. Besides, who would play games on a PC? PCs are meant for browsing the web and editing documents and that is why they come with a mouse and keyboard.

PC gaming is dead, so stop playing games on a PC. It is not fun to play games on a PC, because it is a dead platform. If you or anyone else think that PC gaming is fun then you are deluded, because it isn't.

So stop playing games on the PC because that is stupid and PC gaming is DEAD!
>>
>>55927977
but it very rarely does that. 99.9% of the time PS4 runs Doom at 1920x1080
>>
File: Lol_I_Troll_U2.jpg (49KB, 704x441px) Image search: [Google]
Lol_I_Troll_U2.jpg
49KB, 704x441px
>>55928169
>>
>>55928035
>>55928143
>>55928169
Yes very funny, epic comedy even.

No one has claimed PC gaming is dead or that consoles are in every way superior.
>>
>>55928201
I think it is hard to justify dropping around a two thousand dollars to play games like Doom with maxed out graphics when you can get a console for $200.
>>
>>55928208
Maxed graphics on consoles is typically low quality on PCs, just like Doom demonstrates.
>>
>>55927881
>games run smoother when settings are lowered

No shit sherlock, shill for your toys elsewhere
>>
>>55928180
it's, i, ... , the culmination of human history
>>
>>55927975
>780 ti is worse than a 370

Had a good laugh, thanks novidya
>>
>>55928236
In one version of one game which is optimized solely for AMD cards.
>>
>>55928208
That doesn't implicate PC gaming being dead or consoles being superior. You're trying really hard to get yourself agitated.
>>
>>55928255
>which is optimized solely for AMD cards.

I can't believe I actually read this after years it actually exclusive to Nvidia.

What a time to be alive.
>>
>>55928271
Of course there will be a very real difference when you go from 25-30fps 1080p to 144fps 1440p but is it really worth the investment? A machine that will run 30-60fps with med/high graphics on 1080p. Not really worth it imo. I'd rather build that 2k machine to actually see the difference.
>>
>>55928281
I don't recall any specific IQ optimizations since the days of 3dmark 2003 when renaming the executable resulted in lower results.
>>
>pc gayming
>$6348.15 for 4K
http://pcpartpicker.com/list/qDtvWX
>>
>>55928169
Nice bait
>>
>>55927975
the fuck am I reading, GTX titan 6gb worse than the 960 4gb?
>>
File: doomed.jpg (393KB, 1920x1080px) Image search: [Google]
doomed.jpg
393KB, 1920x1080px
>>55927756
>>55927771

i had 13-20fps at mid-high settings with oced 7850, game has awful optimization.

But anyway i can play it at 80-150fps on my new gtx 1060 so who cares.
>>
>>55928337
Ebin
>>
>>55927756

Denuvo is probably slightly slower than Sony's DRM system.

Also you should try running the game at the lowest settings and at 1600x900 to really compare your computer to a console.
>>
>>55928312
You're miserable.
>>
File: average nvidia owner.jpg (84KB, 293x398px) Image search: [Google]
average nvidia owner.jpg
84KB, 293x398px
>>55928255
>which is optimized solely for AMD cards
>a low level API is "optimized solely for AMD cards"
This is what Nvidiots actually believe.

>DX11 forever! Purge the DX12/Vulkan devil!
>S-s-stop making us l-look bad!!!
>>
HAHAHAH
OP actually thinks 950 can run D4

nigga 950 CANNOT RUN DOTA ON MEDIUM @ 60FPS

SELL THAT PIECE OF SHIT AND GET A REAL CARD, 7900 SERIES OR SOMETHING FOR $100
>>
>>55927975
That graph is not right, it's got fucked up shit like 1 card better the same two in SLI (r9)
>>
>>55928576
>>a low level API
You are ignorant of the intrinsic shaders in the last patch or that async isn't even enabled for Nvidia cards yet?
>>
File: image_1.jpg (179KB, 640x1136px) Image search: [Google]
image_1.jpg
179KB, 640x1136px
>>55928591
Yes it can along with mgs5 from my own experiences
>>
>>55928718
at 320x240 maybe
>>
File: specs.jpg (238KB, 838x494px) Image search: [Google]
specs.jpg
238KB, 838x494px
>>55927756
>mfw playing Doom 3 and not even knowing what this thread is about
>>
>>55927756
It's a fucking manchild hobby, just buy both and get over it.
>>
>>55928763
woops wrong pic
>>
>>55928255
>optimized for AMD

Yeah man, DX12 and Vulkan are both created by two satanic organizations solely for the purpose of weakening of the divine Nvidia master race.
>>
>>55929008
>DX12 and Vulkan are both created
>both
I point out one version of one game and you somehow read that to mean something more than one version of one game. How is the OpenGL version of Doom benching again?
>>
>>55928752
No I had a budget build I pulled out of a dumpster mostly. E8200, 8gigs ddr2, zotec 950. Settings set to medium with no AA
>>
>>55927756
The console versions make heavy use of async compute. It makes a big difference on any AMD GPU.

I'm pretty sure you could get a 950 close to 60fps at similar settings to consoles (they're not running anywhere near high/ultra)
>>
>>55929339
Also, the consoles use dynamic resolution scaling, so it's not always at full 1920x1080.
>>
>>55928638

> it's got fucked up shit like 1 card better the same two in SLI (r9)

No, you are jsut stupid. If a game doesn't support multi-gpu (doom does not) there is stil ladditional driver overhead. Plus for most of those dual die cards they use different clocks than their single die counterpart (the 295x2 is even clocked higher than a 290x!).
>>
>>55929040
Sure man it's the API's fault that the cards were built with only DX11 in mind.
>>
File: xBgSksM.jpg (18KB, 340x260px) Image search: [Google]
xBgSksM.jpg
18KB, 340x260px
>>55929939

Trufax
>>
>>55929981
Kek'd audibly, thanks anon
>>
>>55927756
PS4 runs it at 30 fps 900p at best, and at less than the pc's min settings.
>>
>>55930396
>hey guys lemme just drop in my gut feel about thsi subject even though i did no research at all on the game in question
>>
>>55927756
>Why can't a gtx 950 play doom at 60fps on 1080p but a ps4 can for like 90% of the time.

950gtx - 1,4 teraflop
ps4 - 1,7 teraflop

maybe since 950 is a shitty card outperformed by hardware from 5 years ago?

my 1070gtx maxes out doom 4 on max settings at 200 fps most time with some drops to 170fps
even my ancient 660 has no problems running the game on high with low shadows and motion blur off at 50 fps

Why do retards always assume shitty 50$ new card from this year > a 500$ top end card from last year ?
>>
>>55930456
>hey guys, did you know that I'm paid to post this garbage?
>>
>>55930396
>talking out my ass:the post
>>
>>55930512
paid by who? in the case you actually were fucking serious...

doom runs at 1920x1080 while utilizing dynamic rendering resolution to keep it at the 60 fps target. it rarely actually drops from that 1920x1080 unless there's a ton of enemies and stuff going on on the screen. settings are visually very close or identical to medium settings on pc but this is anecdotal evidence at most. you'd know that had you actually looked at a single comparison.

why comment if you haven't done any research? enjoy typing misleading garbage akin to shiposting?
>>
>>55930544
>max enemies in a room: 12
>tons of enemies and stuff going on on the screen

>paid by who?
You tell us, shill.
>>
>>55930575
>shill
what am i shilling?
>>
File: doom-classic-box-art-re-size.jpg (549KB, 800x600px) Image search: [Google]
doom-classic-box-art-re-size.jpg
549KB, 800x600px
>gtx 950 not good enough for doom

You must be doing something wrong.
>>
>>55927972

>You can pirate easily saving loads of money.

ok denuvo
>>
>>55930812
None of Denuvo games are worth even pirating, besides new Doom.
>>
>>55930917

Yeah so I'm told. still wouldn't mind playing them for whatever value they have besides being empty sequels.
>>
>>55927756
>ps4
>1080p
>60fps
Hahahahahahahahahahahahahahahahaha.
>>
>>55927925
That's right, it will run better than ps4 silky smooth 30fps 900p.
>>
console retards
>$400 console
>$800 TV
>$800 laptop
>$80 per game
>$10/month subscription
>total $2000+
"enjoy" "games" at "30" fps and "900p"

pc friends
>$700 pc (midrange, maybe a rx470/480/1060 if on sale)
>$200 monitor
>$500 laptop for fucking around/school
>total $1400
play almost every game at 1080p 60fps AND have a powerful enough PC to do other things
>>
>>55931521
>get a bundle deal
>i5 system for $300
>throw a 1070 in
>pc that will last nearly a decade for $800

The big consoles just dont make sense anymore, given the rise of mobile/handhelds and how cheap powerful components are.

Guess its just a sign of how lazy and ignorant people are becoming.
>>
>>55931887
Your average gamer wants to feel like they belong to a tribe, and they want to be spoonfed everything.
>>
File: 009.jpg (89KB, 632x738px) Image search: [Google]
009.jpg
89KB, 632x738px
>>55927756
>>
this thread is weird.
>>
>>55927871
I always wondered why games didn't have a dynamic setting option. Like raising it when indoors vs outdoors, or other common easy to run areas
>>
>>55928045
not anymore it doesn't

nvidia has started gimping the 970 and the PS4K / Scorpio will outperform it even more
>>
>>55927756
>buys an absolute dogshit gpu that it's mainly use is in moba games
>wonders why it can't do 60fps on doom
Fucking lmao my 380 runs at solid 60fps
>>
>>55927881
FPS are unplayable on that children's toy controller gamepad.
>>
>>55927756
>he bought a pc
>to play games
>at the quality of a console
You fell for the masterrace meme faggot. Unless you're willing to shell out at least $800 on one, it's not worth it.
>>
File: 1474524542524.png (1MB, 2445x1097px) Image search: [Google]
1474524542524.png
1MB, 2445x1097px
>>55928143
>>
>>55927756
DOOM on PS4 uses asynchronous shading, the GTX 950 doesn't support it.
>>
>>55933957
Neither does the PS4.
>>
File: Async_Games.png (135KB, 3999x2250px) Image search: [Google]
Async_Games.png
135KB, 3999x2250px
>>55934079
Yes it does, are you a dumbass?
>>
>>>/v/
>>
>>55934106
Lobotomized or inbred? Tell us your story!
>>
>>55933209
you can use a mouse/keyboard
>>
Set everything to low~mid and try again
>>
>>55934187
Both the PS4 and Xbox One support asynchronous shading and compute, I'm not going to argue with someone who is completely oblivious to the last 3 years of console and PC game development.
>>
>>55934079
GCN supports async compute shaders in all its implementations, including PS4, XBOne, and post -HD 78xx GPUs.

It's not super broadly used yet since if you're an idiot you can actually make things slower (more cache contention, etc.), but anything 8th gen supports it in principle.
>>
>>55931887
>prebuilt mom with expansion slots and or psu capacity
things that never happened/10
>>
>>55929939
Nowhere did I blame the API. IDsoft is allegedly working on specific Nvidia optimizations and enabling async for Nvidia cards as well.
Also, Time Spy benchmark indicates to the otherwise.
>>
Consoles have a much higher data bandwidth between the GPU, CPU and some memory. I don't know exactly how this plays into effect, as I was only shopping the graphics class I learned this in
>>
>>55927909
>>55928035
>>55928143
>>55928208
>>55928312
Filtered. Also, kys
>>
File: 1457945040571.gif (590KB, 640x640px) Image search: [Google]
1457945040571.gif
590KB, 640x640px
>>55927756
Nvidia cant into Vulkan.
>>
>>55936884
of course id or anyone else can add GPU-native intrinsic instruction to their engines for Maxwell, Pascal, or whatever, but none of Nvidia's architectures can do actual async compute the same was as GCN, since they lack independent shader management engines like AMD's ACEs.

What Nvidia did do with Pascal is make context switches much less expensive so that they can do deadline-based VR perspective warps properly, but this isn't the same thing as having the GPU automatically juggle concurrent compute- and graphics-heavy shaders to eke out the last 10% or whatever of performance.

Nvidia's (arguably smarter) strategy has been and continues to be working with game devs to entirely overhaul shaders within their drivers with targeted optimizations for their latest architecture.

AMD's strategy will Just Work(tm) without continuous ongoing software hacking, at the considerable expense of using more power to just brute force everything.
>>
>>55937215
>the same was as GCN
Just like AMD doesn't do actual async like Nvidia does because they lack independent CUDA shaders. . .

>context switches much less expensive so that they can do deadline-based VR perspective warps properly
Wow, what an interesting bunch of buzzwords.

>GPU automatically juggle concurrent compute
You are claiming Pascal cannot dynamically handle concurrent compute tasks? If so, [citation needed]
>For Pascal, NVIDIA has implemented a dynamic load balancing system to replace Maxwell 2’s static partitions. Now if the queues end up unbalanced and one of the queues runs out of work early, the driver and work schedulers can step in and fill up the remaining time with work from the other queues.
>>
>>55937365
Nvidia engines since Fermi have been able to handle multiple concurrent compute tasks.

The thing that AMD can do that Nvidia can't yet is execute arbitrary compute-only shaders at the same time as graphics shaders that use fixed function units like the TMUs, ROPs, tessellators, etc.

The purpose for this is to overlap computation of things like post-processing effects with normal rasterization that needs to do things like triangle setup and texture sampling.

It's not the biggest deal in the world for Nvidia, since they can just continue to help rewrite shaders on the software side to manually overlap the various rendering stages, but you can't just say they have some hardware capability they clearly don't.
>>
File: temp.jpg (131KB, 1280x720px) Image search: [Google]
temp.jpg
131KB, 1280x720px
>>55937532
>can't yet is execute arbitrary compute-only shaders
You mean mixed-mode compute?
[citation needed]
It seems to be a waste of time pointing out your errors as it seems you are just going to repeat them even when shown there is evidence that it is wrong. Might I suggest you read the Anandtech 1080/70 review which has a rather indepth portion covering Pascal's async. It seems you confuse Maxwell's async with Pascals.
>>
File: temp.jpg (58KB, 663x318px) Image search: [Google]
temp.jpg
58KB, 663x318px
>>55937670
Whoops! Wrong pic.
>>
>>55937215
>>55937532
>>55937670
>>55937719
Maxwell 2 and Pascal can manually partition the ALU shader block into two parts, and Pascal can automatically un-partition them when the graphics shader work queue empties, put it's not the same as what GCN tries to do, which is to load balance the entire shader array every cycle based on transient conditions like memory stalls.

AMD's approach is probably marginally faster in theory, but manually tweaked shader assembly code will still gain faster results without the extra dynamic power load.
>>
>>55937857
Why do people keep posting the obvious, that Nvidia doesn't do async the same way AMD does? No one claimed that they do. The only claim that pops up all the time is the myth "Pascal cannot do hardware async."
>>
>>55927756
Consoles render everything on medium settings at 30fps, which is why games that look amazing on PC look like shit on consoles.
>>
>>55937928
Because AMD billing async compute as being a performance-boosting architecture feature along the lines of HyperThreading.

What Nvidia does allows a similar level of software-side simplicity in feeding the GPU different types of tasks, but it really doesn't improve shader IPC at all.

Blame it industry-typical hyping of a marginal feature and using marketing gymnastics in trying to conflate different distinct things.
>>
File: 0a1.jpg (72KB, 643x820px) Image search: [Google]
0a1.jpg
72KB, 643x820px
>>55928143
>>
>>55927756
It's because of optimization, PC users are a small part of the games market
>>
>>55937928
> AMD invents and names thing
> Nvidia makes slightly different but less useful thing, uses same name
> people bicker about semantics

what a shocker!
>>
>>55938122
>> AMD invents and names thing
AMD "invented" asynchronoous compute? [citation needed]
>>
>>55927788
You must be a real hit at parties
>>
consoles also render at 1080i which is far less work than 1080p
>>
>>55938074
>software-side simplicity in feeding the GPU different types of tasks
Is that repeating the myth? It isn't hardware async but software async?
>>
>>55938148
introduces HyperThreading in GPUs, at least

>>55938223
Mantle, Vulkan, DX12, etc. are build on the idea of being able to push work requests to the GPU through lockless ring buffers.
That multiple work buffers can have tasks independently assigned is where the notion of "asynchronous" processing comes from.

GCN can assign any of its ALUs graphics or compute work at any time, usually prioritizing the former and slipping in the latter when a graphics shader is waiting on results from a fixed function unit.
Nvidia hardware can mark off a pool of ALUs for one task or another and process them simultaneously, but can't repartition dynamically based off of shader instruction flow, which is where the performance gain (and power consumption increase) comes from.
>>
>>55938356
>at least
Thank you for implicitly admitting you just made shit up.

>Mantle, Vulkan, DX12, etc. are build on the idea of being able to push work requests to the GPU through lockless ring buffers.
Thanks for not answering the question. It was a simple yes or no question and it seems to have caused buzzword spaghetti to fall out of your pocket. If you cannot support your claims with authoritative citations try to dazzle them with bullshit and hope no one notices, huh?
>>
>>55938408
No, AMD invented the feature they claimed to, and called (wisely or not) async compute.

My other point is that "async" has a variety of different meanings, and Nvidia's clearly never was what AMD's was, which has always been willfully deceptive marketing.
>>
>>55938474
>invented the feature
Rewording the same claim that you failed to provide a citation for before?

Dazzle them with bullshit again, huh?

>My other point is that "async" has a variety of different meanings
[citation needed]
I wonder if Asynchronous Computing has a more expansive definition, like includes parallel and concurrent tasking, than AMD tries to claim they "invented?" One that originated long before GPUs tried to implement it? Hm. . .
>>
File: Async_Shaders.png (200KB, 3999x2250px) Image search: [Google]
Async_Shaders.png
200KB, 3999x2250px
>>55938554
AMD invented shader control logic that can handle ALU-only and ALU-plus-fixed-function instruction dispatching arbitrarily.

Nvidia GPUs can handle both types of work queues simultaneously, but any individual CU/ALU block can only handle one type or another until reassigned by the driver software.

What AMD made are async shaders ALUs, which are not the same thing as whole-GPU coarse partitioned async processing.

If you disagree, perhaps you can provide a source.
>>
>>55938188
Interlaced rendering now anon? Do tell us more.
>>
>>55934796
>PS4 and Xbox One support asynchronous shading and compute


This is actually true.
>>
>>55938639
>invented shader control logic
Moving the goalposts? And not addressing what "Async means?"

Am I to understand that you have abandoned the previous points?
>>
>>55932825
Yeah, it worked well on Forza 6 apex and you could match the Xbone with low end hardware at the same price.
>>
>>55938724
>missing 2/3 of the quote
Jesus, just fucking shut the hell up already you petulant little shit.
>>
>>55927828
>consoles don't have any optimization over PC.
>consoles used to run on completely different architectures.

Of course they do you retard, if a dev was making the game on your computer, chances are it would run better in your computer than it would on any other computer.
>>
>>55927820
Isn't Gamespot the site that porpusely turned down settings on PC when making their Overwatch comparisong?
>>
>>55927756
Console has dynamic resolution scaling, on PC you can only set rendering scale statically. So on PS4 the game might run at 1080p in some places and then switch to 540p when it needs more performance, while on PC you'd have to set it to 540p at all times to get stable frame rate.
>>
>>55938763
>missing 2/3 of the quote
I didn't need to quote any of it to point out how you've not addressed what was in the post you responded to. I ask for a second time, and expect to not be answered again.
Are you abandoning the previous point?
>>
>>55928035
The part about split-screen hurts my soul because that feature seems to be slowly going away everywhere.
>>
>>55938554
Here is the async compute white paper. Notice that it's hosted on AMD's servers and written by AMD employees and has an AMD copyright notice.

http://amd-dev.wpengine.netdna-cdn.com/wordpress/media/2012/10/Asynchronous-Shaders-White-Paper-FINAL.pdf
>>
>>55938724
Are you perhaps the same idiot who starts "arguments" like this in /hpg/ and possibly in other threads too? You literally fit the bill perfectly, manage to tire everyone who replies in an instant because of your inability to partake in the conversation and argument. Nitpick on argumentation errors, jump to conclusions, conveniently miss valid points which you can't refute, [citation needed], goalposts, weasel out when asked for source by "muh burden of proof" etc.
>>
>>55938828
I don't know and I don't think it matters since Doom dynamically changes its IQ settings to hit its 60fps target.
>>
>>55938724
goddamn, I don't think Jen-Hsun is giving you enough shekels for all your hard work today.

GCN shaders can switch between graphics and compute tasks without software intervention, Maxwell/Pascal can't, end of story.
>>
>>55928480
>game has awful optimization.
Not really dude
>>
>>55938893
Let me get this straight. AMD writes an article about Async compute so you attribute them to have "invented" async compute? Do you honestly not understand how retarded that is? If I post an Nvidia white paper about DX12 does that mean Nvidia "invented" DX12?
>>
>>55938899
>Maxwell/Pascal can't
[citation needed]
IIRC, one of the usually "citations" presented in support of this argument was the tweet that Nvidia was dropping driver side async support. Now that I think about it. That's kind of hilarious.
>>
>>55938884
I'm not even the guy you've been pestering for the past half hour you ginormous faggot
>>
>>55938926
Is it somehow unreasonable to assume you adopt the same position if you respond to the same discussion without notifying that you are a different poster?

Did I claim that you were the same poster as "I have been pestering" or did I claim you are the same poster who posted >>55938639 based on the fact that you responded to >>55938724
>>
>>55938923
>>Maxwell/Pascal can't
>[citation needed]

Fine, Maxwell/Pascal can actually secretly do it, but they won't tell anyone it's possible who hasn't personally sucked off the entire executive board of Nvidia.

Are you satisfied now?
>>
>>55938973
>they won't tell anyone
Anandtech just made up the information in their review of the 1080/70?

Thank you for admitting you've got nothing and disregarded previously provided citations.
>>
>>55938995

Pre-emption is not the same as what GCN does.
>>
>>55927756
If you know exactly what hardware your software will be running on, you can justify investing in writing code that takes advantage of its intricacies and avoid certain inefficient abstractions. This is what is normally meant by "console optimization", yes.

The gap is closing, though. The PS4 and XB1 chips are basically just GCN APUs. A 950 is pushing it but you can probably get solid 1080p60 from a 960, if you spend some time tweaking the settings.
>>
>>55928480
You just had a shit card. There is nothing wrong with that games optimization.
>>
>>55928638
The SLI support in some games is such a mess that it does actually make them run worst.
>>
>>55928480
7850 is fairly old and this is a reasonably demanding game.
>>
>>55939007
>Pre-emption is not the same as what GCN does.
Straw man. As has been mentioned before, no one claimed that Nvidia handles async the same way AMD does.

If you are claiming that all Pascal does for async is pre-emption I still await a citation. Given how you have disregarded the previously provided citation I expect you to fail in providing such a citation.
>>
File: 1469743235983.png (1MB, 1200x10000px) Image search: [Google]
1469743235983.png
1MB, 1200x10000px
>>
File: 1469743310719.png (2MB, 2000x8571px) Image search: [Google]
1469743310719.png
2MB, 2000x8571px
>>55939076
>>
>>55939076
I noted the switch in context. It starts off with talking about not hardware DX 12 in Kepler and Maxwell and then finishes with a comparison that includes Pascal but then concludes that "AMD does better because it gained more than the 1080." Given the non-existent gains/loss for pre-Pascal cards with async then the gains, even if AMD does gain more, indicates that Pascal does hardware async.

Oh, and on a different note, I wonder if that difference in Async may be because AMD was less efficient in scheduling graphics/compute tasks than Nvidia, ie. it has more cores being wasted without async on.
>>
>>55939082
I note how this claims:
>Pascal does not support Asynchronous compute + graphics . . .
I note the lack of any citation to this conclusory statement and how it is contradictory to what was stated in Anandtech's review of the 1080/70.

Do you have a link for this writing or is it just something you made up?
>>
>>55939144

> I wonder if that difference in Async may be because AMD was less efficient in scheduling graphics/compute tasks than Nvidia, ie. it has more cores being wasted without async on.

Now forgive me if i'm wrong, but looking at how GCN2 onwards is designed the ACE exist solely to work on compute tasks while the shaders are busy even if other parts of the chip are being left idle.

So I suppose (under the caveat I dunno shit) your theory of AMD having less effecient scheduling is likely.
>>
>>55939203

Anandtech has been wrong plenty of times - it (iirc) still incorrectly lists the number of compute queues for hawaii, tonga and fiji.
>>
>>55939241
>it (iirc) still incorrectly lists the number of compute queues for hawaii, tonga and fiji.
[citation needed]
As far as I am aware, they get their numbers from the manufacturer, which was a factor in regards to the 970s misrepresentation of its memory bus, ie. Nvidia screwed up in updating its technical writers.
>>
>>55939271

>[citation needed]

http://www.anandtech.com/show/9124/amd-dives-deep-on-asynchronous-shading

See the chart listing how many comnpute are listed for GCN, 8 is woefully wrong as each ACE can handle 8 tasks each, so the number should be 64.
>>
File: pascal_dyn_partitioning.png (170KB, 723x2033px) Image search: [Google]
pascal_dyn_partitioning.png
170KB, 723x2033px
>>55939241
the author of the article anon appears to be referencing,
> http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/9
is clearly conflating the notion of dynamic partitioning with dynamic instruction scheduling.

the entire point of async compute shaders is that any ALU block can be assigned any work type on a cycle-by-cycle basis so that partitioning isn't even needed.
>>
>>55939342
>See the chart listing how many comnpute are listed for GCN
It seems they are spot on in stating that each engine can handle 1 graphics + 8 compute tasks. You even point out how they are accurate with:
>each ACE can handle 8 tasks each

I would suggest you stop posting if that is the level of your reading comprehension.
>>
>>55939390
Ryan Smith is a second-rate tech writer.

What he wrote up for
>>55939390
>>55939342
doesn't jive with AMD whitepapers, which claim that the singular GCP can do 1 shader of any type and each ACE can handle 8 shaders that don't use fixed-function graphics units.
Something like "1 + 8*n" would have been the correct thing to list, not "1 + 8" (implicitly times some n).

>>55939390
>It seems they are spot on in stating that each engine can handle 1 graphics + 8 compute tasks.
ACEs simply can't do graphics shaders, period, so there's no way the table is accurate under any generous interpretations.
>>
>>55939503
>ACEs simply can't do graphics shaders, period
You were wrong in regards to compute due to a failure at basic reading comprehension and yet you expect me to just believe you are not wrong in this too?
>>
File: AMD-Hawaii-GCN-2.0.jpg (646KB, 2310x1163px) Image search: [Google]
AMD-Hawaii-GCN-2.0.jpg
646KB, 2310x1163px
>>55939552
I should hope you realize you're arguing with more than one person at this point you retard.
Also, you're missing the critical point that GCN has not one but two distinct "engine" types.

> ACE - async compute engine = no graphics shaders, up to 8 compute shaders each
> GCP - graphics command processor = 1 shader that can access TMUs, ROPs, tessellators, etc.

If you can't manage to grasp that level of detail, I can't fathom why you're even here.
>>
>>55938898
Well, it matters because it was a graphical comparisong, not a framerate test.
>>
>>55939641
>Also, you're missing the critical point that GCN has not one but two distinct "engine" types.
You mean, just like Anandtech noted?
>From a feature perspective it’s important to note that the ACEs and graphics command processors are different from each other in a small but important way.

Your every post only proves that the claim "Anandtech is wrong plenty of times" is inaccurate and based upon an inadequate skill at reading technical papers.
>>
>>55939701
>it matters because it was a graphical comparisong,
It matters even if Gamespot cannot turn down graphical settings as the game will just turn them back up again so long as the game hits its 60fps target?
>>
>>55939716
I don't know if Doom does that on PC, but I was talking about the Overwatch thing.

My main point being that Gamespot isn't necesarily a trustworthy site to get comparisong videos.
>>
>>55927756
>but a ps4 can for like 90% of the time
But it can't. It literally can't.
>>
>>55939778
>I was talking about the Overwatch thing.
Yes, I am aware. But if you are going to raise a point shouldn't it be relevant to the topic of discussion?
>>
File: apples_and_oranges.png (42KB, 688x506px) Image search: [Google]
apples_and_oranges.png
42KB, 688x506px
>>55939705
I'm not even that guy above, try again.

However, I can clearly see that the complaint being levied against AnandTech is inconsistency in pic related.

GCN can handle up to 64 + 1 simultaneous shaders using 8 + 1 engines, whereas Maxwell 2 can do 32 / 31 + 1 using just two engines that correspond roughly to a GCP and one massive ACE.
>>
>>55939829
>I'm not even that guy above, try again.
No one has ever or would ever samefag on 4chan. . . If you are not the same person you have the same egotistical standard that anyone will believe you simply because you say something.
>>
>>55927975
>>55928236
>>55928255
>>55928281
>>55928576
to play new doom amdfags must buy it first... is it even possible ?

>>55932825
Just like RAGE ?
>>
>>55939813
If that comparison video is to compare graphics, and Doom wasn't running at max settings, which it probably was I guess, then it wouldn't be a fair comparison.
>>
>>55939923
>Doom wasn't running at max settings
Doom was running at the max graphics its own engine allows. It is not being affected by a settings change by the reviewer.
>>
>>55938892
At least Rocket League and even Black Ops 3 on PC had it. I might actually be interested in trying out Gears 4 on Windows if they include splitscreen, but the lack of any mention of it outside the Xbone version sounds like they're ditching it again.
>>
File: doom.gif (538KB, 320x240px) Image search: [Google]
doom.gif
538KB, 320x240px
Playing it at 60Hz without stuttering
>4690k No OC
>Sapphire 280x
>8GB RAM

T..thanks Falcon sempai
>>
>>55927756
Because Nvidia is trash. My GTX 770 can barely run doom at all
>>
>>55930917
Mgsv
>>
>>55940268
I wish companies were willing to put the effort to gave PC releases splitscreen, Borderlands is shit in PC for me because I can't play with friends.

I've been told it's something to do with Steam not allowing multiple users to log into a single game client or some shit like that but idk.
>>
>>55940372
my meme gtx 970 runs it over 60fps at ultra on 1080p
>>
>>55927992
not on pc
>>
>>55940969
honestly consoles are better for co-op i don't have friends to play with so i just stick to my pc for now.
>>
>>55941304
Are you stating that the PC version of Rage did not adjust IQ settings on the fly?
>>
>>55941320
rage pc had a fixed rendering resolution
>>
>>55930917
>piratefags STILL believe this
in 2-3 years every single worthwhile publisher will have denuvo integrated into their game. Only worthless indie devs that can't afford denuvo will be left out for piracy.
>>
>>55941353
>id Tech 5 sports a performance balancing feature. The engine automatically balances image quality with performance, and it aims for a 60FPS target. It does so by monitoring framerates and balancing detail, primarily in textures, to reduce load on the GPU.
http://www.hardocp.com/article/2011/10/19/rage_gameplay_performance_image_quality/1
>>
>>55941410
maybe they will realize by then that aggressive drm isn't guaranteed sales
>>
>>55941463
it sure made me buy doom so i guess it's working . the $38.99 I paid was worth it my GOTY
>>
>>55941422
I'm aware. it has a fixed rendering resolution only changable in the settings menu, while the console versions of rage, wolfenstein, and doom adjusts resolution in real time to maintain its target frame rate, as the original post >>55927977 said.
>>
>>55941496
>resolution
Let me get this straight. You jumped onto the tail end of a discussion about image quality to make a remark about something irrelevant to image quality?
>>
>>55941463
but more importantly it cuckolds the fuck out of pirates and poorfags, two of the most dispicable groups on the planet.
>>
>>55941310
Depends really, I use a TV as a second monitor, so when friends come over it's really easy to play splitscreen, it's even easier on PC because you can go as far as to give every player their own screen if you are that obsene amount of money. And it will run better because muh extra power. If the demand was there devs could probably make use of multi-monitor setups to make every player have their own screen that way, or accommodate every screen in a custom way, but well, the demand is NOT there.

Also everyone can bring their favorite controller and you can make them all work, the other day me and 2 friends player Serious Sam splitscreen, one of them wanted to use 360 controller and he could, other friend is SONY pony so he wanted DS3, and he got it, and I used M/K because we were out of controllers.

It's pretty fun if you are willing to go trough the setup, or if you know you are gonna do it often you can just leave it all setted up full time. Basically you have the usual PC musterdrace advantages applied to the splistscreen experience. Then again configuring controllers for every individual game can be a pain in the ass, especially older tittles. But nowadays? With almost every game having native xinput support? You can set that shit up in 5 minutes. Now that the newer Xbone controllers are gonna support bluetooth you don't even need a wireless adapter to use them unplugged.
>>
>>55941509
Look at the original post and its response you fucking illiterate

>the console versions adjust the rendering resolution on the fly to hit the 60 fps mark

>You mean like Rage did back in 2011?

>not on pc

>rendering resolution
>rendering resolution
what did you think this meant?
>>
>>55941521
fuck i really want a bluetooth xbox one controller
>>
>>55941541
>Look at the original post and its response you fucking illiterate
Look at the OP's post about one resolution setting.

And I would like a citation as I find it hard to believe that Doom changes resolution, from say 1080p to 720p when it is far easier and more efficient to change image quality settings.
>>
>>55941518
Fuck off shill, if I had pirated Destiny I would of known better than to expend [spoiler] My boyfriend's [/spoiler] 60$ on it.

Only pay for worthwhile products. Pirate first, decide later.
>>
>>55941550
the new xbox one s controller has bluetooth.
>>
>>55941558
I know we shouldn't use common sense to judge Microsoft's actions, but I gotta imagine that they're discontinuing every non-bluetooth controller, since the new ones support both Bluetooth and 2.4Gz wireless.
>>
>>55941558
yeah the white one looks so nice also i guess it's gonna be super easy now to connect to an android phone via bluetooth and use it to play emulators
>>
>>55941518
Fucking this. The pure salt secreting from jealout piratefags not being able to play the dozen or so current denuvo games is delicious enough, I can't imagine the collective aneurysm they'll have when they find out that they actually have to start paying for their shit now.
>>
>>55941581
I haven't found a way to connect the Wii U pro-controller to my phone. So we'll see.
>>
Budget builds are for people who like to waste money and upgrade all the time if you are smart you spend a little more now and then only spend every 5 years or so or even longer depending on your preference
>>
>>55941677
This. people on a low budget should just get a console and call it a day but if you really need a desktop then i guess a budget build is okay just don't expect much from it.
>>
>>55941706
Depends on your expectatives, I was able to squish every drop of juice out of my GT210 until it died about 5 months ago.
>>
The main reason is the engine actually uses AMD hardware properly. Look at a 4GB 270's performance
>>
>>55941728
Or well, it wasn't card itself that died, but rather the PC it was in.
>>
>>55941728
well yeah i guess if all you're gonna play is super meat boy or some cs go mobas etc..
>>
>>55941740
You'll be suprised, best I could get out of it was Wolfestein the new order on shit-tier graphics with an average of around 20FPS, but hey, I did worst than that back in my childhood.
>>
>I don't really don't see the point of budget builds then if a ps4 will outperform them.

What makes you think that PS4 beats budget builds in every game. Doom is outlier, not the norm.
https://www.youtube.com/playlist?list=PLQbCPWtOQp0FoY_-7GwWSErWP2j7--Hh5
Thread posts: 221
Thread images: 27


[Boards: 3 / a / aco / adv / an / asp / b / bant / biz / c / can / cgl / ck / cm / co / cock / d / diy / e / fa / fap / fit / fitlit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mlpol / mo / mtv / mu / n / news / o / out / outsoc / p / po / pol / qa / qst / r / r9k / s / s4s / sci / soc / sp / spa / t / tg / toy / trash / trv / tv / u / v / vg / vint / vip / vp / vr / w / wg / wsg / wsr / x / y] [Search | Top | Home]

I'm aware that Imgur.com will stop allowing adult images since 15th of May. I'm taking actions to backup as much data as possible.
Read more on this topic here - https://archived.moe/talk/thread/1694/


If you need a post removed click on it's [Report] button and follow the instruction.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com.
If you like this website please support us by donating with Bitcoins at 16mKtbZiwW52BLkibtCr8jUg2KVUMTxVQ5
All trademarks and copyrights on this page are owned by their respective parties.
Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
This is a 4chan archive - all of the content originated from that site.
This means that RandomArchive shows their content, archived.
If you need information for a Poster - contact them.