Nvida BTFO: faked Pascal board edition
>That silicon was manufactured in mid-January of 2015!!! Everyone including the author was assuming that Pascal didn’t even tape out until Q3 of 2015
>Those WW3/2015 date codes mean Nvidia had to tape their Pascal designs out in late 2014, even before the TSMC process was ready for such tapeouts much less stable. The Nvidia designers were obviously so good and so far ahead of the game that they were able to put out a design that worked so well out of the box that even on an unstable, some go so far as to say non-existent, process it worked on the A1 stepping. Not only that it worked so well that once TSMC stabilized the 16FF+ process months after Nvidia produced their Pascals, no update to the GPUs were even needed! Imagine that, so many critical PDK updates from TSMC and none managed to do any better than those plucky geniuses at Nvidia could get quarters before. That is a commanding lead if there ever was one, simply staggering process tech.
>It all has to be true because as an executive in a publicly traded company with press and analysts who cover and trade Nvidia in the room, he is obliged by the SEC to be truthful. The only other explanation is that he knowingly lied to the press and analysts and showed a fake card in his keynote, something that would clearly be illegal. Jen-Hsun would never do something like that, right?
NVIDIAFAGS ON SUICIDE WATCH
FLASHBACKS LIKE PORK SHOULDERS, WOODSCREWS, HOUSEFIRES, 1.7 YIELDS AND SCALING SENDING NVIDIAFAGS INTO SHOCK.
STATE OF EMERGENCY. REMEMBER THE DEAD PANELS AND THE DRIVER OF CERTAIN DEATH.
WARNING: THIS THREAD MAY INDUCE POST TRAUMATIC STRESS
did they said it's a working chit he is holding for the press? did they sell it? no? so shut up, cancerous faggot
also I bought maxwell on release and I will switch to pascal when it's out. stay irrelevant amemed. you don't even have nothing to show
>you don't even have nothing to show
The article is being very sarcastic when it's claiming that NVidia have had this available since early 2015 because it would be impossible.
It's also noting that everyone else is claiming that making a chip that can handle both GDDR5 and HBM is impossible, meaning that the card shown couldn't be what it's claimed.
How feasible is it to make a batch of chips, considering nvidia is a major customer and they were working on tweaking 16nm for mass production anyways?
It's especially believable since the pascal shown is using the older gddr where the new consumer ones will use hbm.
No, this is hbm
special batch + gddr5 sounds reasonable. charlie is just fishing for clicks again.
TSMC did not get their 16nm process up to bare minimum production standards until nearly the middle of '15
The chip Nvidia showed is dated for the third week of January, is presented as a working board, and this implies they had functioning test units (NOT THE SAME AS A WORKING CHIP) on 16nm as far back as October/September '14
The semiaccurate article is trying to tell you that what Nvidia showed is purely impossible.
No, it isn't reasonable.
It's as far from reasonable as it gets.
Charlie doesn't spew filth and is one of the most often correct people in the industry.
>The only other explanation is that he knowingly lied to the press and analysts and showed a fake card in his keynote, something that would clearly be illegal.
CHING CHONG DING DONG
>Charlie doesn't spew filth
that man exists to shit on nvidia
He writes tabloid tier shit and is only a tier above currytech. I wouldn't count on him to know what's going on behind the industry.
>Burgerfats in charge of understanding sarcasm
>right about intel code name changes two years in advance
>right about cannonlake being incorrectly placed/pushed back
>right about the 10nm delay and kaby being a refresh 18 months ago
>has insider sources you can only dream of
Sorry bud, some of us can actually pay for subscriptions and do in fact have a working memory.
Shills. Shills everywhere
lol? I'm not even trying to debunk him or anything. Can't people have a discussion without "destroying" someone? You must be a blast to hang out with.
He claimed Maxwell got cancelled and it got released 4 months later.
The site is called semi accurate for a reason.
That poor sod is desperately trying to stay relevant.
We're not even shilling for AMD here. We feel sorry for you nvidiacucks. Your company blatantly lies and cucks you so much it's just embarrassing to point it out anymore.
Yeah AMD has fucked up before but wooden screws, 3.5 gb of gddr5, drivers killing gpus, 1.7 yields, etc. And now blatantly lying about having a pascal gpu, god damn.
It's almost like nvidia doing stupid shit is a normal thing now, a fucking real world meme.
Yeah buddy that whole 20 watts really makes a difference!
WANNA BUY THIS 660TI? IT ONLY DRAWS 170 watts!
BUT HEY WAIT THIS 7870 IS JUST AS GOOD, BUT CHEAPER! HEY WAIT TI DRAWS 185 WATTS!
Hawaii wasn't released until q4 2013 and wasn't widely available until q2 2014 due to miners
I'm taking about 2012 where everybody got a 680 over a 7970 and where a 680 oc'ed would outperform a 7970 by a good 30%
Wait so if there is no legit Pascal GPU does that mean it's going to be shit because nvidia is rushing it as much as possible?
I have a GTX 960 and I'm very scared Pascal will be a piece of shit. Might have to go with AMD for an upgrade this time.
It made the 780 look like a joke and beat the titan a little (making that look like a joke too because lolprice), the 780Ti that was released after it was a little faster on account of being a much bigger die with more transistors.
Nvidia have always had the fastest single chip because they make fuckhuge chips with no regard for yields, they usually put it in stupid expensive cards no-one buys (eg titan x) but use it to convince consumers that nvidia are better in general just because their biggest chip is bigger than amd's biggest chip.
There are basically no known facts about release, availability, or how many of the Pascal chips they will be able to make.
Don't worry about it as this doesn't really change anything, it just shows that Nvidia are up to their old shenanigans.
It's going to be late and expensive. Absolute performance will probably be fine but the fact that nvidia have fucked up their yields yet again means AMD will be available first and have much better price/performance.
So why is this even a thing?
It's clearly a fucking working prototype board using modules of older shit for testing, swap those little sub boards out for pascal and it's the exact same thing.
He's right about nvidia rushing the shit out of pascal. Something is bound to go wrong. AMD is set to release artic islands around Q2 2016, nvidia is going to be raped violently by amd is pascal is shit.
I get this strange feeling that pascal is just maxwell with a few hot fixes burned onto 14nm lithography. Probably just me being paranoid I hope.
To be fair that's probably the only thing they need to do to be competitive:
>implement HBM controllers in high end chips
>update maxwell design for better 14nm compatibility
>add in some hardware async scheduling for feature competition
A shrunk Maxwell can probably reliably hit 1.8Ghz boost, if not 2Ghz flat out, while drawing 2/3rds the power
>So why is this even a thing?
>It's clearly a fucking working prototype board using modules of older shit for testing, swap those little sub boards out for pascal and it's the exact same thing.
This. I don't understand the issue
It's literally shrunk maxwell+hbm+FP16. Not necessarily a bad thing taken alone, maxwell isn't bad and die shrinks mean more cores in the same power envelope. It's just that it's gonna get nuked by Polaris, which has big architectural improvements alongside the shrink.
What the fucking hell are you spacks talking about?
The pictured shit is from nvidia talking about their car shit, that's the internals of a processing box that lives in the car to drive everything.
It's going to come with pascal, why the fuck is this prototype board being blown up to be such a big deal when any retard could see fucking VRAM modules and a tiny little chip and therefore see it's not HBM and not pascal?
>It's just that it's gonna get nuked by Polaris, which has big architectural improvements alongside the shrink.
N-no. nvidia is going to work very hard on pascal. Beside amd has less money so that means they spend less on r&d.
There's no way pascal is just maxwell on a smaller process, stop spreading FUD. nvidia can't risk to do another fiasco, I'm certain pascal will do just fine.
Faggot i have a 280X in my desktop right now, i couldn't give less of a flying fuck what nvidia do, i just want to know why a bunch of retards are making a big deal over what is very clearly a prototype module-accepting board in no way designed for PCs and, as far as i can tell, with no claim to actually be related to pascal at all apart from it being said to come with it upon release, which is what you'd expect from something designed to run MXM being TESTED with 980s
N-no. AMD is going to work very hard on polaris. Beside Nvidia has more money so that means they spend more on r&d.
There's no way Polaris is just GCN on a smaller process, stop spreading FUD. AMD can't risk to do another fiasco, I'm certain Polaris will do just fine.
The article is written by retards for these retards on /g/ to spout bullshit about.
"hurr pascal hurr fake"
It's a fucking prototype for nvidias car processor that's using 2x 980 MXM modules in place of the actual pascal ones it'll be using upon release.
Look, the wood screws thing was bad because it was so poorly done,
But is it really that big of a deal to show a model instead of a final product? Why do you care so much? Particularly in the drive px thing, where it was really about showing the form factor of the final product, and that it can be put into existing cars with small engine spaces.
Why is this such a huge deal?
There is no working pascal gpu you turd. AMD is already diamonds for nvidias vagina and is set to release polaris GPUs around June 2016.
This means Polaris GPUs will hit the market first. Pascal better fucking anihilate polaris GPUs else everyone is going to flock to AMD.
To compound the problem even further, polaris is the 4th gen of GCN not just a die shrink. Pascal is rumoured to just be maxwell with a die shrink. All of this is extremely bad news for nvidia.
... Lying about what, exactly? Do you think that Steve Jobs went on stage with working final iPhones months before their release?
And if you only care about graphics cards for gaming, then paying attention to the brand argument makes sense, but there's literally no better option for professional users than nvidia cards, and they still manage to work perfect on games, so again, the question is, literally why investigate that they had a fucking display model on stage
So nvidiacucks are actually this dense? dam, nvidia is so lucky to have zombies like you.
>can't defend own brand loyalties because there's no base to stand upon
>therefore we must attack everybody else
>this is a political issue for my favorite brand and not an issue of hardware functioning as advertised because my feelings are more important
>bbut they make more money th-they're better sh-shut upt!
max damage control
Corporate vomit speak plz go
Not to mention share-holders
Lying about it being a real, functional card at the time of presentation, and then continuing to say it when they were proven wrong. Steve Jobs has nothing to do with this lol also AMD is going to start interpreting CUDA on their cards get fucked
heh you forgot botnet email drivers
Did you watch nvidias conference at all? Tell me how putting new technology on the chip means its JUST a die shrink.
and showing an on stage model does not equal not having a working design. Why would they produce a whole working functional chip just to show people what it looks like?
Ah! I love Charlie. I mean, i quit reading S|A years ago, i occasionally read Charlie when he posts on RWT. But, still i love him.
What to say:
2009: Jen-Hsun Huang will never bring Nvidia share price back to 18$.
2010 Fermi is broken and unfixable.
2012 Why purchasing SeaMicro was important for AMD
And so on... LOL-Charlie is best-Charlie.
No, nvidias just lucky that I need drivers to CAD with.
The quoted post said they're lying and said they're keeping people from going to AMD. What strawman did i make, oh master debater sir?
>Do you think that Steve Jobs went on stage with working final iPhones months before their release?
Sorry, do you think Steve Jobs showed products months before they were ready for release? You might want to go back and take a look. The average time between announcement and release of iPhones was less than two *weeks*.
This is ignoring the point that in this case, lying about the product is actually illegal.
>This is ignoring the point that in this case, lying about the product is actually illegal.
It's bad enough polaris is set to rape nvidia before pascal is out but now nvidia is going to be in legal trouble as well. Bad times are ahead for nvidia. Those who were retarded enough to invest in nvidia must so fucking pissed off right now.
Enjoying the attempts at character assassination from the resident Nvidiots in an attempt to deflect attention. :^)
Holding up something and saying "This is a new thing that we've made!" when it's not is lying and very, very illegal when a large company is doing it to a large crowd with journalists and what have you.
Let's lay off the fanboyism,eh? This certainly is problematic but it's not the end of the world and it's a little bit early to say "lol NVidia bankrupt & BTFO top kek"
>Let's lay off the fanboyism,eh? This certainly is problematic but it's not the end of the world and it's a little bit early to say "lol NVidia bankrupt & BTFO top kek"
I'm not even an AMD cuck and I've mostly used nvidia products in the past. But yeah, it's still too early to say how badly this fiasco is going to affect nvidia.
>amd announce end of perofrmance optimisations because transparency, release crimson for 5000 series anyway
>nvidia stopped working on fermi the moment kepler came out and just didn't announce it
>nebulous claims about hacked-in pseudo-dx12 compatibility coming in some imaginary future
Got wddm 2.0 a couple of weeks ago for my GTX460, as promised. wddm 2.0 is a prerequisite for DX12 support (though they're not supposed to be fully compatible, they'll get virtual memory addressing and multithreading paralellism).
>Look, the wood screws thing was bad because it was so poorly done,
>But is it really that big of a deal to show a model instead of a final product?
It is when the silicon is stamped with impossible dates and said to be working.
>Why do you care so much?
Because Nvidia has lied directly to consumers several times in the past now. All the memes about them didn't show up just because someone thought they would be funny to say. Nvidia vomited out the source material and people ran with it.
The only true humor is the truth.
And yet, there was no physical way they had a working 16nm Pascal at the time they claimed.
There was no 16nm node even online at the time, much less ready for tapeout or sampling.
>also AMD is going to start interpreting CUDA on their cards get fucked
this is literally wrong, AMD is writing a converter to port CUDA to OpenCL.
>2009: Jen-Hsun Huang will never bring Nvidia share price back to 18$.
Obviously wrong and opinionated.
>2010 Fermi is broken and unfixable.
They never actually fixed Fermi, just the underlying fabric it was built on. Reduced
>2012 Why purchasing SeaMicro was important for AMD
Aaaaand evidently you don't understand the server industry.
I will always buy Nvidia™ because I only play games The Way It's Meant to be Played™. Nvidia also pioneers innovative new technologies like PhysX™, Gameworks™ and the highest quality driver to ever grace Windows.
When I boot up with a brand new Nvidia™ Geforce™, I can experience the game just like it's mean to be played. Nvidia™ also delivers a far more silkysmooth experience.
Nvidia Geforce™ is also very power efficient. A graphic card is the most power hungry device in your house. Refrigerators, air conditioners, water heaters, dish washers, lights, etc all use significantly less power than a graphic card. Which is why Nvidia™ puts gamers first by ensuring that their gaming experience is of the highest quality while looking out for gamers by giving them the most value in their electrical bill.
At this point in time, there's really no reasons to consider an AMD graphic card at all. I tried one one time, it caused so much heat that it exploded. It also consumed so much power that it gave on an EMP and destroyed the rest of my computer.
Nvidia™ also pioneered how useless GPGPU is with CUDA™. Years ago, everyone thought GPGPU, CUDA™, and OpenCL were the future. Now, Nvidia™ has removed those useless features from their GPUs and increased efficiency. Now you can save thousands a year in electricity thanks to Nvidia™ ensuring that useless features like GPGPU are "optimized" for gamers.
It's quite clear that OP's an AMD shill trying to convince you to settle on something less than The Way It's Mean to be Played™. Nvidia™ is the only real way to play games. We have seen recently that they offer incredible libraries for software developers like Nvidia Gameworks. He is probably too poor to afford the Nvidia Geforce Experience and can not afford to play any games The Way It's Mean To be Played™.
Don't be a poor gamer with bad drivers and a huge power bill. Play games with the Geforce™ Experience™: The Way It's Mean To Be Played™
I will always buy Intel™ because I only play games with Intel Inside™. Intel also pioneers innovative new technologies like Hyper Threading Technology™, Intel Rapid Start Technology™ and the highest quality chipsets to ever grace motherboards.
When I boot up with a brand new Intel™ i7™ with the latest Z chipset, I can enjoy the games the way they where meant to be with Intel Inside™. Intel™ also delivers a far more silkysmooth experience with its Hyper Threading Technology™.
Intel i5™ is also very power efficient. A processor is the most power hungry device in your house. Air conditioners, water heaters, lights, etc all use less power than a processor. Which is why Intel™ puts gamers first by ensuring that their gaming experience is of the highest quality while looking out for gamers by giving them the most value in their electrical bill.
At this point in time, there's really no reasons to consider an AMD processor at all. I tried once, it caused so much heat that it exploded and nearly burnt down my house. It also consumed so much power that it produced an EMP and destroyed not only the rest of my computer but my entire neighborhood.
Intel™ also pioneered how useless MORE CORES is with the i™ series processors. Years ago, everyone thought MORE CORES were the future. Now, Intel™ has debunked that myth entirely and increased efficiency. Now you can save thousands a year in electricity thanks to Intel™ with its powerful IPC. MORE CORES will never be part of Intel's™ line up.
It's quite clear that OPs an AMD shill trying to convince you to settle on something less than the optimal experience with Intel Inside™. Intel™ is the only real way to play games. We have seen recently that they offer incredible libraries for software developers like Intel C++ Compiler. He is probably too poor to afford the Intel Inside™ experience and can not afford to play any games.
Don't be a poor gamer with bad chipsets and a huge power bills. Play games with Intel Inside™
Pascal so badly wants to be fermi again it needs watercooling lol.
No, it's that AIOs are the future for graphics cards. I think air-cooled cards might go the way of the dinosaur for high-end cards and be the mainstay for the enthusiast and some mainstream cards. The lower budget cards will still be air-cooled, but the overall trend seems to point towards an AIO-dominated future for anything higher than $300.
>Wanting NVidia to fail
>Wanting the only good option to be cheap, power-hungry cards with crippled drivers and poor support.
>Not wanting NVidia to improve on Maxwell's excellent power-to-performance ratios and functional software, and for AMD to step their game up in a big way and push technology further for once.
We should be rooting for both teams.
Problem is with asetek's AIO patent shenanigans and general cost associated with AIO's you are aren't truly any better off than straight up air cooling given gpu AIO's tend to be hybrid solutions (i.e you are still air cooling the power delivery).
Having the core cold isn't that impressive if you need a screaming fan to cool the vrms as you pump more and mroe voltage through them.
Maybe, but it doesn't really matter if NVIDIA fixes Maxwell's abysmal compute problems. Even a lower end card (1060?) will perform great, and even better if you buy an adaptive sync monitor. Max them settings and get 36fps that still feels like 60.
Honestly I wish I would have bought a G-Sync monitor already so I can do shit like max hairworks in TW3 and not make the game unplayable.
Shit like Hairworks is the future so compute needs to go up, and fast, or G-Sync needs to come down in price from $300-400 premium to like $50-100 or even 0 because of how needed it is for demanding games.
Even a Titan-X will choke with hairworks on high, but maybe that's Maxwell's fault.
>screaming fan to cool the vrms
Both of my Hybrids are surprisingly quiet, despite putting slightly more voltage than needed through them, so the VRMs are not the problem.
The issue I have is that the VRAM isn't getting enough cooling, so I'm limited in how much I can overclock the memory. I regret not waiting for the Gigabyte Waterforce to come out, which does cool the VRMs and VRAMs via a massive water-cooling block.
>Both of my Hybrids are surprisingly quiet, despite putting slightly more voltage than needed through them,
Depends what hybrids we are talking - maxwell gets away with it as it doesn't respond well to voltage in general (i.e you overclock as far as you can on stock voltage normally). Hybrid cooling a hawaii or fat kepler for example would have your overclocks fairly limited by the fairly so-so vrm cooling most hybrid solutions have.
My 290x is fucking monster clocked and a small fan would not be enough to contain the enormous voltage I am slamming through my vrms (My max overclock wants in the region of a +180mv offset).
One trick I learned from an old ATI fan was to buy thermal pads to put tiny copper heatsinks designed for motherboard MOSFETs onto the VRMs. They're almost the same size (if not the same) and the heatsinks really does bring down their temps by a handful of degrees.
I don't think you realise how much cooling a 290x running at near 1.4v needs when on air.
I can do it but prefer not to simply because cooling vrm1 becomes a real bitch.
Just put that card on its own watercooling block. An R9 290X is one of those cards that is begging to have its own dedicated cooling loop. You're not doing it any favors by air-cooling it with an aftermarket monstrosity.
> AMD has an established track record for dramatically increasing the energy efficiency of its mobile processors, targeting a 25x improvement by the year 20202
Around 20c as I live in bongland. Shame SP120's aren't the best fans for sound as I need to run them at 12v to cool like that and even my R4 can't dampen out 2300rpm fans easily.
You're a retard if you really think Pascal isn't going to have support for hardware async computing and other features not implemented on Maxwell because the standards were not finalized by Maxwell's release.
Same thing with AMD's Polaris. Both GPU families will be 100% DX12 compatible for once.
DX12 features were finalized in Q2 2015, if I remember correctly. And those new features are mostly firmware related rather than hardware, or ride upon existing DX12_0 standards.
Remember that Mantle pioneered the idea of async compute, and it was basically just designed to be an interface to GCN that only incidentally worked on other GPU architectures
It's funny when nvidiafags tried to turn the housefire meme towards amd when it originated from nvidia in the first place. And now AMD has a card that is nearly 50% more efficient than Maxwell GPUs
>AMD has a card that is nearly 50% more efficient than Maxwell GPUs
Yeah, every company prototypes but if you read the thread you'll essentially see that NVidia tried to pass off some old junk they had laying around with GDDR5 as a prototype of Pascal with HBM.
Look guys, if you're going to shill for Nvidia effectively, you need to keep up with the latest tech news. Ask either Raj or Sanjay upstairs if you need guidance on where to look.
>believing AMD's lies
How much do you want to bet that this is one of the high-end GPUs that they throttled to 25% of its peak power to get those low-ass numbers? This will never pan out in the real world with their mainstream GPUs. They're pulling the old bait-n-switch trick.
They showed this same demo running at CES
Their system pulled 88w, the Nvidia system pulled 146
146 - 88 = 56
Let's assume both cards were frame locked at 60, and running an average of 85% their TDP
(90 * .85) - 56 = 20.5 / .85 = 24 and change
AMD has a 25w (14nm) GPU capable of the performance of a 90w (28nm) GPU
This is fucking huge, regardless of it was a large chip downclocked/power limited.
And considering their recent order of 30 of their lowest-end Polaris chips, I would wager that the showcased product was in fact their low end.
390 was my first card ever. I'm straight out of peasantry at 23yo. I have no complaints but I do want to know what the hype behind nvidia's all about. I hope the next generation of either company rapes my 390
And that's my point
A GM200 will be able to do 1080p at 60FPS capped using much less power than a GM206 would be able to do at those same settings. You're getting played. It may be more energy efficient to play less demanding resolutions and settings with a more expensive card, but not everyone will be able to afford those cards, and those that do will not play at those settings.
It's an unrealistic and downright misleading display of power efficiency when you're running a GPU just a tad bit over its idle power.
You will NOT be able to replicate those power consumptions with a mainstream Polaris GPU at the same settings, since they WILL draw more power since their workload is that much higher.
It was leaked a few hours before it went official. It is an official slide.
>>card is not launched
Showing the performance of a next gen card is pretty common.
>>compare polaris with maxwell and not pascal
Considering that Pascal will be launching later than Polaris, it makes sense. It's also very common, NVidia does the same all the time.
Not as good as NVidia comparing the 9xx series to the 6xx series in their slides. That one was a good laugh.
nm vs 14nm
Please point to NVidia 14nm performance numbers that AMD can use as a comparison.
>I have no complaints
AMD Radeon Software Crimson Edition 16.1 Hotfix Drive
 Fallout 4 – The compass flickers during gameplay on AMD Radeon™ R9 290 and AMD Radeon™ R9 295X2
/ Elite: Dangerous - Poor performance may be experienced in Supercruise mode under Windows® 10
 The driver installer appears to hang at various stages of the install process
 Call of Duty: Black Ops 3 – random frame freezes may be observed during gameplay
 Frame Rate Target Control (FRTC) setting do not apply consistently to all games. In order for FRTC to function properly, Vertical Refresh/VSync must be turned off
 DiRT Rally – A crash may occur when starting a new race with AMD Crossfire™ and AMD FreeSync™ enabled
 The AMD Gaming Evolved overlay may cause a black screen, or introduce game stutter
 Assassins Creed Syndicate - Using "Very High" graphics settings in 3x1 Eyefinity mode may cause displays to switch off
 Star Wars™: Battlefront - Some flickering may be experienced in shaded areas of the screen while game levels are loading
 Call of Duty: Black Ops 3 - Frame freezing during gameplay may be experienced
 Just Cause 3 - The system may hang when task switching on systems with AMD CPUs and GPUs
 Just Cause 3 - Application profile setting added for laptops with Switchable Graphics
 Fallout 4 - Gameplay may be choppy in AMD FreeSync™ mode in Ultra mode at 1440p resolution
 Fallout 4 - Brightness flickering observed with AMD FreeSync™ enabled on certain displays
 cnext.exe intermittently crashes during Windows® shutdown
I'm going to need some specific sources which show that two cards from the same family, one being the high/highest end and one being the low end, draw different amounts of power for doing the same work.
Like, say, frame capped benchmarks.
Is it being 120mm^2 not indication enough for you? Want some reference? That's a bit bigger than a fucking Intel CORE M.
There's no throttling, there's no capping, there were two systems run off a power meter in front of the press.
 A crash may be experienced if an HDMI™ display is a cloned display device on an HP Envy 15 notebook
 "Failed to create OpenGL context" error message may appear after installation
 "Cannot find RadeonSettings.exe" error message may appear during installation
 "AMD Install Manager has stopped working" error message may appear during installation
 "Cannot find cncmd.exe" error message may appear during installation
 Display may flicker on certain laptops after prolonged gameplay with AMD FreeSync™ enabled
 Unable to create 4x1 or 2x1 portrait mode SLS with 4K displays
 Video corruption may appear in Movies & TV app when is VSR enabled and scaling mode is set to "Full panel"
 Portrait Eyefinity mode may not be configured correctly using Radeon Additional Settings
 No display on certain laptops when toggling display mode or connecting an HDMI™ display
/ Flickering may be experienced on some monitors when AMD FreeSync™ is enabled
 Notifications reverting back to English on non-English systems after reboot
 Misaligned UI may be observed on the Bezel Compensation screen
 Launching a game from the Game Manager may launch on a single display after enabling and disabling AMD CrossFire™ in a 3x1 AMD Eyefinity™ setup
 Marginally increased power consumption may be observed during video playback
 StarCraft II: Flickering may be observed in the 'Episode 3' campaign
/ Call of Duty: Black Ops 3 - Flickering or poor performance may be experienced when running in AMD Crossfire™ mode
 Call of Duty Online - The game may crash if the Print Screen key is pressed on a 4K monitor
/ A system restart may be experience when waking the system from sleep mode on some systems with Intel processors
 Star Wars™: Battlefront - Texture corruption may be experienced if the game "Field of View" setting is > 100
 Star Wars™: Battlefront - Some users may experience minor flickering or corruption at different game location or while viewing the in-game cinematics
 Assassin's Creed Syndicate - Building textures may be missing on some AMD Freesync™ displays with VSync enabled
 Assassin's Creed Syndicate - The game may crash if the Gaming Evolved "In Game Overlay" is enabled. A temporary workaround is to disable the AMD Gaming Evolved "In Game Overlay"
 Total War™: Rome II - Choppy gameplay may be experienced
 Gaming Evolved client does not initiate when launching Metro Last Light if AMD CrossFire™ is enabled
 Far Cry 4 – A crash may occur after performing (ALT + Enter) to switch between windowed/full screen modes with the AMD Gaming Evolved "Video Capture" feature turned on
>I'm going to need some specific sources which show that two cards from the same family, one being the high/highest end and one being the low end, draw different amounts of power for doing the same work.
Can't you google it for yourself? Jesus Christ, this is common knowledge. A card uses far less power at idle and if it works barely any harder than it does at idle, then the card will use drastically less power than a card that has to work near its total limits.
Proof? Other than what they claimed? I don't consider Greenland to be "low-end" when it's going to be a replacement for the 390X and Fury series of cards.
 Talos Principle - A crash may occur while changing Gaming Evolved Video settings or pressing ALT + Enter when "In Game Overlay" is enabled
 Mad Max – Low FPS performance may be experienced in game when AMD FreeSync™ and AMD CrossFire™ are enabled
 Battlefield Hardline – A crash may occur when changing graphics settings from "Ultra" to "High" during gameplay
 Some games may experience brightness flickering with AMD FreeSync™ enabled
 Radeon Settings - AMD OverDrive™ clock gauge needles for the secondary GPU may be in wrong position when the system is idle and the secondary GPU is inactive
 Radeon Settings – AMD OverDrive™ Power setting changes on the secondary GPU are not immediately displayed. This is seen only on dual GPU graphics cards, such as the AMD Radeon™ HD 7990 and Radeon R9 295x2
 Game stuttering may be experienced when running two AMD Radeon™ R9 295X2 graphics cards in AMD CrossFire™ mode
 Display corruption may occur on multiple display systems when it has been running idle for some time
 Star Wars™: Battlefront – Corrupted ground textures may be observed in the Survival of Hoth mission
 Call of Duty: Black Ops 3 – Flickering may be observed is task switching is used during gameplay
 Assassin's Creed Syndicate – Building textures are missing and game objects stutter if VSync is enabled in Quad AMD Crossfire configurations
I want my gamework
>we still don't have physical proof or pictures that the GPU was that small
You first, AMDrone
They should permaban you for shilling so hard. I bet you're an employee or a paid poster.
That has to be one of the stupidest complaints I've seen in a while, almost as bad the anon going through the knowledgebase. That can just as easily be done for NVidia.
>AMDrones going all out
Intel + Nvidia > Amdcuck30fpsoverheatingindianshit
Nice spamming, Nvidiafag
This was a public presentation, in which they told everyone they were running their low-end 120mm chip.
In any case, "use google" is not an answer.
You made a claim, I asked for proof (of which their isn't any), and you failed to deliver.
Further, a larger GPU would in fact use more power than a smaller one for the same workload. Simply because the larger chip has more circuitry to suck power even if those units are idle.
Your logic is pure fiction. It makes no sense and is probably the same reason we won't find any power draw comparisons for frame-capped benchmarks between the same family of GPUs, one high end and the other low.
You nvidia shills should know better than to come to /g/, you guys get BTFO every time.
>And now AMD has a card that is nearly 50% more efficient than Maxwell GPUs
>in which they told everyone they were running their low-end 120mm chip
They told, but THEY NEVER SHOWED
All they needed to do was pull the GPU out of the system, remove the heatsink, and present the physical die to the audience. Or at least have the same card on display with its heatsink removed so people can draw their own conclusion.
All we have to go by is their assurances that it is 120mm^2.
Hang yourself, Pajeet
In any case, the GPU RTG showed off was a small GPU. And while Raja’s hand is hardly a scientifically accurate basis for size comparisons, if I had to guess I would wager it’s a bit smaller than RTG’s 28nm Cape Verde GPU or NVIDIA’s GK107 GPU, which is to say that it’s likely smaller than 120mm2. This is clearly meant to be RTG’s low-end GPU, and given the evolving state of FinFET yields, I wouldn’t be surprised if this was the very first GPU design they got back from Global Foundries as its size makes it comparable to current high-end FinFET-based SoCs.
Where is your proof that a larger GPU would use less power than a smaller one for the same workload?
I don't want bullshit personal anecdotes, I want cited, sourced facts and figures.
i had 2 day 1 290x leafblowers.
mad cards, but the noise and heat output was such a joke.
they didnt overclock at all, but i sold them last week, for the same fucking price i bought them
think about that. just think about that.
3 year old cards, no value lost.
Despite them constantly getting GCN specs wrong?
>Steve Jobs shows a non-retail iPhone in January 2007, 6 months before release
>literally has a different motherboard to support video out
>Mercedes shows a non-retail concept car at Geneva Motor Show
>doesn't even have an engine, not even a real car
According to AMD fanboy logic, publicly traded companies are not allowed to show prototypes ahead of release.
I guess that's why AMD is on the verge of bankruptcy and worthless.
No, no, no
When talking about this specific argument we need to not only isolate the GPU type, but also the process of manufacture.
For any benchmark to prove that a larger GPU would use less than a smaller for the same work you would need to take, say, a 7970Ghz and a 7770Ghz, or a 290X and a 7790, and measure how much they draw when frame capped to 60 in a game (and with settings) that ensure the smaller card can maintain 60fps.
I posit that it's impossible, and you posit that you don't know what you're talking about.
You're fucking kidding right? Those benchmarks are the opposite of frame locked, they are for maximum output.
You have absolutely no understanding of empirically correct testing, and have no way to show me what you say is true.
i just reseated my 290x, but what i did was keep the original cooling plate. and add the accelero to it.
that way everything is cooled and i dont need small heatsinks for the vrms and the memory.
vrm1 runs at a cool 65 while the gpu hits 67.
this is 1200/1700mhz at 1.32mv
I actually bought my dad an OC'd 380 4GB version and it runs whisper quiet was kinda impressed I can't wait for the Polaris GPUs to come out it's the jump AMD has needed for quite some time now HBM was only the beginning
>How much do you want to bet that this is one of the high-end GPUs that they throttled to 25% of its peak power to get those low-ass numbers? This will never pan out in the real world with their mainstream GPUs. They're pulling the old bait-n-switch trick.
that would just mean that they have a card that can perform 4 times better lol.
thats even better!!!!
Im ight have to give that a look - while my original cooler was a tri-x i've stripped the screw cynlidners that hold theh eatsink to the core (thus I can no longer mount it). I might see if I can bake the heatsink/baseplate combo to break the solder so I can refit the base plate to the gpu. Still I don't see it providing better cooling that the significantly larger vrm heatsink the morpheus ships with.
If you frame-lock the cards, my point gets even stronger, cumlicker.
What do you think the power-saving features of Maxwell cards do? Feed magical bullshit lean diet energy to the cards? Are you fucking dumber than a sack of rocks?
It's called frame-capping, and it achieves lower power consumption BY LOCKING THE FRAMES TO 60FPS OR WHATEVER THE SETTINGS ENTAIL. If a 960 struggles to get 60FPS when overclocked, how does it use LESS wattage than a 980 Ti that barely goes over idle to produce the same results (and you can compared the idle power consumption for those cards, the differences are barely 5%)?
You must be a charity case, because I'm starting to feel really bad for arguing with a clearly mentally deficient individual here.
yes yes we get it, its old vs new.
but at the same time nvidia doesnt have any newer chips right?
so fuck off magget.
becides, nvidia 16nm wont come close to amd 14nm anyway if that helps your argument
>because I'm starting to feel really bad for arguing with a clearly mentally deficient individual here
That was exactly how I felt about 10 posts ago.
You're either a hardcore shill, a "pretending to be retarded", or a complete fucking moron. I know,
I know it's difficult to wrap your mind around all the intricacies of how data calculations push different hardware.
But nothing you say makes any actual sense, and further you have literally zero actual evidence to prove your original claim.
Big money make smaller thing, smaller thing make less power use, same faster.
after spending years and tons of money current GPU tech has been shrunk down.
shrinking it uses less power while being able to run faster.
the AMD demo showed two cards (one not shrunk), theirs was drawing approximately a little less than half the power for the same work done.
And we all patiently wait for summer to come around to learn more.
>Nvidia only shipping parts for testing
>AMD already shipping complete boards
Nvidia literally bankrupt and finished
Nah, they've got fat deals with major publishing houses, those will make sure AMD cards stay crippled by overusing a certain aspect of the graphic pipeline their hardware doesn't handle all too well.
Since polaris has new geometry units, tesellation will be much more powerful, so they'll use something else that's really not appropriate or has no benefit to the game so the hardware can't handle it in huge dozes.
Yeah, because look at the huge power savings that Intel made with the die shrink of their CPUs after all.
Oh no, wait, the 6700K has a higher TDP than Haslel or Inferno Bridge, and identical to the 32nm 2600K.
Not him but it's not quite that simple, there's more going on right now than just China. The market is very unstable right now, it may sink considerably before it starts to climb again. Buying now is a major gamble, your essentially advocating that someone should bet their money.
They'll have "next gen" cards out sooner, though if you need one right now then you'd be better off looking at what fits your price bracket the best. 380 is the best bang for your buck, 970 is the most popular.
>iGPUs don't shrink too
Retard. The 6700K's iGPU is still complete shit. Ironically, Broadlel had a much lower TDP than Skymeme AND the best desktop iGPU ever made. And nobody bought it.
Bit different with CPUs m8. With node shrinks for a cpu you end up throwing the extra transistors at stuff like ILP and cache where it's difficult to gain anything. With GPUs you just slap on an extra thousand cores.
>Retard. The 6700K's iGPU is still complete shit. Ironically, Broadlel had a much lower TDP than Skymeme AND the best desktop iGPU ever made. And nobody bought it.
Broadwell has such a good iGPU because there's a 128MB cache on die for it, kinda like what AMD is going to do with HBM
That's why the chips cost like $500
Except the 390 is basically just a slightly upgraded 290, and the 290 is going to beat out the 970 as well
Yes, they put the same GPUs, with more voltage regulator modules and more Vram. ''it's not rebrand, it's a refresh™''
It's a metal layer respin, it's more of a refresh than a rebrand since Grenada does show slightly better thermal and power footprint.
Nothing to write home about, but 5-8% more than Hawaii is still pretty good since Hawaii was really performant in the first place.
They didn't show one dated 1 year ago though and according to some rumors AMD may release the first Polaris cards in ~2 months (but these won't be the high-end models everyone wants).
The MSI 290X was hitting close to 1200mhz as was the Power Color 290X
you have to buy specific cards for better overclocking, ones that cool the VRM or have extra VRM are naturally going to be better at it