>despite initial reviews and information from NVIDIA, the GTX 970 actually has fewer ROPs and less L2 cache than the GTX 980. NVIDIA says this was an error in the reviewer’s guide and a misunderstanding between the engineering team and the technical PR team on how the architecture itself functioned.
Just got to that part, but I shouldn't be surprised as it's nvidia we're talking about here.
>GM204 allows NVIDIA to expand that to a 256-bit 3.5GB/0.5GB memory configuration and offers performance advantages, obviously.
It really is a feature then, huh.
>They as much admitted they lied and falsely advertised the product
>They even went out of their way to mislead reviewers with false info.
No nothing going, good old Nvidia at it again. Be sure to buy keep buying their defective and falsely advertised products since they so great!
Can some of these GPU brands get sued for false advertising?
On the GTX 970 box, it says "4GB of GDDR5 memory". Yet, the graphics card only provides 3.5GB of actual GDDR5 memory speed with the remaining 0.5GB being much less than GDDR5 memory speed.
They clearly ripped off the consumer.
Until the day I actually see it affecting games by a significant amount, it will not bother me.
Find a benchmark and compare 2 cards that do and don't have this issue with the exact same configuration and find an issue for me. Please.
It may not bother you, but this will become a clear problem in the future and is already affecting high-end consumers right now.
Nvidia better have some strong damage control because this could be a class action lawsuit here.
the 970 is actually about 184watts running at stock speed and some aftermarket 290's draw 210-215watts at stock speed
but over all thats generally accurate if the 980 was selling at 350$ that would be the actual card that nivida promised the 970 was
It's perfectly fine for you if it does not bother you, but it's not about how it still "works for me" and more about how they advertised higher specs than actually used for 4 months (ROP and L2 cache come to mind).
what you are refering to as the gtx970s 4gb vram is in fact 3.5gb/.5gb vram or as ive begain calling it 3.5gb + .5gb vram. the 970 is not a 4gb vram graphic card itself but rather another disfuntional product from Nvidia split into two separate memory partitions.
>Let's be blunt here: access to the 0.5GB of memory, on its own and in a vacuum, would occur at 1/7th of the speed of the 3.5GB pool of memory.
>There is 4GB of physical memory on the card and you can definitely access all 4GB of when the game and operating system determine it is necessary. But 1/8th of that memory can only be accessed in a slower manner than the other 7/8th
idiot design because if the just removed the .5gb partition the 3.5gb cache would refill itself with new textures at the proper speed
the card would literally run better if it was a 3.5gb card
>But 1/8th of that memory can only be accessed in a slower manner than the other 7/8th, even if that 1/8th is 4x faster than system memory over PCI Express. NVIDIA claims that the architecture is working exactly as intended and that with competent OS heuristics the performance difference should be negligible in real-world gaming scenarios
>v1.x: 4 GB/s (2.5 GT/s)
>v2.x: 8 GB/s (5 GT/s)
>v3.0: 15.75 GB/s (8 GT/s)
How the fuck can they get away with this shit?
They already have. Thank the fanboys.
You can already see them shouting about power efficiency to drown Nvidia lies.
They did the same thing with GTX 660 too.
The lies are here to stay.
Bought both 660Ti and 970 at launch... Welp time to switch back to AMD
This would be a bad idea, A better "band aid" solution would be allocate non gaming VRAM (such as windows/areo/browser/monitors/etc) into the "slow" part and everything gaming into the 3.5GB fast memory. Instead of the current that it's allocate everything until it hits 3.5GB
512MB of VRAM is in a different, smaller pool. This pool is accessible at 1/7th the bandwidth of the 3.5GB pool. They can't "fix" shit because that's how the GPU is made. Games will very likely be much less affected but it is what it is.
Im so happy for early-adopter-electronics-in-general-idiots. Ill buy myself a nice used GTX970 2months old, under warranty for 40% of retail price.
I can live with GPU with 3,6 GB VRAM for 200$
No, that's what NVIDIA says 'real-world gaming performance' is. That memory benchmark is pretty much spot on, reading almost exactly 1/7th the bandwidth on the upper 512MB.
Even if NVIDIA's numbers aren't straight-out lies (they probably aren't) they still don't mention exactly how and what they measure. Is it 4-6% on average over multiple games? How does it affect frame time variance, not just FPS? What's the performance like in the worst case scenario?
I'm actually just a sad customer. But if something is too good to be true, it probably isn't.
sorry to hurt your feelings, it's a great card to use on 1080p single monitor for maybe a year, but after that the memory nerf probably starts to show, gotta crank down dem grafix
Did you idiots even read the article
>NVIDIA’s performance labs continue to work away at finding examples of this occurring and the consensus seems to be something in the 4-6% range. A GTX 970 without this memory pool division would run 4-6% faster than the GTX 970s selling today in high memory utilization scenarios. Obviously this is something we can’t accurately test though – we don’t have the ability to run a GTX 970 without a disabled L2/ROP cluster like NVIDIA can. All we can do is compare the difference in performance between a reference GTX 980 and a reference GTX 970 and measure the differences as best we can, and that is our goal for this week.
Basically when it hits over 3.5gb the entire card takes a 5% performance hit, not accounting the delayed frame times that makes the game a stuttery mess.
At this point it's clear a capped 3.5gb 970 will outperform a normal 970 at high resolution gaming.
>At this point it's clear a capped 3.5gb 970 will outperform a normal 970 at high resolution gaming.
Possibly, but not necessarily. The shit 512MB is still faster than hitting system RAM, it would only be faster if the shit 512MB are being used when the good 3.5GB aren't completely full.
Nvidia doesn't think less stuttering = improved performance.
Only average fps is performance.
"This in turn is why the 224GB/sec memory bandwidth number for the GTX 970 is technically correct and yet still not entirely useful as we move past the memory controllers, as it is not possible to actually get that much bandwidth at once on the read side.
>GTX 970 can read the 3.5GB segment at 196GB/sec (7GHz * 7 ports * 32-bits), or it can read the 512MB segment at 28GB/sec, but not both at once
>it's only a benchmark
>i'm not a sucker
I can't even tell if these posts are ironic or not anymore.
So the long and short of this is that the first 3.5 GiB of logical memory space are striped over seven GDDR5 controllers and the last 0.5 is not striped at all, so that the L2 cache with the disabled twin doesn't (usually) get fucked?
Yeah, this seems like fucking horrible design. If the last 0.5 GiB of RAM can't be meaningfully used AND Nvidia has gone out of their way to prevent it from being used, they should have just not populated the GDDR on the PCB location mapping to the disabled L2's controller.
Realistically it probably is 'fine' for most people.
That being said I know I won't be buying 970s to replace my old 7970s, I only hope something else comes out before Witcher 3.
>With that said, our 4K test did pick up a potential discrepancy in Shadows of Mordor. While the frame rates were equivalently positioned at both 4K and 1080p, the frame times weren’t. The graph below shows the 1% frame times for Shadows of Mordor, meaning the worst 1% times (in milliseconds).
>The 1% frame times in Shadows of Mordor are significantly worse on the GTX 970 than the GTX 980. This implies that yes, there are some scenarios in which stuttering can negatively impact frame rate and that the complaints of some users may not be without merit. However, the strength of this argument is partly attenuated by the frame rate itself — at an average of 33 FPS, the game doesn’t play particularly smoothly or well even on the GTX 980.
> marketing: 4GiB, 256b wide, 224 GB/s, (2048 kiB L2?)
> 99.?% of the time: 3.5GiB, 224b wide, 196 GB/s, 1892 kiB L2
> remaining 0.?%: +0.5GiB, 32b wide, 28 GB/s, 256 kiB L2
>guy posts how much he has loved Nvidia over the years for giving him proper VGA's
>guy asks if he should buy AMD and hang himself, or buy GTX 970 and still hang himself
>asking Nvidia to fix this issue, or else "https://www.youtube.com/watch?v=IVpOyKCNZYw"
>Nvidia's response is nuking his comment
The difference was that reviewers knew about this and that most of them focused on that (besides other things) when reviewing it. It was not misleading, it was something that you could easily find out beforehand. The whole 970 situation is just plain bs, I really hope that they get fucked in a few courtcases in the EU, I doubt that it would fly in the U.S. though.
>hurr durr last 0.5gb runs at 1/7 of the speed
You'd be saying that has 6GB of vram if they duct taped a 2GB DDR2 stick on it.
Well, Nvidia will at least bring 4GB of memory to the GTX 960. Now the 960 will be a good buy, right guys?
>running a 960 for high resolutions
>still 128 bit memory bus
This is almost a worse scam.
>cheaper to lie and hope we don't find out
Well, my two last cards including the 970 has been Nvidia. Next will likely be AMD.
If only they can figure out how to do tesselation without grinding to a halt.
> PCPer discussion
> we are here today to discuss the GeForce GTX 970 memory issue, which is an incredibly complex topic
>if you wanted 4gb all fast then you would obviously opt for the more expensive 980
Except when I bought a 970 this was not clearly the case. It was 1 month ago too so no "hurrr early adopt" bullshit.
ITT: Vidyagamefags complain about not being able to running BF4 at max twice at the same time on different 4k screens.
Fuck off fagets go back to /v/ and stop complaining already, it's already cheap as fuck.
>If only they can figure out how to do tesselation without grinding to a halt.
That hasn't been an issue since HD5000, and lets be honest now, initial Fermi was a much shittier product than not doing tesselation that well.
Yeah, these lies are acceptable because it's only a $360 card.
>the problem affects people pushing their cards to the limit, like 4k res.
>4k is useless and overpriced for at least 2 years
>you will have a new card by that time, or going to
I acknowledge this is shit and nvidia are assholes for doing this, but do we have to discuss this the whole week?
>i own a 970 myself occasionally playing linux steam games and d3 on wine
Dude, the 660ti I have collecting dust beats the 290x on tesselation.
Yeah, a 128bit piece of shit beats AMDs 512bit monster.
"Good enough" for game levels doesn't deny the fact it's garbage once the levels get pumped up.
Like we need this shit.
Is it still true the gigabyte revised version doesn't get this shit or was that a lie too?
They weren't lies with the 660. The difference is this:
"Hey guys, this is a gimped card for X amount of money, this is how we designed the memory you might want to look at this and see if it still fits your needs" -660
"Hey guys, this is a great GPU for X amount of money." -970
They're not lying if they tell you that it is a gimped card, that's totally fine. The problem is that nobody knew that the 970 was a gimped card, not even reviewers.
I thought it was 192-bit, or at least my MSI 660ti PE is 192-bit.
Regardless, it does well for me and was hoping the 960 would be a major upgrade, but it isn't so i'm gonna hold out for another GPU generation.
AFAIK yes. People just freak out because the nvidia "benchmark" shows both cards dropping 40-ish % in framerate, that's just because they needed to increase VRAM usage and cranked up the settings.
tl;dr: people are dumb, Nvidia are assholes, the 970 is a fucking bs GPU and the 980 is probably fine.
an overpriced "top tier" GPU. If you want to argue the case of "lol, should have bought a 980 if you wanted 4k" then fuck off. Nobody knew of this issue, people bought 970s because they wanted to add a second one for SLI later or they initially bought 2 970s, the whole situation is a mess.
>shills for both sides are gonna feast on this turd for a couple of months at least
May very well be the worst part. Every thread with a passing mention about graphics is gonna devolve into ENJOYING THAT 3.2? IMPLYING AMD NO DRIVERS LEL
>This is the reason for the bug but actually it's not a bug, it's a feature and perfectly fine, and to combat the claims of lag spikes here are some fps averages. Now shut up and buy our cards.
>He's really working for those shekels.
you're missing the point. Not that all consumer fanboys aren't idiots to varying degrees, but Nvidia somehow cultivates the most slavish of them all.
> he literally does it for free
>Probably half assed refurbs though.
More likely several generations old card with some random fan and a bios tweak to make it look like a 970.
4850 > 5850 > 660ti > 970 > whatever is useful in 1-2 years.
Being a fanboy is retarded, you buy what is best performance for the buck you have available at the time.
>expect 980 performance
>expect full 4gb vram
No, you expect adequate performance no matter if the VRAM sits at 3.2GB or 3.7GB. Nobody expects the card to perform as well as a 980, people just want to use more than 3.5GB (with 2 970s in SLI, playing games at 4k, for example) without the GPU crapping itself. the last .5GB are basically useless, people got tricked into buying this PoS, this is not okay.
link from article:
You'll be happier not knowing.
So did they manage to cheap out somehow, or is this actually an "oops our engineers are retarded" situation? I was on the verge of ordering a GTX970 for a build I'm planning. I'm glad this news came out ahead of time.
So what do I buy now? Keep in mind it's mITX and heat may be an issue.
No, that specific benchmark is broken as the 290x does 2x the tessellation performance of the 7970 in hardware and it is in effect in game and in the Microsoft DX tessellation benchmark.
They cheaped out and the marketing & legal departments decided it was still safe to sell as a full 4gb card.
LEL AMDRONES, EVEN CRIPPLED, IT'S STILL BETTER THAN YOU SUB BINNED TRASH
This architecture was premeditated scamware from the get-go.
Nvidia went out of their way to design L2/ROP/MC blocks that in theory could access the last 32b RAM module but designed the entire driver stack so that it won't unless it absolutely has to.
Maxwell is the first generation from Nvidia to allow this functionality, so it's not some magic oversight.
>implying easing gradually into the 3.5gb bug makes it magically work as a full 4gb card
if you have no need for your shekels I guess.
If you must go with the merchants, I suggest waiting until the R9 300 series comes out in a few months and encourages them to drop prices.
>lied about # ROPs
>lied about memory access
>highest gpu sales in gaming history
I also didn't mention the use of 3D software.
These people are the ones who are really fucked and i feel sorry for them.
I just get mad at all those shills who are treating it as an apocalypse. I don't want to defend nvidia here, i just want to settle this a bit down, because most ppl will have no problems until next-gen cards.
Trying to make sense of this "benchmark" (numbers are from nvidia, so, whatever)
This shows the 980 and 970 dropping in performance when going beyond 3.5 GB vram usage exactly the same (970 some 1-3% worse than the 980, so...fucking nothing)
It can't be just that 1-3%, nobody would have given a fuck or even noticed, right?
So are the "> 3.5GB" settings actually trying to use like 4.1 GB, to make the 980 show the same degradation when it runs out of memory?
Or is the 980 gimped, too (which doesn't make any sense if the 970's problem is supposed to be caused by the crappier memory bus/whatever))
But will that make my room hot? I keep cards 3-5 years easily closer to 5. An extra hundred is worth it if I don't have to open a window in the winter.
Can you lock it at 3.5 somehow so I dont get stuttering or lag spikes if some shitty unoptimized game tries to go over?
Anyone heard if the 980ti will be a thing?
>crippled online only DRM games.
Not really, that shit is advertised.
It's bullshit with lipstick on to make it not look like an issue. It is an issue, but not an SKY IS FALLING issue.
Nvidia have just removed the 100+ page thread about this issue on their official forums
You can still access it via direct link
but it doesn't show up on the forum browser.
>people trust nvidia after housefires and woodscrews
>kids comparing the 290x anything to the monster that was the 480
The drivers sort of soft lock it now, that's why you have to walk over corpses to force games to use more.
It's been removed and put back and removed again now it seems.
Those are not the benchmark numbers. Those are Nvidia PR department numbers, you can safely ignore them.
These are the benchmark numbers. They have been confirmed by Nvidia engineers to be legit, and that last 0.5GB vram is exactly 1/7 of the speed of the fast section.
They focus on average fps in the PR piece because it hides the stuttering.
>So are the "> 3.5GB" settings actually trying to use like 4.1 GB, to make the 980 show the same degradation when it runs out of memory?
>Or is the 980 gimped, too (which doesn't make any sense if the 970's problem is supposed to be caused by the crappier memory bus/whatever))
980 is in theory free of the 970's bullshit, but it's still debatable how much real games are bandwidth limited across all 3.5-4 GiB workloads or whether the benchmarks are bullshit.
You could in theory still have shader-limited tests that still touch ~4 GiB memory, or you could have the driver dancing around keep frequently used textures or whatever below the 3.5 GiB death line.
The bigger issue is that the ROP count, L2 size, and memory bus width have all effectively been lies for the 970.
I need a list guise:
>bricked GPUs due to driver updates TWICE
>Falsely advertised GTX 970 memory
>8400/8600 packaging defects
Is there more? Didn't include the Fermi housefires yet because I don't know if they downplayed it or lied about it.
Run something heavy like Unity or Watch Dogs. You really need to punish the card with settings, because it won't even give you over 3.5GB until you have punched it in the face several times. Nvidia did the best they could to hide this vram turd in there.
on a further note, what would be a better card at the same pricerange, the 290x? I have the MSI Gaming 970, and I got it for $410 before tax (Canadian)
Mesa is pretty damn good however.
Not as fast as Nvidia's proprietary driver by any means, except during regular desktop usage, but still incredibly stable.
Compared to fglrx, Mesa actually works
Such are the dangers of dealing with jews.
They've bribed most of the devs already.
>On a generic notice, I've been using and comparing games with both a 970 and 980 today, and quite honestly I can not really reproduce stutters or weird issues other then the normal stuff once you run out of graphics memory. Once you run out of ~3.5 GB memory or on the ~4GB GTX 980 slowdowns or weird behavior can occur, but that goes with any graphics card that runs out of video memory. I've seen 4GB graphics usage with COD, 3.6 GB with Shadows of Mordor with wide varying settings, and simply can not reproduce significant enough anomalies.
so wtf? is this one of those problems that is a problem just on paper and it only becomes a real-world problem only in very specific cases that 99.9% of users will never face?
And then you forgot
I> have to state this though, the primary 3.5 GB partition on the GTX 970 with a 500MB slow secondary partition is a big miss from Nvidia, but mostly for not honestly communicating this. The problem I find to be more of a marketing miss with a lot of aftermath due to not mentioning it.
>Would Nvidia have disclosed the information alongside the launch, then you guys would/could have made a more informed decision. For most of you the primary 3.5 GB graphics memory will be more than plenty in 1920x1080 (Full HD) up-to 2560x1440 (WHQD).
Nope. It is a money grabbing opportunity for any still-remaining-some-credibility tech blogger out there. Basically they won the January extra paycheck lottery. Report on the issue truthfully but always end with it is a non issue to get something extra for the month of January.
>nvidiots actually think advertising having 1/8 less memory than advertised is ok
I understand that most people don't have any problems at 1080p with vram issues since it won't break 3GB presently. But are you not concerned that in the future? Couple of years ago, 2GB for VRAM was unheard of, now 2GB+ is practically the norm. What about 1-2 years from now?
The biggest issue is Nvidia lied through their teeth and is trying to fix it by claiming marketing did a boo boo.
I have one, and like I said earlier in the thread, it's likely the last money they get from me.
>What about 1-2 years from now?
1-2 years from now we'll have new cards.
mfw i get 500mb of slow buffer ram on my 970 that amdtards cant have
thank you based nvideo
yeah, I agree that it was a dick move from nvidia to advertise somewhat wrong specs.
The pc perspective (I think) made a video talking about it and I think they made a very good point: nvidia could've just went on and said that 970 is 3.5GB card but it has an extra 0.5GB when you really need it, market it as a very smart idea and this whole bullshit about lying would not even exist now
Until somebody proves this causes stuttering etc. in realistic game configurations, I'm just going to assume this is more about moral outrage than a "real" product defect.
Nobody buys a GPU based on L2 size/ROP count/etc. numbers, just benchmarks. (many idiots probably do overvalue RAM size though)
Just kikes kiking with false but largely irrelevant marketing details,
By the time 3.5 vs 4 GiB matters for games, I'm going to be playing on a 8 or 16 GiB HBM-based card anyway on my elder god tier 120 Hz UHD or 5k display anyway.
>Nobody buys a GPU based on L2 size/ROP count/etc. numbers, just benchmarks.
However, people do buy their cards with an expectation of it working as advertised for years, meaning we look at the L2 size/ROP count/etc. to estimate the longevity of our purchase.
>That feel when I'm still happily using my Radeon HD7950 and will likely continue doing so until the 4xx series is out, depending on how my card fares
Feels good not getting shafted by false marketing.
>mfw reading about this having my shopping cart filled with a new build with 2 970's
fucking lucky day, should get two 290s instead or wat?
nvidia just changed their public specification of the 970 card
>56 instead of 64 ROP
>1.792Kbyte instead of 2048Kbyte L2-Cache
>224+32Bit instead of 256Bit Bus
>3.584 + 512 MByte instead of 4.096 Mbyte VRAM
>196 + 28 GBytes/s instead of 224 Gbytes
>196 + 28 = 224 GByte/s
Nvidia also is literally hiding the thread in the nvidia forum that no-one new can see it.
Full damage control?
>we look at the L2 size/ROP count/etc. to estimate the longevity of our purchase
this is about the closest 3.5-Gate comes to really mattering.
You're really getting a 12.5% haircut off of advertised bandwidth, usable memory size, and theoretically fill rates too I guess.
In reality, screen (or at least rendering target) pixel count dwarfs everything else, hence 900p24 fun times in eighth-gen-console land.
After more than a decade of display stagnation, we're finally seeing cheap as fuck UHD screens becoming a reality, which means at least a 4x jump in required rendering power, which will make cards in the 970 tier completely irrelevant in a year or two from now.
The best part is all that damage control will hurt them more than if they'd just go "Yeah we fucked up, here's some money back/refund options."
The whole blame marketing part just reeks of Fight Club recall scene.
>A times B times C equals X. If X is less than the cost of a recall, we don't do one.
>Lying about specs
Is there precedent for this or is this a new low?
We've learned that the GeForce GTX 970 is more than an SMM-reduced version of the GTX 980, which is a fact that will come into play once that 0.5GB is routinely accessed. We've learned that Nvidia has chosen not to impart the GTX 970's differences until Internet rumours surfaced on a potential memory-bandwidth issue, and we have learned that Nvidia has known all along that the information passed along to reviewers - ROP counts, L2 cache, etc. - has been wrong.. and has done nothing about it until speculation grew too rife.
>nvidia just changed their public specification of the 970 card
only in some places
>Nvidia also is literally hiding the thread in the nvidia forum that no-one new can see it.
Let's hope they found a different team to look into it than the ones tasked with looking into this issue a couple of weeks ago.
>Looks like some kind of forum bug
These guys are good
Pretty much. Without major governments keeping them in line they do whatever they want. Game developers are getting paid to optimize software for one over the other either in dollars or man hours by giving them the software. Journalists have been mostly shit on anything involving video games forever.
AMD isn't corrupt, just incompetent and nearly bankrupt.
From the perspective of a publicly owned corporation, Nvidia is being more intelligent with their shekels though
> TWIMTBP funding keeps them in good graces with developers who don't bother optimizing AMD performance nearly as much
> gaming "journalists" even more trivially bought off
>Panic and delete thread by accident
>Blame it on forum bug
Yeh, covering up your mistake and admitting you made a mistake are two different things
Just like..... this vram ordeal, gasp.
it's been scientifically proven that more equal distributions of wealth end up with people buying stupid shit to impress their peers and potential sexual partners.
if you gave every nigger in the ghetto $5k, there would be zero new factories or other businesses started and a 10,000% increase in spending on grills, spinning rims, and skankier clothing.
wealth only tends to be productive when it's in the hands of people whose darwinian chances aren't more easily improved though purchases of bare essentials or conspicuous consumption.
>>Panic and delete thread by accident
Except that is not what happened. The thread was not deleted. In fact, you can still access it and I still have it open in a tab. Would you like the link if you do not believe me?
>wealth only tends to be productive when it's in the hands of people whose darwinian chances aren't more easily improved though purchases of bare essentials or conspicuous consumption.
Rich dont spend more money middle class would right now.
This is a big problem in the US currently. The recovery from the economic downturn was slow because of this.
You fail at a simple point.
If you are poor and get 5k you can FINALLY upgrade your life-quality, where on zhe other hand wealth people already bought shit to feel better, so they can invest to simply make more money out of it.
Now gtfo our socialist /g/, capitalist pig.
>tfw it took a solid 10 seconds to work that out, meanwhile the nvidia shills can't understand why it's worse than they calculated
They hid it to see the initial reaction, it was all carefully calculated. Are you really naive to believe it was just a forum bug when it's the only thread that got that "bug" and many other posts were deleted as well.
>They hid it to see the initial reaction
That's a stupid concept. It is obvious how people are going to react after there is a lot of discussion about Nvidia hiding a difference in specs on the card. Rather than conceiving that Nvidia could be malicious or stupid but that Nvidia is malicious _and_ stupid is ludicrous.
it's about the ability to handle delayed gratification.
I mean, who'd rather have $500 every year until they die instead of $5k of new shit right now?
> but seriously, the claim that too much money is wasted on fruitlessly trying to attract pussy rings too true
>It only took a solid 10 seconds for some doofus who doesn't even know how memory works to come to a bullshit conclusion
The less you know about a topic, the more likely you are to overstate your understanding of it.
>I think he meant, they hid to see if anyone realized it was missing from the forum pagelist
I am aware of exactly what he meant. I questioned the purpose that such an action would predicate upon to point out how idiotic such meaning is. There is no rational reason for Nvidia to "see if anyone would realize" at this stage.
Your argument is so irrelevant, i'm not going to spent time discussing with you. You are too oriented on the marked (and how to expose it) to see the benefits for the entire humanity. One day it's the time it will be okay to kill your kind on the streets and i'll enjoy it with a passion.
probably never. all the current likely candidates for revolution (SJW faggots and ethnic minority groups) are retardedly doing their best to ban all private gun ownership rather than re-legalize machine guns and other useful toys.
Nah its always been working class that does the job not the fucking privileged assholes.
Plus fat people cant stop you from doing or owning anything they would have to leave the house first.
How is the conclusion bullshit?
The percentage drop-offs they stated were perfectly correct, as averages.
All I was saying is that they were too fucking stupid to calculate the worst case drop-off, which is going to happen with many applications.