[Boards: 3 / a / aco / adv / an / asp / b / bant / biz / c / can / cgl / ck / cm / co / cock / d / diy / e / fa / fap / fit / fitlit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mlpol / mo / mtv / mu / n / news / o / out / outsoc / p / po / pol / qa / qst / r / r9k / s / s4s / sci / soc / sp / spa / t / tg / toy / trash / trv / tv / u / v / vg / vint / vip / vp / vr / w / wg / wsg / wsr / x / y ] [Search | Free Show | Home]

http://www.gputechconf.com/ IT'S HAPPENING

This is a blue board which means that it's for everybody (Safe For Work content only). If you see any adult content, please report it.

Thread replies: 169
Thread images: 14

File: TitanX.jpg (68KB, 2000x1125px) Image search: [Google]
TitanX.jpg
68KB, 2000x1125px
http://www.gputechconf.com/

IT'S HAPPENING
>>
The titan line-up, the ONE thing that definitely needs to die, has a new offspring. And it's a new bastard of "hurrdurr workstation"-price for a glorified gaming card. I weep for humanity and for the people who "praise" this bullshit behaviour.
>>
>>46848672
amd shill pls go
...oh and dont forget to pay your electricity bills.
>>
http://blogs.nvidia.com/blog/2015/03/04/smaug/
>>
>>46848693
>hurrdurr shill
As someone who owns and loves his 780ti, fuck you. The titan line-up is fucking disgusting and terrible for everyone who "just" wants a high-end gaming GPU. It'll be advertised as "the ultimate gaming graphics card!!!!" And as soon as people question the 1000 USD pricetag they will suddenly call it a developer card because muh double precision and muh framebuffer! It's a glorified gaming card that they sell at a HUGE markup because people are willing to pay for it. And he worst part is that it'll hinder ACTUAL gaming GPUs. I would have loved to buy a 6GB 780yi, but wait they never released that because it'd be the exact same card as the 1000 dollar titan black just without the double precision! Can't have that!
>>
>>46848635
Is it better than the 295x2 yet? the 295x2 is down to 600 dollars last I checked.
>>
>>46848761
Impossible, the R9 290x trades blows with a 980 at 4k, 2 of them definitely crush it and GM200 will not magically have double the performance of a 980.
>>
>>46848783
They didn't release a price did they? If I remember right back in the then the titan Z was released at 3k when it was still crushed by the 295x2
>>
>>46848635
Lel then their next generation of midrange cards come out a year later to stomp out the titan so you ended up wasting 500 USD for a glorified card. Its like buying beats, half of what you're paying is the name
>>
>>46848635
wow, what will probably be a $1500+ gpu that will be beaten by AMD's inevitable $1000 gpu 2 or 3 months later that blows it out of the water.
>but muh extra $20 a year on electricity with AMD
>>
>>46848875
don't forget the part where the prices of both cards then plummet $500 each a month later
>>
Timing of this announcement is interesting. The 390x must not be all it's cracked up to be.
>>
will the double precision floating point performance over a gaming card be of any advantage to things like making 3D models and animations, and rendering video?
>>
>>46848783
980 overclocked is ~25% faster than 290x overclocked
>>
>>46848954
No, double precision is irrelevant it those tasks. For scientific computation it might work somewhat, but it will NOT work as a developer card because it lacks the proper workstation drivers and the certifications that come with them.
>>
>>46848875

This card won't retail for more than $1,100, and AMD's next $800 - $1,000 card won't hit retail until Q4.

>>46848900

>things that never happened
>>
>>46848988
>This card won't retail for more than $1,100
How will this only be $1100 if the Titan Z was $3000?
>>
>>46849011

The Titan Z was a dual GPU card. This Titan (along with the Titan Black and first generation Titan) is a single GPU card. Both the Titan Black and first gen Titan hit retail for $1,100.
>>
>>46848978
"Over clocked" is a useless figure, you might have a cherry picked sample of one GPU and a crappy sample of the other. Stock clocks or a slight OC (like the factory OC that many GPUs come with) are acceptable, but basing benchmarks on the silicone lottery is bullshit. And at stock (or with the factory OCs) an R9 290x and a 980 trade blows at 4k. Maybe the 980 wins by a couple percent on a larger number of games, but in the end it's irrelevant. GM 200 will not offer double the performance of a 980, and an R9 295x2 will most definitely not be outperformed by a Titan X at 4k in games that scale properly with crossfire.
>>
File: 1423831508033.jpg (123KB, 400x300px) Image search: [Google]
1423831508033.jpg
123KB, 400x300px
>>46848760
You seem highly upset. Big chips are very expensive to produce as a general rule and ship few units. They are always expensive. These big chips happen to be of interest to people with more money than sense that want to play video games as well. You are in the exact same market with your GTX780TI, please don't delude yourself because you can't afford the next price bracket. Previous Titan cards have shown there is quite a market for compute cards as regular graphics cards.
This is not /v/.
>>
>8 billion transistors

Even if this is super efficient it's going to be a fire hazard.
>>
>>46848948
fuck you talking about? if anything it shows NVidia is trying to one up AMD by releasing this monstrosity before the 3xx series is released, NV knows they fucked up with the 970(and most likely the Shield box also)so the TX is their Ace in the hole.
>>
>>46849031
Oh, that makes sense. I am an idiot. Thanks anon.
>>
File: titanxdiscuss.jpg (775KB, 1920x1080px) Image search: [Google]
titanxdiscuss.jpg
775KB, 1920x1080px
Nvidia Future-proofing engineer approves.
>>
>>46849039
all 980s overclock ROUGHLY the same. something like 1500-1600 MHz boost clock with just air cooling.
>>
>>46849054

Yeah, I see where you're coming from. I just don't get why they would announce this now if it only ends up being marginally better in performance than a $600 card.
>>
>>46849054
A guaranteed to be obscenely overpriced, highly limited production GPU is their ace?
>>
File: titanetu.jpg (4MB, 4500x2835px) Image search: [Google]
titanetu.jpg
4MB, 4500x2835px
>>
>>46849041
As if. The majority of Titans were used as simple gaming cards, just like their marketing suggests. People bought it because it was the best gaming GPU available, not because of double precision or computer performance. And yes, I'm pissed that they try to upsell me a "workstation" card (which ironically doesn't actually function as one) for gaming with that silly price-tag on top of it.
>this is not /v/
You're literally discussing a product that is designed to play games, deal with the fact that many people on /g/ play games.
>>
>>46849098
One would hope that it has massive gpgpu performance, not just hurr durr muh uber gaymen performance elitism.
>>
The purpose of titans is identifying people with more money than sense

If I had that much money I'd be spending on hookers & blow not a gpu
>>
>>46849081
Again you're playing the GPU die lottery, if a GPU's die doesn't fall in an acceptable range of tolerances its binned for either laptop use by shaving off cores and RP or just thrown away.

Same applies to intel's and AMD's processors, I had an i5-2500k that OC'd to 5.0ghz on a hyper 212 fully stable but other people may not be so lucky, others may be more lucky and be able to go further while being stable.
>>
>>46849148

Yeah, but I doubt they're going to undercut the K6000 at a quarter of the price.
>>
>>46849137
The chip the Titan used was also used in other products. There are quite a few workloads you would normally want a Tesla/Quadro branded variant for you can get away with on a Titan branded variant.
>>
File: 1424653429528.jpg (40KB, 400x500px) Image search: [Google]
1424653429528.jpg
40KB, 400x500px
>i-it's for m-my workstation!
>>
>>46849081
980 at 1500 isn't faster than a 290x at 1200.
>>
>>46848986
>No, double precision is irrelevant it those tasks.
Wrong. example: shading artifacts are often caused by floating point imprecision.
>>
>>46849183
>If I had that much money I'd be spending on hookers & blow not a gpu
There are people who can do both.
Deal with it.
>>
>>46849197
K6000 is end of life, it will be replaced by Quadro M6000 soon
>>
>>46849210
Did that Titan Z or what ever its called have firmware options for those workstation GPU's anyway?
>>
>>46849310
Yes, you can toggle FP64 performance on or off
>>
>>46849266
interesting. i noticed that a screenshot of my game on my phone had artifacts compared to the same screenshot on my pc. so disregarding price per performance, then there might be some benefit in getting a Titan X vs a GM200 gaming card?
>>
>>46849310
Not firmware, it's a toggle in the control panel for FP64 performance

It doesn't have Quadro features enabled
>>
>>46849266
You are literally retarded.
>>
>>46849284
and drugs are bad mkay
>>
>>46849324
I remember when you could take a 680 and flash the firmware and chance one of the capacitors or resistors in the bottom left of the card near the PCI-e connector and you would have the same thing as a (then current gen) K6000 or something

you could take a 560ti and do the same thing and get a K4000 except you didn't need to physically change anything just flash the firmware
>>
>>46849325
No, games don't use FP64 so there's no benefits
>>
>>46849345
after the 500 series, Nvidia installed what is basically "physical DRM" by making one of the capacitors on the GPU output less farads than the ones found on the K6000, they were otherwise exactly the same card, so if you were good with a soldering iron you could easily take a 400 dollar 680 and get a 4000 dollar K6000 or what ever they generally went for.
>>
>>46849353
but i mean in development like making 3D models with baked lighting and such.
>>
is that nvidia chink ceo head engineer spokesman on roids or something?

i mean with all the business and science, cucking and pr shit he does i doubt he has time lifting weights
>>
File: 1408122914658.gif (53KB, 129x111px) Image search: [Google]
1408122914658.gif
53KB, 129x111px
What's the bus width?

If anyone is going to even do any sort of GPU-Accelerated Rendering with this thing (Thinking HEVC GPU encoding/h.264) then that bus better be fucking wide as a... bus.

I'm so tired of NVidia jewing so hard on their cards with shitty bus widths, but then again, my 290x is a dream come true.
>>
>>46849371
i also liek type this liek very smart me point me
>>
>>46849371
it's not impossible that he's done a few cycles in the past but i think he's natural now. he's not that muscular, it's very easy to maintain with a non-shit diet and lifting for 30-60 mins once every few days.
>>
>>46849378
[spoiler]192-bit :^)[/spoiler]
>>
>>46849378
384bit memory bus because it has 12GB of GDDR5 memory

256bit & 512bit would have 4/8/16GB

192bit & 384bit would have 3/6/12GB
>>
>>46849411
>560ti
>256-bit
>650ti
>literally a 560ti clone in kepler
>128-bit
>doing any sort of FXAA shit is the biggest bottleneck ever
>>
File: 1402985156852.jpg (20KB, 510x419px) Image search: [Google]
1402985156852.jpg
20KB, 510x419px
>>46849413
>mfw
srsly
>>
>>46849210
Except that you're stuck with gaymen drivers without the proper certifications. You can't just pop in a gaming GPU in a workstation or use it for scientific computation like that, many programs flat out won't work without those certificates while others might work poorly. The drivers and the software support is half of what you're paying for with a workstation card. You'd be ridiculed and thrown out if you showed up with a titan at a big-time animation studio.
>>
>>46849460
>scientific computation
>big-time animation studio
what scientific stuff requires workstation drivers? the scientists could just write their own code.
>>
>>46849409

m8 hes 60 years old and has bigger guns than young schwarzenneger
>>
>>46849489
he's only 52 years old, he's not tall and he only has like 14 inch fatceps.
>>
>>46849081
I wish the silicon lottery wasn't a thing and that we could buy the GPU or CPU based on a grading system. For example you say 1500 to 1600 so 1500 to 1525 is grade D 1525 to 1550 is C 1550 to 1575 is B 1575 to 1600 is A and oddballs that OC higher could be something like A+.

Anyone else agree? I personally would always go A+.

BTW I still wouldn't go Nvidia I was a Nvidia fanboy but I saw the light and I plan to go 390x Crossfire 4k in NY next build.
>>
>>46849460
Adobe CC begs to differ.

The only reason a big studio would pick FirePro/Quadro is because they can afford it, but trust me, on any fast render machines or workstations for content creation, they're using consumer GPU's.
>>
>>46849489
>>
File: nvidia-tattoo-09242010[1].jpg (57KB, 600x603px) Image search: [Google]
nvidia-tattoo-09242010[1].jpg
57KB, 600x603px
>>46849559
>>
>>46849241
>workstation
>fapping on facebooksluts 24/7
>muh energy consumption!!!!!
>>
>>46849545
http://siliconlottery.com/
>>
>>46849574
>>>/pol/
>>
>>46849552
We're not talking about Photoshop here, there are proprietary 3D modelling and animation programs that get used there that literally won't work without the certificates. It might be shitty DRM but that's what you're paying for in the end. A Titan is NOT a proper workstation card, it might be able to do similar tasks at adequate levels, but it lacks the support and the proper drivers to actually fill that role.
>>
>>46849552
You are retarded. The normal geforce no dp is the same speed in adobe cc.
>>
>>46849487
> scientist
> engineer
> not using error-correcting ram and certified non-gaming-bullshit drivers

congratulations, your plane/car/ship/plant/rocket just exploded! =)
>>
File: 1392172746589.jpg (54KB, 412x371px) Image search: [Google]
1392172746589.jpg
54KB, 412x371px
>>46848635
>inb4 11.5 gb
>>
>>46849795
so getting a workstation card for game development will be better than getting a titan x or a GM200 gaming card because i won't have to rely on the gaming-bullshit drivers?
>>
And today we’re excited to announce an expansion of that partnership with NVIDIA providing all UE4 developers with not just binary but C++ source access to the CPU-based implementation of PhysX 3.3.3, including the clothing and destruction libraries, through Epic’s Unreal Engine repository on GitHub. This means that the entire UE4 community can now view and modify this PhysX code alongside the complete C++ source code for UE4. Modifications can be shared with NVIDIA who will review and incorporate accepted submissions into their main PhysX branch, which then flows into future versions of UE4
>>
TITAN cards are only for FP64 compute

TITAN cards don't have Quadro features enabled and don't have ECC feature like Quadro/Tesla cards for mission critical usage

Quadro K6000 was $4999 at launch, a TITAN/TITAN Black card was just $999
>>
>>46849487
scientific stuff typically needs 64bit floating point precision. which is crippled in the titan.
>>
>>46849849
And because of ECC memory, which is a huge deal for many programs that actually use DP rendering.
>>
>>46849586
Is there a UK equivalent? What about having it shipped from America have any of you tried it before, was it worth it?

BTW it also only seems to test CPU's which is a shame.
>>
>>46849952
isn't FP64 performance the entire point of the titan?
>>
>>46849362
Jesus Christ they purposely cripple the cards just to fuck their customers over. How can people defend them? They're just as bad as Apple and so are the fans.
>>
>>46849992
i don't think there's a UK equivalent except being a scumbag and buying a bunch of chips at the store and trying them all with careful resealing of the packages or being slightly less of a scumbag and selling the less good chips as open-box on ebay. shipping should be pretty cheap and they ship to many places (even though they don't list them all on their site yet) but you might get hit with custom fees if you're unlucky. i haven't bought CPUs specifically but i've ordered a bunch of stuff and as long as you stick with USPS you should be fine (UPS for example are stricter with custom fees).
>>
>>46850096
>these workstation cards that required billions of dollars of R&D and fabrication are easily worth thousands of dollars each for professional applications
>b-but i just want my gaymes
>ok we'll give you a gaymen card for a few hundred bucks
>this is somehow evil
>>
>>46850138
That's AMDPOORFAGS for you

Companies spend billions to develop things, but they should sell it at a loss to feed AMDPOORFAGs lack of income

Oh well, they can go and buy AMD products, after all AMD is willing to lose money to cut 290X to what, $299 or some crap like that
>>
>>46850138
>ignoring the blantant lies about "framerate" and how much VRAM you get, or the power consumption through very carefully engineered graphs and statements

S...sure.
>>
>>46850215
>ignoring the blatant lies about how "good" freesync is, or the "open source" mantle which got tossed in the trash
>>
>>46850298
>"open source" mantle
* "open" mantle
there's still no public API for it
>>
>>46850202
Do you have to be such a good goy? I buy the best product and the best GPU's are AMD. I plan to go 390x Crossfire in my next build 16GB of the best RAM possible 4790k or better with a IB-E cooler.

If there is the option I will go OLED Freesync 21:9 HDR 4k and I would be willing to spend £2k / $3k or under for that monitor.

So what build are you running poorfag?
>>
>>46850202
>>46850138
nope, nope.

nvidia was making money even before they thought of rebranding geforce to quadro and selling it at four times the price to corporations that were happy how great deal it is compared to SGI machines.
>>
>>46850202
>company which is hardly worth over a few hundred million
>capable of spending BILLIONS
all my wats?
you marketer shills are really bad at math. how much do you faggots get paid to defend your company against every post?
>>
>>46850416
>If there is the option I will go OLED Freesync 21:9 HDR 4k
>and I would be willing to spend £2k / $3k or under for that monitor.
waitingskeleton.jpg
>>
>>46850574
http://www.reuters.com/article/2015/03/03/tsmc-brief-idUSH9N0QX00P20150303
>>
>>46850627
ok that's T$ not USD but check out the numbers here http://en.wikipedia.org/wiki/TSMC
>>
>>46850641
also nvidia:
>Revenue US$ 4.13 billion (2014)
>Operating income US$ 496.227 million (2014)
so nvidia spends billions each year
>>
>>46850416
>16GB of the best RAM possible
don't be a retard, buy a value brand 1600MHz RAM and spend the money where it actually matters.

right now, you can get three 27'' IPS 4k monitors under $1800
>>
>12gb vram
>384bit bus

Is that even reasonable?
>>
>>46850617
Yeah unfortunately it doesn't look like that is happening soon I will probably have to go quantum dot instead, although I had heard that blacks are still shit on quantum dot.

I dunno might just wait it out I don't need to upgrade soon.

BTW anyone know of an Quantum dot monitors that have been announced? Hopefully with most of the specs I want.
>>
>>46850741
the gtx 980 with 4 GB vram and a 256-bit bus is as good as if not better than the 290x with 4 GB vram and a 512-bit bus so i'd say it is perfectly reasonable. nvidia also has 7 GHz memory while amd only has up to 5.5 GHz memory.
>>
>>46850684
most of which is spent on marketer shills like yourself. also factories and their worker who assemble the cards.

not the actual work on making better gpus.
>>
>>46850791
not at 4k it isn't.
>>
>>46850799
so where is your russian GNU+communism freedom fighter graphics card if it's so easy and cheap?
>>
>>46850704
Money isn't really a problem I have $30k saved up and save an extra $1.3k a month.

If the DX12 update does allow the use of the CPU's GPU component like they say then the faster RAM would help. And it's not much more compared to everything else.
>>
>>46850799
Hey faggot, go to Sunnyvale and ask AMD for free handouts since they're so generous
>>
>>46850805
Yeah essentially the 980 might be on par or better than a 290x if you want to play minecraft at 1080p 9000 fps but if you actually want to use it at the kind of resolution it's designed for the 290x blows it out of the water.
>>
>>46850805
>>46850873
>In most of the games tested GeForce GTX 980 SLI matched the same gameplay experience as AMD Radeon R9 290X CrossFire. This was surprising considering single-GPU GeForce GTX 980 is able to outperform single-GPU AMD Radeon R9 290X.
>GeForce GTX 980 SLI was on par, equal with AMD Radeon R9 290X CrossFire in performance, most of the time.
http://www.hardocp.com/article/2014/10/27/nvidia_geforce_gtx_980_sli_4k_video_card_review/11
>>
>>46850948
Yeah I do feel quite shit aswell for how I treated AMD fans just trying to help. They would for example recommend me a 295x2 which cost £20 / $30 less than 970 Gigabyte SLI and I gave them the whole LOL OVERHEET HOUSE FIRES POORFAG SPEECH.
>>
>>46848635
10.5GB
>>
Is 12 GB of VRAM even necessary? RAM speed that we really need to see increase for high resolutions? Isn't it the
>>
>>46852806
it probably won't be needed but this card is more for show than to be useful
>>
>>46852806
>>46852844
Whoops, I deleted a part of the second question. Isn't it RAM speed that will be more of a factor with higher resolutions?

Will 4 GB of HBM be better than 8 GB of DDR5 at 4K resolution?
>>
>>46852806
>>46852844
we're apparently already using 4 GB considering the gtx 970 fiasco, so we'll be using more than 6 GB soon. the gaming version of the titan x will of course have "only" 6 GB vram. the 12 GB is for "prosumer" usage rather than gaming.
>>
>>46852875
>Will 4 GB of HBM be better than 8 GB of DDR5 at 4K resolution?
depends on the game. as soon as you use more than 4 GB, the 4 GB will slow down to a crawl.
>>
>>46848760
Nope, no DP. It has the same (i.e. 1:32) FP 64 Performance ratio as GM 204 and 206. Which makes the chip completly useless for the HPC market.
>>
>>46848635
So the VRAM is 11.5GB?
>>
>>46849266
Probably. You don't fix that with more FP 64 performance on your card though. Why? Because games don't fucking use it.

Also this discussion is useless here, GM 200 has no FP 64 performance worth mentioning.
>>
>>46849290
Yeah, the buyers will be happy that nvidias new awesome workstation card offers way less FP 64 performance than a 5 year old GF110 chip.
>>
>>46852884
I doubt we will use much more than 4GB of vram any time soon simply due to the console limit. The textures aren't going to need more memory, only the buffers and partial renders. 5GB limit on consoles pretty much puts how much pc would need. This doesn't include modding but 4k textures are pretty much standard now and not really much gain on going higher.
>>
>>46853224
just wait until 8K catches on. 4K mainstream adoption has been going pretty quick.
>>
>>46849046
The 295X2 has 12.4 billion transistors
>>
the real turning moment for me personally was the drivers not being updated for the 780ti to the point where its now losing in benchmarks to the 290X

I can understand a manufacture not keeping driver updates and a priority for something that was old and sold for cheap like a fucking 460 but not having drivers for a 780ti? literally the last generation card and a card that people spent 500-700$ on?

that proves to me beyond any doubt that Any nividia card i purchase will have no longevity

Im seeing alot of threads with people who have had 760's and 770's burn out and im also wondering why none of them stop to ask why those cards were destroyed after such a short lifespan? does it have to do with their bad coolers? their excessive clockspeeds?
>>
>>46852899
How likely is that? I plan to go 390x Crossfire for my 4k build.
>>
>>46849068
>>
>>46853941
You should be glad if you did get a update it would likely brick your card.
>>
>>46849266
>Wrong. example: shading artifacts are often caused by floating point imprecision.

If your shading requires you to use double precision, you are shading wrong.
>>
>>46852899

Well the thing is nobody really knows how the HBM will respond to loads above 4GB.
>>
>>46854049
>>46854336
What about DX12. Do you think most games will be able to take advantage of the VRAM from both cards or not?
>>
>>46854336
it doesn't try to, the game engine realises there isn't enough vram and start streaming vram through the pci to system ram better, if that fails, the game will probably drop to 0 fps every time it fails to locate data in vram.
>>
>>46854869
If a person had the highest MHz lowest latency DDR4 RAM would it take such a huge hit?
>>
>>46854791
no. most of them won't even try.
>>
>>46853941
>Im seeing alot of threads with people who have had 760's and 770's burn out
dude those cards (well, the chips and design) are three years old at this point. games are just becoming more demanding as time moves on.

>drivers not being updated for the 780ti to the point where its now losing in benchmarks to the 290X
a 16 months old card is a bit weaker than amd's best single-gpu offering? boo- freaking hoo.
>>
>>46855032
it doesn't matter, the pci bus is what will be too slow
>>
>>46848810
The catch was heat and noise though
>>
>>46855325
and CUDA. topkek at using amd anything for GPGPU
>>
>>46855354
Isn't OpenGL bigger than cuda?
>>
>>46855304
>this much denial
Face it, 7970s and 290x have aged way better than 680s and 780tis and they were both cheaper at the time. Nvidia is shit for long term.
>>
>>46855304
You mean a 16 month old card is beaten by a 15 month old card. The 900 series is also with the latest drivers a 6 month old card being beaten by a 15 month old one and the 390x when it is released will blow the 900 series away. Don't forget the 295x2 is the most powerful dual GPU card likely still more powerful than this Titan and there is a 395x2 planned to be released.
>>
>>46855425
do you mean OpenCL? i don't think OpenCL is bigger than CUDA.
>>
File: power.jpg (735KB, 2203x2937px) Image search: [Google]
power.jpg
735KB, 2203x2937px
>>46848978
>>46849039
>>46849081
>>46849250
Using Data from HWBOT and official specs
BC = Baseclock, AT = Average Turbo, MT = Max Turbo, OC = Average OC

GTX 780:
BC = 0863 mhz
AT = 0900 mhz
MT = 1002 mhz
OC = 1173 mhz
BC2AT = 04%
BC2MT = 16%
AT2MT = 11%
BC2OC = 36%
AT2OC = 30%
MT2OC = 17%

GTX 780 Ti:
BC = 0875 mhz
AT = 0928 mhz
MT = 1020 mhz
OC = 1202 mhz
BC2AT = 06%
BC2MT = 17%
AT2MT = 10%
BC2OC = 37%
AT2OC = 30%
MT2OC = 18%

GTX 970:
BC = 1050 mhz
AT = 1178 mhz
MT = 1250 mhz
OC = 1434 mhz
BC2AT = 12%
BC2MT = 19%
AT2MT = 06%
BC2OC = 37%
AT2OC = 22%
MT2OC = 15%

GTX 980:
BC = 1126 mhz
AT = 1216 mhz
MT = 1266 mhz
OC = 1458 mhz
BC2AT = 08%
BC2MT = 12%
AT2MT = 04%
BC2OC = 29%
AT2OC = 20%
MT2OC = 15%

|||||||||||||||||||||||||||

R9 290:
BC = 0662 mhz
AT = 0852 mhz (not given, using 2/3 * MT-BC as Nvidia does for 9 series)
MT = 0947 mhz
OC = 1126 mhz
BC2AT = 29%
BC2MT = 43%
AT2MT = 11%
BC2OC = 70%
AT2OC = 32%
MT2OC = 19%

R9 290X:
BC = 0727 mhz
AT = 0909 mhz (not given, using 2/3 * MT-BC as Nvidia does for 9 series)
MT = 1000 mhz
OC = 1142 mhz
BC2AT = 25%
BC2MT = 38%
AT2MT = 10%
BC2OC = 57%
AT2OC = 26%
MT2OC = 14%

In summary, Hawaii overclocks just as well as Maxwell you stupid fucking shitters.
>>
File: titancompute.png (16KB, 450x477px) Image search: [Google]
titancompute.png
16KB, 450x477px
>>46855354
>>46855425
>>46855559
CUDA had the lead

Then Nvidia ruined it by releasing a card that does worse than a card that costs 1/3rd the price; and barely beat its predecessor the GTX580.

This is an even ground benchmark, DX11 compute.
>>
>>46855830
K, what about actual performance when overclocked? I'm curious.
>>
>>46855830
that doesn't say the OC'd boost clock does it? and i was talking about performance you fuckwit, not about clock speeds.

>AMD Radeon R9 290X Creamed

>You don't have to look long at our results today to see that the AMD Radeon R9 290X is crying out for help. The AMD Radeon R9 290X is currently AMD's flagship single-GPU, just like the GeForce GTX 980 is NVIDIA's flagship single-GPU. Other than adding more GPUs, this is as fast as it gets in single-GPU form from both AMD and NVIDIA. Yet, it looks like the AMD Radeon R9 290X is lagging severely compared to what NVIDIA currently has on the table.

>The GeForce GTX 980 is making the AMD Radeon R9 290X look like last-generation technology, which you can argue it is since its launch nearly 1.5 years ago. We are shocked how far behind the Radeon R9 290X is falling from the newer GeForce GTX 980. It seems to keep falling further and further behind in every evaluation we write! Chock some of that up to the clocks we are seeing, but drivers are certainly part of that equation as well.

>The ASUS ROG Poseidon GTX 980 Platinum has surely laid the smack down on the AMD Radeon R9 290X. We were using a very high factory overclocked customized and expensive AMD Radeon R9 290X GPU based video card today. The Sapphire Vapor-X R9 290X Tri-X OC debuted at the same price as the ASUS ROG Poseidon GTX 980. However today the Sapphire can be had for about $400. The SAPPHIRE card let us overclock the AMD Radeon R9 290X to its highest frequency we've ever achieved consistently.

>This is the AMD Radeon R9 290X at its best, at its absolute highest performance potential on air. However the AMD Radeon R9 290X is slower in every game, not just by a little, but by a lot. A highly overclocked R9 290X cannot keep up with a factory overclocked GTX 980, or a manual overclock GTX 980. It would be even worse if this were a stock, default clocked AMD Radeon R9 290X. It is time for AMD's next generation, because Hawaii (R9 290/X) just got old.
>>
>>46855943
source: http://www.hardocp.com/article/2015/03/03/asus_rog_poseidon_gtx_980_platinum_video_card_review/12
>>
>>46850317
mantle was originally supposed to be open source but later AMD decided to put their work into DX12, glNext and also Vulkan. All these three are basically utilizing mantle code. Vulkan is probably Mantle's true successor, but others are too.
>>
>>46848635
It's kind of neat I guess but I can't help but feel like this was bad timing on Nvidia's part. We are on the verge of a tech shift away from GDDR5 in favor of stacked DRAM technology like HBM which may offer 2-4x the memory bandwidth and possibly more with time and further development. This card will be rendered obsolete within a year of its inception. If the card was like $600, $750 tops it might be acceptable but leaks say about $1300 and I feel at the absolute minimum it'll be $1000 so given that it just isn't worth the money.

Whether you choose to get the 390X or wait for Pascal is up to you but regardless of what company you shill for I feel like the Titan X is a bad buy.
>>
>>46855943
>>46855955
He posted relative performance at given clock speeds.

>hardocp
One of the most biased tech "news" sites I've ever visited.
>>
>>46855870
The thing to realize is most sites just used a stock cooled 290X and called it a day
http://www.tomshardware.com/reviews/radeon-r9-290-and-290x,3728-5.html

The 290X Tri-X (marketed as OC, not OC'd by reviewers) runs at 1010mhz baseclock, it's the closest to what you'll get out of a baseclock 290X with proper cooling. You will note it's on par with a stock GTX 780 Ti.

These benchmarks are run at 1920x1080 resolution.
>>
>>46855996
meant 1010mhz max turbo
>>
>>46855955
>>46855995
>hardocp
You're not fooling me Mr.Hanson.
>>
>>46849250
But cooler. Show me your 290 under 50 degrees load, mine is 1,55 Ghz.

Also the Titan X will be overpriced, it wont have more than 50 % more performance (960 has half of the 980 specs and exactly performs 50% worse). I'll keep my 980 and save my money for a proper VR headgear until 2016 Pascal era.
>>
>>46855995
>He posted relative performance at given clock speeds.
no he didn't. the hwbot stuff is pure autism, it's only about the clock speeds.

>One of the most biased tech "news" sites I've ever visited.
right...
>AMD Radeon R9 295X2 CrossFire, or QuadFire as it is commonly called, provides the best gaming performance we have ever experienced. Two of these AMD video cards, costing $3,000, offers up the absolute best gameplay experience. You will be able to take all your games to the highest possible settings on a single-display, or sub-4K Eyefinity configurations. Even at the very demanding 4K resolution you will be able to enjoy a high-end gaming experience with the highest graphics settings, and this is what is important.
http://www.hardocp.com/article/2014/04/29/amd_radeon_r9_295x2_crossfire_video_card_review
>>
>>46856029
>four GPUs from the absolute newest generation of GPUs provides the most performance
Remember, this was before the 9xx series. They couldn't have pulled that kind of bullshit at the time.

HardOCP is literally the only site that agrees with you. Stop shilling and get the fuck out.
>>
File: 2015-03-04 14.52.08.jpg (413KB, 530x945px) Image search: [Google]
2015-03-04 14.52.08.jpg
413KB, 530x945px
Amd stronk physx hybrid


physx is the only good thing nVidia offers to the gaming world and they set us all back by buying out ageia and whoreing out Gpu dedicated physics processing to only Nvidia

Luckily with a little work you can use physx on side an amd Gpu like this 295x2
>>
>>46856112
He'll probably bring out Anandtech actually

They use stock cooled 290X's
>>
>>46856112
>HardOCP is literally the only site that agrees with you
maybe because sites for casuals like you only compare stock clocked cards. you have literally nothing besides your butthurt to base your opinion on that they're biased. they do the tests like they say they do them and they report the results as they get them.
>>
>>46856208
Nah, I've also read a lot of other reviews from them, any time they compare AMD and Nvidia Nvidia wins, and their PSU reviews are shit.
>>
>>46856119
>physx is the only good thing nVidia offers

How many games support physx again? like 20?
>>
>>46856208
Yes, while hotboxing cards artificially by using a fanless case.

And running tests on a liquid cooled card vs an air cooled card.
>>
>>46856251
>any time they compare AMD and Nvidia Nvidia wins
no shit
>>
>>46856278
...which doesn't agree with other sites. The R9 series handily beat the 7xx series (before the 780 Ti, which is really just an OC'ed 780, and could be beat by an OC'ed 290X) on nearly every site, yet HardOCP always showed Nvidia winning. The bias is obvious to anyone who isn't a fanboy.
>>
>>46855993
Can someone explain to someone who's new to all this what the GB number and memory bandwidth each do and what having more or less of one or the other does.
>>
oh great another 1000$+ card so all the fan boys can orgasm and all the reviewers can get bought out. And AMD will go out of business. I don't know if I even care any more. Enjoy your 5% increase a year for 1000$ gfx card once there is a monopoly. Look at what intel has done. They've increase their desktop chips by like 20% total over two generations.
>>
>>46856321
GPU memory is used to hold texture, model, and frame information. As geometry complexity, texture size, and resolution size increase, more memory is required.

However, 2gb of memory will do you no good if your card isn't capable of physically putting information into it fast enough.

As an example of how crucial memory speed is, the GTX 970 debacle with 3.5+0.5 gb of memory, was caused by that last 0.5gb being accessed at 1/7th the speed of the rest of the 3.5gb of memory. When that last 0.5gb of memory requires access, your entire application will start lagging terribly. This can also sometimes result in visual glitches.

https://www.youtube.com/watch?v=ZQE6p5r1tYE
https://www.youtube.com/watch?v=xMA9xKn0DaE
>>
>>46856371
even in a monopoly they would have to compete with themselves to get people to upgrade their existing graphics cards. and ARM SoCs would catch up if they slacked off. intel has still made significant gains in both performance, power efficiency and iGPU.
>>
>>46856386
the visual glitches only affected shadowplay recordings in multi-GPU SLI setups. it looked fine during gameplay, only the recording was affected by it.
>>
>>46856371
You mean 12%
>>
http://wccftech.com/amd-reveals-radeon-r9-390x-gdc-wip/
>4 GB
lol enjoy your $700+ GPU getting obsolete real quick at 4K+ resolution
>>
>>46856641
8gb version later
>>
>buy the new titan
>install it
>turn on a movie
>go to take a piss
>smell smoke
>jump out of the bathroom
>my house is on fire
Nvidia - not even once
>>
Nvidia needs to fire their entire upper management after the past few months of disasters.
>>
>>46856657
not on the 390x. not happening with 1st gen HBM.
Thread posts: 169
Thread images: 14


[Boards: 3 / a / aco / adv / an / asp / b / bant / biz / c / can / cgl / ck / cm / co / cock / d / diy / e / fa / fap / fit / fitlit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mlpol / mo / mtv / mu / n / news / o / out / outsoc / p / po / pol / qa / qst / r / r9k / s / s4s / sci / soc / sp / spa / t / tg / toy / trash / trv / tv / u / v / vg / vint / vip / vp / vr / w / wg / wsg / wsr / x / y] [Search | Top | Home]

I'm aware that Imgur.com will stop allowing adult images since 15th of May. I'm taking actions to backup as much data as possible.
Read more on this topic here - https://archived.moe/talk/thread/1694/


If you need a post removed click on it's [Report] button and follow the instruction.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com.
If you like this website please support us by donating with Bitcoins at 16mKtbZiwW52BLkibtCr8jUg2KVUMTxVQ5
All trademarks and copyrights on this page are owned by their respective parties.
Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
This is a 4chan archive - all of the content originated from that site.
This means that RandomArchive shows their content, archived.
If you need information for a Poster - contact them.