[Boards: 3 / a / aco / adv / an / asp / b / bant / biz / c / can / cgl / ck / cm / co / cock / d / diy / e / fa / fap / fit / fitlit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mlpol / mo / mtv / mu / n / news / o / out / outsoc / p / po / pol / qa / qst / r / r9k / s / s4s / sci / soc / sp / spa / t / tg / toy / trash / trv / tv / u / v / vg / vint / vip / vp / vr / w / wg / wsg / wsr / x / y ] [Search | Free Show | Home]

No thread on the RX480 yet? How mad on a scale of 10/10 are

This is a blue board which means that it's for everybody (Safe For Work content only). If you see any adult content, please report it.

Thread replies: 318
Thread images: 42

File: Radeon-RX480-feature[1].jpg (326KB, 1920x1080px) Image search: [Google]
Radeon-RX480-feature[1].jpg
326KB, 1920x1080px
No thread on the RX480 yet?

How mad on a scale of 10/10 are nvidia fuccbois?

AMD looking like it took back the mainstream crown

$199
>>
>>54897760
>took back

Never had it in the first place. Not a loyalist to either company, before you start memeing at me.
>>
>>54897821

>Never had

The R9 270X had the crown not long again bro, Also a $199 card at release, even if it was just a rebrand.
>>
>>54897890
Are you really saying the R9 270X had more mainstream appeal than any comparable nvidia card? That's pretty deluded.
>>
I have a 6870 and it starts to feel old.
I play in 1080p.
Is this THE card I've been looking for ?
>>
>>54897930
Im saying, at the time, it was the best card available for $200 and I wasn't the only one, it was chosen by many reviewers on many sites at the time.

Now we're getting the 480 for $199 in a couple weeks while the 1080 is $600-$700 and not even avaliable right now
>>
>>54897760
It's OK for the price but those TFLOP numbers are lower than the beefed up versions of current-gen consoles that are going to be released.

I'm a little bit disappointed DESU. Hopefully they reveal a higher-end card otherwise I'm staying with the green team.
>>
>>54897973
>not even avaliable right now
It came out last week.
>>
>>54897981
They are, While the specs aren't released there will be a RX490

This is clearly a budget card, esp for those who want to get into VR but dont have the cash for top end GPUs

>>54898010
yea it did, now please, show me a website that has it in stock for MSRP. Sure I could buy one on ebay for 20% markup, I guess that counts.
>>
>>54897760
I'm terribly disappointed by this. I have a 4K FreeSync monitor and was eagerly anticipating AMD releasing some new cards that are actually fast and I was willing to spend money for them too. Guess I'll have to pay NVIDIA instead and give up on FreeSync for the time being.
>>
>>54898034
>>54897973
>claims 1080 isn't out yet
>acts like 480 is already here
>the literal facts are the exact opposite, even if 1080s are currently out of stock for selling so well to idiots willing to buy reference

Smells like shill in here.
>>
>>54897760
not mad at all. it's good competition. Still going for the 1070 though based on the current benchmarks and my budget
>>
>>54897981
PS4K is only 4.2Tflop and cant overclock so.....no
>>
>>54898082

>I didn't say it wasn't released, I said you cant buy it without paying a retard tax, but nvidiots are used to that
>Clearly said the 480 will be released in a few weeks

Smells like #Nvidiadefenseforse in here
>>
>>54898034
It still means it's available so it does count. I'm waiting a few months, probably going to get the 1080.

As for VR performance all future cards should have the ability to play them well now that the architecture is built in to these cards to not get a performance hit when rendering multiple angles, so any cheaper cards in the future should be fine for it if you're willing to wait.
>>
>>54898121
I've heard 5.4 TFLOP and the XBone+ will be 6 TFLOP.

Of course I may be mistaken, I don't think anything official has been announced yet.
>>
File: NvidiaTheWayItsMentToBePaid.png (281KB, 1452x734px) Image search: [Google]
NvidiaTheWayItsMentToBePaid.png
281KB, 1452x734px
>>
>>54898052
So you bought a 4k monitor yet you had no idea that Vega wouldn't be out for a while and you're unwilling to crossfire two 480s for laughably good performance/$? Sounds like you're just an idiot or impatient.
>>
>>54898176
single card setup is always better
>>
>>54898169
>implying scalpers mean anything

Scalpers scalp everything, AMD, nvidia, PCs, consoles, clothing, tickets, jews gonna jew.
>>
>>54898052
You could xfire two 480s, it'll be cheaper and faster then a 1080, and itll actually be available on release date.
>>
>>54898166
2304 shaders with dual issue single precision floating point, running at 911Mhz = 4.19 Tflop/s
>>
>>54898218
This is just blatant shilling now, man. All you're missing is the ™s.
>>
>>54898216
It means you cannot purchase a 1080 for MSRP unless you know a guy who knows a guy who works at a store and will set one aside for ya.

Pretty bad when its supposed to be the best Flagship card in the world at the moment
>>
>>54898218
enjoy your microstutter
>>
>>54898224
So there are official numbers? Link?
>>
>>54898236
I know rite, I mean, How can he just sit there and type something true, doesn't he know where he is right now.

"The truth hurts" - you

>>54898253
YES! FINALLY A MEME

Not a single 3.5 reference though, hmmmmm
>>
File: 480.png (386KB, 1579x638px) Image search: [Google]
480.png
386KB, 1579x638px
>>54897760
>No thread on the RX480 yet?

Fuck off already. I'm planning to buy the 490 once its out but just fuck off already. We don't need the hundredth thread on Nvidia or AMD's new cards that will just devolve into shitposting.
>>
>>54898240
I don't understand why you would feel bad for retards who want to buy reference cards. No one should want this. I didn't even think they'd sell out, honestly, I mean with the price point and all.
>>
>>54897821
Proving that you are 18.

ATi cards used to demolish Nvidia cards
>>
>>54898281
A catalog search for RX480 shows only OP and a thread made 20 mins ago
>>
>>54898253
Freesync fixes microstutter
>>
>>54898176
>>54898218
I've got 290X CF, getting 2 RX 480s would pretty much be no upgrade at all.
Also CF support is fucking crap. AMD gives no fucks. They pretend to, but they actually don't.

When I bought my monitor FreeSync was basically a bonus, it was the cheapest 4K 60Hz IPS screen anyway, it just happened to come with FreeSync. There wasn't any information available back then about what AMD was actually going to release, I was expecting them to release a high-end product as they have for the last few generations and as NVIDIA has done with the 1080.
>>
Fucc.
Looking into the R9 480 meme.
I'm a fucking poorfag and 8niggabyrtes of vram for 200 shekels sounds a bit too good to be true if it's actually on par with say a 980.

Fuck my nigs. Help a fellow poorfag out.
What cheap CPU, Mobo and ram combo can I use to make a poor man's gaming rig? My budget is around 700-800 euroshekels and for the first time in years this actually seems like an achievable goal thanks to that cheap GPU

I WANT TO BELIEVE.
>>
>>54898240
>running out of stock
>put up prices to slow down sales
>not knowing how supply/demand works

Just wait a month or 2 FFS!
>>
>>54898301
29, no they didn't. At best they were competitive.
>>
>>54897760

I am proud that Raja designed this product, I know it will sell well in India
>>
>>54898308
dont you need a new monitor for that?
>>
>>54898301
they were ahead for about a year and then have been on a pretty steady decline for the past 11 years. At it's peak it only had 55% of the market. I wouldn't call that a crownm unless you're talking about the paper one you get from BK.
>>
File: original[1].jpg (50KB, 951x433px) Image search: [Google]
original[1].jpg
50KB, 951x433px
>>
>>54898325
31 here. ATi stomped the shit out of Nvidia in performance per dollar almost every generation except one. (8000 series if my memory serves correctly)
>>
>>54898362
2 cards barely pushing past a single card. That's ATi's bread and butter.
>>
>>54898382
The GTX 1080 costs more than three times as much as the RX 480.
>>
>>54897760
>thread on the RX480
where the fuck have you been
>>
File: 1456961192076.jpg (181KB, 750x750px) Image search: [Google]
1456961192076.jpg
181KB, 750x750px
>>54898318
>>
>>54898382
2 advertised midrange cards $400
1 advertised best of the best high end card $699(Where Available)

Sounds like some delicious bread and fantastic butter
>>
>>54898362
Barely above a 1080 in the game that favors AMD most heavily. I'd love to see FCAT results on that CF setup though. I'm almost certain that single 1080 has lower frame time variance and a higher 99th percentile FPS.

Going for multi-GPU with slow cards is essentially always a mistake compared to buying a single card that's just as fast.
>>
Nvidia will have to lower their MSRP all around soon.

Win/Win for us
>>
>>54898308
source on that claim
>>
>>54898432
Are most nvidia fanbois blind when it comes to the dollar sign and the amount directly after?
>>
>>54898325
But they did.. from like 1999 to 2008
And the biggest difference under use has been a difference of less than 10 frames on most games and programs
are you lying?
>>
>>54898432
it changes with liquid vr though where each gpu's renders the scene for each eye
>>
>>54898446
Do you know what freesync does?
>>
File: 1464821741608.png (76KB, 895x790px) Image search: [Google]
1464821741608.png
76KB, 895x790px
>AMDrones still they are relevant in the modern tech era
>>
>>54898456
I haven't owned NVIDIA cards since the 8800s, those were excellent though. I've learned (by buying a CF setup myself) that being a penny pinching retard doesn't pay off when it comes down to getting amazing multi-GPU deals on paper. An extra $200 for a better gaming experience for 2+ years until the next upgrade is of no consequence to me or to anyone with a decent source of income.

>>54898488
Too bad there aren't any worthwhile actual games to play in VR yet.
>>
>>54898497
vsync at all frequencies and doesnt lock framerates.

now link a source that says it fixes microstutter
>>
>>54898497
FreeSync doesn't include magical frame pacing, don't be retarded. If you've got 1 frame rendering at 33ms and the next at 16ms, you're still going to get a stuttery mess with FS too, but there won't be any tearing. You'll get rid of some judder which will make it look smoother, but it does fucking nothing to microstutter caused by frame time variance.
>>
>>54898558
Not him
but
Micro-stutters are caused by the monitor being unable to refresh the screen at the same rate as it's rendered, which causes it to tear as some parts are displayed before others, if you can increase the refresh rate of the monitor, it reduces the chance of this happening.
>>
>>54897890
I have one. Great card even today desu. Handles my U2515H fine.
>>
>>54898542
Dont really get your point....

It's not $200 extra, $400 is the cost of TWO 480s. Which not only outperform one 1080, also cost $200+ less then the 1080.

So not only is it cheaper, it outperforms the gtx, yet your argument is "I don't care I'm rich so Ill buy the 1080"?
>>
>>54898416
Honest advice bruh. Get a locked i5 (skylake if you can afford the ddr4) and basic non-z series mobo and use integrated graphics until the 480 drops. 8gb ram with good reviews, 212 evo or similar, decent atx case with good reviews (i like fractal stuff), a smallish ssd for your OS and a tb hdd for storage. Splurge on a nice platinum series seasonic psu that will last you like 10 years. Easy as that.

Don't bother spending extra on planning to oc your cpu. You won't be cpu bound in most anything you do if you're a real poorfag.
>>
>>54898599
Another anon made this for me.
Swap the AMD processor with the 15 6500 you say?

http://de.pcpartpicker.com/list/jBnfr7
>>
>>54898621
Absolutely. The FX series has always been niche and is very weak at single threaded tasks. If you can afford any i5 go for it. Something like a 4460 and an H97 series motherboard should only cost you about 50-60 euros more. Use the stock cooler if you need to and that will save you another 20-30 for now.
>>
>>54898596
It's $200 extra for a 1080, if we're going by MSRP.

The 1080 is "outperformed" (note: may actually be less smooth in practice due to microstutter) in the one title that favors AMD most heavily. I'd love to see those RX 480s in a game that favors NVIDIA, how would they do in TW3? How about GTA V where that Pro Duo showed absolutely horrendous frame time results?

Also, the scaling is fucking shit, going from ~40FPS to ~60FPS, around 50% higher FPS for 100% extra money. Amazing. If a game doesn't support CF/DX12 multi-GPU, you're absolutely fucked too.

My point is that $200 isn't much for what will very, VERY likely end up offering a better overall experience and less CF glitches and late support for multi-GPU. I'm not rich either, an extra $200 over 2 years or however long you wait until upgrading GPUs really isn't all that much. You're not rich if you can afford to spend $200.
>>
>>54898621
>>54898663
Or go with the 6500 and a skylake motherboard, just make sure you then buy ddr4 ram instead of the ddr3 the fx or haswell cpus use. This option will be a little bit faster but a bit more expensive.
>>
>>54898586
freesync doesnt magically increase the refresh rate of your monitor.
>>
>>54898669
MSRP RX 480 = $200

MSRP GTX 1080 = $600

Open your calculator application.
>>
>>54898669
CF doesn't fuck up nearly as much as it did.
>>
>>54897760
No thread? Not import maybe. We know its blowing as every shit from AMD the cuck company, for cuck people, curry peoplo gtfo.
>>
>>54898690
Not him but what point are you trying to make? Are you ESL? He pretty clearly is talking about the $400 cost total of two 480 cards and that saving $200 up front isn't worth the headaches of dealing with CF.
>>
>>54898712
A literal Prabhjeeth in our midst.
>>
>>54898690
Am I being trolled or are you actually this stupid?

2x RX 480 = $400
GTX 1080 = $600
1080 costs $200 more for similar performance in terms of average FPS (which doesn't say much).

>>54898711
Yeah it does. Overwatch at 4K is a microstuttery mess on 290X CF, even if I drop all settings and get FPS in the 120-150 range.

TW3 Blood & Wine stutters horribly in CF if some HUD elements are displayed. Yeah, something is fucked about HUD elements of all things. Dynamic water simulation (waves/ripples) is also broken in CF and has been broken for over a year now. Will probably never be fixed.
>>
ITT fanboys who constantly say 3.5GB VRAM isn't enough shill for an xfire 4GB setup over a single card 8GB solution based on a benchmark that heavily favors AMD
>>
>>54897760
11/10
>>
>>54898759
>overwatch
>the witcher
WEW LAD
>>
>>54898621
http://de.pcpartpicker.com/list/Gws2Fd
>>
>>54898759

Saving 33% cost for similar performance doesn't say much?
Why even post in this thread if you aren't worried about price/performance? Its the entire OP.
>>
>>54898791
No, average FPS doesn't say much, not price. 64.2 FPS (2x RX480) could easily be less smooth than 60.3 FPS (single GPU GTX 1080) if the frame time variance is better.

>>54898784
I don't particularly care about your taste or your opinion on mine.
>>
>>54898684
This looks good.
But I kinda want this case instead
http://www.sicomputers.nl/case-midi-corsair-spec-alpha-black-silve.html
For the price I think this fucks that case in the ass aesthetics wise.
As for the mobo, looks functional but a little on the ugly side.
Can I swap it out for link related and have similar results?
https://www.afuture.nl/productview.php?productID=4505185
>>
>>54898732
This is exactly what I'm saying, yes.
>>
>>54898831
keep changing your argument, oods are you'll be right one of these times.
>>
>>54898846
That's a Z board meant for overclocking. It'll work great, just realize you're overpaying a little bit for a feature you may never use unless you get the K version of that i5. I'm not into windowed cases but if that's your thing go for it.
>>
>>54898362

>AotS

OK, I don't support any vendor because supporting a company is fucking stupid since all companies just want to make you lose your money to them.

But this AotS bullshit is getting pretty old. How the hell did a game absolutely nobody plays and which is completely sponsored by one vendor became a standard benchmark? Atleast back in 2007 people actually played Crysis and it looked god damn good unlike AotS.
>>
>>54897760
1.25/10
Or 0.5 out of 4.
>>
File: sharedimage.png (1MB, 1200x630px) Image search: [Google]
sharedimage.png
1MB, 1200x630px
Is Ashes of Singularity the most important game of the year if not the decade?
>>
>>54897964
>I have a 6870 and it starts to feel old.I play in 1080p.Is this THE card I've been looking for ?

I have a 6950 unlocked to 6970 and it is pretty much the card I was looking for, provided they'll have a shorter length version. My case can't take those retarded 30cm long cards, too full with HDDs.
>>
File: 09870788.jpg (261KB, 411x611px) Image search: [Google]
09870788.jpg
261KB, 411x611px
I need moar fucking leaks! The wait is killing me! I cannot wait 6 more days!
>>
>>54898902
>people actually played Crysis
I agree that it looked good but come on man nobody could even run it for like 4 years
>>
>>54897964
390 overclocked is
>>
>>54898925
buy a better case next time

>>54898935
crysis still looks good...
>>
>>54898362
Gotta need a source on that image.
>>
>>54898945
>buy a better case next time

I would if they still made good cases. Fucking nothing has decent hdd suspension anymore, even the "silent" cases have a super thin grommet that supresses fuck all noise. When I tried moving to a Fractal R5, the entire case was vibrating due to the hard drives.

Fuck that, I'm keeping my P183.
>>
>>54899013
you can buy super grommets that fit's any screwholes.
>>
>>54898945
>>54899013
oh and on that note, I can fit in a 30cm card no problem, but only either by removing the hdd cage, or by keeping the card in the bottom pcie slot (just about fits below the bottom hdd.

>>54899029
fucking where?
>>
File: $_57.jpg (127KB, 1200x1200px) Image search: [Google]
$_57.jpg
127KB, 1200x1200px
>>54899029
>you can buy super grommets that fit's any screwholes.

Yeah, shitty ones like this that don't absorb fuck all vibration. They are goddamn useless.

I suppose it's a moot point since in a year or two SSDs will reach price parity with HDDs and I'll be changing all my 2-3tb drives to those.
>>
File: GTX-1080-31.png (244KB, 514x384px) Image search: [Google]
GTX-1080-31.png
244KB, 514x384px
>>54898902
Its a shitton better than what Nvidia have chosen as their go to benchmark.
Ashes is actually a game that were build for pc from the groundup, and make full use of the technology that is available.

Rise of Tomb Rider on the other hand is a shitty console port that run poorly on AMD card (memory leak issue for AMD card, even though its not used to the fullest that drag the min framerate down to a single digit) with a DX12 patch that for some reason caused a lost of performance on all card that doesn't have 8GB or memory.


I'll take AMD Ashes result over Nvidia RoTR bragging.
>>
>>54899151

>Its a shitton better than what Nvidia have chosen as their go to benchmark.

But people actually play RotR and Witcher 3. I don't know anyone who plays AotS but plenty of people who play RotR or Witcher 3.

AotS is ugly as fuck and there is no reason the game should run as bad as it does on any hardware. I really hate how they lie about no other game being the same scale because I played a ton of SupCom back in the day and AotS looks like a shitty SupCom clone.
>>
>>54899036
usually websites that sell pc case modding stuff. check forums like overclockers and hardocp which sites sponsor case modders or if they have a list of good webshops
>>
File: hurrimretarded.png (10KB, 300x300px) Image search: [Google]
hurrimretarded.png
10KB, 300x300px
>>54899074
>i bought one product and it was shit therefore all other products in the same category are also shit
>>
>>54899013
you might wanna spend a bit more than $100 if you want a solid case.
>>
>>54899013
Buy some Scythe HDD silencers and remove the HDD bays.
Buy bigger drives.
>>
>>54899262
I guess you don't know how those things work then. They are simply not thick enough to efficiently absorb enough vibration. The best grommets on the market are the ones in old Antec cases, and even those are not fullproof, just very close to it (close enough that it's not worth doing elastic suspension anymore since it is way faster to use the grommets).

>>54899245
OK, so you have no idea where to buy them or who sells them or what you are even looking for.

>>54899289
Got any suggestions?
>>
>>54899359
>OK, so you have no idea where to buy them or who sells them or what you are even looking for.
i just gave you the best fucking place to find out yourself, you ungrateful cunt.
>>
File: 01_spc02.jpg (309KB, 1024x768px) Image search: [Google]
01_spc02.jpg
309KB, 1024x768px
>>54899013
Drive Hammock is king.
>>
>>54899230
Yes, it is a good idea to benchmark a popular game that everybody play.
But that is up to the individual to decide, and that is not the point here.

Were talking about both AMD and Nvidia, and their chosen benchmark to showcase each company GPU prowess.
RotR is a shitty console port, a broken tool that were used to measure performance that end up giving a skewed, unfair result that heavily biased on one side.

When it comes to GPU comparison, I'd take Ashes result over RotR any day.
>>
If my current card is an 2GB MSI HD 7850, would it be wise to upgrade to an RX 480? Am """poor""".
>>
>>54898966
AMD @ Computex
>>
>>54899359
>Got any suggestions?
i only have my personal experience with a corsair 600T and i never put many vibrating drives in it.

that said it's old not really worth the price even when i bought it, but it's big and spacious inside and has good cable management. fans are meh.
>>
>>54899420
I currently have a 7870xt and Im planning on a 480 for my new build
>>
>>54899393
No, you told me to go to hardocp or overclockers and check their sponsors list. Sorry for not being grateful because you keep shilling for other sites. And telling me to look around on their forums isn't fucking helpful, don't you think I haven't done that already?

>>54899400
yeah, I used to do that too in the 00s before I got the P183. Kicks ass but a bitch to set up.

>>54899348
>use a 3.5 -> 5.25 converter

have too many drives to do that. 5.25 slots are a great way of adding more drives though, I agree on that.
>>
>>54899013
Corsair 230t is neat
>>
>>54898501
>infinite 1080
kys
>>
>>54899400

that's ingenious
>>
>>54898966
>>54899449
The single GPU result can be viewed in RadeonDemo profile over Ashes benchmark page.
>>
>>54898759
I am an AMD fanboy and even you know you math is wrong.

2xRX480 are probably the 8GB version to beat the 1080 (at least in AoS). The price for 2 x RX480 8GB is $230 = $460. We have not idea if that combo will also be any good in DX11 games or if CF microstutter will rear its ugly head.

I have an R9 290 here by Sapphire and I would enjoy a boost but let's be realistic here. Wait for actual benchmarks. You are going to have to wait a little under a month. Patience anon.
>>
>>54899486
no, i told you to go to their fucking case modding forum section and look at a fucking case modding thread and check that person's sponsor you fucking retard. learn to read.
>>
this probably classifies as shitposting but is logical increments a good guideline for basic builds that aren't super specialized for anything specific?
>>
File: 1459103850405.png (330KB, 330x319px) Image search: [Google]
1459103850405.png
330KB, 330x319px
>>54897760
>390x rebrand
>>
>>54899541
also, i gave you an option to check their forum if they have a fucking list of webshops that sell case modding equipment

that's fucking miles away from "go to these sites and check who their website's sponsors are in the right/left side banners"
>>
>>54899537
I don't know what you think I'm saying, but what I'm actually saying is that considering how shit CF is in general paying the extra cash for a 1080 instead of 2 AMD cards will likely be worth it.
>>
>>54899486
Well you could either use 4 5.25 bays for HDD, or one 4-3 bay and one 5.25 HDD silencer.

https://www.youtube.com/watch?v=sEEXk-LcO38
That way you not only add a spot for another HDD, you have space for large cards.
>>
>>54899577
yeah, I've seen those, but I have 6 HDDs and no case comes with that many 5.25 slots today.

And if I put that in my P183 I could have 10 hdds in it (currently have 6).
>>
You can't blame AMD. They just don't have the funds for R&D.
>>
>>54899600
How many TB do you have?
I have limited my Drive count from 13 down to 4 (2 drives in 1997 - 13 in 2004 - 4 in present) and I still have 14.2TB.

Maybe get one of these or similar?
>>
So if i am fucking poor and i want to get a good pc in the next holidays for about $500 is the RX480 the way to go? what is the best processor to match it up? [spoiler]i was thinking Athlon X4 860K??[/spoiler]
>>
>>54899600
http://pcpartpicker.com/products/case/#sort=d3&page=1

also full tower/server chassi.
>>
>>54899562
>rebrand
>overall newly improve arch
>50% perf watt from die shrink, extra 70% from the new arch
>more performance per IPC
>less shader, but perform almost equally (2.3k vs 2.8k) at a mere 20% more clock

>Pascall
>look! we add some VR shit
>worst performance per IPC
>clocked the fuck out of it, advertised as 180watt TDP, but suck higher amount of juice
>performance gain mainly from high clock, zero improvement in the arch
>thermal throttle
>>
>>54899633
if you're only buying the CPU and GPU, yeah sure. if you actually want a complete PC that you can use, save up more money.
>>
>>54899645
>180watt TDP, but suck higher amount of juice
that's because TDP = heat generation, not fucking power consumption you retarded fuck

but i agree with the rest of what you said.
>>
>>54899633

i have an i3 2120.. works fine for me.. i guess video encoding could be a little faster, but I dont encode much

i'm just waiting for a cpu with an igpu that can playback 60fps 4k hevc videos ... maybe zen
>>
>>54899631
>How many TB do you have?

never enough, obviously.
>>
>>54899547
No, logical increments is focused on gaming
>>
>>54899701
14.2 as said.
I did not count my other PC which have a 512GB SSD only and another that have a 256GB only.
It's only the big PC that I have HDDs in though that also use a relic 120GB SSD.

2x4TB
2x3TB
External 200GB which is why it's not counted in the case.
>>
>>54899667
Since it suck more juice = more heat = higher TDP.
Its a meme arrow, I'm trying to make it as simple as possible.
>>
>all these retards in these threads saying >b-but muh dual 480 can beat a 1080 in this one game
>when they're asked about how bad it will perform in other games like witcher 3 or gta 5
>n-no one plays these games anyway

Literally kill yourselves. Are you going to buy 2x480 to only play it on the ~3 games that will actually work properly? How stupid can people get.
>>
>>54899645
you fell for the bait.
>>
>>54899645
Even adored tv in his latest video said the 1080 only draws about 170w under a typical gaming load. Are you going to call your lord and saviour a liar now?
>>
File: Untitled-1.png (319KB, 980x515px) Image search: [Google]
Untitled-1.png
319KB, 980x515px
>>
Poorfag here, would an RX 480 work well with my XPS 420's generic power supply? It says 375W, 30A on the 12V rails, but I don't know if I trust it, considering that the CPU is a Q6600
>>
>>54899778
Don't bother arguing with those retards. Some idiot earlier implied bad overwatch performance didn't matter, it's literally one of the most popular games out right now.
>>
File: image.gif (13KB, 102x168px) Image search: [Google]
image.gif
13KB, 102x168px
>>54899810
Fucking hell no.
>>
is RX480 better than AMD 280X? is it more than 40% better?
>>
Is the RX480 better than the RS240?
Looking for something to keep the temp down on my housefire AMD hardware.
>>
I dunno if I can hold out another month anon. Should I buy pic related NOW?
>>
>>54897760
How long will my 390 last me?
This is exactly like last year when those came out but I couldn't sit on my 6950 anymore.
Should I try to sell the 390 for like 250 and put that into a 400 series card? Or should I wait a generation or two?
>>
>>54899801
>AdoredTV

Who?

GTX1080 suffer from thermal throttle and couldn't maintain its boost clock.
If the thermal issue isn't in the way, GTX1080 could easily go into the 200watt territory.
>>
AdoredTV thinks the RX480 is basially R9 290 performance for $199 plus only 90 watts power usage on average in games. He also thinks there might be a RX490 in the works with more power usage that gets close to the 1070 @ $299 but that's just him guessing right now.

I would prefer to see a 2x480 on one card @ $299 myself but I doubt we will see that.

AMD really are not completing on performance whatsoever. They are competing on price alone. The question is whether gamers are going to worry about price so much. If you are a poorfag I guess you will just get what you can afford. But really it still annoys me that AMD are not even bothering to compete on performance at all. I don't really care if i can buy another card that performs as well as my R9 290 for less power and initial cost. Where's my upgrade???
>>
>>54899879
Settle down anon, a 390 is going to be fine for quite a while.
>>
>>54899879
I have a 390 and a 1080p 144hz monitor

Think I'm going to hold out at least another cycle since I didn't get into vr instantly like I planned. Hopefully by next year or so it's progressed a lot.
>>
>>54899893
just get a fury X anon. you'll be able to play games at 16K.
>>
>>54899822
I'm guessing my range is more like 960 or 950 then.
>>
>>54899920
Just get a better psu. Lord the evga 600b is like 50 bucks
>>
>>54899915
Yeah right. Fuck off. I am an AMD fag and I have no reason to 'upgrade' (at least to an AMD solution) based on what I see.
>>
>>54899962
but are you VR ready tho.
>>
>>54899893
>I would prefer to see a 2x480 on one card @ $299 myself but I doubt we will see that.

An RX480 already set a new bar when it comes to price/perf.
You're asking way too much here.
>>
>>54899880
>it's this guy again

Go back to bed
>>
>>54899987
>it's this guy again

Who?
>>
>>54899863
Nano is btfo by this card
>>
File: 7304.png (83KB, 480x360px) Image search: [Google]
7304.png
83KB, 480x360px
>>54897973
>tfw going to upgrade from 270x to RX 480
>>
>might finally upgrade from a 660 ti to a the R9 480 because of that price/performance
Someone list every single problem they've experienced using an AMD card in the past couple years.

e.g. maybe some random program with an openGL bug that happened to only affect AMD cards made using AMD a pain compared to an Nvidia card.
>>
>>54900109
Bugs I have encountered? Plenty. Videos stutter on Youtube if I have 3 monitors enabled. HDMI 'forgets' I have an AVR connected after a shutdown/reboot and I have to remove the driver and redetect it in device manager. MadVR won't play ball in Win10 unless I switch to DX11 rendering and disable advance frames. Having to change things for Goyworks in The Witcher 3 (does not really count but still it sucks).

Just some I can recall right now.
>>
File: 1465009683276.png (96KB, 480x360px) Image search: [Google]
1465009683276.png
96KB, 480x360px
>>54900084
>tfw going to upgrade from gtx460 to rx 480
>>
>>54900182
Downclock the RX480, otherwise it'll be too fast for you to realize just how fast it is.
>>
>>54900179
I fucking knew it. That's the tradeoff for going with an AMD card but the r9 480's price/perf is too attractive. I'll wait and see what Nvidia does with the 1060.
>>
>>54897760
Why are there so many threads like these?
If you're just a gamer, go with AMD, but if you use your gfx card on a professional level, go with Nvidia.

Trying to use AMD for RT and any 3d processing type work is a fucking trainwreck.

There's a reason behind one being more expensive than the other.

Shit like this is kinda proof that 4chan is exactly the same as reddit, 9gag, <insert other>.
Bunch of clowns jumping on bandwagons without having any person experience.

That being said, my gaming machine will prolly have a RX480 while my workstations will have 1080's.

> little mad
>>
>>54899151
>Ashes of Literally Who
>better than Witcher 3

EL

OH

EL
>>
File: damage.png (2MB, 1418x876px) Image search: [Google]
damage.png
2MB, 1418x876px
>>54898915
Yes, AotS is one of the most important games in GPU history for one reason. It uses nearly all of the upcoming DX12's features and who ever supports those features will come out looking good. Everyone keeps on saying AotS is a AMD cherry picked game but its not. AotS is a DX12 game, the developers are using nearly all of DX12's features. All the nvidia users are angry but what about when devs use Gameworks and fuck up AMD cards? Then it's ok.
>>
>>54900224
>better than Witcher 3

Who? Where?
>>
>>54898362

This is a DirectX 12 amd sponsored game.

Any benchmarks with things that are actually close to reality and its standards instead of this cherrypicked meme game nobody plays?
>>
>>54900208
>if you're just a gamer, go with AMD
lolwut?
>>
File: 1455194536777.jpg (7KB, 231x218px) Image search: [Google]
1455194536777.jpg
7KB, 231x218px
>>54900182
>>
Long time other board lurker looking for a replacement computer. On a scale of one to ten how awesome is this?
>>
>>54899863
I love the nano.
>>
File: 1465010152719.png (96KB, 480x360px) Image search: [Google]
1465010152719.png
96KB, 480x360px
>>54900182
>tfw going to upgrade from R9 390 to GTX 9800 GT
>>
>>54900264
199
>>
File: angrynvidiots.jpg (70KB, 1374x689px) Image search: [Google]
angrynvidiots.jpg
70KB, 1374x689px
>>
>>54900254
you say AMD sponsored but in reality its just what happens when a dev doesnt give special treatment to Nvidia, that's why its a good metric
>>
>>54900264
If you're looking for a budget gaming rig you found your new card. Plain and simple.
>>
>>54900193
kek
>>
>>54900280
It's still resembling something that won't be a standard in the lifetime of these cards so nobody cares. The vast majority of AAA games will be nvidia sponsored anyway and directx 11 is the standard.
>>
File: image.jpg (118KB, 1200x675px) Image search: [Google]
image.jpg
118KB, 1200x675px
>1070 alone is reportedly 20 percent faster than a Titan x
>people actually think one 480 is going to be comparable to that let alone a 1080
>>
>>54900276
lol, have a wallpaper for that answer.
>>
>>54900311
Where did you get "one" from desu
>>
>>54898482
>But they did.. from like 1999 to 2008
No they didn't competitive is a overstatement.
>>
>>54900270
what do you think? does it 4k well?
>>
File: 1439834943389.jpg (544KB, 4500x4334px) Image search: [Google]
1439834943389.jpg
544KB, 4500x4334px
>>54900182
>tfw going to upgrade from a gt 440 and an E8400
to a rx480 and an i5-6600k
>>
>>54900311
just like the 1080 being 2x better than the titan x
>>
File: 1464747111116.jpg (409KB, 1280x720px) Image search: [Google]
1464747111116.jpg
409KB, 1280x720px
>>54897760
hol up
>>
>>54899810
You got to ask yourself if you like housefires.

You'd probably be okay, but regardless I'd upgrade to something like at least a solid 600 W PSU
>>
File: AMD Radeon RX 480-1200-80.jpg (128KB, 1200x675px) Image search: [Google]
AMD Radeon RX 480-1200-80.jpg
128KB, 1200x675px
>>54900311
I do think some AMD fans are going a bit overboard on one benchmark. I also think the people expecting to see 2 480's take down a 1080 are going to be very disappointed. I think 2 480's will come close to the 1080 and even if it comes a few frames under, you got to say...not fucking bad for a $400 setup.
>>
>>54900363
DESIGNATED
>>
>>54900343
Hell yeah, the nano does 4k on all three of my 4k monitors at 45 FPM
>>
>>54900374
That's exactly what i'm expecting once benchmarks come out of two 480's in more games

Less than a 1070 or 1080 with some occasional bright spots in newer games that make use of Mantle or whatever, but a bitching setup for 400$
>>
>AMDfags acting like their midrange poo card is a godsend

truly the poorfag's company
>>
>>54900374
I'm not denying that it will be good for people who don't mind CF. I'm not going to touch crossfire.

I think it won't come close to the 1080 but will exceed the 1070 in games where CF actually works.
>>
>>54900303
no games entering development at this point are going to be DX11, the majority games that have been started in development recently are DX12. DX11 is not the standard anymore, only things that were started years ago. So in the next year(s) most, approaching all, are going to use DX12. And btw, the people who are buying a $200 card upgrade one every 2+ years so its not like its going to be out of their lifetime either way.
>>
>>54900374
Fuck having two cards though.
>>
>>54900382
I'd rather have the 1070 for 380 from what I've seen.

also the 8gb 480s (required if you want high res gaming) are 230 each.
>>
>>54900343
Yes it does, see video.

https://www.youtube.com/watch?v=ojFMO0EsRF4
>>
Gee Raja, why does your mom let you have TWO RX 480s?

No drivers!
>>
>>54900392
>no games entering development at this point are going to be DX11
[citation needed]

>DX11 is not the standard anymore

Yes it is. Protip: a standard equals majority and how many directx 12 games are out there right now? Do you even need all of your fingers in both hands to count them? I'm fairly sure you barely even need a single hand. Besides, it has already been disclosed that directx 12 development is hard and you think that in this day and age where more often than not games come out unoptimized to hell and beyond or just plain out broken, they are simply going to care this much about appealing to the segment of the market with the absolute lowest market share? That wasn't true for gameworks, why would it be true now?

The thing is that this board has a massive AMD bias and it has a bad case of clinging to the one benchmark where AMD excels, then when Nvidia comes and moneyhats the vast majority of the AAA games out there, the same people start crying, but the shit performance will still be there regardless.

Ashes of Singularity is not a true representative of the standard, it's simply not a representation of "what you're getting", it's a best of the best of the best possible case scenario in which AMD clearly has the lead. Nvidia, however, has a tight hold of the market and its share of it is almost at 90% when compared to AMD. That IS something to consider whether you like it or not and thus we need benchmarks of more common scenarios instead of just looking at the one cherrypicked benchmark of a game literally nobody cares about.
>>
>>54899607
>implying Intel and Nvidia dont drop off a monopolybux suitxase in front of their doorstep every week

AMD just doesnt know how to spend their money, even years ago they cared more about partying and buying Rolex watches
>>
>>54898685
Uh, it does if it supports Freesync, though Freesync has an upper limit on the rate.
>>
>>54900384
>>54900393
I think the next generation of APIs(DX12, Vulkan) will make CF/SLi easier to deal with.
>>
I'd like to chime in here and laugh at all of you getting buttmad over the choices that others make.

Inown pretty much every GPU released in the last 5+ years, AMD and nVidia. All of them are functional working products that have their place and purpose.

You people are the kind of guys that argue whether a Ram or Ford is the Superior pickup, and try to convince everyone else to validate your opinion.

Buy whatever you like and fits your budget and be happy with it. In my own experiences, AMD's GPUs are wonderful for budget builds, if I wanted a top tier machine I'd get the best single GPU nVidia has for sale.

Buying a "budget" nVidia card makes no sense in my opinion, the same way you wouldn't buy a luxury car with a barebones configuration. Nvidias low end cards don't really have a place, because foe the price of one of their budget cards you can almost afford a top of the line AMD card.

Every computer I ever build for those on a budget had an AMD card in it, because the value proposition is just too good. But if someone says they want the best and have an open ended budget, there's not really much of a choice.

I can't wait to get my hands on the new AMD and nvidia cards and work with them both. Arguing over it is bloody stupid though. Unless you have worked with both companies products you don't really have a reliable first hand experience. If you did that then you'd see that 95% of "reviews and benchmarks" posted by those tech blogs are utter bullshit and always slanted one way or the other.
>>
>>54900233
AotS also has a Vulkan build. Vulkan api can potentially be an equally important proving grounds.
>>
>>54900545
>All of them are functional working products that have their place and purpose.
...
>Nvidias low end cards don't really have a place

gg no re
>>
>>54900545
Reddit ran out of content for you?
>>
>>54900558
>Vulkan build
Thank you, I was unaware.
>>
>nvidia showed dx11 performance charts
>AMD didn't
gee I wonder why
>>
>>54900545
saved
>>
>>54900601
<insert favorite amd conspiracy theory excuse here>
>>
>no vega info until Q1 2017
is AMD screwed? They will have to compete with the 1080ti and new Titan as well
>>
>>54900632
if it was as good as their AotS benchmark, they would show it
>>
>>54900653
>reading comprehension
>>
>>54900598
Just mentioning it due to its parity with DX12 as a low level api.
>>
File: 1464657895115.png (344KB, 340x523px) Image search: [Google]
1464657895115.png
344KB, 340x523px
Here's to hopefully not too bad of a markup in Canada.

Need a 270 replacement.
>>
>>54900663
Fellow anon, you seem to like the new APIs as much as I do. Have you realized that AMD has started this trend of APIs have low level access all in an attempt to destroy Gameworks?
>>
>>54900716
>AMD has started this trend of APIs have low level access all in an attempt to destroy Gameworks?

Not even remotely true.

Not that losing gameworks would be a loss to anyone.
>>
>>54900736
i guess Its just my conspiracy theory.
>>
>>54900570
If you would continue reading the sentence then
>>
>>54900785
>All cards have a place
>Nvidia's low end cards don't have a place because of their price/performance

What about the above needs more context? Just straight up contradicted yourself, foo.
>>
>>54900766
>low level API
>a conspiracy

A conspiracy would sound like this;

>low level API is AMD specific API that were meant to be advantageous to AMD and only AMD GPU
>the API is restricted, exclusive to AMD where NVIDIA have almost no access to it in order to improve their product performance in regard to API title
>>
File: 138.gif (2MB, 480x270px) Image search: [Google]
138.gif
2MB, 480x270px
>>54900832
Subtle.
>>
>>54898301
Since at least the 8 Series I only remember them being competitive, not beating them, and that is a solid decade.
>>
>>54900832
Ok this is what happened. AMD made Mantle because DX11 is shit and Gamework was killing their cards. Not only that but we(AMD) suck at developing drivers. Lets do it the way console devs do it. Let give them low level access so we dont have to write drivers for every game made. MS was like wtf Mantle is going to take our clients who pay for our DX licenses, because of that low level support we need to go the same route. DX12 started implementing Mantle like features. Mantle then became Vulkan. AMD was like HAHAHA now we have 2 low level APIs on the market that can span in both our console industry(Xbox/Playstation) and our PC industry and we have to write less drivers!!!
>>
>>54898301
hol up
5xxx series was when they did good damage
4xxx was a tossup
after that they have been losing
>>
>>54898314
Pretty much in a similar boat with i7 930 and 280x CF. CF sucks thought I ragret it.
>>
File: rnvUh0I-900x261.jpg (54KB, 900x261px) Image search: [Google]
rnvUh0I-900x261.jpg
54KB, 900x261px
>>54900988
THIS!!

I don't get why Nvidiots don't realize DX12 and Vulkan were just happenstance because of Mantle and the only reason AMD came up with Mantle was because they don't have the budget to dedicate devs to writing drivers for every single game (also DX11's fuck all mutlithread support for AMD processors so they came up with heterogeneous system architecture for their APUs but needed something for their older more powerful desktop offerings).

Hell, AMD even gave Nvidia the Mantle Code but Nvidia told AMD to go fuck themselves and stroked CUDA in the corner.

>AMD Plays nice and ends up winning out
>Somehow a conspiracy
>Nvidia is a dick and kills competition with underhanded jewery
>everyone sucks their dick

Even in the AOTS benchmark, Nvidia had the drivers kill shader code so they get better results (which was why they rendered less physics interactive snow for better performance)
>>
File: Spongebob_is_missing_his_holes.png (1MB, 1063x796px) Image search: [Google]
Spongebob_is_missing_his_holes.png
1MB, 1063x796px
>>54901291
I don't know if this post is supposed to sway me from buying nvidia or not. Either way I'm still buying a 1070 as an upgrade from a 7950 I got off a friend. Nvidia are just better in my eyes now, and all I see from amds side recently are damage control posts like yours which are all basically
>it took them *x* amount of years to fix this problem whilst nvidia never had the same issues
>but now it works with/on amd so this must mean amd>nvidia

It all looks like seriously clutching at straws just like that adored tv guy who honestly convinced me to buy nvidia because of his hate for them being better according to even himself.
>>
>>54901291
I was almost about to buy a 1070 after using only AMD for so long but shit like this reminds me why I've always refused to buy nvidia. Evil business practices.
>>
>>54901291
Huh, so the difference in the screencaps from the AMD announcement was probably due to nVidia driver fuckery? It did look to me like a lack of texture blending than a difference in settings.
>>
>>54901291
>Nvidiots
Just like your use of an ancient fanboy slur, AMD is trying to do what 3dfx tried with custom devkit.
Look where that got 3dfx
>>
>>54901606
>Look where that got 3dfx
Bought by nVidia because they liked the idea so much.
>>
>>54901552
>I don't know if this post is supposed to sway me from buying nvidia or not.

The post isn't to sway anyone's opinions, it's just stating that DX12 and Vulkan are happenstance while Nvidia keeps fucking over everything.

Hell GOW got fucked over by Gameworks for ti's DX12 release with their shitty HBAO+ and VXAO shaders and other things.

it wasn't an "x" amount of years needed to fix post.

It was Nvidia keeps fucking shit up and not sharing code as a means to try to monopolize and do shit like sell reference designs for $100 more than aftermarket coolers while lying and squeezing themselves to the top by cheating, meanwhile AMD has played fair for the most part and even tried to collaborate with Nvidia before DX12 was a thing but now Nvidiots are saying that DX12 was a conspiracy against Nvidia when they had the opportunity to go into the code and fix their shit accordingly.

>>54901583
Yup, pretty much Nvidia kiking the numbers again because of their poor DX12 performance. Honestly they can work on their hardware instead of trying to cheat themselves out of it.


>>54901606
>Fanboy

I've alternated between cards for the longest while, hell when people made fun of Fermi I got one just because of Physx (GTX 480). Now that I've seen what AMD was trying to do and I use my card for more than just gaming made me realize that coding for CUDA sucked ass when I can just code for OpenCL and slap in an IntelPhi when I need more double precision performance.
>>
File: 1430980813601.jpg (17KB, 480x360px) Image search: [Google]
1430980813601.jpg
17KB, 480x360px
>>54901613
Oh man I wish you were right. Imagine nvidia buying up amd. The amdrones will all commit suicide lmao.
>>
>>54901665
Have you only been born within the last 13 years or something?

http://www.cnet.com/news/nvidia-buys-out-3dfx-graphics-chip-business/

>Imagine nvidia buying up amd.

Say hello to highly overpriced shit by Intel because Intel will revoke AMD's x86 license if they're bought by any company. (they did this after the Samsung scare)
>>
>>54901665
a) that won't happen. monopoly laws and such
b) if that happens we'll be able to enjoy the 2000$ 1180 together.
>>
>>54901552
Buy what you want kido, we were just having a conversation on the computer graphics ecosystem. If some thing on /g/ sways you on buy a product you're an idiot. Buy a nvidia graphics card and I hope you have a good experience.
>>
>>54901649
>nvidia fucking over everything
>khronos group owns and contributes to development of vulkan
>biggest contributers to vulkan currently are nvidia with a dedicated vulkan team
>ceo of khronos group works for nvidia

>GoW
Maybe if you weren't such a deluded ass blasted fanboy you'd realise that the port was utter shite and not because of any nvidia related software. Funny how you fail to mention that those performance problems were patched out soon after and the game runs perfectly fine on all cards even with the nvidia software.

The fact I can tell from your post that you'd rather blame nvidia over incompetent developers show how ass blasted you are. I bet you watch adored tv to get a high. Nvidia are just better when it comes to business as are all companies at the top of their respective chains. Deal with it chump.
>>
File: 1439205951982.png (287KB, 600x764px) Image search: [Google]
1439205951982.png
287KB, 600x764px
>>54900468
on install base, AMD has 40% of the market, Nvidia has 60%

Going by sales for 2016Q1, AMD has 29% of the market, Nvidia has 71%
>>
>>54901687
>>54901714
Who the fuck cares. Atleast the cancer that is amd will die a painful deserved death and an actual competent company can buy up ati and start producing good products again.

Trigger warning btw. Refrain from the asshurt replies :^)
>>
>>54901687
>>54901714
> ------- the joke ------>
> --- > your head < ----
>>
>>54901787
It also affected other patches as well for Tomb Raider etc.

Sure it was shitty developers but the fact that the shaders didn't work at all on AMD cards and are Nvidia specific (like Cq shaders back on the day that didn't work on ATI cards since they depended on HLSL and GLSL).

This isn't anything new, it's just plain old Nvidia again with their bullshit.

Like I said, I'm not a fanboy but Nvidia just has shitty business practices, same with Intel but I'm using Intel CPUs right now.

I pick my parts based on performance and I don't just game, if CUDA wasn't shit i'd be using Nvidia right now.

>>54901836
I'd rather not have a monopoly on x86 processors
>>
File: 1463263669246 (1).jpg (217KB, 1449x1229px) Image search: [Google]
1463263669246 (1).jpg
217KB, 1449x1229px
>>54901914
>Sure it was shitty developers
>it's just plain old Nvidia again with their bullshit
>I-I'm not a fanboy

Make up your mind amdrone.
>>
>tfw you can make amdshills so buttblasted if you mention you're buying nvidia

>tfw when an amdshill tries to attempt to make people mad by saying he's buying amd people just laugh at him

it must suck to be a fan of amd lmao
>>
>>54898902
See >>54896555

It's only going to get worse, async gets amd 25-33%; whereas nvidia gets nothing.
>>
File: DX12FeatureLEvels[1].png (53KB, 798x544px) Image search: [Google]
DX12FeatureLEvels[1].png
53KB, 798x544px
>>54901974
>Nvidia sponsored game
>worked fine on Nvidia cards
>Hampered AMD Cards
>not more than just shitty developers

oh okay.

I mean it's okay when AMD sponsored games have Nvidia and AMD cards neck and neck going blow for blow with Nvidia either beating it out or loosing by 2% but Nvidia sponsored games always run like shit on both cards yet always favors Nvidia cards by a large margin.

I wonder why
>>
>>54901687
>Intel will revoke AMD's x86 license if they're bought by any company.
Intel is required under the revision to the agreement that they struck with AMD to negotiate in good faith with any company that buys out AMD for an x86 license.
>>
>>54902056
>Intel
>good faith

>ONE THOUSAND AND SEVEN HUNDRED US DOLLARS
>>
File: 1422150698675.jpg (12KB, 259x200px) Image search: [Google]
1422150698675.jpg
12KB, 259x200px
>>54902017
>I wonder why

Because amd are just plain shit. Don't blame anyone but your retarded self for purposely buying hardware which you know will be trash compared to the competition. Whether that's due to gimping or not is irrelevant.
>>
>>54902083
The FTC reserved the right to define the term however they wished. The gist of it is that intel *has to* grant the same terms AMD is currently operating under to anyone who would buy them. AMD is never going to be bought out so its a moot point.
>>
>>54900084
are you me?
>>
>>54902093
>Whether that's due to gimping or not is irrelevant.
>gimping is irrelevant
>intentionally being a shitbag to the competition by underhanded tactics does not matter

I hope Nvidia's pay is good

>Don't blame anyone but your retarded self for purposely buying hardware which you know will be trash compared to the competition.

My GPU does fine for what I want to do. I don't just play vidya with my card senpai, Nvidia sucks at nonCUDA CAE shit because of their dumbass abstraction layer.
>>
>>54898669
>Also, the scaling is fucking shit, going from ~40FPS to ~60FPS, around 50% higher FPS for 100% extra money. Amazing.
Well my math could be a bit rusty, but a 50% performance boost for 100% more money sounds better than a 45% boost for 200% more money.
>>
>>54902149
>read about allegations of company A possibly messing over company B resulting in an overall worse product by company B
>still proceed to buy product from company B
>blames company A every time something goes wrong
>i-it's still not my fault though
>i-it was $10 cheaper

No wonder you have such levels of buyers remorse

>le cuda
No one cares about what you do. I use CUDA in premiere pro but I don't keep babbling about it. This thread is about gaming performance as seen in OP's picture.
>>
>>54900233

It might be one of the games that are a bit ahead of the curve on actually implementing features, but that doesn't mean the benchmark is unbiased and it doesn't mean they've done a good, fair job of implementing those features.

Don't forget, this is the same company that were the pilot devs / made the pilot engine for AMD's Mantle. If your memory is too short for what happened there, I'll tell you: There was gargantuan hype for AMD cards once Oxide's sponsored Star Swarm 'benchmarks' hit, because it showed ridiculous gains for Mantle over both AMD's and Nvidia's DX11 results. And then a week later, it became obvious that they were essentially 'cheating' by simply not optimising any of the DX11 codepaths. How do we know? Because Nvidia fixed all the issues they could using their driver, and their DX11 performance was better than AMD's Mantle performance.

Now we're seeing the same sponsored shenanigans all over again, except you can't optimise someone else's game using drivers with DX12 (it's a low level API). In short, the benchmark is as close to a directly AMD created & controlled benchmark as you can get and still put someone else's name on it.

It just sucks that there isn't a nice suite of unaffiliated games using DX12 - But that lack doesn't mean Ashes is any better of a benchmark, just like you wouldn't treat manufacturer benchmarks as gospel just because 3rd party reviewers haven't had a chance to run their own.
>>
Nvidia knows they can't compete with AMD on hardware muscle alone so they've tried to corner the market through the software side. Philosophically they've been angling for this strategy for a long time, starting with PhysX. Gameworks is the latest extension of this philosophy and due to its closed nature is a blight on video games. Low level APIs are the cure to this cancer and Nvidia has every right to be scared as their GPUs are designed solely to take advantage of the outdated garbage that is DX11.
>>
>>54902269
Is hitman 2016 the same story then?
>>
>>54901828

And then once you filter those results for modern (by which I mean pretty much anything DX11) gaming-class GPUs, you'll get much close to that 10 vs 90% figure the other guy mentioned.

Intel's integrated GPUs are larger than AMD's and Nvidia's combined, yet most measures still count AMD's APU GPUs for some crazy reason. I also wouldn't be surprised if there was a fair amount of crappy low/mid end laptop GPUs that are also AMD's, but way below modern game minimum requirements.
>>
>>54902269
The difference is that low level APIs put the power in the devs hands, where the poet should be. The GPU developer should not have so much "power" to improve or fuck up a game by "optimizing code paths" or not. It's a conflict of interest. DX12 and Vulkan level the playing field by creating a situation in which the best card should win.
>>
>>54902255
>buyers remorse
I don't

>Gameworks not fucking up game across the board
>wanting 5 more FPS and supporting a company that wants to monopolize and charge your more money for their kikery when if you adjust your shaders ingame instead of just putting shit on MAX you get the same performance

Tested shit myself with with my old GTX 480 vs a 2nd hand used Radeon 5870.

You get the same shit when you adjust shaders on both cards, ESPECIALLY AA.

Got rid of the cards so I can't test them on modern titles (this was back in 2014) but Nvidia titles tended to suck ass on both cards while AMD titles had the cards neck and neck.
>>
>>54902284

To be honest, it seems like the devs have had a bit of an issue with DX12. I haven't read any recent benchmarks with everything patched, but to quote Guru3D:

"If you try DX12 mode on a GeForce, once you start the game or benchmark simply wait ... it can take 5 minutes before the bechmark sequence or the game itself starts. We have to say that we are incredible dissapointed about the quality control on DX12 from the developper. These things just shouldn't happen. Very sloppy."

So to me it sounds like a dev side issue, especially considering how DX12 is so reliant on fine detail optimisation from the devs. Quite possibly they also preferred to optimise AMD's cards over Nvidia's. Or perhaps there is some sort of Nvidia side bug. Who knows? - That's why I mentioned that a whole suite of reasonable benchmarks would be ideal.
>>
>>54902322
Developers loves asking for help though. And Nvidia loves giving the help for free, along with some free cards.

Nvidia strives to provide the developers and customers the best user experience, the path of least resistance.

AMD on the other hand doesn't even answer to emails unless the developers are making their own sponsored titles.
>>
File: 1460190436396.png (107KB, 500x600px) Image search: [Google]
1460190436396.png
107KB, 500x600px
>>54902334
>Nvidia titles tended to suck ass on both cards = the same

>AMD titles had the cards neck and neck = the same

Ok so no difference then. Nice to know thanks for confirming.

I'm going to sleep. Stay mad amdrone and enjoy that buyers remorse :)
>>
>>54902353
It could be a hardware bug as easily as a software bug in that case, either through the game or through Nvidia's drivers + cards.

Dx12 doesn't magically remove all need for drivers, it just removes a lot of the overhead and cruft required to translate calls.
>>
>>54902353
Nvidia doesn't have the low level hardware support for two reasons.

Lack of Async which everyone already says but also

The abstraction layer setup the Nvidia cards have.

Because they locked down their cards to keep CUDA to themselves so you need to code for CUDA by CUDA on the low level to get better performance. Devs won't go through the trouble and would just code for DX12 and not bother with Nvidia's extra bullshit.

So you'll end up with no change in results switching from DX11 to DX12 on the Nvidia side because it's so locked down.
>>
>>54902387
>Ok so no difference then. Nice to know thanks for confirming.
>shitty on one branded title, good on the other
>no difference
k bud
>>
>>54902322

I disagree, although I think you've got it somewhat backwards.

"The GPU developer should not have so much "power" to improve or fuck up a game by "optimizing code paths" or not."

That's pretty much the current state with DX11. The GPU companies can't fuck up games by optimising code paths. The only ones where they can do that are sponsored games, and even then they have limited power to do so because of the leeway that DX11 drivers have to try and overcome that. In essence, 'the best card+drivers will win'.

With DX12, pretty much whatever the developers do, goes. So if you have a biased or sponsored developer, the other GPU company is screwed. Or if the developer just does a rubbish job at one side. Or if they just go for the largest market share. Or whatever. None of those say 'the best card will win' to me.

To dumb it down a bit, the situation we have now is where performance is (to a certain degree) ensured for both sides because both companies are going to make sure the game is optimised for their cards via their own drivers - It's in their interests to do so. However, if you take away the ability to tweak performance via the drivers, then you're left with trying to tweak performance by influencing the devs. And I don't think anyone wants that.
>>
>>54902387
>Nvidia titles tended to suck ass on both cards = the same

It's like you can't read.

I said they sucked on both cards but the NVidia cards had a huge lead on NVidia titles rather than just a marginal lead between each card on an AMD Title.

Is English your 2nd language or something?
>>
>>54902387
>So butthurt he makes a meme image
>>
>>54902403
with dx12 theoretically the performance should be identical since the codepaths would be identical.

whoever has the best, most compatible hardware wins.

if the devs fuck up the codepath such that it makes the game run like shit well, that's on them.
>>
>>54902403
I disagree with that. Apparently software devs love the fact they can get down into the hardware and get to do exactly what they want and give the artists more of an artistic budget to do what they want.

It only sucks when there are either bad ports or Nvidia sponsored, even worse when it's both.
>>
>>54902390

Oh come on. You don't seriously think that Maxwell just has some sort of 'hardware bug' for all of DX12? And that Nvidia haven't figured out a way to get around it? Come on man, that's just a bit silly. Furthermore, if that were the case, we'd be seeing that issue promptly resolved for Pascal, and yet the optimisation for Pascal looks just as bad as it does for Maxwell, and that's a brand new Arch (where bad optimisation, especially on a low level API, is expected).

And of course it doesn't remove the need for drivers, but it doesn't just remove overhead. Whereas in DX11 the driver automatically picked how to translate certain commands and calls, sometimes doing something else entirely if the devs thought it was more performant, in DX12 the devs have a hell of a lot more manual labour to do, a hell of a lot more to specify. It's the difference between saying:

DX11: "Pick up that can."
DX12: "Bend your /lower/ back over 70*. Extend your /right/ arm 50cm forward. Open your /right/ hand. Grip the can, ID /#24512512/, with a /3/ second clench at /1/ Newton of force.

And so on. Being so specific can improve performance a whole bunch, but if one card has a shitty right hand and the other has a shitty left hand, you can see the above example would be much easier for only one of them.
>>
I just want to see how it stacks up next to the 1070. I mean, I could get two of these for about the same price, but I'd need a new power supply and I've heard the horror stories about crossfire.
>>
>>54902403
I see your point but I'd rather have the devs be held accountable for making good software than have the hardware company scrambling to unbreak their game on release day.
>>
>>54902491
>Oh come on. You don't seriously think that Maxwell just has some sort of 'hardware bug' for all of DX12? And that Nvidia haven't figured out a way to get around it? Come on man, that's just a bit silly. Furthermore, if that were the case, we'd be seeing that issue promptly resolved for Pascal, and yet the optimisation for Pascal looks just as bad as it does for Maxwell, and that's a brand new Arch

All this points to is either they DID fuck it up and tried to cover with drivers (poorly) or Pascal is literally just Maxwell shrunk and that's why the problem stays.

And maybe companies that make shitty hands should fucking adapt or die.
>>
File: GTAV_3840x2160_PLOT.png (133KB, 900x500px) Image search: [Google]
GTAV_3840x2160_PLOT.png
133KB, 900x500px
>>54902497
It's $250 each for the 8gb versions. That's only the price of reference cards. If you don't want housefires get ready to pay at least $300 for sapphire cards.

Why do all of that when you can just buy a single $379 150w 1070 and overclock it to 1080 levels?
>>
>>54902538
>Pascal is literally just Maxwell shrunk and that's why the problem stays.

It isn't, they did change the architecture (it's similar to GCN) it just misses Async Shaders and still has a locked down Abstraction Layer for Nvidia specific shit that may no be used soon.

I mean seriously, they have a CUDA version of fucking python
>>
>people unironically recommending xfire
Just ruck my shit up fampai
>>
>>54902564
please show me a 379$ 1070.
>>
File: cat_boxing.jpg (41KB, 628x676px) Image search: [Google]
cat_boxing.jpg
41KB, 628x676px
Just sold my 380 for 170$, did I do good /g/?
>>
>>54902396

I'm sorry, but I think you're a bit ignorant here.

Game devs don't need to use CUDA at all. Whatsoever. Your point might have some merit when arguing between CUDA and OpenCL, but if developers wanted to use either one of those languages, well they can just go right ahead and use OpenCL.

>>54902472

That's not how it works.

Developers write software to take advantage of the latest hardware features (and I mean features as in just plain old Arch style as well as big name features they put on the front of the box).

Hardware developers don't change their hardware to suit individual codepaths in games. That would be madness.

And since GCN & Maxwell/Pascal are so different at the hardware level, if a dev really wanted to get the best perf, they would write two separate codepaths, each optimised for its respective Arch. This probably wont' happen much because it costs money, so they'll probably try and take a blind guess at making a single codepath that's equally friendly to both Arches, and end up with either good optimisation for just one Arch, or poor optimisation for both.
>>
>>54902595
I don't think people are actually recommending CF, it was just sexier for AMD to show a scenario where their new Polaris beats a 1080, however contrived the situation was.
>>
>>54902606
go ask sqt
>>
>>54902647
Kek
>>
>>54902625
>Game devs don't need to use CUDA at all.

I'm not saying they have to, I'm saying the way Nvidia cards work is that if you need extra performance you need to use CUDA code. This is why you don't see any change in DX11 to DX12 on NVidia cards.

Their Abstraction Layer is like a doorway that only lets CUDA code through to get lower level access. AMD Cards being as open as they are since Terascale has no problems especially with GCN (since Terascale isn't supported)
>>
>>54902482

Of course devs like it - Where did I say that they don't? ... They like the fact that they can make their game run a hell of a lot smoother, because if they do the optimisation in-house rather than relying on a driver, obviously it'll be much better. That doesn't contradict what I was pointing out about the change in the balance of power (in terms of who ends up with good optimisations and who doesn't) towards the dev (in fact, it supports it). Just consider that in DX12, there's a much stronger link between small choices in the programming (of which there are many in DX12) and specific properties of the GPU it runs on. To use a simplified hypothetical example: In DX11 you might expect cache size to be handled by the driver. In DX12 you might have to specify it. But if GCN runs best with 32bit cache sizes, and Maxwell really prefers 48bit, then which do you chose? The best answer is 'both, with different codepaths' - But I highly doubt that'll be the norm. Instead you'll have developers who are already struggling with one codepath and when there's 10,000 little choices like that to deal with, I'm sure the answer will quickly end up being whichever is eaisest to deal with, rather than which one is fairest or fastest for everyone.

And I'm sorry, but it's pretty ridiculously obvious that you're either biased or ignorant if you're only pointing out games that are sponsored by Nvidia as an issue. AMD are perfectly capable of having their own sponsored games and devs, even if in recent years they've been outdone by Nvidia and their gameworks program.

>>54902538

Why would they be utterly stupid enough to make an Arch that has the exact same crippling flaw as their previous one? That's madness.

Also, you completely missed the point. The idea was that both companies have their pros and cons. Just pretend that I said "really fucking awesome" instead.
>>
>>54902646
Having tried CF and Sli, I wouldn't wish that shit upon my worst enemy
>>
>>54902692
>if you're only pointing out games that are sponsored by Nvidia as an issue.

I'm not, I said if they're bad ports like games made for DX11 initially then quickly ported to DX12 to say they're the "first" or "one of the first".

When they do have Nvidia support you see a drop in performance on both cards in DX12 which is the opposite of what it's supposed to do (As in Rise of the Tomb Raider which had a quick DX12 port sponsored by Nvidia which made performance worse on both cards).

It isn't a bias if it's observable
>>
>>54902625

Again, no. You can just use OpenCL and still get better performance than AMD if you like.

Not to mention, CUDA remains exactly the same whether you're using DX11 or DX12. You'd use the exact same commands to do the exact same work whether you're running the game in DX12 or in DX11. Afterall, DX11/12 are for the graphics pipeline, whereas CUDA/OpenCL are for compute.

Furthermore, AMD isn't more open than Nvidia in pretty much any regard whatsoever, aside from of course that the only compute language they support is OpenCL, which is someone else's (open) language, but that of course is moot because both companies support it. Both companies grant low level access - it's in their own best interests to do so, in order to get the best optimisation.
>>
File: pikachu-crying.jpg (35KB, 628x340px) Image search: [Google]
pikachu-crying.jpg
35KB, 628x340px
>AMD might finally take back the crown
>Even if they do I'll have to stay with Nvidia because of the Adobe exclusivity
>>
>>54902692
>>54902729

Here is my example of what I'm talking about

980ti Performance DX11 vs DX12
https://www.youtube.com/watch?v=g2nL5QrytNY

Both side by side
https://www.youtube.com/watch?v=bAOLxchfxHE

Fury X performance DX11 vs DX12
https://www.youtube.com/watch?v=NEA5dwW9htc
>>
>>54902765
my bad, it's 980 vs Fury X side by side but you'll get what I'm talking about
>>
>>54902729

First of all, this is what you said:

"It only sucks when there are either bad ports or Nvidia sponsored, even worse when it's both."

The obvious implication there being that AMD doesn't sponsor games, or that when they do, it (or their ports) doesn't suck. My point is that's BS. It goes both ways.

You talk as if there's enough of a population size out there to talk in general terms, "When they do have Nvidia support"

Who's "they"? There's literally one DX12 game that I can think of with Nvidia support, and that's the only one you mentioned.

3 games, one of which is biased and the other one apparently not so well done, doesn't make a good observation sample.

I mean, the fact that perf decreases at all on any card is a massive red flag, that much should be obvious. There's no way in hell that's normal - Especially with the fact that Nvidia was so close with the DX12 dev team throughout development. DX12 was originally demo'd on Kepler. When it launched, there was 3 (I think) dev blog posts about the two launching practically side by side. There wasn't a single mention of GCN throughout the whole thing. Not to mention the billions of dollars in R&D from the two of them together. So in short, I think it's useless to try and make generalisations with such a limited sample size, especially when it's so early days for DX12 when all the (3? 4?) devs are basically the first to experiment with such a new technology.
>>
>>54902743
I'm not saying CUDA is the problem, I'm saying Nvidia's abstraction layer closes the card down and is only CUDA specific.

It's either above the Drivers or Firmware but it's been known to be a headache for developers for years if they don't use Nvidia Gameworks to work around it.
>>
>>54902802
>There's literally one DX12 game that I can think of with Nvidia support

Quantum Break, Gears of War and the DX12 release or Rise of the Tomb Raider.

They were all broken on release for the PC on whatever card you used.

Also not sure about Quantum Break but GOW and Rise of the Tomb Raider were both bad ports and had Nvidia Gameworks in them for their DX12 releases.
>>
>>54902808

What you are saying is that somehow DX12 exacerbates the issue and that the two together explain Nvidia's DX12 woes.

Except that's bullshit, because switching to DX12 does absolutely nothing to change that abstraction layer or CUDA whatsoever. Nothing changes in that regard, so how could it be responsible for negative changes in performance?

Furthermore, you keep spouting abstraction layer like a buzzword, seemingly with no idea of what it actually does. Bottom line, if it were an issue, they'd use OpenCL. In reality however, CUDA is actually pretty great, which explains why people go to the bother of using it at all, let alone buying Nvidia hardware specifically for it, when OpenCL is an alternative option.

>>54902765
>>54902775

See:

>>54902802
>>
>>54902861

GOW is unfair. It's a DX9 port for christ's sake, it was just fully broken altogether. If the roles were reversed, no AMD fan would accept that as a relevant benchmark.

I wasn't even aware that QB had DX12 support at all - And I can't find any benchmarks, so perhaps you could oblige. Either way though, it seems like it too is completely broken, even aside from DX12 or anything to do with Nvidia: http://www.extremetech.com/gaming/226261-quantum-break-on-pc-is-classically-broken

In short, I'm still not seeing how a grand total of 5 games, all of which are completely broken or biased, makes a good benchmark suite.
>>
>>54902872
I'm not talking about CUDA, I'm talking about the CUDA Specific abstraction layer.

The hardware is locked down due to their shitty Abstraction Layer over either the hardware itself or the firmware or drivers.

> Because the
GPU is often presented as a C-like abstraction (e.g., Nvidia’s
CUDA), little is known about the characteristics of the GPU’s architecture beyond what the manufacturer has documented.

http://www.eecg.toronto.edu/~myrto/gpuarch-ispass2010.pdf

It's a pretty interesting read on Nvidia's ALU architecture. I suggest you read up on it but basically Nvidia's hardware is locked down and the only extra functions that work are CUDA specific. The functions in DX12 that try get to the firmware itself cannot get there so you're left with DX11 performance or worse.

Nvidia has had alot of hands in on the development of both DX12 and Vulkan but still keep their hardware locked down. Why? I have no idea, maybe they want to keep CUDA to themselves so they can dominate the CUDA specific code market or whatever.


AGAIN, I am not talking about CUDA itself but rather how NVidia has locked down their architecture to anything that isn't made by NVidia for ONLY Nvidia.
>>
>>54902106
Moot doesn't mean what you think it means, faggot dipshit.
>>
>>54903071
Who is moot?
>>
>>54903090
a faggot
>>
>>54902983

>CUDA Specific abstraction layer.

You've said it right there. It's CUDA Specific. Just because they've abstracted how CUDA specifically is implemented on the GPU doesn't mean you're missing anything that you would get on an AMD GPU, which doesn't even support it in the first place.

Furthermore, you're taking the quote out of context. CUDA was just taken as an example of one abstracted presentation of the GPU. If you use OpenCL instead, then a whole different presentation will be given to you, and since both companies support pretty much only OpenCL as an alternative to CUDA, developers are welcome to use that in order to glean information about the card if they wish.

...... Or, /obviously/ they will just freakin' ask the GPU company what they need to know. Nvidia is famous for working hand in hand with devs in order to get exactly this kind of shit sorted out. All of that is under NDA as you would expect, as opposed to freely available for public publications like the one you linked. AMD have the exact same setup going.

I don't know why you're so fucking paranoid about CUDA or its abstraction layer locking the card down, when obviously all that's happening is that CUDA doesn't give you an all-encompassing, hyper detailed breakdown of the GPU, and you don't have to use it in the first place (at which point it HAS NO EFFECT ON ANYTHING). Yes, there's an abstraction layer involved. Protip: AMD isn't any better in this regard, and in fact they don't even have a 2nd choice like CUDA in the first place!
>>
>>54902497
RX 480 8gb will be $230 and performs equivalent with a 390x/gtx 980.

1070 reviews are already out and its currently a 980 ti with worse overclocking. 980 ti is between 20%-25% faster than 980/390x in actual games depending on resolution and title.

RX 480 is a 100 watt card so even a 500w psu should handle CF.
>>
Would it be retarded to buy this at launch?

I'm thinking I can still scam a few people in my city and sell my 280x.
>>
>>54903255
I think if I see you in the store buying AMD merch I will fucking ko you, giving you Parkinson's disease caused by brain trauma.

RIP Ali
>>
http://www.hardocp.com/article/2016/05/27/from_ati_to_amd_back_journey_in_futility/

>In the simplest terms AMD has created a product that runs hotter and slower than its competition's new architecture by a potentially significant margin.

>Buying AYYMD HOUSEFIRES
>any year
>>
>>54903149
>I don't know why you're so fucking paranoid about CUDA

I'm not paranoid, I'm saying that because of Nvidia locking the card down the way they do they see no improvement in DX12.

I don't get why you do not understand that and focusing that I said it was CUDA specific.

>AMD isn't any better in this regard, and in fact they don't even have a 2nd choice like CUDA in the first place!

They cannot support CUDA because noone knows how CUDA works other than NVidia. It isn't by choice AMD doesn't support CUDA, they just cannot.


Let me explain what I've been trying to say.

AMD cards work by going through this flowchart.

Hardware > Firmware > Assembler > Kernel > OS > Drivers > Software

This is pretty open because at all levels there is nothing very specific keeping you out of the hardware and if someone is so inclined you can write your own firmware.

NVidia however works like.

Hardware > CUDA Specific and Nvidia Proprietary Abstraction Layer > Firmware >Assembler > Kernel > OS > Drivers > Software

So you cannot get to the hardware itself and the way it's ALUs are set up you will never know exactly how CUDA works.

I don't get how you don't understand this, it's simple shit.
>>
>>54903285
Don't worry, I usually recognize people with an extra chromosome from a mile away. I bring my repellent for those.
>>
NVIDIA are all about progress and innovation. AMD niggers all about poorfags and housefires.

There, I made it simple.
>>
>>54903307
Classic meme response.
>>
>>54897760
mainstream crown was 970 my friend
>>
File: 1464826170917.png (29KB, 500x275px) Image search: [Google]
1464826170917.png
29KB, 500x275px
>>54903446
(you)
>>
https://hardforum.com/threads/from-ati-to-amd-back-to-ati-a-journey-in-futility-h.1900681/page-31#post-1042333971

>My sources at AMD tell me that the Polaris 10 is coming up very short in the clocks department.

IT'S OVER, AYYMD IS FINISHED & BANKRUPT.
>>
>>54903537
reported
>>
>>54902621
Not bad, senpai.
>>
>>54901836

This is bait right?
Thread posts: 318
Thread images: 42


[Boards: 3 / a / aco / adv / an / asp / b / bant / biz / c / can / cgl / ck / cm / co / cock / d / diy / e / fa / fap / fit / fitlit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mlpol / mo / mtv / mu / n / news / o / out / outsoc / p / po / pol / qa / qst / r / r9k / s / s4s / sci / soc / sp / spa / t / tg / toy / trash / trv / tv / u / v / vg / vint / vip / vp / vr / w / wg / wsg / wsr / x / y] [Search | Top | Home]

I'm aware that Imgur.com will stop allowing adult images since 15th of May. I'm taking actions to backup as much data as possible.
Read more on this topic here - https://archived.moe/talk/thread/1694/


If you need a post removed click on it's [Report] button and follow the instruction.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com.
If you like this website please support us by donating with Bitcoins at 16mKtbZiwW52BLkibtCr8jUg2KVUMTxVQ5
All trademarks and copyrights on this page are owned by their respective parties.
Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
This is a 4chan archive - all of the content originated from that site.
This means that RandomArchive shows their content, archived.
If you need information for a Poster - contact them.