What graphics card do you own?
R9 390x. Btw, anyone with a 390/390x still getting fucking abysmal performance in Fallout 4 even with the new drivers? I've forced tessellation and god rays off, but the game still runs like shit. CPU is i5 4590 and I'm playing at 1080p.
>mfw my ancient 5850 is still going strong and the Crimson drivers just made it even better
literally why buy an expensive gpu
I put it down to medium. Still runs like ass. One thing I noticed is the GPU core clock just randomly downclocks to well below 1000mhz, even during what I'd consider GPU intensive areas.
Except this does not happen in any other game. For most recent games the GPU will know to stay at 1000+ mhz, but somehow in Fallout 4 it always stays at around 800mhz and drops down to 500mhz.
/g/ doesn't like that picture (which I posted) because it makes Nvidia look bad.
Which is a historic sign of a cpu bottleneck.
I tried Fallout 4 with my 970 meme card I keep as a backup and the problem suddenly disappears. Also there are people with worse CPUs than the i5-4590 who are playing without problems.
>What graphics card do you own?
Used to play on a 7670M
then my laptop died and now I use a netbook With an Nvidia ION Le
Building a PC soon and I'm pretty sure I'll get a Sapphire Vapor-X 280X cause the 380X is too expensive
Unless those AMd price cuts reach my country
A 5ghz sandy vagina chip is going to be close to a skylake i5.
If you can cool it and the mobo is upto snuff 5ghz ain't shit for most cpus. The issue is mostly people do NOT cool it properly or realise the strain it puts on mobos. Sure intel chips use less voltage than the likes of the 9590 (which needs 1.5v for 5ghz) but pulling 1.3v+ is going to flatout murder lower end mobos that support overclocking.
That graph doesn't prove that the CPU is a bottleneck. Bottlenecking means that the CPU is limiting the GPU, preventing it from being 100% utilised (when this happens the CPU will typically be at or close to 100% load). However it's quite possible that a CPU will result in a higher frame rate in certain situations, but this is because each component has different things to do and so the things that the CPU does will be done faster by a better CPU.
It's a difficult thing to explain and for some people to understand.
i5 4690k < 8350
>old intel mobo vs new amd mobo
A cpu in isolation is as fast at potato as it is at 4k. The real issue with cpu benchmarks - especially in games like fallout - is there is no uniform to testing so there are no cohesive results.
As for this claim-
>when this happens the CPU will typically be at or close to 100% load)
This tends not to be true for vidya as the enormous amount of abstraction DX11 has in particular means its very, very difficult to squeeze scaling out of more than 2 threads really - 4 is the best you are going to get. So a a high core count cpu can be sitting at low total usage but still bottlenecking a gpu.
Equally thats why you see the 5960x and 3970x barely moving away from a 4770k despite having a lot more threads.
I see, I thought that graph was a counter argument to the other posters sarcastic statement.
I'm not really sure how it all works when it comes to hyper threading and high (>4) core counts. I would assume that a game with decent multi threading utilisation would load up all the threads to a high amount when bottlenecked, and when multi threading doesn't work for shit it'll only load up 1-4 threads to a high amount. If a game has good enough multi threading support to load 8 threads to 30% I don't see any reason why it couldn't take them to 100% if bottlenecked.
GTX 670. Looking to either get a GTX 970 or maybe even a 950 and pairing it with my 670 to drive 3 displays.
Does anyone know if a 950 can handle a 4k signal just for web browsing and image editing/viewing?
> I don't see any reason why it couldn't take them to 100% if bottlenecked.
In a nutshell the api overhead prohibits such scaling - DX11 is essentially DX9 turbo edition and that was written when dual core chips were still the latest and greatest. Gamegpu (the site the chart coems from) monitors core loads in their performance reviews and you can see how core/thread 1+2 are slammed but the rest sit nearly idle.
Hyperthreading itself is just clever hardware scheduling, which is why it tends to only give 60% scaling or so compared to having equal amount of physical cores. It is also why for some software hyperthreading can give negative scaling and should be disabled.
Pic related - GTA V hits only 4 threads and no more.
>>Hyperthreading itself is just clever hardware scheduling, which is why it tends to only give 60% scaling or so compared to having equal amount of physical cores. It is also why for some software hyperthreading can give negative scaling and should be disabled.
How can I tell if hyperthreading is working like this, negative scaling
The 270 is (iirc) the 7850 - a 390 is going to be over twice as fast.
There is no hard and fast way to find out, but as a rule for vidya its worth leaving on if only because it means resources are available for background tasks - most vidya benchmarks are done on a barebones system. The second you do anything remotely cpu intensive having the extra cpu resources available means your vidya won't be affected.
That is actually something AMD's 8 core fx chips are really good at - it takes a LOT of multi-tasking to overwhelm them.
> you can see how core/thread 1+2 are slammed but the rest sit nearly idle.
I understand that situation, and in the case of a CPU bottleneck those two cores/threads would be close to 100% load, right?
Now, if the game manages to load up more than 4 cores/threads I still don't see how a low load percentage could be considered bottlenecking. Sure, it won't get to exactly 100% but it's going to be close to it. I just don't get how a game would be able to support more than 4 cores/thread to an extent that it could load them to say 30% but then can't go any higher than that, resulting in a bottleneck.
>Build a rig.
>After getting it I had a better idea for a better right with slightly more cost.
Did I just become upgrade addict?
[spoiler]wanted atx 2xfury x instead of mITX 980ti[/spoiler]
card is the sapphire nitro 290 tri x
Hoping to get a 1440 monitor on cyber monday desu senpai
Okay, but is that showing a bottleneck? I don't think so.
We'll take the 5960x for example. Imagine we managed to load up a machine with such graphical power and that game that could make use of it, and the 5960x became a bottleneck, what would the load look like across all those threads? Going by the load level in that chart and assuming there's no background processes running that could be contributing significantly to that load then I would guess that we'd see close to 100% load across all the threads.
On the other hand if that chart showed all but 4 of the threads at like 10% or less then in the case of a bottleneck I would expect only those 4 threads to be at high load. Now, I don't know any of this for certain but it seems the most logical to me.
>got a 980 classified for 355 on jewbay 3 weeks ago
Even assuming the software in question can scale to 18 threads, in an idle world you want your cpu to be as little work as possible to get a task done while you want your gpu to be going flatout.
number of phases is something meme overclockers talk about when they think they're relevant, more phases just means more ripple so more 'stable' power juicing the cpu and because it's spread out over more stages more efficient heat dissipation
you dont need more than 6 (and 6 phase is a fucking meme as it is) unless you're on LN2 pushing for absurd clocks solely for competing - anyone who tells you otherwise is a fucking gaymer retard who thinks their clocked cpu is a bragging right
A gpu can't compute shit until the cpu tells it to, so if the cpu is being slammed doing its own shit it is going to pull gpu performance down. A gpu going flatout just means your game (as an example) is going to be as flashy as its going to be - the only way to make it run better is add more gpu horsepower.
This is the whole point of DX12/Mantle/Vulkan - to remove software abstraction that is slowing down cpus so they can better feed gpus data.
use them eyes bruh. it took me literally 30 seconds to find those
If i grab the hot jet Sapphire R9 290 to play 720p at full settings, is it gonna heat up like crazy?
And does AMD have decent support for older games or am I going to get compatibility errors?
$200 for this seems like there's a catch, especially since its comparable to the GTX970 or R9 390
R9 290, and from the graph you postd its mean that it is on par with GTX980 for half the price.
4gb is not the limiting factor at 4k, raw gpu grunt is. See the 295x2 on this chart.
Before anyone herps and derps vram doesn't stack in crossfire or sli so the 295x2 functions as two 290x cores using 4gb of vram.
The titan x has 12gb of vram simply because Nvidia could do it, nothing more.
AMD is supporting DX12 on any GCN card, I don't know the specifics of Nvidia's supported range.
I would do that but I just got the card for cheap and I really cannot be bothered to re sell it and get a 980 ti.
It's overclocked to 1380 base and 1480 boost. Will I be able to run a second card with an equal overclock on a 620w psu?
Pcpartpicker is saying the whole setup will use 537w at max but I'm still a bit sceptical. I don't want to end up frying the whole thing.
A 4.8Ghz 2500k at 1.43v would draw about 195 watts.
5Ghz at 1.45v would draw 210,
5Ghz at 1.52v would draw 230
the formula for power draw is
stock tdp * (target clock / stock clock) * square of (set voltage / stock voltage)
In this case,
95 * (4800 / 3300) * sqr(1.43 / 1.2) = 195
Here's my chip (6-core Intel) clocked to 4.4Ghz,
95 * (4400 / 2926) * sqr(1.36250 / 1.005) = 262.5w
When is 960ti coming?
>more than 6 phase is a meme
Not when your board is 6 years old and you've been pushing 200+ watts through the CPU socket for almost as long.
Such an old device, relatively, and it never gets above 42C in the hardest of situations. Power delivery is still (almost) as flawless as when it was new.
I highly doubt a board with less phases could claim the same.
I bought a 960 ~2 weeks ago the gigabyte 4gb one.
The performance was "ok" that's really all I can say about it. I'm not even sure which card I want to buy now. [spoiler]my 460 768 died[/spoiler].
I think I'm going to build a sff PC
MSI GTX 650 (1gb version), fan died so I got an aftermarket cooler, works for me.
A shitload of factors but in a nutshell GCN is far more powerful for a lot of tasks (not all) than maxwell/kepler are and this comes to the fore at higher resolutions when gpus start to cry for mercy under the loads.
Christ and people say the 9590 is a nuclear reactor.
960 is turbo shite unless you really need h.265 or whatever the fuck it is encoding.
>Doing so would be stupid considering Pascal is about half a year away.
So are they going to release a decently priced card comparable to a theoretical 960ti when pascal is released? A card in btween $199 and $350 like they don't have now?
I own a MSI 390x.
Been very happy with it. Idles at 28-32c depending on ambient temp, and when gaming in ultra/1080 it never goes above 55c. And people complain about the power consumption with dual monitors, but my entire system idles at under 150 watts, and under load hasn't gone above 370 watts.
"shit" tier 750ti here
I play Starcraft 2 with max graphics at 1080p with no stutter, so whatevs
how on earth did amd manage to release a whole new architecture for the fury series that isnt even 10% better than nvidias year old offerings?
>980 - 148%
>r9 fury - 157%
knowing nvidia, theyll release cards that are 20-30% more powerful than the new amd fiji cards. this is going to keep happening gen after gen.
>nvidia releases cards 20/30% more powerful than the new amd offerings
>amd releases their new cards a year later which are barely more poweful than the older nvidia cards
and people actually shill for amd? jesus christ.
I don't operate that high 24/7, and the chip seems happy with 1.20625v (actually 1.192 at the CPU) at 4Ghz.
In fact I don't know if it even needs 1.35v to get 4.4Ghz stable, I don't ever run it that hard unless dick waving 6-core benchmarks.
That's still about 180w at 4Ghz (for six cores instead of four) under full load.
In a Xeon 56x0 owners thread people put theirs under water and push out 4.8-5Ghz at 1.4,
It would be drawing something like 305w
I'll be replacing the whole thing with a Zen platform next year though, or maybe the middle of '17, depends on how it performs and where the prices may drop.
Try 4k results.
The Fury absolutely slaughters a 980.
290x, 390x, 980, or Fury
Any less than the 290x (390x is just overclocked - basically) and you sacrifice quality
The 980 costs so little less than a Fury that you may as well get the Fury.
...and for a little bit mroe money you see a lot of gains in performance. Price vs performance caps out at around the 950/960 point. Beyond that dminishing returns kick in. Once you go beyond a 970/390 every shekel you spend gets you less and less performance.
One. Fucking. Percent.
In one game.
If this number is what you are basing your GPU purchase off of your are literally retarded. There are a dozen other things to think about.
$100 is your limit?
I seriously urge you to do whatever you can to save an extra $80. The sweet spot of price/performance has always been between $150-$200
380s are now selling for about $170 and they are massively, massively better than a 750Ti
A 750Ti will serve low-end dGPU needs RIGHT NOW, but have almost no longevity for even 2 years from now.
I have a 7850/265/370 and I wouldn't want someone to pay $130 for one. That's only $20 less than I payed for the 265 more than 18 months ago.
It's much better to suggest they plan for and get a 380.
It's actually 1.6% :^)
More than 50% better than you claim!
>It's much better to suggest they plan for and get a 380.
I agree, but price vs performance is purely mathematical and as such low end cards will always sit high on the ratio.
>280 aka 7950 aka tahiti
Forever GOAT you mean.
7950/7970 were THE go to cards for 1440p in 2012.
Still amazing, high performance devices.
Mediocre my dick, you're better off than something like 93% of everyone else going by the steam hardware survey.
if they use the stock pcb then they can usually put out a custom cooler the same day that the card launches
if it has a custom pcb then less than a month and pretty much all major custom coolers are out
Well i took that screenshot of speccy after playing GTA V for two hours. I haven't encountered any heat issues. Sometimes it can get a little loud if I play a more graphically intensive game but nothing I could hear through headphones.
I have a few choices:
>750 Ti SC
>750 Ti 4GB (I want consoles textures at 30fps)
>Wait for GTX 950 4GB
I can't go AMD because my power limit is 100 watts. Also I play older games that like Nvidia. I'll be playing at 1680x1050 for now, but will get a 1080p screen eventually. I want to be as cheap as possible, is a 750 Ti an acceptable choice for PS4 settings in LITERALLY everything? I have an i5 and 8GB of RAM. I'm a hyperpleb who doesn't mind 30fps and 900p in some games.
It really isn't as massive as you think anon. Hint: 970 cards don't run at the 145w Nvidia claims they do.
how the fuck does the 4790k with ht on score worse than a 3570k?
when I upgraded from a [email protected] to a 4790k I saw like a 5-10 fps jump with my 7970, and I stopped having that issue where shit would despawn when i'd be driving too fast
Hyperthreading can give negative scaling yo - hence why they tested with it on and off.
Which one? I pished 4.5ghz on my old 8320 (which I killed due to my own stupidity) o na m5a97 le rev2.0. I strapped the stock cpu fan over the un-heatsinked vrms and that shaved over 10c off board temps, allowing the overclock. Couldn't quyite squeeze 4.6ghz out of it.
That isn't what the chart is indicating.
Thats the one with heatsinks right? With a spare fan on the back of the mobo (assuming your case has a cutout) will really help drop board temps, assuming your cpu cooler is upto the task.
XFX 280x OCed to 1120mhz.
I have a 1680 x 1050p monitor , should I bother upgrading to 1080p? Would my card handle 1440p?
Depends on the brand. My MSI 290 stock gets to 80c ~ 84c in my Corsair 540 airflow case. I can't even OC this bitch. VRM 1 is also gets to 8Xc at stock. I even changed the thermal paste on the die and it still gets hot as fuck. This is why my next card is going to be a 980 ti Hybrid.
I wouldn't say the card is "loud" but of course you can hear it at 100% fan speed which is what mine is at and still gets 84c stock.
well i'll take personal experience (multiple times) over some chart.
plus most of my issues was gta keeping the frame rate up, but stopping stuff from spawning.
no gta wasn't running on a hard drive, it was on my fast ssd
They're probably using the same compression tech nvidia uses since their cards have way less vram and games also use less vram on nvidia cards
Delta colour compression I believe it was called
So would it be safe to leave a 390x running for 8 or so hours doing OpenCL rendering? I hear the latest updates destroy Cuda.
One of the sites I read said they can drain up to 300w and my PC is connected to a PSU and under GPU load around 234w I get close to 25 mins. I live in an area that gets frequent brown outs during the summer and is fairly far in the country.
I am planning to add a 4k 60Hz display to this setup, it's already on it's way here.
So the 290 is discontinued right?
Is there any reason to go with a 390 over a 970 then? I mean yea 290 would be my choice if it were still available (had a bad experience once with a third-party so never again) but it's not and I've heard that the 390 is a tiny upgrade over a 290 with far worse power consumption.
As a rule both gpus and psus like a constant, even load. Modern gpus are generally designed to run 24/7 as long as they are properly cooled.
The 390 doesn't such that much more juice than a 290.
Reference 290, rip the cooler off and replace with something like the raijintek morpheus or pic related. Unbeatable performance for the money.
Fyi this cooler is built to handle best part of 400w.
Custom water cooling is really easy, too me at least. It requires some upkeep and maintainence, but seeing my 290x 8GB's at 52C while under a full 100% load while the CPU is stressed as well. Requires time, money, and some know how (google). All worth it desu
>I'm just scared my old games won't run on AMD, plus that heat.
Thats paranoia anon. The 380/x are great cards, but the 290 slaps their shit sideways. Remember: a 290 fights a 970.
If this chart is any indication, the 8350.
They don't, but spikes when you are balls-to-the-wall really don't do any favours for shit psus. Hell shit psus don't like spikes in general as they have shitty capacitors.
Go read jonnyguru's "death of a gutless wonder" articles for why a good psu is important.
So I'm that anon that was asking about the upgrading from the 580 to something else, I saw a lot of people posting about the 290 and I had a question...is it actually a significant upgrade if I stick to 1080p?
That is because its 280x tier lol
Yeah lets also ignore the myriad of other 290 models that aren't infernos.
Best bet is to find a reference 290 and buy a kraken and a AOI or go full custom loop if you even want to overclock. Of course you could just say fuck it and overclock it anyways and just let it run at 100c.
Okay guys I want to switch to AMD and get a 390
My only question is, will my room get hot.
The TDP of the 390 is fuckloads bigger, even if it cools properly. But I live in fucking florida and in an old house at that.
I used to have a gtx 470, THE HOUSEFIRE, I never again want to have a hot card like that ever again.
1) the temp difference between a 290 and 290x is basically noenxistant
2) hawaii really will run at 95c for extended periods and give zero fucks.
My card looks like this.
>Just get a 970 with the 150w tdp.
Thats a lie
>The 390 tdp is like 300w
Not even the fury x has a TDP of 300w. It has a 275w TDP (and the 980ti TDP is 250w at stock).
It makes virtually no difference.
Stock clocks has it running closer to water cooling.
When I put the fans to 12v (admittedly they aren't the best fans) it will keep the core under 70c when running at 1200mhz and an eye-watering 1.38v.
I've got a 7850 2gb that still runs stuff pretty well. I am tempted to grab an r9 390 for $300 or so, but I would likely need a new PSU.
Feel like I might as well just wait for next year and new cards.
I was considering a R9 290x as well over gtx 970
As far at it goes, gtx 970 has way more overclocking potential than any r9 290x out there and cooling is incredibly better so I had to change my mind and I got a 970 which I really dont regret even though i always bought amd stuff :P
>gtx 970 is 260quid in the UK
>290x costs 280
>390 costs 280
>390x costs 320
Like seriously, why does AMD garbage cost more in this shithole of a country?
The only reason to even buy AMD is because you think saving a couple of tenners makes up for 2 years of pain and misery.
I don't see the point in having a good GPU right now.
Consoles are doing their typical "holding the pc back" thing for all new releases of games and anything popular/worth playing enough to care about performance can be ran on a toaster anyway.
Who here bought this card?
Why wont Nvidia produce something like this again?
I know the word "buy the best card you can afford at the time" But the different to tier cards have large cap between them in Australia!
A 380/960 4g are around $330-$350 Au
and then a 390/970 is $500+
Is what large price increase really worth it?
nope, according to price performance.
Recently bought a used r9 280 for $100. The cooler was broken so i just slapped an old noctua cpu cooler on there along with some vram heatsinks. It performs like a champ and is whisper quiet under full load.
I have a 6950 that's showing it's age, but I'll probably replace the entire PC somewhere next year. Hopefully there will be some great options then. Until then I'll avoid more recent games.
Is this a good deal for the UK?
It's almost maxed out my GPU budget for the next 5 years. Should I wait for Pascal?
My 7970 just died and I have two alternatives
>buy poorfag card and suffer until new gen releases then buy some powerful card + psu
>buy high end card now together with a new psu (my old one is 650w)
>removed the #1 feature that made it good for workstation
>nvidia markets it as a gaymen card
>"it's only shit for gaming in fps/dollar because muh workstation"