Is the 750ti a decent "entry level" gaming pc card?
It's on sale on newegg for $99 after rebate, and I want to try to build a gaming PC for under 500.
Im thinking the classic i3/750ti combo. is that still viable? or should I bite the bullet and pay the extra for i5 and a better card?
I mainly want to play Fallout 4, GTA V, and the new tomb raider.
An i3/750ti MIGHT play those games at low setting @30fps. Assuming 1080p. I would recommend a 960 for true entry level. i5 would be substantially better but you will notice less gains assuming you keep shadows off and AA to a minimum.
Thank you for the detailed answer. I guess I will spend the extra, since I'd like to get around at least 45 fps on "high" settings 1080p.
Waiting on dat tax refund before pulling the trigger
not to be shilling or anything, but I think the 265 or 260x is the same price or cheaper and provides better performance. if you're getting a decent enough power supply it should easily power those cards.
I can't say much for TR or GTA V, but my laptop with a maxwell 860M (pretty much identical to the 750ti) can play FO4 at max settings with a 30fps cap.
Its potent if you turn off really taxing settings like MSAA or AO.
I would say to grab it, then save your money up for pascal or polaris when you can afford it.
It is, but retards assume that because Nvidia's high end is superior, everything across the entire range of prices must be. That's why it's so important to have the most powerful single card on the market, even if only a tiny fraction of people can actually afford one.
750ti is getting old.
The new Tombraider is the only game currently where it performs worse than the console.
And I don't think it will be the last.
But you could still turn down the graphics a notch and play it at 720/900p or turn the textures/shadows down a little.
Better get a R9 380 or a 960.
It's not actually cheap, 960 is almost the same price as the 950 and a bit better.
>But you could still turn down the graphics a notch and play it at 720/900p or turn the textures/shadows down a little
He literally says right there in the video that even on the lowest settings the i3/750 Ti combo delivered 13-25fps due to a CPU bottleneck. Can't go below low, bro.
How is he shilling when he is right? Stated the 980ti was the better flagship, which is true. Stated every nvidia card below that costs too much for what you get, which is also true.
Used 270X or 280X for a bit more, and get the Athlon 860K. If you don't get a quad core don't be surprised if some games won't even start.
In the future consider reading the sticky and utilizing the tools found there such as "logicalincrements" instead making a fucking thread for advice
TLDR if its second hand get a 950/960 but the 750 ti is still a good card
Owner of this.
>750 ti ftw/acx (basically sc version)
I've played fallout 4 on this at medium at 60fps stable, amazing success with pcs2x emulation.
They'll achieve better fps than people claim
Having said that if I knew better I could have bought a 950 or 960 same price (second hand) and have gotten better performance.
I3 4360 is all you need in my opinion, I've never ever had the cpu at 100% except for folding.
Also don't fall for the "turbo ultra fuckoff-clocked" meme like I did
Even that can't help Nvidia's lack of asynchronous computing in DX12
>Asks for consumer advice
Here you piece of shit: >>>/v/325661696
Literally fuck back to where you came from, manchild.
That doesn't matter. Nvidia just pays off the developers to not include asynchronous computing in their games, like they did with Rise of the Tomb Raider.
>Has asynchronous computing on XBone
>Nvidia pays developer
>Developer removes asynchronous computing for PC version
The way it's meant to be played!
This, I've got 2 390'x and my roommate has 2 980's and the additional vram of the 390's is a big bonus. Only real downside of the dual 390's is the heat generation/power consumption.
You're better off just living with the IGP in the skylake till you have enough cash for the 950.
You're already getting better than 730 performance with the IGP enough to hold you over a bit.
Dude I've got a i5 4460 and gtx 950 with 8gb ram and it runs damn near anything on high to Max settings at 1080p with at least 50fps usually 60. Only thing this has trouble with is the witcher 3.
random 750ti owner here,
It's an okay card but don't expect any miracles. It deals with dx10/11 games better than dx9 games. i3 won't multitask as well, but you can get one with a decent clockspeed for much cheaper than an i5 with that speed. Shouldn't be a problem if you just run a game without much in the background.
Okay? And amd is still missing dx12 features on their current cards to. You have to wait til next generation to get full dx12 support. Buy right now both cards struggle with dx12 with amd still coming out behind if the game features Any tesselation or if you have a weak cpu. The division performs pretty bad on amd cards compared to nvidia ones, and it's a dx12 title. Not even shilling, nvidia does some pretty anti-consumer shit, while amd never actually works on their tech.
Implemented in like 6 games.
>but its used in dx12 and Vulcan
Only a part of either and amd no longer supports Vulcan development, while nvidia actually does.
Whatever amd shill,both companies are shit.
Sure, price-performance ratio, AMD is king.
But that's been the fucking selling point since the 6-series, hasn't it?
I'd gladly pay more for not having to put up with the inconvenience I've had with AMD. I had flash and mouse corruption AT LEAST once a week. Shit was annoying, yo.....
Also, most of the high end cards are loud as fuck, and run hotter than the surface of the sun. I'm OCD as fuck when it comes to noise.
You're better served by getting a used gpu if you don't have $200+ to spend on it alone.
On a budget I'd pick up a used 760 or 270x for around $125. They're much better and only a little more than the 750ti.
The cool thing for the 750ti is if you have a shitty prebuilt you can pop it in and have a workable gaming machine.
I have a 760k/750ti combo and it does well. Playing blade and soul atm (cpu intensive) and so long as I only have the game, voip, and maybe 1 window open in chrome I can run it at 50+ fps at medium-high settings @720p. I've never hooked it up to a 1080p monitor, but it runs everything I've ever thrown at it.
Only turn down settings during huge battles during BnS/WoW/FF14. Think I spent $475 on the whole thing after salvaging hard drives.
No. I was making fun of the "overpriced and underpowered". There's nothing underpowered about any card inthe upper price points. A gtx 980 is too pricey but it's not slow. While amd's offerings at the 500ish price point are faster than the 980 the vast majority of pc users wouldn't any difference because anything past a 390/290/970 is already limited by their monitor's refresh rate.
Buying a Titan X or 980 is a waste of money most of the time, but it won't be noticeably worse most of the time either. They're in dickwaving territory
>cuda is a meme
goto bed kid.
2:00pm is past your bedtime
Honestly if you do not have enough money to buy a 970. You should not consider Nvidia at all. Their low end options are just pure shit. And even then once you reach the 970 you could pick up an r9 390x for the same price on discount that beats it.
>amd is still missing dx12 features on their current cards to
Two features that are optional because they aren't high-impact features and the work they do can be done in other ways just as efficiently.
Asynchronous compute was the big thing that DX12 brought to the table. AMD has been saying for years that the lack of multithreaded workloads to the processor has been hurting them for years. We had evidence showing this then, and we know for a fact now that lazy devs were a major hindrance to AMD.
AMD makes MUCH less money than Nvidia. They have less money for R&D and less money for marketing... and you're here like "OMG, they're such dickheads for spending money on an open technology when nobody else would. Look at them allowing Nvidia to adopt it without charging for license fees. Fucking dicks. Wow, they're not even paying to continue development now that everyone gets to use it for free. Fuck them they're a piece of shit company."
Honestly, do you think about the words you string together?
...what? If you have a 60-75hz monitor and your graphics card is pushing out 200 frames a second, only the first 75 frames actually matter.
After that you're just asking for tearing, which means those extra frames might actually make your shit look worse. So you then turn on vsync to prevent tearing... and your card is rendering 125 frames for no reason at all.
Unless you have a 144hz monitor. Which was the point that anon was trying to make.
>it's okay when amd does it!
Look either amd supports freedom or it doesn't. Amd has poor Linux support, nolonger supports mantle, freesync, Vulcan or opengl. While nvidia has great Linux support, supports Vulcan, opengl and doesn't abandon its own technologies. Besides once amd jumps to its new architecture it'll drop it's support so fast for its older gens. Just like they did with the 4series.
You have no idea what you're talking about, you little cutie.
AMD spent who knows how much money creating Mantle. AMD made it open source so that others could take it over and maintain it once it was done. The ONLY REASON AMD DID THIS is so that multithreading would finally get some support AFTER OVER TEN FUCKING YEARS.
Once the API was completed, AMD let others take the work and use it in their own projects, free of charge. NOW EVERYONE GETS MULTITHREADING.
And AGAIN, we have a shithawk like you not understanding that AMD just spent money for everyone's benefit.
Do you want them to come give you a fucking blowjob, too?
>AMD is shilling on this board so hard this week
wait to find a deal on a gtx950 at the very least if you're going to be a poorfag
I want amd to be able to actually compete with nvidia and standby what they say. I don't care if they keep throwing away their money, it's doing nothing for them, except getting closer to them being bought by Samsung. Which would be awful.
Are you retarded?? Why the fuck do you think they are forcing driver updates in GFE (ie not making them optional, in addition to removing direct download for game day drivers)? It's exactly for this reason, ie so the gimp becomes real. Their whole business model is predicated on selling GPUs and they have almost no competition. they have every reason to make people think they need to upgrade