I got myself a 55" 4k tv to use as a monitor. It unfortunately has no displayport input, but it supports 60fps over hdmi.
The problem is that my current vidcard is only a 750ti.
Speaking purely in terms of graphics cards, what is the recommended minimum for gaming at 4k on medium settings at 60fps in most games. Note that I do not play pretty much any AAA games (i.e. horribly optimized EA/Ubishit). I play counterstrike, souls series, metal gear, etc.
nah its fine
it's quite immersive
There are no games or broadcast that are in 4K.
All you are doing is upscaling them and it will only make them look more pixelated.Only Japan currently has anything broadcasted in 4K. So really anywhere but there 4K is just a money sink and hype.
Just stick with the 750ti until the next gen of gpu's drop later this year that are supposed to be a huge step up. A lot of the best pc games aren't even graphically taxing at all and could be run at 4k with a 750ti
Even good looking games that are designed simply and optimized well work with the 750ti at 4k
Could probably play games like DMC4 at 4k super easy too
Welcome to the true master race anon. I suggest a single GTX 980 as it's been a pretty good experience, throwing a second one in was just amazing. That or wait a couple more months, pascal is soon.
Also keep in mind -most- 4k TVs can also be run at 1080p/120hz for games you can't push to 4k.
TV's are always going to have additional input lag. There's no way around it. It doesn't matter what kind of TV you have, it's just the nature on how TV's are built vs monitors. If input lag is a concern to you then your choice is always going to be a monitor. At first I didn't like the additional input delay on my TV but after a while you kind of get used to it in most games. If competitive multiplayer is your thing though, you're better off just sticking with a monitor.
Also, not a response time issue (a lot of TV's have as good of response time as most monitors), so inb4 response time lel fags
It's actually the background picture, some retard took a good picture and put a sepia filter on it.
Run it through GIMP and use auto levels to fix the contrast.
>I DONT KNOW WHAT THE FUCK IM TALKING ABOUT SO IM GOING TO SHIT POST. LOOKATME.
You're a fucking moron anon-kun. The best 'gaming monitors' on the market right now average 20-25ms input delay but you're getting 144~165hz and Free-Sync/G-Sync as a trade off.
There are 4k TVs now as low at 17ms input delay. So really you're choosing resolution or new features vs minor changes in input delay. Also keep in mind most of YOUR PERIPHERALS.. have a HIGHER input delay than your DISPLAY.
IE: You're a fucking idiot.
do your eyes not hurt after a couple of minutes? this shit looks uncomfy.
i would totally play dragons dogma on this a couple feet away with a wireless xbone controller tho. thad be comfy
>CAPS LOCK IS CRUISE CONTROL FOR COOL
It takes a lot of skill to use Google and resummarize what someone else said into your own words and take credit, doesn't it?
Never heard of this site don't give a fuck, also doubt they go into the fact your games/peripherals are more of an input delay bottleneck than your display.
Thanks for trying, please play again faggot.
I had to Google nothing. Anyone that keeps up with technology knows that the HDMI 2.0 standard was designed for the sole purpose of 4K output as HDMI 1.4 lacked the bandwidth to output a 4K resolution at a rate of up to 60 FPS.
Also, you should probably throw away your HDMI 1.4 cables and go out and buy new HDMI 2.0a cables if you haven't already.
Let's see how hard you have to search to understand the improvement from HDMI 1.4 ro HDMI 2.0a is not in the cable, but instead, is done through firmware updates.
Shit. I better to retrieve all those HDMI 1.4 cables I just threw away.
i did this shit with a 37 inch tv. it's retarded. get a good pc monitor instead m8, 27in 1440p 140hz or a nice ips or something. you can always run an hdmi cable to the tv and do shit with it.
>Trying this hard to save face on the internet
It's okay anon, your google-fu is adequate you can stop now. You might as well just paste the paragraphs from the hdmi 2.0 wiki next.
>bought a 55 inch 4k television to play Counter Strike
What's it like to be retarded? PC autists, ladies and gentlemen.
So ITT I see like two people who just gave OP actual info related to his Qs. Some faggot went on a tangent about HDMI for no reason, another faggot thinks input delay is an issue, especially on a 4k samsung which is like sub 30ms and holy shit posting galore.
/v/ why are you such a bunch of insecure little twats? Props to the people who actually contributed.
>get G-SYNC 144hz monitor for Christmas
>mfw all this fucking smoothness
Now I don't care if TW3 drops below 60, it doesn't make much of a difference. I now don't feel like I need to upgrade my 970 for a long while.
>Even with all the sources posted here, this faggot thinks there is anything wrong with that choice.
Hoollly shit it doesn't stop. Are you from 2005 anon? The only reason for monitors being > TV for gaming no longer exists.
>ask retarded question you could easily figure out with google
>thread is filled with shitposting
wow. how can i find a way to feel superior to everyone else in this thread???????
>The best 'gaming monitors' on the market right now average 20-25ms input delay
Yo dude, how's it feel still living in 2010?
I just want to know why he thinks a TV is better than a monitor for gaming. Sometimes I'd much rather game on a TV than a monitor (comfiness, laziness, etc) but I'd never argue they're better.
OP here, sorry I've been away
Go in google and look up "input lag database," it's a whole site that ranks tvs on this specific aspect.
For this model in particular (Samsung uh55ju6700) there's a very low amount, and once i set it to "game" mode there is absolutely none.
I would like to say that wallpaper is beautiful anon.
okay, "absolutely none" is an exaggeration. But I don't notice any, and i'm pretty sensitive to it.
"Game mode" does reduce visual quality very slightly, but it's still fucking 4k and looks better than anything I've ever seen.
>but it's still fucking 4k and looks better than anything I've ever seen.
yeah if you're standing closer than 4 feet away, which i guess you are. shoulda got a 1440p 144hz monitor
pretty great, but i'm getting a tiny bit of subpixel bullshit (looks like a little chromatic aberration). I need to get mactype or reconfigure cleartype.
for comparison, the monitor i was just using before this is 1080p at 27" making this 55" 4k tv almost exactly the same as just arranging four of those. It's pretty much the exact same pixel density, just ridiculously more real-estate.
I enjoy videogames in the ways previously described, but i'm also an engineer (CATIA and spreadsheets) making all of this screen real-estate fucking radical. I also like muh netflix.
I also eventually intend to get a couch and mount this tv on the far wall, and switch between hooking my pc up to a regular monitor setup (144hz? perhaps!) and connecting it to this tv.
CRT vs LCD. If you don't know the difference you should just stop posting.
Also modern TVs have hideous amounts of postprocessing and shitty slow scalars regardless of display technology.
>TVs have hideous amounts of postprocessing and shitty slow scalars regardless of display technology.
i think you can turn them off though. that's what game mode is supposed to do anyways.
it's not like you need anti-aliasing at that point
there is no information being aliased that needs anti- ing
I don't care too much about other settings apart from LOD and draw distance