So, pcfags: what is 60fps? How do you define playing in 60fps?
Is "60fps" gameplay when i'm having framerate locked on exactly 60fps no matter what's happening? Or can player have drops to 50fps on some heavier actions in game?
I just wonder if there are people with machines so powerful that they have exactly 60fps no matter what, or is just everybody referring to 60fps as "most of the time, i have 60fps".
Getting exactly 60 FPS is almost impossible unless you're using Vsync (which causes fucking insane input lag) or Gsync which is incredibly expensive right now. So for most people, it's 60 FPS or more with occasional drops into the 50s.
"most of the time I have 60" I guess
Most people playing, leaves themself error margin, so possible framerate drop won't be a problem. It's personal prefference. Some get really annoyed when they see actual framerate drop, others can live with them if those are short, or insignificant
Honesty I don't even see difference between 120fps and 240fps. Personally I'm OK as long as FPS don't fall bellow 50, so I tune my setting for at least stable 60-80.
Most high end computers can get 120fps or 144fps. And that's the full fps all the time. You would only see drops in fps when the graphics card or the CPU cannot process the game fast enough, which if we're talking about "high end computers" then that won't happen.
A drop of 1fps to sync the computer and the monitor every minute or so is not noticeable, although it is a sign of a poorly designed graphics card or monitor. It should not be happening.
60fps is just what it sounds like: 60 frames per second. A lot of computer games are not locked at 60fps and can in fact go higher, although 120fps may not be consistent.
I have a 144hz monitor,.. so I game at 144fps.
Mostly because of CSGO.
The difference between 60 fps and 144 is actually noticeable for a game such as CS. But anything else it's not too much of a factor.
But to answer your question, it feels smooth.
The whole idea is that frames go fast enough to never see an actual frame transition. 50 is really enough and the higher you get the less likely you are to see a "skip" or transition. Past 60 is pointless honestly so 60 fps is basically ensuring you never see it.
The way things are filmed, like your picture, is not the way games work, so its not like seeing a movie at 30 vs 60 since they blur differently. This is why the cinematic experience as an excuse for lower frames is bullshit, they just want to be able to show nicer looking things, since that looks better for advertisements and things.
I suppose, theoretically when you have to make sure you are seeing everything with no skip it could make a difference. I don't think its been documented that people can actually notice that, and I don't want to get into the "human eye" arguement because our eyes process things differently, but I wonder if things like that are more of a placebo, honestly. There is that stupid 30 vs 60 website but its pointless since it tells you which one is which.
Not trying to come off as doubting you, but I'd like to see footage where it doesn't tell you what frame it is and see if you can really notice anything.
>what is 60fps?
60 frames per second.
>How do you define playing in 60fps?
Have the game render at, and the screen display at, 60 frames per second.
>Is "60fps" gameplay when i'm having framerate locked on exactly 60fps no matter what's happening?
>Or can player have drops to 50fps on some heavier actions in game?
That would be 50fps, or 50 frames per second.
You're and idiot.
Past 60fps is noticeable, although you won't consciously notice it as a better FPS situation. It is really most pronounced in very fast games, though, as a slower game like a RPG or visual novel can get away with 60fps or even less because there isn't as much quick movement.
It's when your monitor is 60Hz one and the game keeps up to that frame rate.
It's the expected frame rate of video games, the highest that your monitor allows.
If it drops to 50FPS then the FPS is 50FPS for the moment, simple as that.
There's no whatsoever need to arbitrary classify the game to some FPS since modern video games aren't made with specific frame rate in mind.
Most of the time people have machines powerful enough to run most games at 60FPS at whatever graphical quality they choose. Some people don't care about some FPS drops, some people go to great lengths to sacrifice graphics to maintain a smooth frame rate and pleasant gameplay.
>mfw just got a 144hz monitor
>mfw first moving my mouse
Can confirm this, I have a GTX 760, so not as good but comparable. I have had optimization issues before, most recently with games like Far Cry 3, or especially Borderlands 2, which was optimized pretty shittily for AMD hardware. Can run 90%+ of games 60fps on ultra with no frame drops, and in most cases possibly higher if I don't set a frame cap, but I wouldn't be able to notice because of my scrub 60Hz monitor.
no, again, both of those are telling you which ones are which, and I'm not saying there isn't a difference between 30 vs 60. I'm saying I'm not sure that past 60 makes a real noticeable difference the majority of the time.
I just wonder how much is placebo
For years I played Halo PC with a shitty pc and monitor, then I upgraded my pc and I'd get anywhere between 100 and 300fps (as displayed by fraps), and there's a huge difference between all 3 states (shitty framerate, good framerate, through the roof framerate). I didn't change monitors until much later though, and only ever had 75hz at the maximum.
Am I talking about something else when I mention what fraps displays, or are we talking about the same thing here ? I don't understand. I know for sure 60fps, for this game in particular, was considered a pretty bad framerate and you could definitely notice a difference between a locked 60 and no vsync at much higher fps.
Which one is 30fps and which one is 60fps, then? I can clearly tell the difference between the two. I can clearly tell which is 30 and which is 60. There is clearly a difference. They don't need to be marked to tell apart.
Consoles had 60 fps since the PS2, I don't see why people are celebrating this like something new.
If anything it's more of a return to the old ways, after we let graphics whores take over the industry last gen
Vsync doesn't push framerates up, you dimwit. It tells the system to wait on drawing a frame until the screen is finished drawing so as to reduce GPU load spamming out frames when the monitor can't display as many as is being drawn.
I've been playing The Witcher 2, and I sometimes get low framerates (around 40fps from the usual locked 60fps) in towns and taverns. I've checked, and none of my CPU cores are at over 80% load, and my GPU didn't go over 70% (during the period which the framerate was low, at other times it went up to 100%). My temps are fine as well. What might cause that low framerate?
Most monitors have a 60hz refresh rate so 60fps usually means you're maxing out your refresh rate, getting the most frames possible on your display. When people say 60fps they mean that a game will render at 60fps, and it depends on the person, the machine and the game whether they mean it's rendering a solid 60fps or just 'up to' 60fps. I built a new PC recently so basically everything I throw at it will run at solid 60.
I was watching my younger brother play Payday 2 recently and for some reason he was getting a low framerate. We looked at the video settings and he had it locked at 30fps, which was weird.
That's because vsync caps the framerate to your monitor's refresh rate you idiot. There's no point in rendering higher than 60fps if your monitor is 60Hz.
Vsync only causes noticeable input latency in shit-tier engines that aren't coded properly.
But why would you want texture detail that low?
Human eye works mainly by visualizing outlines.
When textures are softer, vertex outlines stand out more. Your vision has to work less when looking for the enemy on the screen, making reacting easier.
I know, I'm not talking about 30 vs 60 again, I don't feel like you are reading the posts entirely. I'm only questioning how MUCH of a difference there is beyond 60, I know there is one, but I just wonder how much of it is really there if we don't know what the fps is there.
Triple buffering paired with adaptive vsync fixes the problem with vsync where it would normally cause FPS drops whenever the frame rate would drop below the refresh rate.
Input delay remains a problem. Gsync is the only way to avoid both input delay and screen tearing, so it's no wonder it's being pushed.
60fps means the framerate is 60 or higher 90%+ of the time.
Generally your average framerate should be around 80fps or higher, so with vsync on even when there are dips it never goes below 60.
>4k monitors still well over £600
>1080p 144Hz monitors around £300 for a good one
>single cards can't quite 1080p ultra at 144 fps just yet
Come on technology, not much further now.
I wonder how much changing things like that starts to fall under cheating. I mean I don't care, but it'd be nice if everyone was on the same playing field if you were competitive about it.
>>got a 1080p 144Hz monitor for €200
Official comptetive servers aren't serious enough for that kind of detail with the lag and the low tickrate anyways.
But I believe the high level tournaments forbid it, same for the color tweak that just changes your GPU setting to 2x saturation - CSGO is so void of color that it doesn't even look remotely strange in-game.
If you ask me, the actual problem is with the game design. Players shouldn't feel the need to modify the game like this, nor should they gain any real advantage with this kind of little tweaks.
My mistake then, most common cause for that sort of problem is just people overrating their hardware.
No idea about the specifics of W2 either, but I haven't heard anything about it being too unoptimized.
>16ms input delay
>making mouse controls extremely jarring
what the shit
are you drunk
sorry to say if you notice input lag you're well past the 30~50ms mark anon
Exactly. I dont get the hype for pc
That's not how it works.
My average reaction speed is around 170ms, but that means jack shit here when I can tell apart a monitor that has 1ms and 20ms input delay easily. Reaction time doesn't mean that humans would only be able to perceive world in chuncks between that time.
Go on, open up CS, go into empty server and enable vsync and see how it goes. If you can't actually tell the differences then it's as if you're on tranqulizers all the time.
Son of a bitch, €249 (£197) is already cheaper than almost any of the options on the UK Amazon, fucking mainlanders man.
Even that exact model of monitor is £260 on Amazon and not below £240 anywhere else.
In any game where visibility of players might be even remotely difficult, tourneyfags will try to break the game so that enemies are easier to see. QuakeLive colors all enemies bright green by default, but maybe part of CS's game design is for enemies to not stand out so much? I know I've got some clever wins out of using the terran to camouflage myself.
It does, which is why I use it. I was just saying that the screen tearing without vsync is a lot more noticeable to me than the input lag when using it, which is why I use it.
every competitive fps with players going pro on it had this ability since ever. First time I saw it was on quake 3 (and everything quake 3 engine related) where I couldn't believe what people were reducing it to but the point was apparent on the first sight. A great fps boost (which on quake 3 engine affected many things including jump height) and made shit easier to notice at extreme speeds these guys were playing at. Honestly, that sort of shit is what makes a truly great game.
vsync generates input lag if you get LESS then the capped framerate, because the screen is going to keep the old image onscreen while waiting for the next rendered frame to display. obviously, you won't be able to see your input if the screen didnt refresh.
if you get a higher framerate than the one the cap is at, additional frames get discarded and you lose absolutely nothing.
Or well, you are supposed to lose nothing. Read up on double and triple buffering if you want to know more. Especially on good and bad kinds of triple buffering
Is there any way to not get screen tearing without vsync? There are a few games I've noticed that have pretty much no screen tearing. When I first played the original Far Cry I was getting a few hundred frames per second and with no screen tearing. In Sleeping Dogs I only have to use the framerate limiter and not vsync and there's no screen tearing. Those are the only two games that this happens in though and I have no idea why.
>eat moldy bread
>it's better than starving
>see rich people walk out of restaurant complaining that their steak was cooked well done instead of rare
fuck off food whores. no one gives a shit about good tasting food
Quake engine is special in it that certain actions are tied to frame rate. But the game is also designed to be ran at 125, 250, 375 etc.
CS:GO doesn't allow r_picmip style commands.
It shouldn't be a question in CS. I get that it's in slower games, but this just isn't one where low visibilty causes problems.
>these threads come up every so often
>start learning about it all
>go several layers deep through random forum links
>all the while the comments on everything keep arguing whether the article is correct
>then suddenly everything is game dependant and relies on a dev doing a proper job rather than the principle employed
>end up none the wiser with a vague understanding of this shit
60fps or better, all the time, no matter what.
Unless we're talking about Planetside 2, that games just bad optimization wise. Pretty much no one can maintain a solid 60fps in that game at all times.
Well without that.
I have that issue too. I found a fix by making a custom resolution with the refresh rate 60.001 but that caused stuttering while using the computer sometimes so I stopped it. It should be something like 59.95 fps though and this might be overridden in video games.
>vsync generates input lag if you get LESS then the capped framerate
No, it's going to cause it anyways. It limits the FPS to Hz rate of screen. The delay is caused as the system waits for the screen to be completely drawn before showing it, in other words, it waits until the next actual frame starts to happen before even showing the previous frame.
Without vsync, it would all be slapped on the screen right away, be it complete or not, which causes the tear.
You have not lived until you've played vidya at 4k 120fps.
Peasants will never know this feel.
>W-why don't you have a response to my bad analogy!? :(
60 fps is different, not better. Enjoy your soap opera graphics.
If you have a framerate above the capped one the delay will be absolutely negligible as compared to being below it.
That's why I said you shouldn't lose anything if you're above the cap. To reduce that effect even further buffering exists, but most engines have absolutely shit triple buffering which would otherwise the best of both worlds
>60 fps is different, not better
please just tell me whether or not you're trolling. i can't handle the pain of thinking that you're serious
>So, pcfags: what is 60fps?
A low as fuck framerate that looks awful on my 144Hz monitor.
As for your question about framerate locking, a higher framerate is almost ALWAYS better, whether your monitor can display 100% of the frames or not.
it's not 120hz. it's 60hz with interpolation / backlighting tricks. they're very different things, and the companies that make them aren't under any obligation to be truthful about it
Triple buffering essentially means that your game doesn't have its pace dictated by your framerate when using vsync, so you can do an input halfway through a frame and it will be read and executed. You won't be able to see the result until the next frame is drawn, but it feels a lot smoother since your command isn't either delayed by a new frame coming in or lengthened by the current frame being swapped out.
>If you have a framerate above the capped one the delay will be absolutely negligible as compared to being below it.
Vsyncing IS capping, even if you could go above the Hz cap easily. And it always causes delay of one frame.
The only reasonable way to avoid tear & delay would be to constantly run a game at 2x Hz, and always show the previous frame, which would still be ready faster than the actual screen could update. But that's not how vsync is designed.
This. On a 60hz monitor 50 fps in CS:GO feels better than 40, 60 feels better than 50, 90 feels better than 60, 150 feels better than 90. You can argue all you want, but you don't know shit until you try it. It's glorious if you can sustain stable 150-200 fps on a 144hz monitor, though. That is what's really silky smooth™.
>using examples of games that havent come out
>implying consoles dont get errors
>implying console graphics are as good as press release shots
>implying consoles have hardware equal to most peoples gaming PC's
The thing people don't realize is that even if your monitor can't display all the frames, rendering the game internally at higher framerates will improve response times in the game. This is really noticeable in precision-input-based games like Street Fighter, where if you run the game unlocked at 150fps or whatever for a while and then go turn on Vsync you will notice how sluggish your inputs feel, how many links and inputs you end up dropping... The game will LOOK the same on your monitor because you can only SEE 60Hz, but the game will FEEL more responsive and accurate at the higher framerates.
Also Quake Live (I don't know if this was possible in Q3) allows you to make all enemies bright pink and shit, which makes them much more visible compared to the dark brown maps.
Hey, you're the guy I really need now.
I have an old Flatron 1280x1024 4:3 75hz monitor and I'm changing it for the Asus VG 16:9 1920x1080 144hz one, but I decided I'd rather keep the old one and set it by the new one in a dual setup.
Would this drastic difference setup work or would the old monitor bottleneck the new one and cap something (resolution, refresh rate or whatever)?
>lying on the internet
>where people can just google "144hz monitor price" and see 200 euro / 280 dollaroos price tags
Oh so you mean every fucking AAA engine being used right now? No shit
This is why the eastern europeans are PC master race ancestors. With no resources they make better looking and performing engines
>PS4 games are high quality restaurant food
Console games are McDonald food at best.
Easily accesible, simple, low quality.
if I tweak my shit system with the correct configurations, I can achieve a steady 60fps on any game. Now on games like strategy, or RPG, I don't find 60 fps necessary, but when it comes to a shooter its like night and day. Its really only something you can experience on your own. Nobody can describe it to you since your eyes just haven't seen animation so smooth ever
its not as big of a deal with certain shit but for some games it seriously bumps the quality, and its sad when you see games that have a dumb 30fps lock when they'd look and play like silk without it
>Its really only something you can experience on your own. Nobody can describe it to you since your eyes just haven't seen animation so smooth ever
The real reason is that the visual smoothness is only half the story. The control responsiveness is a huge part of why 60fps is superior, and you can't SEE that, you can only FEEL it when you play yourself.
>you won't notice higher than 60FPS unless you have a monitor with a refresh rate higher than 60Hz
This is false. Input response times change at higher framerates. You might not be able to see the frames, but the game rendering more frames internally means there are more times per second where your input can be recognized and used. A 120fps game will feel more responsive than a 60fps game, even if you only have a 60Hz monitor.
>upscale HD to 4k
does thi mean it's capable of upscaling 1080p input to 4k output, or that it won't accept 4k input and in fact is a cheap piece of shit 1080p screen that upscales everything?
Im 26 and I will play this, been playing brawl for years with friends.
If you're the kind of person who thinks quake is the best game ever made and play lots of shooters, probably option 1.
Unless you have a crazy good PC you won't get 120fps in modern games (without lowering some settings anyway)
This is the stupidest shit I've ever heard, way to go anon. Yes most games can do 60 fps steady if you have the Hardware for it. Frames only drop when they game is a shit port, even then it's not much, considering you have the hardware for it.
You only activate Vsync when the game doesn't have it's own internal limit for a FPS cap & instead your CPU/GPU render as many frames as possible than needed (I.e. More than 60 fps if you only have a monitor capable of 60hz refresh rate, if you couldn't understand that, console tards), which may lead to Screen Tearing or Coil Whine. A great example of this is the Sims 3.
>The Sims 3 runs at a maximum of 30 fps & it doesn't have a limit to stop your hardware from rendering extra.
>I have a I5 3570k/ 7950 & the Sims 3 will try to render the maximum amount of frames your hardware can do, in my case it went to about 2,200 frames in the main menu.
>My monitor is only 60 Hz, so anything extra, I won't be able to notice it.
>So that is 2,140 extra frames being rendered for no reason.
>This causes immense stress on your GPU/CPU & may lead to Coil Whine.
>You need to use external software to cap Sims 3 at 30 fps/60 fps (Sims runs at 30 max though), because the Sims 3 is a poorly optimized piece of shit by EA, who always has been shit.
nah, i'm more focused on playing some nice shooters and space games on ultra. i'm not bothered about 60+ fps really so i'm tempted by the second option. I'm thinking of getting elite dangerous and star citizen but i don't want to pay out the ass for early access. But let's not get into that in this thread.
I'll be building it soon but it's going to be prtty good, here's the build. http://uk.pcpartpicker.com/p/rpG8bv Prices aren't set, I can probably get it down to about £850, so including a monitor i'd like to stay below £1150.
currently playing on a fairly shit laptop with integrated graphics but i play a lot of old shooters and rts. i'm not going to be too discerning when it comes to fps, i just want a great gaming experience, so i honestly don't know which to get.
there is also the matter of bigger desktop resolution, i'll be using the pc for work as well using 3D programs and stuff so a bigger resolution might be nice.
You, as a console faggot, are so fucking retarded, you are no longer able to differentiate between a mid-range PC and a high-end PC.
Fucking laughing so hard right now.