What's up with this new trend or buzzword/meme whatever of 60fps?
Is there really a difference? Why is this being done now?
Consolefags have had their vision destroyed by consoles. I asked a console friend of mine if he could see the difference between 30 and 60 FPS, and he honestly couldn't.
Protect your child's eyesight. Ban consoles now.
>What's up with this new trend or buzzword/meme whatever of 60fps?
I really hate it, when young people and idiots come to this board to ask obvious, long known questions.
This place was better when it was a secret club.
It's not now, it's been that way for while. Devs have just been sacrificing smooth fps for more graphics, because fps doesn't sell games. Screenshots do, and you can't just ride on the bullshots from e3 forever.
>just 60 fps
I need all the fps I can get, especially in competitive games
Also, most monitors are 60hz so that's why it's the standard
>implying any game that is locked to 30 fps doesn't dip constantly
>implying frame lose between 60 and 30 is anywhere near as jarring as 30fps to 25 fps
Extremely shaky 60 fps> shaky 30 fps
blame le asscancer man for creating this meme
when he dies (which will thankfully be soon) pcucks will finally shut up about "muh fps"
But then we have the inevitable choice of Ultra settings or 144fps with less AA and maybe some other things lower
>has 1440p monitor and quad titan X
>plays at 1080/60fps
I don't understand this, he probably just does it for the video sake but I never see him using higher res in any of the billion options menus, he has Nvidya he can DSR 4K
>people won't play one of the best games of 2015 just because le 30 fps
It used to be standard, then games became "cinematic experiences" and normies & reviewers who don't know shit started thinking that devs being lazy and locking games to 30 fps was a good thing because they are to stupid to get into the film industry.
replaced this bad boy with a proper desktop and a 980, it finally cooked itself to death shortly after. I'm almost nostalgic for it now, almost. Need to pull out the drive to recover muh porn though.
anything above ~120fps should be literally imperceivable to the human eye, at that point the frames are flashing faster than your brain can process into an image. Your eyes physically can't send you information fast enough.
So yeah once you get to 144fps it's time to move on to optimising other things.
I'm afraid to get a monitor that goes above 60 fps because it might ruin me. I wouldn't want to go back to 60 fps probably. For now I'll just stick to higher resolutions at 60fps
Why do consolefags claim to want 30 FPS? You can want anything. It's like choosing to want $5, when you can instead choose to want $1,000,000, or any amount for that matter?
Why 30? It's so arbitrary. It's not any kind of a real standard, and it ruined 2 generations of games. All of these games are going to have to be remade.
I honestly don't mind 30fps. In fact, I can't see anything bad or irritating with 30fps. I don't mind it at all.
But let me fucking tell you that I can tell when something is in 60fps, and it plays and feels so much better than 30fps, holy shit.
The webm is a bit deceptive though. That kind of smoothness is closer to 120fps. 60fps is not that smooth. There's a video rendering illusion going on there called interlacing. There is a clear difference between 30 and 60 fps but it's not quite that stark in reality.
It's called confabulation, among other names, in psychology. It's when you get less than what you bargained for but instead of getting mad (and accepting you made a mistake buying that) you find reasons why you like it even though you don't.
I call it being a cuck. Imagine they find somebody fucking their wives, but instead of getting mad (and accepting they made a mistake marrying that person) they start finding reasons why it's alright.
These are people who don't like conflict, and much less admitting they made the wrong decision.
UNPOPLAR opinion incoming
I feel 60FPS is way too fluid to look "natural".
30Fps looks more 'cinematic' -- don't get me wrong, cinematic in the right way. If I watch a movie in 60Ffps, I hate it because it feels as if the actors are just in front of me rather than in the screen, and somehow it breaks muh immersion.
So even though I could play everything in 60fps, I cap it at 30 when given the chance.
Yes, there is a difference. There are literally more frames being generated per second, allowing for smoother, more consistent animation. How could you even ask that? >>323793706
Even if the frame rate dips from 60 it would still be higher or would match 30 fps??? How could you ask that????
Only the consolefags can truly dislike him. The whole 60 fps thing is so miniscule and people are drawing it way out of proportion. If a game has below 60 fps on PC for me, it makes me not want to play it - the same doesn't apply to the same degree on console, so it's actually less of an issue for them.
60fps standard has been a thing for longer than you've been on the internet, son.
It is a fairly recent marketing invention to accept 30fps because of substandard console hardware.
No, your post is bad and you should feel bad.
This is interlacing.
Films are shot at 24fps, because with film higher framerates means you physically need more film, and 24fps is basically the minimum where it doesn't feel like a slideshow.
The main difference with vidya is that film captures natural motion blur which blends fast motion between frames. Vidya, being rendered, does not have that.
The effect can be reproduced, but it's too computationally expensive to do in real-time. The "motion blur" settings you see in vidya attempt to simulate it using computationally cheap post-process filters, but they just make it look like someone smeared vaseline on the screen.
>tfw 1440p IPS panel @ 96hz
I ain't go back
Never going back
Holy fucking shit. Why would you want your cutscenes to be capped at 30? It completely ruins the immersion within the game. It's like when gameplay is 60 fps and the pause menu is 30.
>make a recording of a youtube video
>check the metadata
>"hey look! The file isn't interlaced!"
I think cutscenes should be 60fps because I think they should be in engine. The best part of unlocking silly costumes is when you can use them to ruin otherwise serious parts of the story.
lol ur retarded hobbit was shot @ 48 HFR
DONT U FEEL STUPID :^)
I hated when Witcher 3 did that for a good bit of the in-game cutscenes, especially at the beginning, I thought my game was running like shit despite my rig being very capable at running it. Then I found it was just for certain cutscenes.
try 1. if you'd have paid attention in physics class you'd know that time doesn't update more often than once every second, which is why it's the lowest measurement of time.
If it has the HDMI port its the locked version. The DVI-D only is the one you can overclock. I only stopped at 96hz because I didn't feel like fucking with the EDID timings.
If you want a 'cinematic experience' just watch a fucking movie and fuck off from pushing such pitiful framerates on others
>make a thread about literally anything
>include the words meme and or buzzword
>guaranteed 500 replies thread
Literally all games 2+ decades ago ran at 60 fps, now 60 fps is considered a feature in modern gaming(aside from pc gaming ofc). Can't really blame the devs, the hardware is just not there anymore. Technology moved to fast and even though I hate saying this, console hardware is literally holding back gaming. I think they should make consoles customizable, where you have a base model console and then the gpu is upgradable for more money. Kind of like a modular pc. Only thing is that for average casuals, this will only confuse them.
It's the frame proceeding and the current one. If there's motion blur (difficult to tell if there is from that webm) then there's also bits from that. It LOOKS a lot smoother than actual 60 fps because you're not looking at individual frames like you would if it were being rendered in real time on a computer. The frames are blended, and not in the same method that motion blur does it (interpolates changes between two frame as opposed to combining them).
> I think they should make consoles customizable, where you have a base model console and then the gpu is upgradable for more money.
Why do you need customizable hardware? Consoles could push good resolutions and framerates if you turned down other settings. There's no reason not to have graphics settings for power users.
Yes, what it's saying essentially, is that interlaced video is smoother because each field is a different frame, e.g. it's effectively 60fps instead of 30.
Interlacing was a solution to a problem that no longer exists, which is why almost everything is now progressive scan.
Hmm, I thought about that too. Everytime I'm playing a console game and I see graphics settings under options, I get a little excited lol. Either way, I know for sure these consoles were a test run for better mobile GPUs. The next batch of consoles will have much better performance at a reasonable price.
60 fps serves as an industry standard really. Yes it's more taxing graphically, but it's nothing gpus today aren't capable of. The human eye, depending on distance of the viewed object and barring the effects of motion blurring (more on this later) is nearly limitless.
The idea of our eyes seeing only 28.something fps is a myth, that number actually came from a standardization in the film industry when viewing machines could automatically turn the reels. Before then, the reels would be hand cranked and each film would have variations in fps. This is why when you view an old timey reel on a machine, everyone appears to be moving at sanic speeds. If things were filmed and unaltered, they would look choppy as all hell at approximately 30 fps. This is compared to waving your hand in front of your face and it being blurry. Since cameras only catch light, all you are seeing is, for one particular spot on a film, a change in color. Since that one spot isn't in motion, you don't get blur. To account for this, films will add in motion blur (this makes up a bulk of post production work). That is why that 28.whatever fps will appear natural to you, and where the "our eyes can only see this much" myth stems from.
In video games, a 30 fps game will look choppy compared to a 60 fps game unless you turn on motion blur. However, motion blur is more work your gpu has to do, so you will either play at a lowered graphical fidelity, or suffer stuttering. So, as the argument goes, you might as well design for 60fps so the option is there for anyone who wants to squeeze it.
he does, rendering that many frames in a game like Tomb Raider is pointless and is only more strenious on your GPU.
Rendering that many frames in a source game like CSGO though isn't pointless at all, as it reduces input lag by a few milliseconds