Despite the many faults in gameplay, EA's Battlefront is arguably the most photorealistic game ever made (yes, more photorealistic than Crysis with mods even).
This got me wondering though, what's left for the true photorealism in games? How much more can graphics improve starting from Battlefront?
>what's left from true photorealism
A fucking lot.
>Real time Global Illumination
>Ray-tracing for all objects at 144Hz
>Full object destructibility/deformation for all geometry within render space
>maximum LoD at all distances
>per-object physics simulation
>beaufort wave simulation for all fluid dynamic systems
>volumetric particles for all particle type objects; foam, smoke, clouds, fires, etc.
and the list goes on. Battlefront looks great in specific circumstances, but it's nowhere close to being photorealistic--and consoles will not achieve any of the above listed for at least another 25-30 years. PCs will achieve it within the next 20-25 years.
game physics are still currently extremely hard to implement accurately and fast. Physics cover everything from animations, water simulation and dynamic environments (destructible, for example).
Go watch any SIGGRAPH video from the past few years. Making something look good in screenshot is a far cry from something actually being pohotorealistic in motion.
still a lot of work to do, but the biggest thing ios going to be trying to replicate that 'haze' properly, it looks dumb even in the screenshot, and it's dumb for every game so far
texture at a distance as well probably, even in provided screenshot the trees further on the right look flat and bland
graphics can improve but it's just going to take a very long time, i mean the jumps between console generations were ridiculous, until last gen and current gen when it was very minimal, but it's been like that in everything
until stuff like 4k is cost efficient and hardware can run it easily there won't be much improvement in much of anything desu
and considering 4k is an absolute struggle to get anywhere near 60 fps with multiple video cards and extremely powerful cpus the next graphic 'jump is probably maybe 10 years away? could be even more pessimistic and say 20-25 like first anon said
there just hasn't been very many processor/graphic leaps for quite awhile, right now it seems they're focused on size and efficiency power wise, which usually always comes at the cost of performance early on
>and considering 4k is an absolute struggle to get anywhere near 60 fps with multiple video cards and extremely powerful cpus the next graphic 'jump is probably maybe 10 years away? could be even more pessimistic and say 20-25 like first anon said
why are you so retarded? a single gpu can play BF at 4k 60fps let alone muliple.
you can play 99% of games at 11520x2160p with playable fps, yeah 3x 4k.
Yeah but Battlefield is an example of a very well optimized game you have to take into consideration all the games like Asscreed Unity/Syndicate that are horribly optimized and require four titans to get 4k/60fps.
Real time render will always be behind pre-render graphic. CGI industry has already show that vidya industry is no near close to reaching the peak of what's possible for CGI. So it's silly think that video games will each perfect realism in the current.
The thing is doesn't matter if there is new cgi advancements that industry can implement because customer hardware isn't focus on raw horse power anymore. 4K and VR are examples of this.
Real-time rendering has been more advanced than CGI for years though.
When you have games with crazy real-time action and physics running at 60fps with visuals that rival CG films, how isn't that being ahead of CGI?
>textures at a distance
This can be rectified by Anisotrophic Filtering. But no console developer will implement it more than 2x because of performance detractions, that they'd rather use to push more effects. It's a matter of priority, not a lack of hardware.
>4K becomes cost efficient
4K is already cost efficient, but developer priorities are warped. Games should be made in the following paradigm: 60fps min > gameplay (which includes animation/physics/effects) > story. Right now framerate is the lowest priority when making a game. If PS VR becomes a thing, if Microsoft HoloLens becomes a thing and if SteamVR (Vive) and OculusVR (Rift) + other solutions take off, 30fps will not be acceptable and 60fps will be barely tolerable. 90fps will be the new baseline and 120-144fps will be ideal.
>4K60 is ten years away
No. 4K60 is in 6-8 months tops. Next-gen GPUs are shrinking down to 16nm. Current-gen powerhouses like the FuryX and 980Ti are sitting at still 28nm. Shrinking down from 28 to 16 is a 175% reduction in space. This means you can cram 150% more transistors within the same amount of space. This in turn means around a 30-40% increase in performance.
The 980Ti can do 4K40 fairly well at high settings in Crysis 3. The game is well optimized, but it's also very effects and performance demanding. A meager 30% improvement puts the new-gen GPUs at 52fps. Assuming a middle ground of 35% perf increase, you're looking at 54fps. Accounting for HBM 2.0 next gen, means 40% is more likely than not. Which means 56fps.
Of course this is linear scaling and disregards any new architectural improvements and new paradigms of rendering altogether. Thereby, 4K60 in 6 months is a reality. 2560x1440 at 90fps is pretty much guaranteed (which is what will be the baseline for VR in the coming years).