Why don't we have motion smoothing on current consoles?
Probably a dumb question but:
With the upcoming GTA 6 where people speculate it might only run at 30FPS, I was wondering why there isn't some setting on current gen consoles to turn on motion smoothing.
For example my 10 year old TV has a setting for motion smoothing that works perfectly fine even though it probably has less performance than someone's toaster.
It seems like this is being integrated in some instances for NVIDIA and AMD cards such as DLSS and Fluid Motion Frames which is compatible with some limited games.
But I wonder why can't this be a thing that is globally integrated in modern tech so that we don't have to play something under 60FPS anymore in 2025? I honestly couldn't play something in 30FPS since it's so straining and hard to see things properly.
Higher framerates only in part improve the experience due to looking better, they also make the game feel faster because what you input is reflected in-game that fraction of a second sooner.
Increasing framerate while incurring higher latency might look nicer for an onlooker, but it generally feels a lot worse to actually play.
It might be fine for non-interactive stuff where you can get all the frames in advance, like cutscenes. For anything interactive though, it just increases latency while adding imprecise partial frames.
And that's while ignoring the extra processing time of the interpolation and asynchronous workload. That's so slow, that if you wiggle your joystick 15 times per second the image on the screen will be moving in the opposite direction
Input latency for one, because the next frame is delayed where the interpolated frame is inserted.
And image quality. The generated frame is, as I said, interpolated. Whether that's just using an algorithm or machine learning, it's not even close to being accurate (at this point in time).