For me, and I’m sure many other PC gamers, turning off motion blur is a first-launch ritual. It’s the most dreaded post-processing in videogames. Why do so many videogames include this effect? It is not necessary to be a member of the? Enabled by default?
It’s hard to get a game designer to tell you. The other studios didn’t reply. Perhaps they don’t feel it’s worth their time to comment on individual effects, or perhaps they are in the pocket Big Blur. After a few months of trying to find a solution, I decided to do my own research. What I learned is:
- All the blurry effects in video games have a good reason
- You can also hate motion blur.
- It’s likely to be less as frame and refresh rate increases
What is motion blur
In photography and film, there is motion blur because the cameras can’t capture images instantaneously. Film is exposed for a period of time—say, 1/48th of a second—and if an object moves across the frame during the exposure, it will appear to blur in its direction of travel. The whole scene will be blurred if the camera moves.
According to a summary, this is not ideal if you want to clearly show fast-moving action. But at standard frame rates video “needs a certain amount of blur” to appear natural. (opens a new window) Magix Software, the maker of Vegas editing software.
It’s particularly true for films shot at 24 frames per second. A baseball flying through a shot might hypothetically only show up in a handful of frames, and we wouldn’t want each of those frames to contain a crisp, perfectly round white ball in a new location—it would strobe unnaturally across the screen. We see an oval blur across the screen instead, which is all the information that the film recorded for the duration.
The scene above from King Kong (1933) is a good example. Dr Andrew Glassner is a computer graphics researcher. (opens a new window) In a 1999 article, it was pointed out Opens in a new tab Kong’s motion looks jerky, because unlike the rest of the movie he is animated by a series or still photographs that contain no motion blur. Stop-motion animators today add motion blur to their animations when they want a natural look.
Why do games use motion blur?
Multisampling, temporal anti-aliasing, and other techniques can help eliminate visual anomalies in games and create a more natural looking movement, but these are not always enough, especially when framerates are low. Redfall console gamers, please accept my sympathy: 30 fps is an absolute joke. Rotating the camera quickly turns the world into an eerie slideshow.
Enter motion blur, a cure that is worse than the condition. It’s been a feature for years, according to developers. In an article in the 2007 book GPU Gems 3 Opens a new windowGilberto Rosado (of Rainbow Studios), the makers of MX Vs ATV, wrote that motion blur effects can help create a “realistic” sense of speed, and they also help “smoothen a games appearance, particularly for those that render at less than 30 frames per second.”
Study from 2013 (opens a new window) Researchers from MIT and Disney also stated that motion blur effects can be useful for “reducing artifacts” as well as “achieving a more realistic look” in video games.
After some Half-Life 2 testing, I found that I largely agree that motion-blur can “smooth-out” low framerates. Flipping the camera around in Half-Life 2 even at 60 frames per second looks bad. With motion blur on, however, it appears smoother in the sense that it looks more like what I’d expect to see if an actual camera were being rotated—a smear, rather than jittery leaps between frames.
Even though I would never play Half-Life 2 using motion blur, I still wouldn’t.
Why gamers hate Motion Blur
Finally, I got The following are some of the most effective ways to increase your effectiveness. Developer’s view on motion blur Adam Sanders, an amateur game developer whose studio Red Slate Games works on an unannounced product, gives a simple reason for the low status of motion blur among gamers.
In movies, motion blur can be used to express speed and isolate the subject of a scene by blurring its surroundings—roughly simulating an eye’s focus—but in videogames, where players typically control the camera themselves, Sanders thinks that motion blur tends to do the opposite. In videogames, where players typically control the camera themselves, Sanders believes that motion blur tends to do the opposite. Instead of making a subject more clear, it is likely to “obscure information players are actively attempting to perceive.”
Sanders says, “It comes down fundamentally to the different needs of the media.” “Films should use clever tricks to focus the attention of the audience on specific areas in the frame. Post-processing effects such as motion blur can interfere with the ability of players to frame their own scenes in games.
It is most problematic when the player’s camera movement triggers motion blur, such as in Half-Life 2 above. The blur does smooth out the camera rotation, but it also consumes the enemies that I am trying to center. When you’re playing games with an egregiously high level of motion blur, your screen will turn into mud when you try to get oriented.
Sanders points out that players don’t tend to complain about localized motion blur—the streak of a sword swipe in an attack animation, for instance. He says that this is a sign that motion blur in video games is being applied incorrectly as a screen shader.
It also ruins any cool screenshots that you take. Motion blur is bad.
Do you need to turn on Motion Blur?
Motion blur would be a major issue if PC gamers were required to swear an oath to join the hobby. But to be honest, as much as I dislike very strong motion blur, I don’t always notice it when it’s subtle—at least until I look at my screenshots folder and discover that they all suck.
When I use the Nitro in racing games, I don’t care if things get streaky. As long as the track is clear and I can see other cars, it’s ok. The MIT and Disney study that I mentioned above concluded that motion-blur did not “significantly improve the player experience” for the racing game that they showed the subjects. So if you ever get into a debate about motion blur, you can now use some legitimate academic research. (opens a new window) People don’t like it.
It’s also up to individual taste whether or not emulating video looks is a valid way to use motion blur. Since the “real” things we see on screens are shot with cameras—eg, news footage—mimicking camera effects can create the impression that we’re looking at real life. Unrecord is a recent example of this. It’s an unsettlingly real-looking Unreal Engine 5 game that simulates a bodycam perspective. Motion blur is a key element in creating the illusion, along with other video effects like lens distortion and blossom.
You should ignore anyone who says that you can’t “perceive” anything higher than 60 Hz.
In rare cases, like Unrecord’s, I believe motion blur can add to an overall impact that surpasses clarity. Motion blur is no longer necessary when natural-looking motion is desired. However, as frame rates and refresh rate increase, the effects of motion blur become less and less effective. It’s a little ugly to spin around in Half-Life 2 at 60 frames per second, but it looks much better at 144 frames per second on my 144Hz display. And at 200Hz, the effect would be even more pleasing. The more frames that I see, the less blurring is needed to generalize information.
We talked to experts about that, and it’s nonsense: one psychologist suggested that 200 Hz could be a good target if we want to experience video motion close to the way we perceive real-life motion. Experts told us that this is false. One psychologist said that 200Hz would be the best target for video motion to mimic real-life motion. Researchers were already pointing this out a decade before. Opens in a new tab The reduction in motion blur makes 100 Hz videos look significantly different and better than 60 Hz.
As long as 30 and/or 60 fps are still acceptable in videogames we will have to continue to go into the settings to turn off motion blur. The good news is that although it’s almost always on by default for some reason, it’s also one of those post-processing effects that can almost always be switched off—maybe Big Blur hasn’t gotten to everyone just yet.