Extra Credits. September 2015.
In film, the standard frame rate is 24fps, but that’s actually a slow rate which creates effects like motion blur that our brains have been trained to recognize as cinematic. Video games aim for a minimum of 30fps, however, because the interactivity means that a slower frame rate can make the game feel laggy. While developers sometimes hit a higher framerate, we usually only hear 30fps and 60fps discussed because our TV and computer monitors refresh in intervals of 30, although we can turn off vertical sync (vsync) to get somewhat closer to the refresh rates pumped out by our graphics cards. … developers have to trade off other aspects of the game’s performance, and every industry study shows that better graphics trump better framerate when it comes to sales.
Refresh rate is a measure of how often light is emitted from the display.
While some displays support multiple refresh rates, the complexity of the game has not impact on the refresh rate.
The frame rate is a measurement of how often the display hardware can update the game image on the display and is typically measured in frames per second.
The complexity of your game, along with the computing power of the gaming hardware, influence the frame rate.
Introduction to Game Development
Michigan State University. Coursera. June 2016