Post
Using AI or algorithms to generate entirely new frames between rendered ones, effectively doubling or tripling your frame rate.
Frame generation is the latest evolution in the quest for higher frame rates without proportional GPU cost. NVIDIA's DLSS 3 Frame Generation and AMD's FSR 3 with Fluid Motion Frames analyze two consecutively rendered frames and create one or more synthetic frames to insert between them. The GPU only renders half the frames you actually see, with AI filling in the gaps by predicting motion and interpolating data. The result can turn 60fps into 120fps or even higher. The tradeoff is added input latency, since the generated frames are predictions rather than responses to your current input, plus occasional visual artifacts on fast-moving objects or rapid scene changes.
Example
Cyberpunk 2077 with DLSS 3 Frame Generation enabled can turn a GPU-crushing 30fps path-traced experience into a smooth 60fps or higher, making the game's most demanding visual mode actually playable. Alan Wake 2 leans heavily on frame generation to maintain performance alongside its demanding ray tracing. Flight Simulator 2024 uses frame generation to smooth out CPU-limited scenarios where traditional GPU rendering would have idle time. The technology enabled 4K ray tracing at playable frame rates on hardware that otherwise could not handle it.
Why it matters
Frame generation is controversial because it fundamentally changes what 'frame rate' means. Purists argue that generated frames add latency and are not real performance. Pragmatists point out that the visual smoothness is real and the latency is manageable for most players. Either way, it represents a shift from brute-force rendering toward intelligent frame synthesis that will define the next era of GPU technology.
Related concepts