Post
Showing different versions of a game to different players to see which makes more money.
A/B testing in games means serving different experiences to different player segments and measuring which version performs better on key metrics like retention, spending, or session length. A studio might test two different tutorial flows, two different pricing structures for an in-app purchase, or two different difficulty curves. The data decides the winner, not designer intuition. Mobile gaming pioneered this approach, and it has become standard in any game with live operations. The tension is between optimizing for metrics and optimizing for player experience, because those goals do not always align.
Example
Supercell is famous for testing everything in Clash Royale, from card balance to shop offers, across player segments before rolling changes out globally. King runs hundreds of A/B tests simultaneously in Candy Crush Saga, optimizing everything from level difficulty to the timing of purchase prompts.
Why it matters
A/B testing is why two players can have meaningfully different experiences in the same game. It also explains why some monetization feels perfectly calibrated to your weak points, because it literally was calibrated through testing on millions of players. Understanding this practice reveals how data-driven modern game design has become.
Related concepts