October 26, 2023
Video gaming has changed a lot since the early days of coin-operated arcade machines in the 1970s. Moving first to home consoles that accepted cartridges in the early 1980s through the PC gaming revolution of the late 80s and 90s into the fully mobile and digitally distributed games of the modern era, games are radically different today. When smartphones came on the scene, video games exploded in popularity — everyone with a smartphone can now play anywhere, anytime.
Video games used to be more monolithic in terms of how they were monetized. Beginning in the 1980s, the traditional path of monetization was that the game would be offered for sale (usually for around $60), and you’d play it until you were done. Games were a thing that you bought and used.
From there, we moved into the age of ads and microtransactions. This began in 2006 when Bethesda offered a downloadable horse armor for Elder Scrolls IV: Oblivion, which cost $2.50. The armor provided no significant bonus in the game, it merely made your in-game horse look cool, and people paid for it in droves. That spawned the era of small in-game purchases.
Microtransactions are the small, in-game purchases that typically come in the form of downloadable content (DLC). DLC can take many forms, from new clothing, skins or armor for your game character to other cosmetic or more substantive enhancements.
Fast forward to today, and online and downloadable games have changed the model again. Games are beginning to look more like a service, where new offerings can be layered onto the basic product and sold incrementally.
The gaming industry is surprisingly large. In terms of annual revenue, gaming is bigger than the music and film industries combined.
Most gaming revenue is driven by mobile and casual games, and it keeps growing. Especially for free-to-play games, the game’s launch is merely the beginning of its monetization lifecycle. Even free-to-play games can be big business if a large community of players grows and keeps playing and paying for new content year after year. Call of Duty Mobile, for instance, is free to play and monetized entirely by in-game purchases. These tiny transactions added up to more than a billion dollars since its launch in 2019.
Shorter games that take only an hour or two to finish as compared to longer epic adventure games that might take 50 hours or more (not to mention massively multiplayer online role-playing games, MMORPGs, which theoretically can go on indefinitely), may not feel worth the typical $60 price tag of a console game.
As a result, game developers have been painted into a corner in terms of monetization, as it’s challenging to monetize games below a certain length. These games have adopted different monetization strategies, and others have followed suit. Let’s look at a few common ones.
A loot box is essentially a pack of other interactive items that you can purchase inside a game. Typically, there are anywhere from three to six or more items in a box. Every aspect of the lootbox user experience is focused on delivering the same dopamine reward cycle you get from activities like gambling. As a result, the first items you interact with are the more common ones, with the rarest as the last.
For example, every loot box will contain some common items (most of which are not especially valuable), and around 90 percent will contain a rare item. Rarer “epic” items will appear in 15-20 percent and “legendary” items (the most highly coveted) will appear in fewer than 10 percent. There’s some obvious psychology at play here: since everyone wants legendary items, many people are willing to buy loot boxes for another chance to get one.
Loot boxes are far more prevalent in mobile games than they are in console or PC games, which tend to monetize differently.
Some games offer in-game currency — such as Fortnite’s V-bucks or Roblox’s Robux— that you can purchase offline. This means a Fortnite player can purchase a pre-loaded gift card at a brick & mortar store like GameStop, Target or Walmart that is redeemable for Fortnite’s in-game currency.
In-game currency is purposely not valued one-to-one with real currencies. This makes it more mentally difficult to compare in-game costs with real costs. For instance, a card worth 1000 Fortnite V-bucks costs around $7.99 at retail, and 5000 V-bucks will cost you $31.99 (you receive a V-bucks discount for buying in bulk). In Fortnite, V-bucks can be used for a variety of things from cosmetic upgrades like wraps and outfits to other items like pickaxes, gliders or premium Battle Passes.
All of these small purchases help the gamer customize their experience, which tends to keep them playing because the game feels more satisfying and personalized. There’s also a sunk cost effect happening, with people more likely to stay with a game they’ve invested in by making in-game purchases. With more than 350 million players worldwide and counting, Fortnite has captured the imagination of many, and they are willing to pay for customized experiences.
Subscription-based gaming has become a more popular monetization strategy in recent years. Instead of buying specific games, a game company offers access to a catalog of games, and gamers typically pay a monthly fee for unlimited access to that catalog. Popular examples include Microsoft’s Xbox Game Pass, Sony’s PlayStation Plus, and Apple Arcade.
There are numerous methods for monetizing games, and the three categories mentioned here are some of the most popular. Much evolution has happened in the past 50 years: we started by feeding quarters into the coin slot of a stand-up console and now have loot boxes and V-bucks.
As the industry continues to grow and diversify, it remains to be seen which methods will become the most popular. It’s likely that more enhancements will follow in these major areas, like social microtransactions that facilitate trading loot box items between friends, for example.
Even though console-based gaming is still very much part of this market, the gaming category has expanded to include other methods like mobile gaming, each of which has monetization strategies that suit it better than others. Watch this space: as the gaming industry continues to evolve and grow, we can expect it to spawn more monetization models as well.
Dr. Aldis Sipolins is our Senior Manager, Advanced Research & Development. He’s a virtual reality (VR) and augmented reality (AR) researcher who applies principles of design and psychology to influence human behavior. Aldis earned a Ph.D. in Visual Cognition and Human Performance from the University of Illinois at Urbana-Champaign studying brain imaging and stimulation. During that time, he founded a startup in San Francisco developing VR brain training games. He later joined IBM Research as Head of Virtual Reality and Game Design, where he worked on combining VR/AR, sensor data and machine learning techniques to enhance learning. After IBM Research, Aldi led VR/AR research and development at Draper Laboratories, a not-for-profit R&D firm. With over 10 years of experience conducting VR/AR research, Dr. Sipolins has identified brain imaging markers of memory encoding in VR, used eye tracking in VR to track attention and cognitive load and even designed and tested AR interfaces for SOCOM Special Forces. His work at Adeia focuses on VR, AR and gaming.