"Nearly all multiplayer games rely on some kind of market to give players incentives, even if the currency is just points. But when designers layer real money markets on top of that, the results can be hard to predict. As Varoufakis told us, "once we start trading items or assets with each other, there is the possibility of arbitrage, there are bargaining instances, there's even room for futures markets. Suddenly the game itself becomes immaterial." You may be trying to level up your World of Warcraft character, but the girl next door is stockpiling entry-level armor and selling it off for cash.
"It's particularly challenging because gamers have so much practice exploiting these systems, even when they're just accumulating 1UPs and experience points. World of Warcraft has famously struggled with gold miners for years, and some estimate they've taken nearly 12 million gold units out of the game economy, walking the line between stat-pumping and fraud. Just last week, the space-themed MMORPG EVE Online saw a group of players game their newly implemented loyalty system for $175,000 worth of in-game currency. If there's a weak point anywhere in a game, users will find it, and every time the designers add a new asset class, a new set of markets springs up with a new set of loopholes."
Winchell Chung originally shared this post:
It's an ongoing story, but one thing is already clear: they would have been better off with an economist on board.
A surprising number of companies already do. In-game markets are an increasingly popular (and lucrative) part of the industry, especially with the rise of free-to-play games like Valve's Team Fortress 2, which rely entirely on in-game transactions to pay the bills. But if a company can't keep those markets running smoothly, their product will have a hard time breaking even.
As a result, macroeconomists have never been more in demand.
The game industry is hiring a new class of central bankers — but not in time to save Diablo III.