To make in-game spending seem less… real a lot of game developers are opting for a virtual currency in their games. this makes it feel to the consumer like they are spending less than they think, and with normally quite confusing exchange rates it can be quite difficult to pin down exactly how much you spending in one transaction. Some games even have multiple types of currency to spend in different ways.
Free to play games rely on this in-game currency to fund their game and it’s servers/updates. while AAA games often use this model to make a game that would normally only be populated for the year generate much more revenue allowing for bigger teams to keep patching and updating the game.
The argument as to whether this is good or bad is often about how games used to be, nowadays sequels to games are few and far between, normally being released years after expected. Video game releases are all about marketing, It’s no longer beneficial to take risks as a game developer, just like the film industry a title needs to be guaranteed sales and players for the next 3 to 5 years or it’s not a viable option to develop. Although this means much more time and detail is put into the big releases, while games that don’t follow this model are left to the indie developers.
Applying Loss Aversion and the Sunk-Cost effect to games
Good insight on the evolution and development of currencies
https://nativex.com/blog/free-to-play-monetization-a-lesson-on-virtual-currency/Practical application of currency best practice