There has been a notable increase in studios in the early stages of development self-branding their upcoming games as AAA
Copy Link
While there is no official definition for what makes a game “AAA”, this designation has historically been reserved for multi-year game projects with industry-leading budgets ($80m+), sizable development teams, and significant gameplay depth. In short, AAA is an assumed stamp of quality that is granted based on the resources that go into developing and marketing the game versus the success it ultimately achieves in the market. These games are generally created by the industry’s most prestigious studios and are highly anticipated each year.
Contrary to the very nature of this term, there has been a notable increase in the number of studios still in the early stages of developing content who are self-branding their upcoming games as AAA. Although it may seem trivial to discuss a relatively arbitrary label for pre-release content, this has a few meaningful implications for the next generation of video games.
Origins of the Term: AAA
There is not necessarily one universally agreed upon origin for the designation of AAA games, but it is often seen as a reference to bond ratings. Just as AAA bonds are the most creditworthy debt instruments, video games bearing this classification are believed to have the highest likelihood of commercial success. It should be noted that “commercial success” is specifically related to sales, not how the game is received by the community.
The first ever AAA game is arguably Final Fantasy VII (FF7), which was released by Square in 1997. Rather than focusing on costs, the company decided to pursue an ambitious strategic plan that prioritized quality. For the seventh iteration of the already popular Final Fantasy series, this meant quadrupling the team to 150+ employees. The game’s development costs totaled ~$40m, with another ~$40m going to global marketing (collectively ~$150m in today’s dollars) (Eurogamer).
FF7 is a prime example of how large content budgets enabled the creation of new technology that pushed the industry forward. The evolution of video games was not only about creating compelling storylines and gameplay, but also improving the quality of visuals (3D graphics) and innovating the way that narratives are delivered (with extensive computer generated imagery cutscenes for example). Technological advancement that served as a competitive edge helped create novel experiences. In Square’s case, approximately 25% ($10m) of the FF7 development budget was solely allocated to computer graphics (Polygon).
This was a time where game development infrastructure was relatively fragmented and required significant customization to enhance the capabilities of games. Developers would pour time and resources into custom game engines that provided a technical advantage. Some examples include:
Although a number of studios and publishers still opt to use their own proprietary tech stack, the rise of advanced publicly available game engines such as Unity and Unreal has resulted in broader access to powerful development tools over the last decade. As these engines continue gaining market adoption, indie developers that used to be at a fundamental technical disadvantage are now often working with the same (or comparable) toolkits leveraged by productions with substantial budgets.
The Problem
The democratization of high fidelity graphics and performant development tools is a positive for the gaming industry. Smaller teams are able to punch above their historical weight class and deliver content that can often look and feel like something built by a team multiple times its size. This is increasingly leading indie developers and startups to declare their games are AAA. The claim is particularly relevant in (but not limited to) blockchain gaming in an effort to bolster credibility.
Historically though, AAA has not been an assessment of quality or success, but rather a measure of capabilities determined by engineering resources, budgets for marketing, distribution capabilities, and IP ownership / licensing. One of the most notable recent examples of this was Cyberpunk 2077, which was estimated to cost $330m while still releasing with a number of major bugs and performance issues (NY Times).
In an increasingly crowded market for content, it is understandable why teams want to associate themselves with AAA titles. Connotations of the label include broad market appeal, captive audiences, and long-term revenue streams; but modern AAA titles also often avoid taking big risks, fail to meaningfully innovate, and pursue incremental progress rather than category defining ambition as the cost of failure is simply too high.
This mindset is dangerous and antithetical to the mission of most startups and emerging studios, and it has the potential to stifle creativity and growth. We see two issues arising:
Our point is not that emerging studios should avoid competing with the industry’s largest players. In the future, we believe leading games in the most popular genres can be created by up-and-coming studios, and one of our core goals is to continue funding the platforms, infrastructure, and technology that makes high quality development more attainable. Rather, the oversaturation in the usage of AAA misrepresents the risk profile associated with most content.
Going back to the comparison with bonds, updating a debt instrument’s rating does not change the fundamental value or volatility of an underlying asset. It only increases the return expectations.
Takeaway: The concept of AAA gaming has been around since the 1990s and has generally been applied to content with the most resources. However, as advanced development technology becomes more widely available, we are increasingly seeing “AAA quality” become a commoditized term for any game with good graphics. We believe this is counter-productive to content creation and misrepresents the risk associated with most games.