Why didn't the arcade machines keep on "scaling" as the years went by?

Solution 1:

Why didn't arcade games keep scaling? Because the problems of making a game stand out scaled in a different direction than the benefits arcade cabinets provide.

Decades ago, as mentioned in the question, a game could stand out by simply having 3D graphics, or doing other things that required hardware that could not practically be included in consumer home gaming devices. As technology improved, that became less and less the case, and now there is essentially nothing you can make a video game do that commercial hardware can't handle. Nicer hardware might get you more polygons or better lighting effects or even better physics simulations, but you can't get the same qualitative difference as 3D graphics in a world of otherwise 2D games.

Once we reached that point, the main challenges of developing impressive new games shifted to things like innovating on mechanics, making game worlds bigger and deeper, improving 3D model fidelity, and others. These things do not necessarily demand more of the hardware, but they require much more development resources.

Considering those shifts, it is often not worthwhile to make an arcade machine, in terms of both user experience and profit.


Let's say you wanted to try the "expensive hardware" strategy of building an arcade cabinet with today's technology. You get the best graphics money can buy: a 4K monitor and 4 RTX 2080 TI connected with SLI. You get the most powerful processor on the market: an Intel Core i9-10980XE. You get all the memory you can cram into a PC motherboard: about 64GB. And you fill it out with more storage than you could possibly fill with one game. This adds up to several thousand dollars worth of hardware, and you can do real-time ray tracing in 4K, which is pretty cool.

Now you need to make a game that actually uses that hardware. You're going to need designers, modelers, artists, animators, programmers, and others to spend a lot of time making a game with the fidelity to take advantage of that hardware. And in the end, this is all still commercial gaming hardware, so you can provide a lower resolution graphics option and the game will run on people's computers at home. And once you sell the game to people directly, most people probably would not consider it worthwhile to travel somewhere to pay more money to play the same game with marginally better graphics than what they can get in the comfort of their own home. Plus, if people buy your game to play at home, you don't have to handle selling hardware.


I recently went to a Dave & Busters, which is probably one of the best places to see modern arcade games. Almost every game there distinguished itself not by being more visually impressive, but by having an input device that you would rarely see on consumer gaming setups, like a steering wheel and pedals, or a gun, or a dance pad, or a single big button. For the most part, they were designed to be easy to understand, and quick enough to finish that other people could get a turn.

Solution 2:

The short answer is Moore's Law: The fact that technology doubles in complexity and halves in cost every few years.

In the 80s and early 90s, Arcade games had hardware that was too expensive for home consoles. This changed in the mid-90s. Here are a few things that 90s consoles started getting that used to be arcade only:

  • The Super Nintendo got hardware scaling and rotation via Mode 7
  • Starfox 64 added a chip to allow hardware vector graphics
  • Saturn, PlayStation, and Nintendo 64 all added 3D graphics hardware
  • The Nintendo 64 introduced analog stick controls. This was different from previous joysticks as those were digital... directions were either on or off.

Of course, Moore's Law didn't stop there and eventually consoles could basically do everything except what the most esoteric arcade games needed.

Solution 3:

I havent got any proof, but here is my analysis.

  • Arcades were the bomb in the 80s because home versions didn't really exist.
  • Arcades started to dip in the 90s because yes, the home version existed, but usually they had an edge. Like better graphics, the possibility of 8 players or some extra periferals, like the House of the Dead lightguns. But I do think that they got hit hard by the fact that you had to pay a buck per life, when the home product you payed up front, but then had no extra payment to play.
  • Arcades really took a hard hit in the 00s because the home product became on par with what you got at home. So they lost one of the 3 main elements that pushed them. Really, it felt like starting in the 2000s, the only big arcade videogames were Lightgun shooters and DDR.
  • And then Arcades just continued to fade in the 10s because of the lack of quality games and probably too that those machines were costly to run and maintain.

But then, why do big arcade video game producers from the 90s just seem to stop to care? As always, no proof but here goes:

Creating an arcade game was not much more costly than a normal game. But the costs related to running an publishing the game were way higher than an at-home videogame.

For an at-home, you pay maybe 30$ for the cartrige, send to a shop for 55, the shop sells it for 60 and then its done. You do your profit on sale.

For an arcade, you have to sell a machine that is probably something like 3-4 grand, and then, as the publisher, you see no sales (except if they had a revenue split on every quarter, but it doesn't feel like there was.) And when you create a new game, you have to sell a brand new cabinet for another 3-4 grand. A way harder sale to do.

Also, while I was writing these lines, I came to this realisation. Maybe, since cabinets were a couple grands, when an arcade invested to get a couple of them, they made sure to milk every penny they could out of it. Keep the game on the floor for as long as possible. So that might also be why it seems like the library of games at the local arcade has always been pretty much the same for 10 years.