It all came down to the quarter [video game expert Mark] Cerny said. Arcade games had to squeeze enough money out of people to be worthwhile for arcade owners and game makers. But they also had to deliver enough play time to make it worth while for gamers to drop in their money. ... What remains today, Cerny says, is Japan as the single shelter for arcade gaming. And that boils down to their 100 Yen coin.
Of course, not everyone agrees, pointing to consoles, the internet, the existence of the coin loony in Canada (and yet, no more arcades), and LAN parties.
I was a huge video game fanatic from the late Atari through the early N64 years, and I have my own theory to add to the mix: console/pc technology eventually allowed home games to do things you can't do in the arcade model, and those new, irreplicable functionalities became ubiquitous in video games.
Consider, for instance, the first Nintendo. When it released with Super Mario Bros., there was also an arcade version of that game with slightly better graphics, as had happened with Galaga and Pac Man on the Atari. Same thing with ExciteBike and other sports games. I suspect this was an intentional arrangement so video game makers could keep both revenue streams going, and it was the model for a long time.
Then Nintendo released The Legend of Zelda and Metroid. These games were distinct from their predecessors in that they used methods of saving and recalling progress to allow a player to navigate a huge map incrementally over time. These two games felt bigger and more immersive than anything else that had yet come out, and they became two of the biggest selling, most popular franchises in the history of video games.
And the arcade was utterly locked out.
You may remember, then, that the arcade adapted, focusing on the things it could offer, specifically having lots of people in one place and having the space and equipment to house elaborate setups. Thus, arcades around the country stocked up on competitive fighting games (Street Fighter II and Mortal Kombat), 4 person cooperative short adventure games (The Simpsons, X-Men), and simulators (Cruisin' USA). Meanwhile, they ceded the market for long-form adventure games and the nascent roleplaying (Final Fantasy) genre to consoles and the strategy genre (Command and Conquer) to pcs. It looked from my perspective like arcades had survived and were doing fine.
Then 64-bit systems started coming out, and it became apparent that the gap in graphics quality between arcade games and home games disappeared, and may even have inverted. Gran Tourismo was probably a more beautiful game than any arcade racing game. Golden Eye could match any arcade shooter.
Then came the advent of internet gaming. People could log into sites like Blizzard's battle.net on their computers, and then even on consoles, to play with or against other people. This new model meant that not only could people play 4 players splitting one screen like at the arcade, but one could play with any number of other players online, all having their own full screen. Competitive gamers found they could get more satisfaction from online first person shooters like Halo than from Tekken-style combat games, and the people who like 4 player cooperative play really liked the new Massively Multiplayer Online RPGs like World of Warcraft, which supports 5, 10, 25, and even 40 person cooperative play.
Thus gamers, the kind of the people who would go somewhere to blow $10 in a sitting over a couple of hours on video games, got hooked on game styles you can't replicate in an arcade setting. Look at the games people are playing now, Wow and Rifts and Call of Duty: Black Ops and Kill Zone and Dragon Age 2 and Just Dance 2. Only the last one could be feasibly rendered in an arcade setting, and you'd have to use a mat rather than the Wii controller. Also, you'd have to dance in public.
Alternatively, one could point to Angry Birds and Plants vs. Zombies, but you can buy those games for less than $5 on your smart phone. Who would pay to play them in an arcade?