Gaming In The 1980s

Abstract

Problem: Modern gamers (primarily aged 25–35) have no frame of reference for what gaming was like in the 1980s, making it harder to understand current industry trends.

Approach: Tim Cain draws on his personal experience — entering the game industry in 1981 at age 16 — to paint a picture of 1980s entertainment and gaming culture, comparing it to today.

Findings: Gaming in the 1980s was defined by scarcity, deep engagement, fragmented hardware, and a catastrophic industry crash in 1983. The shift to today's abundance of content has fundamentally changed player behavior, game design philosophy, and what it means to be a "gamer."

Key insight: The explosion of entertainment options has inverted the core problem — from scarcity (few games, deep engagement) to abundance (endless games, shallow commitment) — and this directly drives modern trends like shorter games, flatter learning curves, and high abandonment rates.

Source: https://www.youtube.com/watch?v=3KMUaVsp9Zs

1. The Entertainment Landscape of the 1980s

Entertainment in the early 1980s was deeply social and physical. Going to the movies meant driving to the theater, waiting in line, and sometimes discovering your movie was sold out. VHS tapes cost around $50 ($175+ in 2025 dollars), making rentals the only viable option for most families. Books had to be pre-ordered at physical stores if they were popular. Everything involved going somewhere with friends — it was inherently communal.

This context matters because games existed within that same ecosystem. There were far fewer releases per year, spread across a fragmented landscape of competing platforms.

2. Hardware Fragmentation and Specialist Culture

PC gaming was a specialist hobby. When a new game came out, you might need to buy an expensive video card just to run it — not for better performance, but for basic functionality. You might have to assemble the entire computer yourself, and games would ask you about your specific chipset version and interrupt settings.

Tim's first computer, an Atari 800 purchased in 1979, would cost over $3,500 in today's dollars. His mother told him later it was "an investment in his future" — and it paid off, as his game industry job funded his college education.

Console gaming was simpler: cartridges meant instant play and near-zero bugs, since games couldn't be patched and had to pass rigorous certification. But this simplicity existed alongside an overwhelming number of competing consoles — Atari 2600, ColecoVision, Nintendo, TurboGrafx, Magnavox Odyssey, and more — none running the same software.

3. The 1983 Crash

In 1983, the entire American video game console industry collapsed. Every US console developer lost their job. Tim witnessed this firsthand as he graduated high school and started college. He survived only because he worked at a PC game company (Pegasus Software / Cyborg). The crash wiped out everything except Nintendo.

This wasn't like modern layoffs — it was a total extinction event for the console sector.

4. Teenagers as a New Demographic

The 1980s marked the first time entertainment was targeted squarely at teenagers. Previous generations (Boomers) had movies, books, and TV made for adults or children — not the in-between. Gen X teenagers got MTV, John Hughes films like The Breakfast Club, and games designed for them. This was unprecedented and created the foundation for the youth-targeted entertainment industry that exists today.

5. Deep Engagement vs. Modern Abundance

With fewer games available, players engaged deeply. Tim describes playing a single RPG for an entire summer — three or four months. Players read every line of in-game text, pored through thick manuals, and immersed themselves in lore without any internet to look things up.

Today, with roughly 70 games releasing on Steam per day and 25,000+ per year, the dynamic has inverted. Players buy games and never play them. They hop from title to title. Tim compares it to the early days of cable TV: "100 channels and nothing good on."

This shift from scarcity to abundance has produced measurable design consequences:

  • Flatter learning curves — Games must hook players within about 15 minutes or lose them to the next option
  • Shorter games — Developers can't assume 100–200 hour commitments anymore
  • High abandonment rates — Even beloved games like Baldur's Gate 3 see a majority of players who never finish
  • Discoverability problems — Good games exist for every taste, but finding them in the flood is paralyzing

7. The Redefinition of "Gamer"

What constitutes a "hardcore gamer" has shifted. Players who love a game, try multiple characters, but never finish it still consider themselves hardcore. The monetary and time investment expectations have fundamentally changed. Games at $50 in the 1980s were $175 in today's money — modern $70 games are historically cheap by comparison.

The commitment model has moved from a large majority of deeply invested players to a smaller percentage, with most players treating games as more disposable entertainment — neither purely good nor bad, but a fundamental structural change in how games are consumed.

8. References