What makes the IBM PC so significant in the history of personal computing? Its impact goes far beyond being a simple product – it’s a symbol of standardization and accessibility that reshaped the industry for decades to come.
Think of this in today’s terms… what differentiates a gaming desktop from a home console? Is it the ability to choose any parts and peripherals you want? Or perhaps the support for games created decades before the computer itself? In the past, the main difference would be a keyboard.
These days, Apple is the only company that produces computers with exclusive software and peripherals. But in the early 1980s, it was the norm. Back then, home computers were seen as upgraded consoles, primarily used for gaming and coding. In many cases, they were even made by the same companies. The IBM PC approach changed the very definition of what a personal computer could be.
By embracing third-party hardware and software, the IBM PC standardized the computer market, lending its name to the entire industry and granting the Microsoft and Intel duopoly (later also AMD) decades of dominance.
TechSpot’s Legends of Tech Series
The iconic tech gadgets that shaped our world. From groundbreaking gaming consoles to revolutionary mobile devices and music players, discover the legends of technology.
The Art of Compromise
In the 1970s, IBM was the largest computer company in the world. However, it focused on creating room-filling mainframe computers for governments, universities, and corporations, rather than serving home users.
That began to change with the release of the spreadsheet app VisiCalc in late 1979. VisiCalc turned home computers from enthusiast gadgets into something that most people could find useful. This was the sign for Big Blue to get involved.
IBM’s biggest obstacle was itself. With a rigid 9-to-5 work culture and an excessive number of approval committees, any home computer it could make would become obsolete by the time it reached the market. IBM employees famously joked that if the company wanted to ship an empty box to stores, it would still take nine months to do so.
IBM employees famously joked that if the company wanted to ship an empty box to stores, it would still take nine months to do so.
IBM executive Bill Lowe was tasked with determining the company’s strategy for the home market. His initial plan was bold: acquire Atari, which had successfully entered the home computing space with its 8-bit computers.
However, IBM Chairman Frank Cary rejected the idea, preferring instead to create a small, “independent business unit” within IBM. Lowe came up with a plan, nicknamed Project Chess, to create a home computer within a year.
IBM chairman Frank Cary
With such a tight schedule, IBM decided to design only essential components, such as the case and motherboard, while outsourcing everything else. Philip “Don” Estridge was put in charge of the project, and the computer was codenamed Acorn.
The Birth of Compatibility
The choice of a CPU for IBM’s first personal computer seemed straightforward at the time, but ultimately shaped the future of modern computing. IBM chose Intel’s 8088 microprocessor over the more powerful 16-bit 8086, a decision driven primarily by cost, time constraints, and compatibility with existing 8-bit hardware. While many engineers criticized this choice as a compromise, it turned out to be a pivotal moment in technology history.
The 8088 was based on the 16-bit Intel 8086 and used the same x86 instruction set, but it featured an 8-bit external bus. This design allowed IBM to utilize cheaper, widely available components, such as those compatible with the Intel 8085-based System/23 Datamaster. The decision had massive implications: the x86 architecture of the 8088 became the foundation for future generations of processors, including today’s CPUs.
Also, to avoid reliance on a single supplier, IBM required Intel to license its x86 processors to another manufacturer, which led to a partnership with AMD.
Intel 8088 made by AMD. Image: AMD pre-historico
The decision was a tad more complicated when choosing an operating system. The natural candidate was CP/M, created by Gary Kildall and used on many home computers at the time. Many stories have been told about an IBM visit gone wrong, but the real issue was that IBM realized that creating an x86 version of CP/M would have taken Kildall’s company (Digital Research) too long, and would have made the computer too expensive. IBM needed an alternative, fast.
The solution came from Seattle Computer Products (SCP). Tim Paterson, a programmer from SCP, had developed an x86-compatible operating system unofficially called QDOS (Quick and Dirty Operating System). It used CP/M’s publicly available application programming interface (API), making it easy for developers to port their applications.
Like most home computers at the time, the IBM machine would include a ROM (read-only memory) chip with a version of Microsoft BASIC, so users could create their own apps – the easiest way to get apps in the early days.
Philips P2000C running CP/M. Image credit: tony_duell
The company turned to Microsoft to handle the OS negotiations. Microsoft struck a deal with SCP to license QDOS, initially paying only $25,000 in royalties because IBM was their sole client. Later, Microsoft hired Paterson to adapt QDOS for IBM’s needs, including support for the newer 5.25-inch floppy diskettes. Recognizing its potential, Microsoft purchased QDOS outright from SCP for $50,000, securing exclusive rights.
At this point, creating a computer may sound simple. You might wonder: why didn’t dozens of electronics companies follow IBM’s approach? The key difference lay in one crucial innovation: IBM’s Basic Input/Output System (BIOS).
The BIOS was stored on a ROM chip and acted as a bridge between hardware and software. It allowed the computer to run an operating system that wasn’t specifically written for its hardware. This layer of abstraction was a game-changer, enabling software compatibility across machines and laying the groundwork for the future of personal computing.
Big Blue Sea of Clones
The IBM Personal Computer (model 5150) was unveiled at a press conference in August 1981 and quickly became known as the IBM PC. The launch was accompanied by a clever advertising campaign featuring Charlie Chaplin’s character, the Little Tramp. This signaled IBM’s first major foray into selling computers primarily through retail stores.
Technically, you could get an IBM PC for $1,565 (more than $5,000 today), but if you wanted to save your work, you’d need to connect it to a cassette recorder. However, it came with the Model F keyboard, still revered today for its mechanical quality. Another popular peripheral was the high-resolution 5151 monochrome monitor, offering an impressive resolution of 720 x 350 pixels.
For almost twice the money, you could upgrade to a version of the 5150 with two floppy drives (a built-in hard disk wouldn’t arrive until the 5160 model). CP/M-86 was eventually offered as an alternative operating system, but its high price – $200 more than the rebranded PC DOS – was largely dismissed.
IBM estimated it would sell 250,000 units of the PC, mostly to small businesses, over five years. Instead, it sold 750,000 units in two years, with many purchased for home use. Nobody thought so many people wanted an expensive computer at home. Surely it was a good computer, but perhaps more importantly, it was an IBM computer.
In 1982, Time Magazine named the “Personal Computer” as its Man of the Year (later renamed “Machine of the Year”). This decision marked a cultural turning point, recognizing the PC’s profound influence on society and the economy.
The Rise of Compatibility
Within a year of its launch, hundreds of applications were available for the IBM PC. Seeing the opportunity, many competing computer manufacturers adopted x86 processors and the rebranded MS-DOS, hoping to tap into IBM’s growing ecosystem. Microsoft capitalized on this by licensing its OS on a per-machine basis, a deal it did not offer IBM.
However, a significant hurdle emerged: some software bypassed the operating system and communicated directly with IBM’s BIOS. Programs like the revolutionary Lotus 1-2-3 spreadsheet and Microsoft Flight Simulator wouldn’t run properly on these early clones.
The IBM PC, PC Portable, and Compaq Portable II. Image credit: Marcin Wichary
The solution arrived in 1983 with the Compaq Portable – a suitcase-sized machine that became the first true PC-compatible computer.
IBM’s decision to use off-the-shelf components meant they could not patent the design of the IBM PC, making it relatively easy for competitors to reverse-engineer the machine and build “IBM-compatible” clones. Case in point, Compaq achieved compatibility by reverse-engineering IBM’s BIOS using “clean-room” methods: programmers who had never seen the copyrighted source code recreated its functionality.
In 1984, Phoenix Technologies followed suit, licensing its reverse-engineered BIOS to other manufacturers. This breakthrough allowed smaller companies to create faster, cheaper, and more innovative PC clones, outpacing IBM itself.
Legends Never Die
IBM tried to differentiate its computers from the growing wave of PC clones with its Personal System/2 (PS/2) series, launched in 1987. This series popularized 3.5-inch floppy diskettes and connectors you may be familiar with, like VGA (Video Graphics Array) and the PS/2 mouse/keyboard connector.
However, IBM’s decision to use proprietary Micro-Channel Architecture (MCA) expansion slots in its higher-end models alienated users. MCA was incompatible with existing Industry Standard Architecture (ISA) cards, effectively isolating IBM from the broader PC market. Both standards were eventually replaced by PCI, the predecessor to today’s PCIe, in the 1990s.
Within a few years, clone makers flooded the market, undercutting IBM’s own sales as the company had inadvertently created a new standard (the “IBM-compatible PC”), but ultimately lost control of the market they pioneered.
While IBM faltered, PC clones running DOS-compatible Windows systems surged ahead. By the early 1990s, competitors like Packard Bell, HP, and Dell dominated the market, pushing out alternative systems like the Motorola 68000-based Commodore Amiga and Atari ST. Only Apple Macintosh remained as a viable alternative.
The ThinkPad brand is still used by Lenovo. Image: Jarek Piórkowski
Motorola joined IBM and Apple to form the AIM alliance in 1991, intending to create a competing platform to “Wintel.” The result was the PowerPC architecture and the processors that powered Apple computers between 1994 and 2006. Later on, the architecture was also used to power home consoles like the Nintendo Wii, PlayStation 3 and Xbox 360. However, Windows users stuck with x86.
IBM eventually exited the PC business, selling the division to Lenovo for $1.75 billion in 2005. A year later, even Apple switched to Intel CPUs (a decision that was reversed some 15 years later).
A Legacy That Endures
In the 1990s, AMD became Intel’s main x86 rival after a court ruling allowed it to sell its own x86 processors. AMD’s introduction of x86-64 – a 64-bit extension of the x86 architecture – cemented its relevance.
While Intel tried to replace x86 with its Itanium architecture in the early 2000s, consumers preferred AMD’s backward-compatible solution, eventually forcing Intel to adopt it as well. To this day, this architecture powers not only most desktops but also modern PlayStation and Xbox gaming consoles.
Meanwhile, the Arm architecture began dominating handheld devices in the early 2000s, including smartphones, tablets, and Nintendo’s portable consoles. Apple adopted Arm processors in its desktops and laptops between 2020 and 2022.
To imagine a world without the IBM PC, look no further than an Apple Mac. It’s sleek but locked down, with hardware you can’t upgrade and compatibility limited to apps from recent years (Macs can have trouble running macOS apps from as late as the 2010s).
Read next: How 30 Years of Progress Have Changed PCs
Without the IBM PC, home computers might have all followed this closed, proprietary path – the PC was a historic win for tech enthusiasts.