Disrupting Intel
Andy Grove pivots Intel from memory to microprocessors. → A textbook strategic reinvention.
In the middle of 1985, Andy Grove sat in his office on Intel’s Santa Clara campus and asked Gordon Moore a question that neither man could let go of. The financial picture in front of them was catastrophic. Intel’s earnings per share for the year would round to a single penny. The company that had invented the dynamic random-access memory chip, the product that for fifteen years had defined Intel’s identity and most of its revenue, was being beaten in its own market by Japanese producers whose yields, prices, and quality had become impossible to match. Senior managers in the conference rooms down the hall had been arguing in circles for over a year. Memory was Intel’s “technology driver,” they said. Memory taught the company how to manufacture every other product. Customers expected Intel to offer a full line. To stop making memories was to stop being Intel.
Grove looked across the table at Moore, the chairman, the namesake of the law that governed the industry, the man who along with Robert Noyce had hired Grove on Intel’s incorporation day in 1968 and made him their right-hand operator ever since. According to Grove’s own account, written a decade later in Only the Paranoid Survive, what came next was the question that ended the argument. If we got kicked out, Grove asked, and the board brought in a new CEO, what do you think he would do? Moore answered without hesitation. He would get us out of memories. Grove paused, then said the line that has since echoed through every business school case study of strategic reinvention. Why shouldn’t you and I walk out the door, come back, and do it ourselves?
They did not literally leave the room. But the question reframed everything. In one stroke, it stripped away the sentiment, the sunk cost, the loyalty to a product line that the founders themselves had pioneered. It substituted the only test that mattered: what would a stranger, free of Intel’s history, decide on the merits? The answer was so obvious that they had been refusing to look at it.
Within twelve months Intel would post a $173 million loss, close eight manufacturing plants, eliminate roughly 7,200 jobs, and walk away from the business that had built it. Within five years, the company would be the most valuable semiconductor maker in the world. The pivot from memories to microprocessors became the textbook case of strategic reinvention because almost no other major firm of comparable size had ever done it cleanly, in real time, against the resistance of its own people, and lived to dominate the market it pivoted into.
Grove, by 1985, had been the operator at the center of Intel for seventeen years. The Hungarian refugee who had walked across the Austrian border in 1956 was now the executive vice president running the floor that Noyce and Moore had founded the company on, the discipline behind the company’s manufacturing reputation, and the loud one in the trinity at the top. The system he had imposed on Intel meetings, what the company called constructive confrontation, allowed any employee at any level to attack any idea so long as the attack was on the idea and not the person, and so long as the attacker was prepared to defend a better one. Late arrivers were sent away. Engineers who answered “I don’t know” without a follow-up plan were told to leave and come back when they did. The man who could ask Moore a question that ended a year of corridor argument was, by then, the only person at Intel who could.
The operator’s reflexes mattered, because by the early 1980s the company that Grove was operating was running into a wall it had not seen coming. Intel had introduced the first commercial DRAM, the 1103, in 1970, and for most of the decade after that the memory business was the heart of Intel. Memory chips drove the cleanroom volumes that made every other product possible. They generated the cash that funded the microprocessor work. They were what customers asked for first. As late as 1974 Intel held more than eighty percent of the world DRAM market. By 1984 Intel’s share had collapsed to roughly one percent.
The reason was Japan. Through the late 1970s and early 1980s, the great Japanese electronics conglomerates, Hitachi, NEC, Toshiba, Fujitsu, Mitsubishi, had poured capital into memory at a scale American firms could not match, backed by patient bank financing and by MITI’s coordinating hand. They had taken the lessons of statistical process control further and faster than the Americans had, and they were achieving wafer yields that left U.S. producers staring at their numbers in disbelief. By the early 1980s, Hewlett-Packard’s incoming-quality data showed that the worst Japanese DRAMs were better than the best American ones. By the middle of 1985, with the global PC market softening, Japanese firms were unloading 256-kilobit DRAMs into the U.S. market at prices that Intel’s general counsel publicly called predatory. A Hitachi memo, surfaced later in trade complaints, instructed its U.S. sales force to bid roughly ten percent below the lowest American price, whatever that price turned out to be. American memory makers had no way to fight that without bleeding capital they did not have. Intel was bleeding it anyway.
Grove had already been watching the bleeding inside his own factories. By the time the conversation with Moore happened, Intel ran eight silicon fabs, and only one of them was still being used to produce DRAMs. The rest had been quietly redirected, wafer batch by wafer batch, by middle managers running the production schedules. Nobody had ordered the reallocation. The plant managers were simply refusing to waste good capacity on a product whose margins had gone underwater. This was the phenomenon that the Stanford strategy scholar Robert Burgelman, after twelve years of fieldwork inside Intel, would come to call internal selection: the idea that resources in a healthy company drift, almost autonomously, toward whatever is making money, regardless of what the published strategy says. Intel’s published strategy in 1984 was that memory mattered. Intel’s actual strategy, as expressed in capacity allocation, had already shifted. Top management was the last to admit it.
There was a reason for that lag, and the reason was sentimental more than strategic. Memory was Intel’s first child. Moore had personally led the early DRAM work. Senior engineers had built their careers on it. The people most committed to staying in memory were the ones who had been there since 1970. Grove understood this. He also understood that knowing it intellectually was not the same as fixing it. For most of 1984 and into the first half of 1985, he had been holding what he later described in Only the Paranoid Survive as a “valley of death” series of meetings, in which the same people reopened the same arguments, citing the same canonical reasons memory could not be abandoned, while the financials got steadily worse. He called it wandering. He said later that the worst part of a strategic inflection point was not the decision; it was the year that came before the decision, when the leaders had lost confidence in their plan and had not yet found a new one.
The arguments Grove kept hearing had a logic to them, which was why they would not die. Memory, the senior engineers said, was the discipline that taught a fab how to make every other product. DRAMs ran in volumes high enough that a process problem showed up immediately and could be solved at scale. Microprocessors, by comparison, were lower-volume, higher-margin oddballs. Without DRAMs to push the technology, the argument ran, Intel’s process learning curve would flatten and Intel would, within a few generations, fall behind on microprocessor manufacturing too. There was also a customer-facing argument. Buyers who bought DRAMs from Intel often bought logic chips and microprocessors at the same time. Lose memory, lose the relationship. These were not stupid claims. In 1985 they were the conventional wisdom of the industry, and they were the reason almost every other major American chipmaker was still pouring money into DRAM long after the Japanese had won. Grove’s eventual answer to them was empirical rather than rhetorical. The technology-driver argument failed, he came to believe, because the Japanese were now driving the technology, not Intel; staying in memory at sub-scale was not driving anything. The customer argument failed because microprocessor customers, increasingly the PC clone makers, were not the same buyers as memory customers anyway.
The Moore conversation cut through. Once it happened, Grove began drafting the operational sequence in his head. The decision had to be communicated all the way down, and it had to be defended against the predictable objections from sales, from customers, and from middle managers who would interpret hesitation as opportunity to relitigate. It had to be paired with a positive direction so that the cuts did not feel like surrender. And the cuts themselves had to be deep enough to bring Intel’s overhead structure, sized through the boom years for a company growing toward two or three billion dollars, back into line with what Intel was about to become.
The execution, when it came, was brutal. In late 1985 Intel announced it was exiting the DRAM business. Internally, the company began closing fabs in the Philippines, in Oregon, in California, and in Puerto Rico. Eight manufacturing plants shut. About 7,200 jobs vanished, roughly a third of the workforce, through a combination of layoffs and attrition. Salaried employees took a ten percent pay cut. Grove and Moore took larger ones. Intel’s 1986 annual report, which Grove and Moore would sign together, opened with a sentence that did not bother to soften the picture. We’re pleased to report 1986 is over, they wrote. It was, without question, the toughest year in Intel’s history. The company had been left, the report said, “with an overhead structure appropriate to the $2 to 3 billion company we aimed to be rather than the $1.0 to 1.5 billion company we were becoming.” The full-year loss came to $173 million. Intel had never lost money before.
Saying goodbye to memory in public was harder than the financials. Customers had to be told that the company they had bought DRAMs from for fifteen years no longer made them. Sales engineers, who had built their careers on cross-selling memory and microprocessors as a matched set, had to absorb that the cross-sell was over. Inside, employees who had spent decades viewing memory as Intel’s spine had to be told that the spine had been somebody else’s product all along; the real spine had been the manufacturing competence, and the manufacturing competence was perfectly transferable.
The positive direction, the second half of the pivot, was the part the textbooks remember less and Intel needed more. In the same window of months that Grove was closing memory fabs, he was committing the company’s future to a single bet: the new 32-bit microprocessor that the Oregon design team had been finishing. On October 17, 1985, only a few weeks after the conversation with Moore had run its course inside the company, Intel publicly announced the 80386. It was a startling piece of silicon. It crammed 275,000 transistors onto a die, ran at 16 megahertz, and could address four gigabytes of memory directly, an order of magnitude beyond what most personal computers in 1985 would have known what to do with. Intel coordinated the announcement across San Francisco, London, Paris, Munich, and Tokyo on the same day. The architectural drawings were eventually displayed at the Museum of Modern Art in New York. One of the engineers on the design team, employee number four on the project, was a young Pat Gelsinger, who would, decades later, return to run the company in a very different crisis.
The 386 mattered for a reason that had little to do with its raw transistor count. It was binary-compatible with the 8086 and 286 chips that already ran most of the world’s IBM-compatible PCs, which meant every program written for the prior generation would run on the new one. It was the natural upgrade path. And in a decision whose consequences would shape the next twenty years, Grove decided that, this time, Intel would refuse to license it.
Through the prior generation, Intel had grudgingly maintained second-source agreements with companies including Advanced Micro Devices, allowing them to manufacture and sell Intel’s 16-bit chips. The arrangement had been forced on Intel by IBM, which insisted in the early 1980s that any chip going into its personal computers be available from at least two manufacturers, so that IBM was never hostage to a single supplier. Without the second-source guarantee, the original IBM PC contract that Operation Crush had won in 1980 would not have happened. By 1985, however, the bargaining position had inverted. IBM was losing control of the PC standard to the clone makers. The clone makers wanted whatever was fastest. And Intel had a chip, the 386, that nobody else could yet make.
Grove decided to keep it that way. AMD, which had expected the 386 license under the existing agreement, did not get it. The decision touched off years of arbitration and litigation that would not finish until the mid-1990s. It also achieved its strategic purpose. For the rest of the decade, every PC that wanted the 386’s performance had to buy the chip from Intel, at a price Intel set, on a schedule Intel controlled. To insure against any single-point manufacturing failure, Grove distributed 386 production across three geographically separated fabs: Santa Clara, Hillsboro, and a new site in Chandler, Arizona. The company that a year earlier had been a memory house with a sideline in microprocessors was reorganizing itself, all at once, into a sole-source microprocessor monopolist with a mature manufacturing base.
The bet still needed a buyer. IBM, as it happened, declined to be one. Through the first half of 1986, Big Blue’s executives signaled that they were in no hurry to adopt the 386. Internally, IBM was wrestling with a different problem: a new chip in mid-range PCs would cannibalize sales of the company’s higher-margin minicomputer line, the System/36 and System/38. IBM had also taken a manufacturing license for the slower 286 and intended to keep building those for years. From Intel’s point of view, IBM’s hesitation was a near-disaster. The IBM logo on a new PC was, in 1986, still understood by most corporate buyers as the signal that mattered.
The breakthrough came from a much smaller company in Houston that decided not to wait. Compaq Computer, founded in 1982 by three former Texas Instruments engineers, had built itself into a roughly billion-dollar maker of IBM-compatible portables and desktops by being relentless about being first to follow. In January 1986, a Compaq engineer named Hugh Barnes came back to chief executive Rod Canion with a piece of intelligence: Intel had told him that IBM was not going to use the 386 anytime soon. Canion saw the opening immediately. If Compaq could ship a 386 PC ahead of IBM, it would, for the first time, be defining the PC standard rather than chasing it.
Compaq spent the next eight months in a sprint, working closely with Intel’s engineers to build a machine around a chip that had barely entered mass production. On September 9, 1986, in a New York City gala, Canion took the stage with Andy Grove and Gordon Moore beside him to introduce the Compaq Deskpro 386. Bill Gates of Microsoft and Compaq chairman Ben Rosen sat in the front rows. The base model retailed at $6,499, with higher configurations approaching $9,000. By the end of the first quarter after launch, Compaq had moved 25,000 units. By the second quarter of 1987, it had shipped roughly 90,000. PC Magazine, in its review, described it tersely as a screamer. IBM’s competing 386 machine, the Personal System/2 Model 80, did not arrive until July 1987, ten months later.
It is hard, looking back, to overstate what that gap meant. Bill Gates would say, years later, that the Deskpro 386 launch was the moment the industry realized IBM was no longer the one setting the standards. For Intel, the lesson was simpler and more immediate. The most valuable customer relationship the company had developed since the IBM PC contract in 1980 was now with a clone maker, not with IBM. The architecture mattered more than the brand on the box. And so long as Intel held the architecture, the brand on the box was negotiable.
By the start of 1987, the worst was past. Intel’s revenues began to climb again. The 386 was selling into the most profitable PC segment of the industry. Costs had come down with the layoffs. The company would post a profit for 1987 and would never, for the rest of Grove’s tenure, lose money again. Inside the boardroom, the succession that had been pending for years became formal. Robert Noyce, who had stepped back from operations to run the SEMATECH industry consortium that the U.S. government and industry had assembled to fight the Japanese, gave way at the top to Moore as chairman. Moore in turn handed Grove the chief executive’s title. From 1987 on, the company belonged to Grove operationally as well as in fact.
What Grove had built into Intel along the way was a doctrine, not just a product line. He would later articulate it in the 1996 book that gave the doctrine its name, but the substance was already in place by the late 1980s. A strategic inflection point, in Grove’s framing, was a moment when the fundamentals of an industry shifted by an order of magnitude on some axis: cost, technology, regulation, competitor capability. Past the inflection, the old strategy did not need to be improved. It needed to be replaced. The danger was that incumbents always saw the early signals first and trusted them last, because the early signals were ambiguous and the old strategy had decades of organizational antibodies behind it. Most companies, Grove argued, did not survive their inflection points. Those that did had usually done what he and Moore had done in his office in 1985: they had imagined themselves replaced and then acted on what their replacements would have done.
There was an underside to the doctrine that became visible only later. Sole-sourcing the 386 turned Intel into a near-monopoly supplier of the world’s most important commodity processor architecture, and the cash that came back funded a level of investment in process technology that no rival could match through the 1990s. But it also taught Intel a lesson that would, in time, become a liability. Vertical integration, full ownership of design and manufacturing, exclusive control of the architecture, were the things that had saved the company in 1985 and made it dominant in 1995. Intel would carry that conviction into a future in which other companies, in Taiwan, would be quietly developing a different theory entirely. For now, in the late 1980s, the conviction looked like simple sense. Real men have fabs, as one of Grove’s contemporaries would later put it. Intel had eight, then six, then more, and they were all making microprocessors.
The company that had been synonymous with memory was about to become synonymous with the personal computer. The doctrine that several generations of American managers would memorize had been forged in the same place: a small office in Santa Clara, on a bad day in 1985, with Grove staring across the desk at the only cofounder he fully trusted, looking for a question that would let them save what they had built. The question turned out to be, what would somebody else do? The answer, once they let themselves hear it, did the rest.