Apple and the Ghosts of Companies Past – Stratechery by Ben Thompson

by oqtey
Apple and the Ghosts of Companies Past – Stratechery by Ben Thompson

Apple is not doomed, although things were feeling pretty shaky a couple of weeks ago, when the so-called “Liberation Day” tariffs were poised to make the company’s manufacturing model massively more expensive; the Trump administration granted Apple a temporary reprieve, and, for the next couple of months, things are business as usual.

Of course that’s not Apple’s only problem: a month ago the company had to admit that it couldn’t deliver on the AI promises it made at last year’s WWDC, leading John Gruber to declare that Something Is Rotten in the State of Cupertino. Still, just because Apple can’t ship a Siri that works is not necessarily cause for short-term concern: one of the Siri features Apple did ship was its ChatGPT integration, and you can run all of the best models as apps on your iPhone.

So no, Apple is not doomed, at least not for now. There is, however, real cause for concern: just as tech success is built years in advance, so is failure, and there are three historical examples of once-great companies losing the future that Apple and its board ought to consider carefully.

Microsoft and the Internet

I bet you think you already know the point I’m going to make here: Microsoft and the Internet is like Apple and AI. And you would be right! What may surprise you, however, is that I think this is actually good news for Apple, at least in part.

The starting point for the Internet is considered to be either 1991, when Tim Berners-Lee created the World Wide Web, or 1993 when Mosaic released the first consumer-accessible browser. In other words, Bill Gates’ famous memo about The Internet Tidal Wave was either two or four years late. This is from his opening:

Developments on the Internet over the next several years will set the course of our industry for a long time to come. Perhaps you have already seen memos from me or others here about the importance of the Internet. I have gone through several stages of increasing my views of its importance. Now I assign the Internet the highest level of importance. In this memo I want to make clear that our focus on the Internet is crucial to every part of our business. The Internet is the most important single development to come along since the IBM PC was introduced in 1981. It is even more important than the arrival of the graphical user interface (GUI). The PC analogy is apt for many reasons. The PC wasn’t perfect. Aspects of the PC were arbitrary or even poor. However a phenomena grew up around the IBM PC that made it a key element of everything that would happen for the next 15 years. Companies that tried to fight the PC standard often had good reasons for doing so but they failed because the phenomena overcame any weaknesses that resisters identified.

It’s unfair to call this memo “late”: it’s actually quite prescient, and Microsoft pivoted hard into the Internet — so hard that just a few years later they faced a DOJ lawsuit that was primarily centered around Internet Explorer. In fact, you could make a counterintuitive argument that Microsoft actually suffered from Gates’ prescience; this was what he wrote about Netscape:

A new competitor “born” on the Internet is Netscape. Their browser is dominant, with 70% usage share, allowing them to determine which network extensions will catch on. They are pursuing a multi-platform strategy where they move the key API into the client to commoditize the underlying operating system. They have attracted a number of public network operators to use their platform to offer information and directory services. We have to match and beat their offerings including working with MCI, newspapers, and other who are considering their products.

Microsoft beat Netscape, but to what end? The client was in fact commoditized — the Internet Explorer team actually introduced the API that made web apps possible — but that was OK for business because everyone used Windows already.

What actually mattered was openness, in two regards: first, because the web was open, Microsoft ultimately could not contain it to just its platform. Second, because Windows was open, it didn’t matter: Netscape, to take the most pertinent example, was a Windows app; so was Firefox, which dethroned Internet Explorer after Microsoft lost interest, and so is Chrome, which dominates the web today.

That’s not to say that the Internet didn’t matter to Microsoft’s long-term prospects, because it was a bridge to the paradigm that Microsoft actually fumbled, which was mobile. Last fall I wrote The Gen AI Bridge to the Future, where I made the argument that paradigm shifts in hardware were enabled by first building “bridges” at the application layer. Here is the section on Windows and the Internet:

PCs underwent their own transformation over their two decades of dominance, first in terms of speed and then in form factor, with the rise of laptops. The key innovation at the application layer, however, was the Internet:

The Internet differed from traditional applications by virtue of being available on every PC, facilitating communication between PCs, and by being agnostic to the actual device it was accessed on. This, in turn, provided the bridge to the next device paradigm, the smartphone, with its touch interface:

I’ve long noted that Microsoft did not miss mobile; their error was in trying to extend the PC paradigm to mobile. This not only led to a focus on the wrong interface (WIMP via stylus and built-in keyboard), but also an assumption that the application layer, which Windows dominated, would be a key differentiator.

Apple, famously, figured out the right interface for the smartphone, and built an entirely new operating system around touch. Yes, iOS is based on macOS at a low level, but it was a completely new operating system in a way that Windows Mobile was not; at the same time, because iOS was based on macOS, it was far more capable than smartphone-only alternatives like BlackBerry OS or PalmOS. The key aspect of this capability was that the iPhone could access the real Internet… that was the key factor in reinventing the phone, because it was the bridge that linked a device in your pocket to the world of computing writ large.

To reiterate Microsoft’s failure, the company attempted to win in mobile by extending the Windows interface and applications to smartphones; what the company should have done is “pursu[e] a multi-platform strategy where they move the key API into the client to commoditize the underlying operating system.” In other words, Microsoft should have embraced and leveraged the Netscape threat, instead of trying to neutralize it.


Apple and the iPhone is analogous to Microsoft and Windows, for better and for worse: the better part is that there many more smartphones sold than PCs, which means that Apple, even though it controls less than half the market, has more iOS devices than there are Windows devices. That’s the “for worse” part, however: Apple exerts more control on iOS than Microsoft ever did on Windows, but also doesn’t have a monopoly like Microsoft did.

The most obvious consequence of smartphones being a duopoly is that Apple can’t unilaterally control the entire industry’s layer like Microsoft wanted to. However, you can look at this in a different way: Microsoft couldn’t have dared to exert Apple-like control of Windows because it was a monopoly; the Windows API was, as I noted above, an open one, and that meant that the Internet largely happened on Windows PCs.

Consider this in the context of AI: the iPhone does have AI apps from everyone, including ChatGPT, Claude, Gemini, DeepSeek, etc. The system-wide assistant interface, however, is not open: you’re stuck with Siri. Imagine how much more attractive the iPhone would be as an AI device if it were a truly open platform: the fact that Siri stinks wouldn’t matter, because everyone would be running someone else’s model.

Where this might matter more is the next device paradigm: the point of The Gen AI Bridge to the Future is in the title:

We already established above that the next paradigm is wearables. Wearables today, however, are very much in the pre-iPhone era. On one hand you have standalone platforms like Oculus, with its own operating system, app store, etc.; the best analogy is a video game console, which is technically a computer, but is not commonly thought of as such given its singular purpose. On the other hand, you have devices like smart watches, AirPods, and smart glasses, which are extensions of the phone; the analogy here is the iPod, which provided great functionality but was not a general computing device.

Now Apple might dispute this characterization in terms of the Vision Pro specifically, which not only has a PC-class M2 chip, along with its own visionOS operating system and apps, but can also run iPad apps. In truth, though, this makes the Vision Pro akin to Microsoft Mobile: yes, it is a capable device, but it is stuck in the wrong paradigm, i.e. the previous one that Apple dominated. Or, to put it another way, I don’t view “apps” as the bridge between mobile and wearables; apps are just the way we access the Internet on mobile, and the Internet was the old bridge, not the new one.

The new bridge is a user interface that gives you exactly what you need when you need it, and disappears otherwise; it is based on AI, not apps. The danger for Apple is that trying to keep AI in a box in its current paradigm will one day be seen like Microsoft trying to keep the Internet locked to its devices: fruitless to start, and fatal in the end.

Intel and the Foundry Model

Intel was the other company that dominated the PC era: while AMD existed, they were more of an annoyance than an actual threat (thanks in part to Intel’s own anticompetitive behavior). And, like Microsoft, Intel also missed mobile, for somewhat similar reasons: they were over-indexed on the lessons of the PC.

Back in the 1980s and 1990s, when PCs were appearing on every desk and in every home, the big limitation was performance; Intel, accordingly, was focused on exactly that: every generation of Intel chips was massively faster than the previous one, and the company delivered so regularly that developers learned to build for the future, and not waste time optimizing for the soon-to-be-obsolete present.

Mobile, however, meant battery power, and Intel just wasn’t that concerned about efficiency; while the popular myth is that Intel turned Apple down when it came to building chips for the iPhone, Tony Fadell told me in a Stratechery Interview that they were never under consideration:

The new dimension that always came in with embedded computing was always the power element, because on battery-operated devices, you have to rethink how you do your interrupt structures, how you do your networking, how you do your memory. You have to think about so many other parameters when you think about power and doing enough processing effectively, while having long battery life. So everything for me was about long, long battery life…when you take that microscopic view of what you’re building, you look at the world very differently.

For me, when it came to Intel at the time, back in the mid-2000s, they were always about, “Well, we’ll just repackage what we have on the desktop for the laptop and then we’ll repackage that again for embedding.” It reminded me of Windows saying, “I’m going to do Windows and then I’m going to do Windows Mobile and I’m going to do Windows embedded.” It was using those same cores and kernels and trying to slim them down…”We’re just going to have Moore’s Law take over” and so in a way that locks you into a path and that’s why Intel, not under the Pat days but previous to the Pat days, was all driven by manufacturing capability and legal. It wasn’t driven by architectural decisions.

Missing mobile was a big problem for Intel’s integrated device manufacturing model: the company, in the long run, would not have the volume and the associated financial support of mobile customers to keep up with TSMC. Today the company is struggling to turn itself into a foundry — a company that manufactures chips for external customers — and would like nothing more than to receive a contract from the likes of Apple, not for an Intel chip, but for an ARM-based one.

What is notable about this example, however, is how long it took to play out. One of my first Articles on Stratechery was 2013’s The Intel Opportunity, where I urged the company to get into the foundry business, a full six years after the iPhone came out; I thought I was late. In fact, Intel’s stock nearly reached its dot-com era highs in 2020, after steady growth in the seven years following that Article:

The reason for that growth was, paradoxically enough, mobile: the rise of smartphones was mirrored by the rise of cloud computing, for which Intel made the processors. Better yet, those Xeon processors were much more expensive than PC processors (much less mobile ones), which meant margins kept growing; investors didn’t seem to care that Intel’s decline — so apparent today — was already locked in.


While Microsoft and the Internet is more directly analogous to Apple and AI, it’s the collective blindness of Intel shareholders and management to the company’s long-term risks that offers a lesson for the iPhone maker. To summarize the Intel timeline:

  • Intel missed mobile because it was focused on the wrong thing (performance over efficiency).
  • Intel failed to leverage its greatest strength (manufacturing) into an alternative position in mobile (being a foundry).
  • Intel’s manufacturing fell behind the industry’s collective champion (TSMC), which raised challenges to Intel’s core business (AMD server chips are now better than Intel’s).

Now, a decade-and-a-half after that first mistake, Intel is on the ropes, despite all of the money it made and stock market increases it enjoyed in the meantime.

If a similar story unfolds for Apple, it might look like this:

  • Apple misses AI because it’s focused on the wrong thing (privacy).
  • Apple fails to leverage its greatest strength (the iPhone platform) into an alternative position in AI (being the platform for the best model makers).
  • Apple’s platform falls behind the industry’s collective champion (Android or perhaps TBD), which raises challenges to Apple’s core business (AI is so important that the iPhone has a worse user experience).

The questions about Apple’s privacy focus being a hindrance in AI are longstanding ones; I raised them in this 2015 Update when I noted that the company’s increasingly strident stance on data collection ran the risk of diminishing product quality as machine learning rose in importance.

In fact, those fears turned out to be overblown for a good long while; many would argue that Apple’s stance (strategy credit or not) was a big selling point. I think it’s fair to wonder, however, if those concerns were not wrong but simply early:

  • An Apple completely unconcerned with privacy would have access to a vast trove of exclusive user data on which to train models.
  • An Apple that refused to use user data for training could nonetheless deliver a superior experience by building out its AI as a fully scaled cloud service, instead of the current attempt to use on-device processing and a custom-built private cloud compute infrastructure that, by necessity, has to rely on less capable models and worse performance.
  • An Apple that embraced third party model providers could, as noted above, open up its operating systems so that users could replace Siri with the model of their choice.

Apple’s absolutist and paternalistic approach to privacy have taken all of these options off the table, leaving the company to provide platform-level AI functionality on its own with a hand tied behind its back, and to date the company has not been able to deliver; given how different AI is than building hardware or operating systems, it’s fair to wonder if they ever will.

And, critically, this won’t matter for a long time: Apple’s AI failures will not impact iPhone sales for years, and most AI use cases will happen in apps that run on the iPhone. What won’t happen, however, is the development of the sort of platform capabilities that will build that bridge to the future.

This, in the end, was Intel’s ultimate failing: today there is massive demand for foundry capacity, but not for mobile; what the world wants is more AI chips, particularly from a company (Nvidia) which has regularly been willing to dual source its supply. Intel, though, has yet to meet the call; the cost of the company not opening itself up after its mobile miss is that it wasn’t prepared for the next opportunity that came along.

Apple and China

This last analogy is, I admit, the shakiest, but perhaps the most important: it’s Apple itself. From the New York Times:

In 1983, Mr. Jobs oversaw the construction of a state-of-the-art plant where the new Macintosh computer would be built. Reporters who toured it early on were told that the plant, located just across San Francisco Bay from Apple’s headquarters, was so advanced that factory labor would account for 2 percent of the cost of making a Macintosh. Ultimately, the Macintosh factory closed in 1992, in part because it never realized the production volume that Mr. Jobs had envisioned — such sales numbers for the Mac would only come later…

That failure taught Mr. Jobs the lesson. He returned to Apple in 1997, and the next year, he hired Tim Cook as Apple’s senior vice president for worldwide operations. Mr. Cook had mastered the art of global manufacturing supply chains, first in IBM’s personal computer business and then at Compaq Computer.

It was admirable that Jobs wanted to build in America, but realistically the company needed to follow the rest of the tech industry to Asia if it wanted to survive, much less thrive, and Cook, just as much as Jobs, both saved the company and set it on the course for astronomical growth.

The challenge today is that that growth has been mirrored by China itself, and the current administration is determined to decouple the U.S. from China; that potentially increases Apple’s most existential threat, which is a war over Taiwan. This is a very different problem than what has long concerned Cook; from a 2008 profile in Fortune:

Almost from the time he showed up at Apple, Cook knew he had to pull the company out of manufacturing. He closed factories and warehouses around the world and instead established relationships with contract manufacturers. As a result, Apple’s inventory, measured by the amount of time it sat on the company’s balance sheet, quickly fell from months to days. Inventory, Cook has said, is “fundamentally evil,” and he has been known to observe that it declines in value by 1% to 2% a week in normal times, faster in tough times like the present. “You kind of want to manage it like you’re in the dairy business,” he has said. “If it gets past its freshness date, you have a problem.” This logistical discipline has given Apple inventory management comparable with Dell’s, then as now the gold standard for computer-manufacturing efficiency.

There are things worse than dairy going bad: it’s cows being blown up. Evil? Absolutely. Possible? Much more so today than at any other point in Cook’s tenure.

This, then, is the analogy: the Apple that Cook arrived at in 1998 was at existential risk from its supply chain; so is Apple today. Everything else is different, including the likelihood of disaster; Apple’s China risk may be elevated, whereas Apple’s bankruptcy in the 1990’s seemed a matter of when, not if:

At the same time, that also means that Apple has cash flow, and power; what is necessary now is not making obvious choices out of necessity, but making uncertain ones out of prudence. Cook built the Apple machine in China; the challenge now will be in dismantling it.

The Cook Question

Cook is the common variable across all of these analogies:

  • Cook has led the company as it has continually closed down iOS, controlling developers through the stick of market size instead of the carrot of platform opportunity.
  • Cook has similarly been at the forefront of Apple absolutist approach to privacy, which has only increased in intensity and impact, not just on 3rd parties abut also on Apple itself.
  • Cook, as I just documented, built Apple’s dependency on China, and has adroitly managed the politics of that reality, both with China and the U.S.

All of these decisions — even the ones I have most consistently disagreed with — were defensible and, in some cases, essential to Apple’s success; Cook has been a very effective CEO for Apple and its shareholders. And, should he stay on for several more years, the company would probably seem fine (assuming nothing existential happens with China and Taiwan), particularly in terms of the stock price.

Tech fortunes, however, are cast years in advance; Apple is not doomed, but it is, for the first time in a long time, fair to wonder about the long-term: the questions I have about the company are not about 2025, but 2035, and the decisions that will answer those questions will be made now. I certainly have my point of view:

  • Apple should execute an AI Platform Pivot, enabling developers to build with AI instead of trying to do everything itself; more broadly, it should increase the opportunities for developers economically and technically.
  • Apple should not abandon its privacy brand, but rather accept the reality that all of computing is ultimately about trust: the device will always have root. To that end, users do trust Apple, not because Apple is so strident about user data that they make their products worse, but because the company’s business model is aligned with users, with a multi-decade track record of doing right by them; in this case, doing right by users means doing what is necessary to have an actually useful AI offering.
  • Whereas I once thought it was reasonable for Apple to maintain its position in China — the costs of hedging would be so large that it would be better to take the minuscule risk of war, which Apple itself minimized through its position in China — that position no longer seems feasible; at a minimum Apple needs to rapidly accelerate its diversification efforts. This doesn’t just mean building up final assembly in places like India and Brazil, but also reversing its long-running attempts to undercut non-Chinese suppliers with Chinese alternatives.

All of these run counter to the decisions Cook has made over the last three decades, but again, it’s not that Cook was wrong at the time he made them; rather, times change, and Apple needs to change before the time comes where the necessity for change is obvious, because that means the right time for that change has already passed.

Related Posts

Leave a Comment