Sunset on Intel

Rodolfo Rosini
3 min readNov 17, 2020

--

The transition away from the x86 architecture is a discussion that I have heard all my life. Even back when I started it was considered sub-optimal and needed a rewrite from the ground up, because the demand for DOS/Windows was skyrocketing, it could not be attempted. And this problem of a shaky foundation remained for literally decades. While at the same time Intel was constantly lagging behind in terms of die shrink, and performance gains they achieved were not due to architectural advancements but more processors, more speed, and stuff like speculative execution (which ends up causing unfixable security issues). Basically there was a performance wall that x86 was going against, and they were brute forcing it.

The market dominance was their biggest advantage, but also the biggest barrier to developing a leaner alternative.

In new markets, they did develop an alternative. As server computing grew they developed Itanium, which was IA-64 and not x86, but its development was based on old assumptions (or tests, depending on who you ask) — meaning that it was optimized for the previous generation of supercomputers, which were mostly about scientific computation. But the server market that was growing was cloud computing, where IA-64’s performance just was not there, so they ended up releasing an expensive server processor that would not perform better than their previous architecture, and that anyway the x86 crowd didn’t want. Itanium was finally killed last year.

Around the mid-2000s, for reason known only to them, as the market for smartphones was booming like crazy, they decided to sell their mobile chipset division (again, “mobile” meant “how to squeeze PC performance and power consumption in a phone” rather than developing a new product designed around the constraint of a new device).

Better/leaner/lower power consumption alternatives were actually being funded (Transmeta being one), but because they targeted an existing market, they all faced the headwinds of developers not wanting to recompile programs for something that would ship 100,000 devices at best. With the release of the iPhone this changed, their ARM architecture got better at every cycle, until reaching the point (and this is MIND BLOWING) of outperforming x86 in emulation mode.

I suspect (this is a huge speculation) Jim Keller left when he told Intel that the only solution was to ditch x86 and write something else.

The monopoly days of Intel CPUs in desktops are over. Microsoft adjusted from going to ~90% computing market share to ~8% by moving onto the cloud, and they have done it extremely successfully. I don’t think Intel can pull the same feat.

The biggest issues of Intel, in my opinion, were:

Not funding external innovation
Intel did not invest in the competition. Finding a VC backing a silicon startup was very hard in the Bay Area. Ironically it was one of the areas where more funding was available in Cambridge (which is almost never the case).
EDIT: Intel did have a CVC arm, but they were only pursuing strategic investments to grow their core business (like infrastructure to increase demand for computation), and not funding things that would disrupt them (semi). Also by all accounts were being dickish to entrepreneurs about it, as they would not follow future rounds regardless of finacial success of the company if the mothership decided that that particular market was no longer strategic.

Not investing in mobile
Intel decided against investing in R&D to win the iPhone business, because it was too expensive. Around 2005–2010 (IIRC) there were about 750m PCs, and 2bn phones. PC shipments were slowing down for the first time in history, and it was clear that all phones would have had a dual processor in a decade. The era of PC was not over, just that mobile was going to be a bigger market. By then it was clear that Itanium was not going to be a solution for commercial cloud computing, web hosting etc.

Getting out of the GPU gaming business
This was harder to plan, but (to some extent correctly) they got out of gaming. Alas it turned out that a lot of problems use linear algebra besides gaming (machine learning, crypto), and it was clear by early 2000s that having some sort of programmable GPUs could offload computation and speed up programs massively. In the early 2000s programming for it was clunky, so can understand the skepticism, and even in 2010 it was not obvious how big deep learning was going to be (ironically not even to NVIDIA themselves)

TLDR; none of those can be reversed in time now that their monopoly is over. A company like Intel is huge, they are a key national defense supplier, so they can keep going decades, but the decline is unstoppable.

--

--

Rodolfo Rosini
Rodolfo Rosini

Written by Rodolfo Rosini

CEO and founder, stealth. Also working with Conception X helping PhD students become venture scientists.

No responses yet