The x86 Exodus
Analysis
By Yanai Levy, March 25th, 2023
Processors have become part of almost every appliance, accessory, and of course, the smartphones in our pockets. Your smart fridge has a processor in it, your tv, even your digital vacuum has one inside. We normally think of processors in the context of computers, as the “brain” of the black box sitting next to your monitor or in your laptop. However, there has been a paradigm shift in recent years that is already affecting the choices you have when buying electronics: the move to ARM.
Let’s back up a little bit to when PC processors really were only used in computers. Mostly in the 1980s and early 1990s, they were a burgeoning class of products for use in home and rapidly moving away from their limited existence in labs and universities. Since some of the first home computers such as the Apple II and Commodore 64 used specially selected processors and software and therefore were very limited in what kind of other computers could understand their software. This meant that if you had an Apple 2 you were mostly limited to communicating with other Apple 2 owners.
The advent of Intel’s 8086 processor and later 8088 chips with their accompanying instruction set started a change in the processor world that can be credited with most of our PCs today. An instruction set is basically a set of functions that a computer has built into it on both a physical hardware level and a basic software level. For example, “add” is an instruction to combine to numbers a computer may have saved to produce a result. In the early days of computing, even instructions like basic arithmetic were not a given across different computing brands.
The Intel 8086 paved the way for what we call the x86 architecture today, which almost all high-performance processors use. Intel and AMD’s consumer processor lineups as well as both company’s server processors use this type of processor, and their duopoly ensures that most computer processors are still based on x86.
There is, however, another option that is gaining traction at an incredible rate. ARM or “Advanced RISC Machine” is another goliath company that founded in 1990 which creates processor designs for almost every other type of electronics product besides the computer. Smartphones? ARM. Tablets? ARM. Car infotainment? ARM. Even tiny accessory boards inside things like hairdryers typically use some form of low-end ARM products.
Unlike AMD and Intel, ARM does not manufacture products for consumer use. Instead, they create the designs which they then own, and they license those designs out to other companies. This approach has a few key advantages. Companies can save on R&D costs because creating a processor from scratch is extremely difficult and expensive. Also, they can be much more specialized because they can borrow a certain design for a specific function they need to perform and use other designs for other requirements.
For example, Apple can sell an “Afterburner Card” for its Mac Pro that only does video editing work, and is based off of designs they license from ARM. Another way ARM processors can be beneficial to use is to make much more power efficient chips such as the ones in smartphones. The battery in a phone is much smaller than a laptop and a phone cannot be cooled as easily, necessitating processors that a hugely more efficient than a desktop computer’s version.
The solution that is used is incorporating different types of processing components, called cores. A processor can be made up of many cores, but we will stick to 6 for simplicity. The processor in Apple’s latest phones is made up of four efficiency cores, and two performance cores. The efficiency cores do not create much heat or need much power, but do not perform very well. The performance cores work in the opposite way. They create lots of heat and draw lots of power, but they are only used when necessary.
Now that we have laid out some of the reasons why ARM processors can be better in many applications, we can move on to how they are changing the landscape of the electronics marketplace. Throughout the 2000s and most of the 2010s, companies such as Apple, Microsoft, Samsung, and just about any other electronics manufacturer you care to mention would go to processor manufacturers and buy their premade processing products to use in their own. Phones and tablets are a bit different because they are generally powered by something called a system-on-chip or SoC. The difference between this and a regular computer processor is that other components such as system memory/storage, cellular modems, and graphics processors are all baked in to one chip.
So, Apple bought i7 processors from Intel, Samsung bought Snapdragon SoCs from Qualcomm, Google bought tablet SoCs from Nvidia, and so on. Apple and Nvidia had a famous falling out over faulty graphics cards that Nvidia supplied for Macs between 2008-2014, and while Apple used AMD cards to replace them until 2020, I believe that this experience started their plans to shift away from being processor customers and start making their own for the computer market. Now the word “making” is a bit of a misnomer here. Apple does not manufacture their own processors and SoCs, as they do not manufacture almost anything themselves.
Apple and other consumer electronics companies contract out the work to companies like Foxconn and TSMC to make their hardware products, and that is exactly what Apple did to make their ARM-based processors for their products as early as the original iPhone in 2007. ARM creates core designs, some aimed at high performance, at some high efficiency, or even at some specific task such as video decoding, and companies pick and choose which cores and how many they want in a given product. Once they have made those crucial decisions, they send their overarching design to a fabricator such as TSMC to physically manufacture those designs, including the licensed core designs from ARM.
This kind of customizability comes with huge advantages. Since companies do not have to take a general-purpose model of processor from a company like Intel and build their designs around it, they can much more tightly integrate their hardware and the software they intend to run on it. Google was able to create their Tensor lineup of SoCs that have tons of AI and machine power in comparison to a standard chip, which lets their camera app remove whole people from photos. Tesla can make a chip for their cars that is specially made to process video to determine a path forward. The efficiency gains are huge as well. Due to the of the specialization of the hardware, a task that would have required a heavy duty, power hungry processor before can now be completed by a specialized processor that does it so well it needs an eighth of the power.
Companies such as Intel and AMD that do still use x86 based architectures have been aware of this shift and seen the threat it poses to their existence. While Intel may have been resting on their laurels for much of the 2010s, renewed competition in and out of the x86 processor space has woken the slumbering giant up, and they are firing on all cylinders to innovate new ideas. Their big.LITTLE architecture introduced in 2021 uses performance and efficiency cores in the same way that ARM processors have for years, and it pushed the number of cores to unheard of numbers in a mainstream desktop processor. Their flagship i7 gaming processors had four (only performance) cores from 2007 till 2015, but pressure from ARM and AMD has pushed that number to 16 total (8 efficiency + 8 performance) cores for their current i7-13700K. That’s what competition can do for the consumers of a market right there.
This is an extremely exciting time to be interested in processors. The x86 paradigm is seemingly fighting for its continued existence against a growing list of custom processors developed by companies for their specific products. Companies like Intel and AMD are not only massive entities, but contain some of the most brilliant chip design minds out there in their ranks. I do not see either of them just fading into history in the style of PowerPC anytime soon, rather I think that if RISC architectures like ARM uses continue to rise in popularity, they will adapt to produce those processors as well. The processor industry is an arms race, a battle of influence, and on a regulatory tightrope all at once. It is definitely an area to keep an eye on in the coming years.
Thanks for reading!