On the eve of the
Consumer Electronics Show, the game changed. Well, at least there was an
announcement that could completely change the PC game. Microsoft CEO
Steve Balmer announced on Jan. 5 that the next version of Windows would
run on ARM-based processors. The same day, Jen-Hsun Huang, Ballmer’s
counterpart at chip maker Nvidia, revealed Nvidia’s plans to deliver the
first soup-to-nuts family of ARM chips for computing.
This news, reminiscent of the shift at Apple using the ARM-based iOS on the iPad rather than MacOS, is very intriguing. This development has shaken old structures to their foundations. What will emerge when the smoke clears is anyone’s guess, but expect to see lower-power, lower-cost versions of all types of computers — from notebooks to supers — built with ARM-based systems-on-chip.
ARM: ARM is a 32-bit reduced instruction set computer (RISC) instruction set architecture (ISA) developed by ARM Holdings. It was known as the Advanced RISC Machine, and before that as the Acorn RISC Machine. The ARM architecture is the most widely used 32-bit ISA in terms of numbers produced. They were originally conceived as a processor for desktop personal computers by Acorn Computers, a market now dominated by the x86 family used by IBM PC compatible and Apple Macintosh computers. The relative simplicity of ARM processors made them suitable for low power applications. This has made them dominant in the mobile and embedded electronics market, as relatively low cost, and small microprocessors and microcontrollers.
That shift alone has long-term repercussions for everything from the emerging tablet market to the billion-dollar data centers starved for electricity in a world going “green.” It will even bring PCs to households that would never otherwise have seen them. The socioeconomic implications will ripple far beyond computing. But the implications for the electronics industry are truly significant.
Ultimately, Intel, the world’s largest semi-conductor maker, won’t be able to sustain average selling prices dozens and sometimes hundreds of dollars greater than those of other chip makers. Neither will it be able to sustain its vertically integrated business model, in which it is the world’s largest operator of chip fabrication plants.
There is also some inherent danger for the chip industry as a whole. Intel helped pioneer extreme-ultraviolet lithography, the long-delayed printing technology expected to drive the industry to ultrafine chip patterns. It is also leading the shift to larger wafers. What will happen when no one company has both the motivation and the financial clout to plow through the enormously difficult task of delivering new chip processing technologies?
Wafer: a thin slice of semiconductor material, such as a silicon crystal, used in the fabrication of integrated circuits and other microdevices. The wafer serves as the substrate for microelectronic devices built in and over the wafer and undergoes many microfabrication process steps such as doping or ion implantation, etching, deposition of various materials, and photolithographic patterning. Finally the individual microcircuits are separated (dicing) and packaged.
The need for many small players to collaborate on the future of chip technology will complicate the financial and social aspects of technical tasks that are already nearly insurmountable. Expect progress to slow just as CMOS hits atom-sized limits in physics and demands radically new approaches. And what about systems companies? Hewlett-Packard is the world’s largest user of semiconductors, most of them going into the X86 PCs and servers that are often defined at the motherboard level by Intel and Advanced Micro Devices. It is a truly commoditized world, and Intel was a leading engineering source.
CMOS: Complementary metal-oxide-semiconductor (CMOS) is a technology for constructing integrated circuits. CMOS technology is used in microprocessors, microcontrollers, static RAM, and other digital logic circuits. The words "complementary-symmetry" refer to the fact that the typical digital design style with CMOS uses complementary and symmetrical pairs of p-type and n-type metal oxide semiconductor field effect transistors (MOSFETs) for logic functions. Two important characteristics of CMOS devices are high noise immunity and low static power consumption. Significant power is only drawn while the transistors in the CMOS device are switching between on and off states. Consequently, CMOS devices do not produce as much waste heat as other forms of logic, for example transistor-transistor logic (TTL) or NMOS logic. CMOS also allows a high density of logic functions on a chip. It was primarily this reason why CMOS won the race in the eighties and became the most used technology to be implemented in very large scale integration (VLSI) chips such as microprocessors.
Then there’s the effect on PC makers such as HP, Dell, and Lenovo. All have slashed their ranks of chip and board engineers; now they will have to rebuild them. They will have to evaluate perhaps a dozen competing SoCs again as they define differentiating board designs. PC makers may even need to design ASICs
SoC: System-on-a-chip (SoC) refers to integrating all components of a computer or other electronic system into a single integrated circuit (chip). It may contain digital, analog, mixed-signal, and often radio-frequency functions — all on a single chip substrate. A typical application is in the area of embedded systems.
ASIC: Application-specific integrated circuit (ASIC) is an integrated circuit (IC) customized for a particular use, rather than intended for general-purpose use. For example, a chip designed solely to run a cell phone is an ASIC. Modern computers are typically made up of a dozen or so of these specialized ICs running everything from the network to the disk drives to the sound.
Taiwan’s OEMs will have the advantage, as they have already nimbly crossed over to designing and making ARM-based handsets. The Windows-on-ARM shift is their chance to show off their engineering prowess and break out of their role as silent partners to the big-brand companies like IBM, Dell, and HP.
None of this will happen overnight. It could take Microsoft years to deliver a solid version of Windows for ARM chips. Windows itself largely flopped until it hit version 3.1, and even once it became an industry behemoth, Microsoft was still apt to fall flat when extending its franchise. Remember Vista?
That said, Microsoft had to make this move or risk losing its core franchise to the Linux variants, such as Android and Chrome, that are already climbing up from smartphones into notebooks and even ARM-based servers. Once Microsoft delivers, ARM chip makers must step up to the plate. They are practiced at designing 8- to 32-bit embedded SoCs for handsets and other systems, but none of them has a 64-bit design; indeed, ARM doesn’t yet offer a core on which to base one.
What’s more, none of the ARM SoC makers have dealt with supporting the complexity of a software stack like Windows, with its millions of lines of code. Just as daunting is the need to support the world of PC peripherals with their often poorly written drivers — a thorn in the side of Microsoft for years. ARM SoC makers also lack experience supporting chips that might sell in volumes of tens of millions, though they’d likely welcome that challenge.
It will be a decade before all the implementation and execution issues play out for Windows on ARM. Nonetheless, the shift has officially begun.
I’ve got my popcorn and a comfortable chair. It is going to be quite a show!
Originally written on January 31, 2011.
This news, reminiscent of the shift at Apple using the ARM-based iOS on the iPad rather than MacOS, is very intriguing. This development has shaken old structures to their foundations. What will emerge when the smoke clears is anyone’s guess, but expect to see lower-power, lower-cost versions of all types of computers — from notebooks to supers — built with ARM-based systems-on-chip.
ARM: ARM is a 32-bit reduced instruction set computer (RISC) instruction set architecture (ISA) developed by ARM Holdings. It was known as the Advanced RISC Machine, and before that as the Acorn RISC Machine. The ARM architecture is the most widely used 32-bit ISA in terms of numbers produced. They were originally conceived as a processor for desktop personal computers by Acorn Computers, a market now dominated by the x86 family used by IBM PC compatible and Apple Macintosh computers. The relative simplicity of ARM processors made them suitable for low power applications. This has made them dominant in the mobile and embedded electronics market, as relatively low cost, and small microprocessors and microcontrollers.
That shift alone has long-term repercussions for everything from the emerging tablet market to the billion-dollar data centers starved for electricity in a world going “green.” It will even bring PCs to households that would never otherwise have seen them. The socioeconomic implications will ripple far beyond computing. But the implications for the electronics industry are truly significant.
Ultimately, Intel, the world’s largest semi-conductor maker, won’t be able to sustain average selling prices dozens and sometimes hundreds of dollars greater than those of other chip makers. Neither will it be able to sustain its vertically integrated business model, in which it is the world’s largest operator of chip fabrication plants.
There is also some inherent danger for the chip industry as a whole. Intel helped pioneer extreme-ultraviolet lithography, the long-delayed printing technology expected to drive the industry to ultrafine chip patterns. It is also leading the shift to larger wafers. What will happen when no one company has both the motivation and the financial clout to plow through the enormously difficult task of delivering new chip processing technologies?
Wafer: a thin slice of semiconductor material, such as a silicon crystal, used in the fabrication of integrated circuits and other microdevices. The wafer serves as the substrate for microelectronic devices built in and over the wafer and undergoes many microfabrication process steps such as doping or ion implantation, etching, deposition of various materials, and photolithographic patterning. Finally the individual microcircuits are separated (dicing) and packaged.
The need for many small players to collaborate on the future of chip technology will complicate the financial and social aspects of technical tasks that are already nearly insurmountable. Expect progress to slow just as CMOS hits atom-sized limits in physics and demands radically new approaches. And what about systems companies? Hewlett-Packard is the world’s largest user of semiconductors, most of them going into the X86 PCs and servers that are often defined at the motherboard level by Intel and Advanced Micro Devices. It is a truly commoditized world, and Intel was a leading engineering source.
CMOS: Complementary metal-oxide-semiconductor (CMOS) is a technology for constructing integrated circuits. CMOS technology is used in microprocessors, microcontrollers, static RAM, and other digital logic circuits. The words "complementary-symmetry" refer to the fact that the typical digital design style with CMOS uses complementary and symmetrical pairs of p-type and n-type metal oxide semiconductor field effect transistors (MOSFETs) for logic functions. Two important characteristics of CMOS devices are high noise immunity and low static power consumption. Significant power is only drawn while the transistors in the CMOS device are switching between on and off states. Consequently, CMOS devices do not produce as much waste heat as other forms of logic, for example transistor-transistor logic (TTL) or NMOS logic. CMOS also allows a high density of logic functions on a chip. It was primarily this reason why CMOS won the race in the eighties and became the most used technology to be implemented in very large scale integration (VLSI) chips such as microprocessors.
Then there’s the effect on PC makers such as HP, Dell, and Lenovo. All have slashed their ranks of chip and board engineers; now they will have to rebuild them. They will have to evaluate perhaps a dozen competing SoCs again as they define differentiating board designs. PC makers may even need to design ASICs
SoC: System-on-a-chip (SoC) refers to integrating all components of a computer or other electronic system into a single integrated circuit (chip). It may contain digital, analog, mixed-signal, and often radio-frequency functions — all on a single chip substrate. A typical application is in the area of embedded systems.
ASIC: Application-specific integrated circuit (ASIC) is an integrated circuit (IC) customized for a particular use, rather than intended for general-purpose use. For example, a chip designed solely to run a cell phone is an ASIC. Modern computers are typically made up of a dozen or so of these specialized ICs running everything from the network to the disk drives to the sound.
Taiwan’s OEMs will have the advantage, as they have already nimbly crossed over to designing and making ARM-based handsets. The Windows-on-ARM shift is their chance to show off their engineering prowess and break out of their role as silent partners to the big-brand companies like IBM, Dell, and HP.
None of this will happen overnight. It could take Microsoft years to deliver a solid version of Windows for ARM chips. Windows itself largely flopped until it hit version 3.1, and even once it became an industry behemoth, Microsoft was still apt to fall flat when extending its franchise. Remember Vista?
That said, Microsoft had to make this move or risk losing its core franchise to the Linux variants, such as Android and Chrome, that are already climbing up from smartphones into notebooks and even ARM-based servers. Once Microsoft delivers, ARM chip makers must step up to the plate. They are practiced at designing 8- to 32-bit embedded SoCs for handsets and other systems, but none of them has a 64-bit design; indeed, ARM doesn’t yet offer a core on which to base one.
What’s more, none of the ARM SoC makers have dealt with supporting the complexity of a software stack like Windows, with its millions of lines of code. Just as daunting is the need to support the world of PC peripherals with their often poorly written drivers — a thorn in the side of Microsoft for years. ARM SoC makers also lack experience supporting chips that might sell in volumes of tens of millions, though they’d likely welcome that challenge.
It will be a decade before all the implementation and execution issues play out for Windows on ARM. Nonetheless, the shift has officially begun.
I’ve got my popcorn and a comfortable chair. It is going to be quite a show!
Originally written on January 31, 2011.
Excellent overview. Thanks for taking the time to put this together. Waiting for the next version of
ReplyDeleteWindows.
Analog Asic Design