Mobile Computing and Parallel Processing Convergence

Posted on: February 1, 2010
Posted in Mobile, Strategy

Image001
Near the turn of the century, Intel predicted they would be running processors at 15 – 20 GHz by 2010. This prediction never held, and today Intel is selling multi-core processors, with each individual core operating around 1 – 2 GHz. It’s worth noting that these speeds were achieved on single-core processors over 10 years ago.

So what is multi-core and why the shift in strategy? To use an analogy, a multi-core processor is like a meal cooking in separate pots on 4 different stovetop burners, instead of in 1 supersize pot on a large flame. Each pot cooks independently, which allows the chef to better manage ingredients and temperature. For example, a good cook can turn off a couple burners or move the food around and focus on different portions of the meal.

Around 2005, Intel began to publicly redouble its efforts to break the 15 GHz barrier, and they dubbed a Fundamental Theorem of Multi-Core Processors. This affirmed that an equivalent multi-core processor would provide better performance per watt of power dissipated than would a single fast processor. Faster was not better. Interestingly, Intel never really had to worry about power dissipation before this time. Not only were desktop PCs plugged into the wall, but notebook computers hadn’t evolved to where users demanded much performance or feature differentiation.

As the parallel processing movement gained steam, problems arose – each chip had to employ its own on-board “cook” to manage the separate cores. By nature, multi-core processors must manage inter-processor communication (IPC), and the associated software complexity of splitting tasks between cores. Standard software design techniques are not well suited to deal with these inherent complexities. In fact, John Hennessy of Stanford and MIPS lore, claims parallelism is the biggest challenge to computing since the advent of high-level programming languages, some 50 years ago.

In 2005 there wasn’t a ton of innovation coming out of the software incumbents to deal with these issues. Part of this was because we’d become accustomed to 15 – 20 years of speed doubling every 18 months, courtesy of Moore’s Law. You couldn’t get anybody’s attention inside Microsoft by saying that the future was parallelism back in 2005. This typifies why Windows is a relic of another generation, and explains some of the performance inadequacies around Windows XP and Vista.

Meanwhile, the mobile computing market was having its own “parallel” revolution. The de facto processor of choice had become a licensed architecture called ARM, which any company could pay to use. This removed the barriers to entry, of which Intel had been famous for defending at all costs in the PC market. In embedded markets (non-PC), devices known as “heterogeneous processors” dominated, which contained cores optimized for specific classes of tasks – e.g. a main processor, an HD video encoder, an imaging engine, and an audio processor. Since the embedded market had already embraced a parallel model, there existed an abundance of optimized chips which met very specific applications and price points.

Intel had decided to storm ahead as well, and by 2007 reached major commercial success with its quad core processors. However, these devices were still aimed for PC and server markets, and Intel had failed repeatedly to penetrate the mobile market. In fact they exited it entirely in 2006, selling off their own ARM computing division (to Marvell). Today, Intel essentially has 0% of the smartphone and portable device market. ARM has above 95%.

Andy Grove, one of Intel’s early founders is famous for this quote: “only the paranoid survive”. Andy knew that a company must constantly guard against strategic inflection points – times when the fundamentals of a business or market are about to change. The problem is, these inflection points are often very difficult to measure objectively as they occur. Clay Christensen’s disruptive technology theory along with Chris Dixon’s comparison of disruptive technologies to toys explain why this phenomenon tends to reoccur.

Today, the line between mobile computing and PCs is substantially blurred. Intel, for all its competencies in processor design, appears to have realized this too late. Apple’s iPad looks like it will finally bring tablet computing out of the trough of disillusionment into the mainstream, and it uses zero Intel content. It’s very likely that Intel will suffer as the PC market plateaus, as more and more people use “non-PC” computing devices. If Andy Grove were still around, this inflection point would surely cause him to lose sleep at night.

Comments are closed.

Tweet or Like this post.