Decisions of the "00s": Will the Past Define the Future?

There are revolutionary products that reshape the information technology industry and the lives. But before ground-breaking products emerge, world-shattering decisions are made, or maybe certain decisions themselves change the market so drastically that they become revolutions themselves. In this article we are taking a look on decisions of the recent decade that changed personal computing industry almost completely.

by Anton Shilov
12/31/2011 | 06:57 AM

A lot of innovative devices have power to change the future of many industries. The changes, however, are not happening overnight, a lot of work has to be made in order to materialize an idea or a concept and then leverage it until a strategic inflection point is reached. Sometimes, those market changing events happen as a result of strategic shifts, sometimes the events create propositions for strategic changes.

 

In this editorial we will examine a number of decisions by various companies and various people that were the most important for the personal computing industry in the last decade in our point of view. Some decisions created giant companies, other reshaped markets, yet others set industrial trends. All of those decisions eventually formed today's IT industry and will continue to have influence on the market for years to come.

2000: Nvidia Acquires 3dfx: a New Era in PC Graphics Begins

The factual bankruptcy of 3dfx and the acquisition of its assets by Nvidia Corp. may not belong to the greatest decisions of the last decade, but certainly to important ones. This event bridges the late nineties and the early oughts on many levels: the takeover of 3dfx marked  the beginning of the "gathering of stones" era, which involves massive strategic acquisitions and creation of mega corporations.

3dfx Interactive was established in 1994 and released their first commercial product in 1996. The legendary 3dfx Voodoo graphics accelerators were only compatible with the company's proprietary Glide application programming interface (API) that ensured maximum performance. The creativity and pragmatism of people developing the 3dfx hardware was admired at first and then hated later.

The problems at 3dfx began along with the mass emergence of popularity of open standards like Direct3D and OpenGL. Those standards were supported by other mass IHVs [independent hardware vendors] and forces against 3dfx were much higher than the strength of the company itself. But this was not an un-expectable major problem: the real economic problem of 3dfx was the purchase of PCB [printed circuit board] maker called STB. While the takeover allowed 3dfx to make high-quality graphics cards, it put the company against all the powers of Taiwan- and China-based manufacturers, who used both 3dfx and Nvidia chipsets. Besides, there were obvious headwinds from ATI and Nvidia along with game developers against Glide and things like miniGL (so-called miniport). The company was bleeding heavily in terms of both revenues and profits.

To make the matters even worse, 3dfx announced back in 1999 its VSA architecture - without T&L [which was a feature for game developers] but with chip-level scalability - that was behind those offered by ATI and Nvidia in terms of features and cost.

In the year 2000, all the headwinds that 3dfx was against just blew the company out of the scene. The result was obvious: the financial power of Nvidia acquired all the assets of 3dfx along with talents and patents.

The end for 3dfx showed a clear sign to the industry: on the PC market nowadays it is impossible to impose proprietary standards, rely on sales of own-brand add-ons and be against of the whole industry at the same time.

2001: Sega Exits Video Game Console Market

It is hard for a purely game company to compete against hardware and software giants when it comes to design of consoles that can be successful throughout long life-cycles. In 2001 it became apparent that Sega had just made too many mistakes on the market of game consoles and it had to cease making them and concentrate only on game publishing.

For many years throughout its history Sega relied on sales of various coin-operated amusement-type games such as jukeboxes, slot machines and later on arcade video game machines. The devices were rather expensive, but were in very high demand. Over the years, Sega has faced many challenges and changes, but in the nineties, its core businesses were video games, video game consoles and declining coin-operated arcade machines. All the wrongs just combined themselves in the Dreamcast.

The very first issue for Sega to kill its console business was to price its Saturn game console at whopping $399 in 1995. This price was too high [for 1995, with inflation adjustments it would have been around $799+ today] and the console itself was not exactly something considerably better than the direct rivals: Sony PlayStation and Nintendo N64. The price not only stopped Sega's core fans from buying the device, but also slowed down developers from rolling out their new games for the platform. Sega Saturn was struggling. But the problems were only beginning to happen.

On November 27, 1998, Sega launched the Dreamcast game console in Japan. The Dreamcast used off-the-shelf components and therefore was more or less priced competitively at $199 in the U.S. Unfortunately, it did not offer a "next level" of quality for video games compared to Saturn. Moreover, relatively complex architecture, which contained Hitachi SH-4 processor with PowerVR2 CLX2 integrated graphics dramatically enhanced development time for game creators from 6 months to up to about 2 years. At the time, it cased fury among software makers.

Moreover, the Saturn had a really short life of three years, which drove hardware developers away from Sega into Microsoft and Sony. The core audience also chattered: the new console offered only somewhat better graphics and lacked a rich library of brand new games. But there were issues within the title availability as well: it cost more to develop title for Sega Saturn than to design a game for another platform. Saturn simply became irrelevant.

Keeping in mind the lack of engineers and economic power of Microsoft and Sony, Sega just decided to quit leaving the home game console market and to leave it to Nintendo, Microsoft and Sony. The era of mobile devices was approaching and without support from software makers and own hardware developers it made no sense to enter the game. Nonetheless, Sega has a lot of good video games for all platforms now!

2002: HP Acquires Compaq and the PC Goliath Emerges

Some events that took place in the late nineties and the early-2000s continue to plague Hewlett-Packard even today. Still, HP is now the strongest IT company in the world, thanks to its acquisition of Compaq and a great expansion of its markets.

After an acquisition spree performed by Eckhard Pfeiffer, chief executive of Compaq in the nineties, which involved takeovers of Tandem Computer (NonStop servers) and Digital Equipment Corp. (designer of Alpha processors for servers), Compaq found itself in a difficult situation. On the one hand, it was in the midst of a price-war against all the other players on the PC market, on the one hand, it had precious mission critical server assets and three different corporate cultures within itself. Even after Pfeiffer was forced out from the company in 1999, the new chief exec Michael Capellas could not restore the company.

After some struggle, Compaq and Hewlett-Packard announced the world’s largest corporate IT merger in September 2001 with the plan for HP to acquire Compaq in an all stock purchase valued at $25 billion. Numerous large HP shareholders, including Walter Hewlett (the sun of Bill Hewlett), openly opposed the transaction, which resulted in an emotional public proxy battle between those for and against the deal. Finally, after an 8 month period ending in May 2002, the merger passed shareholder and regulatory approval with the end result being one company. After another public clash with Walter Hewlett, Carly Fiorina, chief executive of HP, ordered to rename the company into HP. Following this acquisition, HP became a major player in desktops, laptops, PDAs and servers on many different markets.

"People are declaring the PC business dead because it has had a couple of rough quarters. That's incredibly shortsighted. It's clear that this is a critical part of the ability for consumers to do interesting things in their homes. But the reason for buying isn't going to be to get the hottest box at the lowest price. You've got things like digital imaging, digital music. It's something that does something for a consumer. This is what the industry is missing. It's innovation. That's what Dell can't do," said Carly Fiorina at the time.

Although a lot of Compaq's product lines were discontinued by HP, the company still offers support for PDP-11, VAX and AlphaServer machines. Besides, acquisition of Compaq clearly showed the path for further expansion to HP and in the following years the company continued to takeover smaller firms. Some of the large acquisitions (EDS, 3Com) proved to be huge successes, some of the smaller ones (VoodooPC, Palm) were clear failures.

It is clear that HP is now the No. 1 maker of PCs and has ultimate corporate, financial, and buying power. Nonetheless, there are issues with synergies, major execution challenges as well as lack of various ingredients necessary to compete in the 21st century. The journey of HP into the new world only began with the purchase of Compaq and only a part of it has been completed so far.

2002: Jerry Sanders Appoints Hector Ruiz onto CEO Role

Advanced Micro Devices had a number of incredibly important events throughout its long history: conversion from the maker of RAM and basic logic chips into a maker of x86 microprocessors, merging with ATI Technologies and so on. But one of the less-noted transitions was appointment of Hector Ruiz onto the chief executive position at the company in 2002.

Hector Ruiz worked at Texas Instruments for six years and Motorola for 22 years, after which he was hired by Jerry Sanders to become AMD's president and chief operating officer in 2000 and eventually succeed him as chief executive officer and chairman at the world’s second largest maker of x86 microprocessors. Mr. Ruiz was supposed to bring stability, consistency and great execution to AMD, goals that were partly achieved. AMD is not an underdog CPU supplier now, but a rather competitive player on the market. Unfortunately, Hector Ruiz himself was ousted from the company after a series of strategic and tactic mistakes.


Jerry Sanders

AMD had a winning K7 micro-architecture in 2000 thanks to acquisition of chip design team from Digital/DEC and was readying even more promising x86-64 micro-architecture known as K8. From the roadmap and technology standpoints back in 2000 the company was very well set: it had plans, it had engineers with innovative ideas, it had leading-edge manufacturing facilities. What AMD did not have was a reputation, history of flawless execution and internal stability. So, in 2002, Jerry Sanders stepped down and Hector Ruiz became the CEO of AMD.

Hector Ruiz was pretty well-known in the semiconductor world due to his work at Motorola and he was among the perfect candidates to persuade new partners, particularly among large PC and server makers, to start using AMD microprocessors. Besides, he should ensure AMD’s stable and sustainable growth for many years to come. Thanks to progressive AMD64 micro-architecture and performance of Athlon 64 and Opteron central processing units, AMD’s market share started to increase from the year 2004.


Hector Ruiz. Image from generation-nt.com

Mr. Ruiz has achieved quite a lot of goals at AMD, including market share increase, score a number of major contracts with largest server makers, including IBM, Dell, HP, Sun Microsystems and others, expansion of design centers and manufacturing facilities, starting platform-oriented strategy with the acquisition of ATI Technologies, building relations with key customers and transforming AMD into a respected supplier of commercial and enterprise-class components.

But there are failures too. Even though under the rule of Mr. Ruiz AMD managed to increase its market share rather considerably from about 18% to approximately 23%, the key technologies that helped AMD to transform into a leading CPU maker from an underdog – HyperTransport, K7 and K8 micro-architectures – were developed under the supervision of the previous chief executive and the founder of AMD, Jerry Sanders.

Hector Ruiz also could not fix several fundamental issues that AMD has always had. The ramp up of new processor micro-architectures has always took a considerable amount of time at AMD and the delays of chips based on K10 micro-architecture with up to four processing engines is a good proof for that. Additionally, transitions to new manufacturing technologies have also been slow at AMD and Mr. Ruiz did not manage to change that. In fact, the biggest mistake of Hector Ruiz while at AMD was to slowdown transition from 90nm to 65nm. The transition from 90nm to 65nm took AMD three years (from October, 2004, when the first 90nm chips were launched, to September, 2007, when the first 65nm high-end chips were released) and left AMD completely uncompetitive against Intel in terms of performance. Instead of adopting Intel's tick-tock model, AMD had to sell its fabs off in 2008 - 2009 timeframe in order not to go bankrupt. But this is a completely different story...

2003: Steve Jobs Lets iPod + iTunes into Windows World - Apple Breaks Through

Apple iPod digital media player has indisputably changed the music industry and significantly influenced media industry in general. But what few people understand is that iPod and its mass availability essentially transformed Apple from a computer maker into a leading maker of consumer electronics. In fact, the whole Apple’s roadmap in the last decade was catalyzed by one decision: enabling of iPod and iTunes on Windows.

Although Steve Jobs was a great innovator, he was also a big fan of closed eco-systems. When Apple designed the iPod in 2001, the device was meant to be a gadget for Macintosh users. In fact, many, Steve Jobs included, thought that it would drive sales of Macs upwards, something that was extremely important for the company in 2001 - 2002 as sales of Macs had been on the decline for a while back then. Steve Jobs thought that by ceasing to advertise Macs and shifting resources to iPod would catalyze sales of both iPods and Macintosh computers.

"I had this crazy idea that we could sell just as many Macs by advertising the iPod. In addition, the iPod would position Apple as evoking innovation and youth. So I moved $75 million of advertising money to the iPod, even though the category didn’t justify one hundredth of that. That meant that we completely dominated the market for music players. We outspent everybody by a factor of about a hundred," Steve Jobs later explained to his biographer Walter Isaacson.


Steve Jobs. Photo by ibitimes.com

Although sales of iPod personal digital media players gradually increased, specifically after the company launched the iTunes online music store in 2003, its success were constrained by the fact that the iPod and iTunes were compatible only with Macintosh systems that were still very low. When top executive at Apple at the time – Phil Schiller, Jon Rubinstein, Robbin and Faddel – approached Steve Jobs with proposal to port iTunes to Windows and enable operation of iPod, he reportedly went furious as Windows and even relatively open eco-system was against his nature. Moreover, Steve Jobs continued to think that Mac-exclusive iPod could help to boost sales of Macintosh PCs.

"Until you can prove to me that it will make business sense, I’m not going to do it," he said back then.

The Apple team of executives developed a spreadsheet and under all scenarios, there was no amount of cannibalization of Mac sales that would outweigh the sales of iPods. So, after a while, Steve Jobs finally agreed that it made a great sense to make iPod compatible with Windows and afterwards even insisted onto porting iTunes to Windows PCs. The iTunes for Windows released in October, 2003, as well as iPod mini launched in January, 2004, helped sales of Apple’s players to skyrocket from hundreds of thousands per quarter in late 2003 to a couple of millions in early 2005.

The success of the iPod among hundreds of millions of Windows users clearly showed Apple the path to success on the mass market, opened the door for all other products made by Apple, including iPhone, iPad and new generations of Macintosh systems as well as changed the paradigm of how Apple products are designed and made. Macs were made for hundreds of thousands or millions of users. The iPhone and iPad are aimed at hundreds of millions of consumers, a radical change for Apple and a major shift for the industry.

Later on, the iTunes became a universal store for everything: music, videos, movies, books and apps, marking another shift for Apple and showing an example of a large legal content hub.

2004: IBM Spins Off PC Business: The War of the Clones Is Over

On the 12th of August, 1981, International Business Machines introduced its IBM Personal Computer, the system that changed the world and history of the mankind. The computer that was designed for business, school and home eventually paved the way for new industries, new opportunities and new quality of life. In 2004, IBM decided to get rid of the PC business, which ended the era of IBM PC-compatibles and kicked of a new one.

The main idea behind the very first IBM PC model 5150 was the reason why eventually personal computers became commodity and IBM had to sold the business unit. Unlike many computers in the early eighties, IBM PC was modular, which allowed to quickly configure systems in accordance with users' demands, something which greatly reduced pricing of such personal computers compared to competitive offerings. Yet another thing that IBM did was allowing third-parties to develop software for IBM PC. The off-the-shelf hardware and software components as well as open architecture enabled other makers of computers to produce IBM PC-compatible clones. The IBM PC became a standard and the personal computer industry  as well as a new age in history were born. Naturally, IBM tried to stop cloning of its systems with various methods, from introduction of PS/2 systems in the eighties to using expensive innovations like titan skeletons for ThinkPad notebooks. But the clones prevailed.


IBM ThinkPad. Photo by laptopenterprise.com

While back in the eighties and the nineties production of computer components and computers was complex and expensive (as it was located in the U.S., Europe, Mexico, Japan, etc.), towards the end of the nineties and early-2000s manufacturing became much cheaper (as it was transferred to China, Taiwan, Thailand, Philippines, etc.). As a result, even small white-box makers without significant capital could build PCs locally while acquiring components from manufacturers in Southeast Asia. Needless to say that price wars started immediately across the industry.

By 2003, it became clear that companies like Intel Corp. are considering PC market as growing too slow for them and started to architect fully-integrated chipset platforms that allowed PC makers to integrate premium features into systems without increasing their price too much. As a result, systems with Wi-Fi support, which previously were considered as premium, quickly became standard. Moreover, with further development of platforms architected by chip designers other premium features and capabilities from high-end systems quickly migrated into mainstream. For example, with ultrabooks, ultra-thin systems are available from both premium manufacturers as well as from makers of low-cost systems at pretty much the same price-points.

Another issue for IBM with its PC business back in 2002 - 2004 was the fact that manufacturers from China and Taiwan gained a lot of advanced technology to build high-end systems and components and were able to quickly drop pricing (sometimes at the expense of R&D) in order to sustain their revenue growth. Needless to say that by 2004 it was clear: PCs are becoming part of the everyday lives of a billion of people, a commodity.

While IBM clearly was unable to compete on the market of commodities as the whole company is tailored for extreme innovation, for IBM, personal computers back in 2004 were somewhat a complementary business. The firm sold high-end servers, enterprise-class systems and mainframes to large businesses and PCs were then sold/bought to those companies as something additional, which prevented sales growth of ThinkPad or ThinkCentre. IBM could not compete against aggressive rivals like Acer, Asustek (which essentially produced IBM PC-clones) and others on the market of consumer PCs as well. As a result, on the 8th of December, 2004, IBM and Lenovo announced transaction under which the latter acquired the former's PC business unit.

It is clear that IBM helped to establish the market of personal computers and they changed a lot. However, IBM itself could not compete against crowds of makers who made the PCs a commodity product. While Lenovo was able to keep ThinkPad and ThinkCentre product lineups, nowadays those computers no longer represent the pinnacle of innovations. The top-of-the-range ThinkPad X1 laptop has a worse screen than a three-year-old ThinkPad X301, lacks WWAN support by default as well as other must-have features for an ultra high-end 2011 notebook that pretends to be absolutely-the-best. Nonetheless, the heritage of IBM innovation continues to live on in anti-spill keyboard, roll-cage design and other important parts of the ThinkPads.

2005: Steve Jobs Shakes Hands with Paul Otellini: x86 Now Inside Apple 

For many years Apple relied its Macs on not-so-popular PowerPC architecture chips made by IBM and Freescale/Motorola. Even though there was some kind of competition between IBM and Motorola, it did not yield into frequent product updates, massive performance improvements or maximized volume availability. As a result, Apple needed to find a new chip partner and Intel was a perfect fit.

“Our goal is to provide our customers with the best personal computers in the world, and looking ahead Intel has the strongest processor roadmap by far. It's been ten years since our transition to the PowerPC, and we think Intel's technology will help us create the best personal computers for the next ten years,” said Steve Jobs, Apple’s CEO back then.

Although PowerPC chips made by Freescale/Motorola and IBM powered Apple Macintosh systems for about a decade, their performance by the early 2000s was clearly lower compared to AMD Athlon 64 or Intel Pentium 4. Moreover, the absence of frequent product refreshes slowed down Apple's product refreshes, something very critical in the rapidly changing world. Finally, neither IBM nor Motorola had a product lineup that would include everything from smartphones to tablets to notebooks to desktops to high-end workstations. Not that Intel has a lineup like this now, but it did have a clear roadmap back in the years and is on track to address every market possible with its low-power or high-performance central processing units.


Paul Otellini in "bunny" suit handles a wafer with Intel microprocessors to Steve Jobs. Photo by clubic.com

So Apple plotted transition to x86 starting from the very first versions of the Mac OS X back in 2001. The operating system itself could work on both PowerPC microprocessors as well as on Intel x86 chips. An issue for Apple and Steve Jobs personally was the price: the company would like to make an exclusive deal with Intel in exchange for discounts. Finally, the deal was reached in 2005.

"Apple and Intel, together at last,” flashed on the big screen while Steve Jobs and Paul Otellini hugged.

Apple received astounding amount of benefits by collaborating with Intel: Intel can offer higher supply quantities; Intel can offer more rapid performance growth with its chips; product design flexibility due to Intel’s broad product portfolio; Product price flexibility due to Intel’s broad product portfolio as well as low building costs thanks to standard components. Apple and Intel can now collaborate on leading-edge technologies, for example, this year they rolled out Thunderbolt and more leading-edge technologies to come. Finally, at present nobody can claim that Apple's systems offer lower performance compared to Windows-based PCs as they are based on the same microprocessors. Apple's current market share is still rather negligible 5.5% worldwide, but it is a couple of universes ahead of where it was back in the mid-2000s.

What is even more interesting is that Apple is looking forward unification of Mac OS and iOS in several years from now. As a result, it will have to either keep ARM inside mobile devices and x86 inside more powerful machines, or unify operation of applications on both ARM and x86 architectures, causing another major tectonic shift for its industry.

2006: AMD Acquires ATI: Future Is Fusion!

 Although the takeover of ATI - a decision that may arguably be called the decision of the decade - saved AMD during hard times, its acquisition completely collapsed AMD's own plans, ruined a lot of prospects for ATI, but caused massive ramifications across the industries.

By the end of 2005 it became clear that GPU-accelerated computing had massive potential for various needs (most importantly, for high-performance computing [HPC] where AMD's chips were popular) and high-definition visualizations will prevail over simplistic user interfaces of the first half of the 2000s. AMD realized that in ten years from then it chips may not scale in performance as easily and as quickly as graphics processors and therefore it will have to share the HPC space with special-purpose GPU-based accelerators. Besides, given the fact that it did not have its own chipsets and graphics processors, eventually it would not be able to compete against Intel when it comes to platforms as it was clear that the graphics core was heading under the hood of microprocessors. So in late 2005 AMD started to negotiate with ATI Technologies considering merger or acquisition.

There was a great rationale for AMD to buy ATI at the time. AMD was growing both in terms of CPU sales and market share, AMD had massive manufacturing capacity expansion plans for 2010 and forward, AMD needed to have own computing platform with chipsets and embedded graphics cores, AMD needed to enter new markets, most importantly, markets of GPUs and consumer electronics. Furthermore, AMD knew it would lose performance crown to Intel in late 2006 and clearly needed profitable sources of revenue.

ATI was in a different situation. The company has always said that it is tailored for product portfolio expansion. In mid-2006, 77% of ATI's revenues came from various PC segments (GPUs, chipsets), while about 23% were the earnings of consumer electronics divisions. While the competition between ATI and Nvidia intensified back then, ATI had bright prospects on the DTV and ultra-mobile graphics markets. There was a threat that Intel enters the market of discrete graphics adapters and that would hurt ATI's sales, but this was certainly not the primary reason why ATI decided to become a part of AMD. Perhaps, the prospects to form an extremely competitive and extremely large corporation inspired the move, perhaps there were more incentives, but the companies officially became one in late October, 2006.

Unfortunately, the merger did not proceed smoothly. After ATI began to lose market share, sales volumes and failed to deliver DirectX 10 lineup on time, questions arose whether the merger was actually necessary. Moreover, AMD-ATI reconsidered the Fusion central processing unit (CPU) roadmap a number of times, which made the merge even more questionable. The year 2007 was a complete disaster for the “new” AMD. The company delayed release of its 65nm quad-core AMD Opteron “Barcelona” processor and subsequently AMD Phenom X4 “Agena” chip by around three quarters and when it released them it appeared that the design had an erratum, which essentially stalled shipments. Moreover, ATI Radeon HD 2000 failed to beat Nvidia GeForce 8000-series in terms of performance and then ATI Radeon HD 3000 did not manage to offer performance advantages too. The graphics business unit - ATI - was back on track already in 2008 with award-winning Radeon HD 4000-series, meanwhile, AMD still fails to deliver products that beat Intel's with no compromise.

But the worst things that happened to the combined AMD were the necessity to sell ultra-low-power GPU assets, DTV assets as well as manufacturing facilities. At present AMD simply cannot compete on the market of consumer electronics and it will take years till AMD develops new ultra low-power graphics processing technology for smartphones and tablets as well as adopt x86 micro-architecture for those devices. Moreover, without actual in-depth knowledge of Android or Windows Phone platforms, the company will not be able to compete against already established players like Nvidia, Qualcomm, Samsung or Texas Instruments. Finally, without own production facilities, the company cannot quickly tailor process technology in order to boost speeds and improve yields of its current CPUs.

While AMD continues to have bright ideas about the future of CPU and GPU technologies, it lacks many assets to be successful in the next generation of devices.

2007: Google Founds Open Handset Alliance: The "Next Windows" Is Born

Microsoft Corp. cannot be challenged on its home turf since Windows operating system is a de facto industrial standard for personal computers. But Google decided to challenge Microsoft on a completely new ground: smarter mobile devices that are more aware of its owner's location and preferences. But Google decided not to go at war alone. It formed Open Handset Alliance with both cellphone manufacturers, network carriers and others in 2007 to develop Android operating system.

The OHA made the launch of Android operating system possible and the operating system was actually released commercially in 2008. But the history of Android started a lot earlier, sometimes in 2003, when Andy Rubin (co-founder of Danger), Rich Miner (co-founder of Wildfire Communications, Nick Sears (a VP at T-Mobile) and Chris White founded Android, a company set to develop operating system for smart ultra-portable devices with rich multimedia capabilities and always connected to the Internet. After a series of financial struggles, Android was acquired by Google.

But while Google was already a strong player on the market, it clearly understood that without backing from smartphone manufacturers, operators, semiconductor companies and software developers  the Android will never become really popular. The Open Handset Alliance united all of those parties and let everyone have influence onto the operating system while not dictating anyone what exactly to do.

As a result, there are hundreds of different Android-based smartphones and tablets from tens of manufacturers across the world. While it is true that there is a great segmentation on the Android market, while Apple iOS and Microsoft Windows Phone everything is stable, there is also something for everyone. Not surprising that Android is now the most popular operating system for smartphones and second most popular operating system for media tablets.

Today, over 700 thousand of Google Android-based devices are activated daily. For comparison, about a million of Microsoft Windows-powered PCs are purchased every day across the world. Therefore, Google has every right to call Android the next Windows, which it actually is.

2008: Warner Bros. Drops HD DVD Support, Blu-Ray Becomes New DVD

Sometimes a decision of one defines the future of the whole industry. Warner Bros.' decision to choose Blu-ray technology as the primary high-definition home entertainment format for the future caused all the other market players to switch to Blu-ray from HD DVD.

Despite of the fact that the Blu-ray disc had larger storage capacity than HD DVD, back in 2006 - 2007, during the war between the high-def formats, it was not something that played a critical role. HD DVD had its own advantages, most important of which was price and finalized specification. The most important factor for any format or platform aimed at consumers is support from content owners.

Among major Hollywood studios HD DVD was exclusively supported by Universal Pictures, whereas Sony Pictures, Twentieth Century Fox and Walt Disney exclusively backed Blu-ray. Paramount Pictures and Warner Bros., as well as studios controlled by the two companies, remained format neutral and released both BD and HD DVD. In a bid to establish leadership of their formats, Sony and Toshiba had to provide incentives to "convert" studios into exclusive supporters.

In mid-2007, after it transpired that HD DVD was winning the game in Europe, Paramount decided to become HD DVD exclusive (for some $150 million in incentives), which provided a huge benefit to the format in the form of highly-popular “Transformers” movie. In September '07 Walt Disney approved HD DVD 51GB disc specification at the DVD Forum and Twentieth Century Fox started talks with Toshiba and Microsoft regarding becoming HD DVD exclusive provided that Warner Bros. also becomes HD DVD exclusive. All-in-all, Warner Bros. became the key for rivaling parties to win the format war, which meant that the prize will go to the highest bidder.

Various rumours around the Internet claimed that the Blu-ray disc Association provided “incentives” worth $500 - $620 million to Time Warner, the parent company or several studios, including Warner Bros. and New Line Cinema, but Time Warner denied those allegations. Another rumour suggests that Warner’s Blu-ray exclusivity would only last till Q1 2009 and that the company got $450 million for that.

At the Consumer Electronics Show 2008 Warner dropped the  bomb onto the HD DVD by announcing plans to become Blu-ray exclusive starting May '08, which left Toshiba only with Paramount and Universal as the backers of the format. Several weeks after Warner’s announcement, Toshiba itself said it would cease manufacturing of HD DVD, which marked the end of the format war as well as HD DVD technology.

Perhaps, there was a lot of rationale to support Blu-ray instead of HD DVD since capacities of BD and its improved bandwidth eventually enabled such things as stereo-3D supporting Blu-ray 3D as well as 7.1-channel DTS HD Master Audio. But everything was a result of a decision by one company.

2009: Globalfoundries: Real Men No Longer Have Fabs

Jerry Sanders, the founder of Advanced Micro Devices, once said that the real men - those, who can actually compete on the market of x86 microprocessors - should have fabs. Unfortunately, a series of mistakes and economic downturn left AMD without own manufacturing capacities.

Hector Ruiz did a lot of good things for AMD. Moreover, he understood very well that two of the major reasons why Intel is so considerably ahead of AMD is because the company can produce huge amounts of microprocessors thanks to massive production capacities and leadership in manufacturing technologies.

Hector Ruiz negotiated with authorities in Germany and New York about expansion of manufacturing capacities and constructing a new fab. According to estimations in the mid-2000s, by 2010 the Sunnyvale, California-based chipmaker would have three foundries producing chips on 300mm wafers with total output up to 67 500 wafers per month, which meant that AMD would have increased its capacities by over five times from 2005 to 2010. Unfortunately, the plans were never fulfilled.

Chief executive officer of AMD decided to somewhat slowdown transition to 65nm process technology since sales of 90nm chips were very high back in 2005 thanks to performance advantages over Intel processors. However, three-year-long life-cycle of 90nm technology left AMD completely uncompetitive against Intel in 2007. Moreover, design issues with the famous "Barcelona" product with K10 micro-architecture left AMD without the premium CPU sales in 2008. Fortunately for AMD, Dirk Meyer was very instrumental to fix the situation with both 45nm and K10.5 micro-architecture so AMD could fight back some positions when it launched Phenom II chips in early 2009.

But poor sales and losses throughout 2007, 2008 as well as 2009 because of uncompetitive products and economic crisis forced AMD to make a deal with Advanced Technology Investment (owned by Mubadala of Abu Dhabi), spin off its manufacturing facilities and form Globalfoundries, a contract maker of semiconductors.

While initially AMD had massive influence of Globalfoundries and its 45nm manufacturing technology was developed under its own supervision and was tailored for its microprocessors, eventually the company sold the majority of its GF stock to ATIC and lost leadership in the board of GF's directors. Without hands-on control at Globalfoundries, AMD ran into massive problems with 32nm fabrication process. Instead of shipping its highly-anticipated Bulldozer products in 2010, the company had to delay them by about a year and only initiated shipments in late September, 2011. To make the matters worse, AMD fired its chief exec in early 2011 and for nine months there was no executive to tackle the problem and its possible consequences. Going forward, AMD will likely solve its problems with fabrication processes, yields, etc. But it will no longer be AMD from 2000 - 2006 with performance crown on its head and with limitless prospects down the road.

But while real men from AMD no longer have fabs, the industry got a new large contract maker of semiconductors that is able to compete against Taiwan Semiconductor Manufacturing Company both in terms of process technologies and volumes thanks to strong financial backing from ATIC and Mubadala.

2010: Microsoft Embraces ARM Micro-Architecture for Major Windows Operating Systems

Since 1997 Microsoft and ARM have worked together on software and devices across the embedded, consumer and mobile spaces, enabling many companies to deliver user experiences on a broad portfolio of ARM-based products. In July, 2010, Microsoft announced that it had licensed ARM architecture and got closer access to ARM's intellectual property (IP), which signaled the beginning of Windows on ARM initiative. At CES 2011 the world's largest software maker confirmed plans to let Windows 8 to run on ARM microprocessors.

Microsoft Windows 8 on ARM will be primarily designed for tablets as well as ultra low-power notebooks or desktops that do not require truly high performance. Applications designed for x86 hardware will not be able to run on ARM-powered system-on-chips, but Microsoft did imply that apps designed for Metro interface of Windows 8/Windows Phone will run on any device on any processor provided that it has appropriate Microsoft operating system on it. Besides, it is very likely to expect Microsoft to design its own software, for example MS Office, compatible with both ARM and x86 systems.

The "WARM" pact between ARM and Microsoft is a clear breakthrough not only for ARM, but also for the whole computing and consumer industries. For over a decade ARM only powered devices with limited capabilities and low power consumption. Thanks to "WARM", ARM breaks through onto feature-rich Windows 8 tablets, notebooks and even low-power desktops that can tolerate limited performance of ARM architecture. Naturally, Microsoft also opens itself doors onto yet untapped markets of consumer electronics, something that Windows XP Media Center Edition failed to do.

The close work with ARM can, perhaps, even lead to emergence of Microsoft's own hardware like smartphones and tablets (just in case Microsoft is willing to compete against its own partners). Besides, a rumour has it that Microsoft's Xbox Next (Xbox Loop) will also contain an SoC with ARM general-purpose cores as well as AMD Radeon HD custom graphics and other things.

Obviously, the WARM pact clearly threatens numerous companies, including hardware makers like AMD or Intel as well as software developers like Apple or Google (or emerging platforms like Tizen). Microsoft does not care about Atom or Athlon, what it cares about are deployments of its own operating systems. With ARM support and proper Windows implementation, it does allow Microsoft to compete on emerging devices' markets and maintain its leadership on the desktop/notebook and server markets.

Going forward, it is logical to expect Microsoft to fully unify kernels of its Windows and Windows Phone operating systems, which is another step towards unified customer experience across a range of devices. In any case, support of ARM by "big" Windows 8 in 2012 clearly disrupts many industries despite of the fact that it was just one big decision.

2011: The Year of Shifts 

The year 2011 is about to end. We yet do not know which of the decisions made this year will actually disrupt the consumer or computing industries. We picked up three decisions that clearly have long consequences and can have influences well-beyond companies that actually make them. Coincidentally, all of those decisions were made early during the year.

2011: AMD Fires Dirk Meyer

In early January, 2011, the board of directors of Advanced Micro Devices decided to oust Dirk Meyer from the company due to disagreements with his strategy of further development of the company. Perhaps, his plan was not right, but it is unclear whether the new chief exec can actually form a winning strategy and tactics and then execute his roadmap flawlessly.

Dirk Meyer is a person whose role at AMD is hard to overestimate. He personally led the development of the legendary Athlon processor that completely changed AMD as a company. He also participated in the design of AMD's x86-64 architecture as well as K8 processor known as Opteron and Athlon 64, the chips that drove AMD into high-end desktops and the datacenter. Besides, starting from 2008 he managed to get AMD back on track during the worst economic downturn in the recent decade.

Based on what we do know about the reasons why AMD's directors decided to oust Mr. Meyer was his concentration on high-end servers and datacenters while not addressing the market of ultra low-power devices properly. Mr. Meyer justified the focus by stating that an increasing mobile and consumer electronics markets do not shrink the traditional markets. The former chief executive of AMD wanted to address the ultra low-power markets later, when AMD is prepared to dedicate sufficient resources to address them well. But the board of directors decided that it needed a clear roadmap for ultra-portables.

AMD hired ex-Lenovo executive Rory Read to address popular markets with AMD's chips. Mr. Read is not an engineer and cannot solve complex hardware related problems with hands-on control. It remains to be seen whether he will be as successful at AMD as he was at IBM and Lenovo.

2011: Nokia Chooses Windows Phone, Inventor of Smartphones Loses Crown

In February Nokia dropped a bomb onto its own sales by announcing plans to adopt Windows Phone operating system and drop support for its own Symbian. This quickly dropped sales of Nokia's existing smartphones whereas Microsoft Windows Phone did not get any attention from the end-users.

Stephen Elop, a long-time Microsoft executive, became chief executive of Nokia in September '10 after the board of directors decided that his predecessor failed to reclaim lost market share in the U.S. Generally speaking, his mission was to address the market in North America while not losing the share in Europe and other parts of the market. Unfortunately for Nokia, pre-announcement of the transition towards Windows Phone 7.5 "Mango" early during the year not only harmed Nokia's sales, but also revenues and stock price.

Nokia has been criticizing Google Android platform for a while claiming that it limited differentiation between vendors because of some software limitations. At the same time, Microsoft puts cap onto both hardware and software capabilities of Windows Phone, which is why the platform is not popular among manufacturers, carriers and customers. If about 700 thousand of Google Android and 300 thousand of Apple iOS are activate daily, Microsoft keeps its mouth shut about the success of the WP OS. Gartner, nonetheless, claims that Microsoft Windows Mobile and Windows Phone was installed on 1.5% of smartphones sold in Q3.

The announcements of Nokia helped to drive sales of Android and iPhone upwards and even make Apple and Samsung the leaders of the smartphone market. It remains to be seen whether Nokia will ever be able to become the No. 1 smartphone vendor again.

2011: Oracle Drops Itanium Support, Loses Interest for x86 Chips

It looks like there is no place for proprietary processor architectures even in the mission-critical segment of the market.

Oracle, the world's largest designer of business software, said in March that it stopped development of all software designed for Intel Corp.'s Itanium platform. The company said that the decision was made due to Intel's focus on x86 and the fact that Itanium was nearing the end of its life. However, the company has clear reasons not to support Itanium: after the acquisition of Sun Microsystems, it obtained its own mission-critical server platform. Towards the end of the year, Oracle said it was not interested in selling x86 servers as well.

What are ramifications for the industry? At present there are three highly-integrated mission-critical software/hardware platforms on the market: IBM Power, Oracle SPARC and HP Superdome/Integrity as well as HP NonStop powered by Intel Itanium. Until early this year, Oracle supported all three of them until early this year with its Oracle DB and other extreme software for enterprises. However, there will be no more Oracle software for Intel/HP business-critical platforms.

Loads of software and hardware partners have been ceasing support for a while now. Many server makers dropped Itanium platform, including Dell and IBM and so did loads of software developers, including Microsoft and Red Hat. Even SGI reduced its focus on Itanium back in 2009. Essentially, it means that Intel's plan to make proprietary IA64 mission-critical systems widely available has failed.

The world's largest chipmaker understands that Itanium will never shine and will never replace neither Power or SPARC. As a result, Intel has been adding RAS (reliability, availability, serviceability) features into the Xeon microprocessors and thus at present Itanium and Xeon are equally reliable in terms of hardware. However, Itanium is also supported by HP Unix, OpenVMS and NonStop platforms that were made for business-critical machines. It is possible for HP to redesign the aforementioned programs to run on Intel Xeon chips (HP's project Odyssey is very close to that), but the question is whether Oracle is actually interested in supporting HP-UX and NonStop at all.

It looks like Itanium has a number of years ahead of it, but the future belongs to Intel's x86 solutions when it comes to enterprise-class systems. Perhaps, this is exactly what Oracle is afraid of? After making comments about negligibility of x86 servers in its own shipments, rumours about dropping support for non-Oracle Linux distributives and ceasing development of new programs for Itanium, it is clear that Oracle wants to concentrate on its own SPARC platform for different classes of servers and make it the best in the world.

Why? Because once Itanium is dead, the market of mission-critical machines will be redefined again. In fact there are signs of this showing up already. Oracle wants to meet the new challenges fully armed and without established rivals like Itanium. But changes are coming into the universe of mission-critical computing.