Breakthroughs of the Decade: Products and Technologies That Changed the World

Every day thousands of electronics products born and vanish into oblivion. Some devices and technologies capture a significant market share, some just become one of the many. But there are devices that we call "breakthroughs", those, who not only play a significant role on the market, but which become inflection points for the high-tech consumer industry at large. Today we are taking a look that the digital products that impacted the lives in the last ten years.

by Anton Shilov
12/25/2010 | 11:43 AM

 Each year end-users acquire hundreds of millions of various electronic gadgets that are worth hundreds of billions of dollars. The absolute majority of companies have broad product lines with models aiming different types of customers with diverse requirements and income. But it does not take hundreds of devices to change the future of the market forever: the future can be changed with one innovative device, one right or wrong decision or just the progress itself.

 

While there are market-changing events or products, they do not emerge overnight, they are not traditional milestones, but they are results of evolutionary chains of events that transform into revolutions.

 In this article we are going to take a look not only on the products that eventually became strategic inflection points for the industry and catalyzed creation of new types of companies and emergence of new rivals, but also try to understand why did those products materialized at all and which changes did they bring.

2000 - Athlon and Duron Remake AMD

For many years Advanced Micro Devices was a second-source of x86 microprocessors for makers of personal computers and was basically the shadow of its larger rival Intel Corp. For many years the company made clones of Intel-designed processors and hardly could imagine itself as a rival that would offer high-performance leading-edge products. But the last decade changed everything: even though AMD is still a lot smaller than Intel, its microprocessors now compete head-to-head against Intel's chips.

Founded in 1969 by seven ex-Fairchild employees headed by Jerry Sanders, AMD made random access memory (RAM), simplistic logic chips and eventually clones of Intel microprocessors after in 1982 International Business Machines demanded to ensure second source for x86 processors developed by Intel and the chip designer signed an agreement that allowed AMD to produce clones. But in 1986 the larger rival refused to provide technical details about the 80386 microprocessor and after numerous disputes in court, the company had to reverse-engineer both 80386 and 80486-series of microprocessors to create its own. Quite unsurprisingly, as design cycles became shorter, reverse-engineering became inefficient and it was clear that the company needed to design its own chips. The first two generations of AMD's own microprocessors - K5 and K6 - were not exactly competitive from floating point performance point of view, yet, they still were inexpensive alternative to Intel's thanks to the fact that they could be installed into the same sockets.

The seventh-generation AMD microprocessor should not only offer performance on par or higher compared to Intel's Pentium, but also be fast enough to make mainboard manufacturers produce motherboards only compatible with AMD chips. Knowing that, Jerry Sanders, the co-founder and the chief executive officer of AMD, gradually hired engineers from outside the company since the mid nineties. Following his "people first, profit and products will follow motto", Mr. Sanders hired Dirk Meyer and a team of talented engineers who once co-developed Alpha-series processors at DEC in 1996. That team created the processor announced on June 23, 1999, the AMD Athlon.

AMD started to ship its Athlon processor in August, 1999, and the world quickly discovered that the chip could outperform Intel's Pentium III at the same clock-speed and had better frequency potential. Moreover, Intel's 820 core-logic compatible with the latest Pentium III "Coppermine" processors used expensive RDRAM memory developed by Rambus, which did not provide any tangible performance benefits. A version of the core-logic with memory translator hub (MTH), which allowed to use traditional PC100 SDRAM, appeared to be extremely buggy. The result was clear: AMD systems were faster, less expensive and more reliable.

Unfortunately for AMD, the new central processing unit came in slot form-factor, which meant that mainboards for Athlon were expensive to make. In addition, AMD's 750 core-logic was not absolutely the best chipset ever designed and Via Technologies' KX133 was nowhere to be found. To make the matters worse, manufacturers of motherboards were reluctant to manufacture platforms for Athlon (since they did not want to ruin the relationship with Intel, an explanation I will hear in 2005 from a server maker, who back then did not make Opteron-based servers), which greatly slowed down adoption of the chip by the masses. But the main task was achieved: AMD's CPUs could outperform Intel's without any technical problems.

The Sunnyvale, California-based AMD understood the issues with the first-generation Athlon very well and made its homework with the release of the second-generation Athlon code-named Thunderbird in mid-2000.

But AMD did not only launch a better AMD Athlon, it also released the AMD Duron, a low-cost chip without L2 cache at all, that did not only blew away Celeron, but actually managed to rival much more expensive Pentium III.

The launch of the AMD socket A platform was a huge win. End-users greatly gained confidence in AMD and its products. A number of leading computer makers adopted both Athlon and Duron. Seeing the potential of AMD, software designers started to tune their applications for the company's chips. Mainboard manufacturers initiated production of high-performance motherboards for AMD microprocessors, a never before seen scenario. By late 2000 AMD Athlon processors have received over 70 awards from journalists and market analysts.

The material results were terrific. In early January, 2001, AMD reported annual sales for 2000 at $4.644 billion, a 63% increase year-over-year, as well as net income of $983.026 million, another record for the company.

"Our technology and manufacturing organizations distinguished themselves from the competition by executing nearly flawlessly. [...] Despite the PC slowdown, AMD gained market share in the PC processor arena, with more than 26.5 million total units sold in 2000. We believe we gained three points of market share in units during the year, to approximately 17% of the worldwide market for PC processors," said W.J. Sanders III, chairman and chief executive officer of AMD back then.

"Athlon and Duron processors in the socket A version were just loved by enthusiasts. Probably, they, along with Intel's Celeron at the time may be called the beginning of the era of mass overclocking - and partly due to this fact AMD owes its success. Socket A processors were very friendly to overclocking both because they were easy to overclock and because they were really overclockable since AMD artificially limited their frequency. Many remember the famous 'golden bridges' on the surface of the processors' substrate, modification of which could easily and naturally change the multiplier of these processors," said Ilya Gavrichenkov, CPU analyst at X-bit labs.

It will take Intel to introduce a new processor micro-architecture called Netburst and wait for over a year for it to see its potential, drop exclusive support for RDRAM and reconsider a number of things to return the performance crown in mid-2002. AMD will have to face numerous new challenges on the market and redevelop itself in the following years completely for several times. There will be wins, losses, exciting introductions that will also bring major changes.

But the year 2000 is the year of the triumph of AMD and the market of desktop microprocessors: no more Intel's top chips for the price of over $1000 per unit; no more "Intel only" approach of computer enthusiasts; no more claims about low performance of AMD processors by anyone. The world now has two suppliers of microprocessors that are equally good.

2001 - Apple iPod: Digital Music Revolution Officially Happens

Throughout its long history that started in 1977, Apple has always been a pretty creative company that has constantly used untraditional approaches to technology and software. But as time went by and Apple consistently failed to meet price-points demanded by end-users and also did not use Windows operating system, the company's market share shrank as a consequence. When Steve Jobs returned to Apple in 1997 the company was struggling for survival and was amid twelve year record low stock price.

Thanks to return of Jobs onto the position of chief executive officer of Apple, from 1997 to 2001 a number of interesting products were launched, including the iMac, iBook as well as PowerMac G4. All of these featured aesthetic design as well as some new technologies, such as Wi-Fi, not available from competitors back then. But the real return of Apple as a provider of solutions for the masses was the iPod personal digital media player.

By the late nineties MP3 music format became popular among enthusiasts and allowed to quickly get music from the Internet. One of the main problems, however, was the fact that the majority of such downloads were illegal and consequently audio files, especially new tracks, were not easy to find. Moreover, even if MP3 music stores existed, they lacked broad choice of music. In general, while the digital music revolution had already happened, it did not happen to the masses. Moreover, there were very few applications that allowed to convert files from one format to another, organize music on players or even enhance quality of MP3s.

Since the file-sharing technology quickly got popular, portable MP3 players emerged on the market. As all ne devices, those products were in the process of finding the right balance between size, battery life, features, design and so on. For example, many first-generation MP3 players used CDs to store music, which made them bulky; or NAND flash memory, which was too expensive to be installed in large quantities, hence, those players could not store a lot of songs. Naturally, neither of the players was ideal. As a result, the masses were reluctant to acquire sub-par devices.

Apparently, Steve Jobs did recognize the power of digital music and digital music downloads all together. He formed a team headed by Jon Rubinstein within Apple, which created a player in less than a year. The player was based on technologies originally designed by PortalPlayer company, but Apple installed a portable hard disk drive instead of flash or CD drive, developed an easy to use interface (which was actually copied from Creative Nomad music player introduced earlier and the two companies eventually had a legal dispute about this) and ensured simplistic usage as well as visually attractive design. In October, 2001, the iPod was officially introduced and went on sale, but its reliance on FireWire interface, $399 price-point and some other factors did not make it popular from day one. Still, its success was evident: Apple sold 125 thousand of the first-gen iPods by the end of the year.

Before releasing iPod, Apple developed a special software suite called iTunes (launched in early 2001) that could easily convert files to and from MP3 format, organize music libraries, create playlists, mixes and so on. The software eventually gained support for iPod and Rio music players.

The combination of iPod and iTunes gained some popularity among Macintosh users and already in mid 2002 the company released iTunes for Windows along with the second-generation iPod, which sported new scroll wheel and had better visual design, but was still expensive and relied on FireWire interface.

In May, 2003, the company announced the third-generation iPod with USB support as well as iTunes music store with one of the largest collection of MP3 music files available. After that date sales of the iPod began to skyrocket. Apple proclaimed second millionth iPod sold in early 2004 and in May '04 shipments of the players hit the 3 million landmark.

The very first iPod introduced in 2001 was, just like other players on the market, not ideal. But it was the player that took the step into the right direction. It represented the official beginning of the digital music revolution.

2002 - ATI Radeon 9700 Pro: With DirectX 9 to the Stars

Found in 1985, ATI Technologies has always been innovative, yet relatively conservative company. After numerous industry's "firsts" as well as numerous market failures, the year 2002 was a major change for both ATI as well as consumer graphics cards market.

For many years ATI sold its graphics boards either to OEMs or under its own brand. In addition, the company was so committed to demands of computer makers that its first generations of 3D graphics accelerators could not match those from companies like 3dfx, but could match customers' price-points. On the other hand, following usual business logic, ATI diversified its product lines: it introduced the world's first mobile graphics chip with 3D acceleration, it made the first graphics adapter with integrated TV tuner as well as video-in and video-out connectors, it developed one of the first graphics accelerators for mobile phones. The company has consistently acquired smaller market players and startups to boost its resources with new engineers and technologies. For example, in early 2000 the company bought little-know ArtX company, which previously worked on graphics processors for Nintendo 64 and Nintendo GameCube, for whopping $400 million in stock options and appointed the head of the company - David Orton - onto the position of president and chief operating officer. The acquisition later proved to be the best investment the company has ever done.

But in spite of its innovative DNA, there was one thing that ATI used to fail: introduce new graphics accelerators for gamers on time. Starting from the introduction of GeForce in late August, 1999, Nvidia was consistently the first to launch new chips with new DirectX functionality and new levels of performance. Thanks to successful products like GeForce, GeForce 2 it managed to crash rivals S3 and 3dfx. With the GeForce 3 arriving in early 2001, it became the enemy No. 1 for ATI.

While the DirectX 8-supporting GeForce 3 was an important product, ATI understood well back in 2000 that its more advanced Radeon 8500 (R200) will be late-to-market. Instead of speeding-up its introduction or boosting its performance, it concentrated on development of the code-named R300, Dave Orton transformed the company from an add-in card vendor to graphics chips designer with many partners building graphics cards powered by its GPUs.

Some say that ArtX developed the R300 almost completely by the time it was acquired by ATI Technologies; some claim that thanks to new blood it received as a result of the take over the Markham, Ontario-based company formed second GPU development team, which eventually created the legendary Radeon 9700. In any case, the ATI R300 was not just a blow to Nvidia, it was a massive and outrageous offense onto the market of high-performance graphics cards for enthusiasts, the market mostly created by 3dfx and Nvidia.

Starting from day one, the ATI Radeon 9700 Pro and its derivatives quickly became the fastest graphics cards on the market, outperforming the competing GeForce 4-series chips dramatically. Nvidia hinted (by releasing OpenGL extensions of its NV30) that its forthcoming DirectX 9 GPU supports more features, but months past and there were no products from the former leader. When the GeForce FX 5800 Ultra finally emerged in March '03, it was slower than the Radeon 9700 Pro, especially in cases of heavy usage of shaders, something that created Nvidia back at the time.

A year after the launch of the R300, its successor - the Radeon 9800 XT - was the graphics card of choice by gaming enthusiasts as well as computer makers and starting from June, 2004, David Orton is the chief executive officer of ATI Technologies.

The success of the R300 is a result of many events and decisions that do not have direct relation to each other. Firstly, ATI bought ArtX, a little known company; secondly, the company changed its business model from selling graphics cards to selling graphics processors; thirdly, the company made decision to concentrate on designing the R300 instead of speeding-up the R200; in fourth, the made a number of design decisions that enabled very high performance of the R300 chip. There are probably tens of factors that clearly influenced the success of the Radeon 9700 and ATI. There are hundreds of other factors that led to the merger between ATI and AMD in 2006 and demise of the ATI brand-name in 2010. But for those who lived back then, in 2002, there was only one affirmative result of R300's success: the market of gaming graphics no longer belonged to Nvidia alone.

2003 - AMD Opteron Revolutionizes Server Market

Even though Advanced Micro Devices attempted to enter the market of servers with its Athlon MP microprocessors, the success was less than moderate. While some institutions did adopt the platform, the majority unsurprisingly preferred proven platform based on Xeon-series chips from Intel. But Opteron managed to convince even the largest makers of servers that AMD can not only design high desktop processors, but also speedy and reliable chips for multi-socket servers.

After AMD Athlon central processing unit became an undisputable hit of the desktop CPU market, every observer on the planet was expecting AMD's x86-64 micro-architecture, which promised 64-bit capability amid native 32-bit execution, integrated memory controller, new pipeline design, new processor system bus and many other innovations. Without doubts, the new processors were the most innovative chips ever built by AMD and represented an even bigger technology leap than the original Athlon was compared to the K6.

AMD demonstrated systems with the Opteron central processing units (CPUs) code-named SledgeHammer at the Computex 2002 trade-show in June, 2002, (AMD originally promised to "ship" the chips in 2002) and promised to start commercial shipments already in Q4 2002. However, the chips needed redesign and the company delayed them towards Q2 2003 sometime in October or early November of 2002. The following five months were a disaster for AMD: its losses were massive and sales could not improve as Intel managed to finally clock its Netburst-based processors at speeds where they left competitors from AMD massively behind.

But on the 22nd of April, 2003, the Sunnyvale, California-based chip company finally introduced its long-awaited Opteron processor. Despite of a number of pretty loud names, who supported the Opteron at launch, there was one, which was really important: IBM, who announced plans to use AMD Opteron inside its servers designed for high-performance-computing (HPC) applications. Later on AMD gained all the other leading server makers as partners, including HP, Dell and many more. But that day it was a triumph for AMD: the most trustworthy server maker adopted the company's newest chip.

"AMD first began to think about the necessity of server x86 processor architecture that would create a scalable multiprocessor solutions. At that time Intel did try to make high-end multiprocessor servers rely on its Itanium. AMD simply offered what was lacked on the market:  a 32-bit and 64-but CPU, which could be easily and without compromising performance combined into multi-socket configurations and which allowed the use of large amounts of RAM," explained Ilya Gavrichenkov of X-bit labs.

The Opteron processor revolutionized the market of servers and multi-socket servers in many ways:

It took AMD years to gain a significant share on the server market. Moreover, starting from 2007 the company's server market share has been gradually declining. At present it is just about 6.3%. Still, the original Opteron introduced in 2003 won its war: AMD is now a visible name on the market of servers.

2004 - PCI Express: Interconnection For the Next Decades

The start of transition to PCI Express bus from PCI in 2004 was generally a regular event as the industry knew it was coming. However, the significance of the bus, prospects that it enabled for the industry as well as changes that its deployment caused are hard to underestimate.

The original PCI (peripheral serial interconnect) bus introduced in 1993 was a revolution for its time. While PCI bus is still used by many devices, it became obvious even by 1997 (which is when AGP was introduced) that its bandwidth is not enough for graphics accelerators, moreover scalability of PCI's bandwidth was pretty limited and it could not grow as fast as demands from various peripherals. While PCI-X solved the problem for servers, it was a rare guest on desktop computers and was could not be used on notebooks. Finally, parallel approach to data transfer, where a large number of slow-speed links is required meant that add-on cards were relatively large because of large connector. So, sometime in 1999 engineers from Intel as well as Compaq (now HP), IBM and Dell started to design a new serial high-performance data interconnect bus code-named Arapahoe.

The PCI Express bus is based on point-to-point serial links and hence it is very scalable both in terms the amount of lanes as well as clock-speed. It features a number of innovations not available previously on any internal buses. Nonetheless, the PCIe fully maintains software compatibility with conventional PCI. Already the first version of the standard provided 250MB/s of bandwidth per single PCI Express lane, compared to 33MB/s provided by PCI 32-bit/33MHz.

The increased bandwidth and ability to configure and reconfigure the number of lanes dynamically not only increased performance of personal computers, but also opened the door to a number of technologies and devices previously impossible on the desktop.

The PCI Express bus allowed graphics chip designers to return multi-GPU setups to the market, something impossible with AGP. Makers of advanced solid-state drives and high-end RAID controllers are no longer bandwidth-limited. It is also now possible to create 40Gb Ethernet or 100Gb Ethernet cards. There are a number of other bandwidth-hungry applications that benefit greatly from PCI Express version 1.x and 2.x and the version 3.0 is projected to bring a number of protocol-related improvements, which will particularly help to improve efficiency of GPGPU-based applications.

But not all companies could benefit from the arrival of PCI Express. For example, Creative Technology, once the driving force of high-quality audio on the PC, failed to deliver audio processors that would benefit from PCIe innovations on time assuming that extra bandwidth was not needed for audio. As a result, at some point the company discovered that its expensive audio cards could not compete against inexpensive integrated solutions as their advantages were not obvious to the majority of end-users. Obviously, PCIe bus alone could not help Creative much, but at least it would have paved the way for some creative thinking within the company and could have resulted in new features. A number of other significant names of the past also could not benefit from PCI Express revolution and failed into oblivion.

Starting 2011, some chipsets from Intel Corp. will lose support for PCI, which will mean end-of-life for the bus after 17 years of existence (11 of which it was the primary bus for PCs). By contrast, PCI Express technology is being actively developed even on its sixth year on the market. As a result, it may eventually outlive its predecessor significantly.

2005 - Dual-Core Microprocessors: Core Wars Initiated

For decades performance of microprocessors was determined by their clock-speeds as well as certain micro-architectural improvements. But as a result of quickly increasing power consumption due to power leakage and other factors it became clear that further quick increase of processor frequencies was impossible. As a result, both AMD and Intel chose another way of boosting performance of central processing units: by increasing the amount of cores.

Traditional usage model of personal computers under DOS (disk operating system) and early Windows operating system was limited to one task at a time. For example, it was impossible to run Excel, Word and an antivirus program on the background at the same time back in the early nineties. As a result, it made a great sense to improve single-threaded performance. While eventually Windows gained proper multi-tasking, performance demands of many tasks were so high that users disabled certain tasks while running others. Moreover, seeing that single-thread performance of client CPUs was rising rapidly software designers further continued to create applications that could eat the majority of resources.

The natural consequence of ever increasing demand for single-thread performance was creation a micro-architecture that could quickly gain clock-speed and could also utilize the unused resources of chips by executing two threads of code in parallel. As a result, Intel developed its Netburst micro-architecture with Hyper-Threading technology.

The main peculiarity of Intel NetBurst micro-architecture was very long pipeline: 20-stage for the code-named Willamette chip and 31-stage for the code-named Prescott processor, up considerably from 10-stage pipeline of Intel Pentium III central processing unit (CPU). On the one hand, long pipelines allow processors to run at extreme clock-speeds, however, they increase branch mis-prediction penalties, which means that software has to be developed with microprocessor design in mind. Intel hoped that it could easily amplify clock-speeds gradually and offer competitive performance no matter how competitive AMD would have been. Nonetheless, it appeared that with the code-named Prescott core with 31-stage pipeline performance gains grew much slower than power consumption. It became apparent: Intel was not unable not only to deliver 10GHz chips, but even 4GHz processors

In early May, 2004, Intel said that its longer-term future central processing units would not be based on Netburst, but would be derivatives of low-power Pentium M micro-architecture. Unofficial sources suggested that time Intel would release chips with multiple cores and already in October, 2004, the company cancelled 4.0GHz version of Pentium 4 and announced plans to release dual-core chips.

Advanced Micro Devices did not concentrate on maximal clock-speeds when designing its AMD64 architecture processors, built-in a number of capabilities to allow creation of multi-core processors relatively easily. As a result, the company made it clear already in September, 2003 that the company would release dual-core Opteron chips going forward. In April, 2004, AMD officially confirmed plans to release dual-core chip lineup in 2005.

The dual-core microprocessors for desktops and servers were successfully released in mid-2005 by both AMD and Intel. The product launch became an inflection point for many industries adjacent to client computers: the CPU industry changed its vector of development; software industry changed dramatically: programs that cannot take advantage of multi-threading either ceased to exist or lost popularity; sales of servers and workstations with more than two sockets decreased considerably; end-users no longer pay attention to clock-speeds, but rather mind the core-count.

The emergence of dual-core chips paved the way for long-term development of central processing units by showing the weak spot of continuous clock-speed evolution: it is impossible to boost a certain characteristic of a chip for a long time and obtain linear increase of performance. Going forward both AMD and Intel plan to integrate graphics processing units (GPUs) into their CPUs in order to accelerate massively parallel applications and also build-in special-purpose accelerators to improve performance in specific applications. Such approach is called heterogeneous multi-core.

2006 - Nintendo Wii: Video Games Start to Feel Motions

Nintendo has been producing various games since the year 1887. Throughout its history the company had its ups and downs, but its gaming nature always helped Nintendo to find its own unique path to success. The motion controller of Nintendo Wii allowed the company to sell over 80 million of consoles in just about four years.

Unlike companies like Sony Computer Entertainment and Microsoft Corp., Nintendo has never tried to create the most technologically advanced video game system. Its consoles always lacked a number of technological features others considered common (Nintendo 64 used cartridges, whereas the original PlayStation used CDs, GameCube used proprietary discs and did not support either CD or DVD playback), but concentrated on creation of successful game franchises like Super Mario or Zelda. But slow sales of GameCube made it clear for the company: it does need some kind of innovation to succeed in the next round.

In 2002 a long-time leader of Nintendo, Hiroshi Yamauchi, stepped down as the and Satoru Iwata took the reins. The new head of the company redesigned the firm and its development teams quite substantially. But even under the new lead Nintendo was confident that video game consoles should be inexpensive and either earn money from day one, or at least cause minimal losses. As a result, it was not really an option for Nintendo to go with high-definition graphics and complex processing units. Instead, it needed to find something that no one else would have. Something, which would immerse players into games without high-quality graphics or audio. Being unable to "address" the eyes and ears, choosing motion-based gaming was really the most logical option.

As a result, the company decided to risk and employ a motion-sensing game controller. In fact, the firm first started "playing" with the technology back in 2001 (the first prototype of a motion-sensing controller was made for Nintendo by Gyration back then), so, the decision was a well-considered one. None popular video games used motion-sensing controllers before Wii and the innovative Wiimote and Nunchuck were just poised to attract attention of gamers. Moreover, simplistic game process could also make non-gamers play. At the end, all "traditional" games like basketball, bowling or football are about moving the whole body, not just fingers.

By the middle of the decade video games became so complex that a non-gamer could not play them. By contrast, moving hands and pressing one or two buttons if needed was simple enough to achieve goals of the game. Simplistic motion-sensing controller appeared to be just what the doctor ordered, not only hardcore gamers were excited about motion-sensing gaming, but people who never held a gamepad in their lives became addicted to Wii!

At first, neither Microsoft nor Sony really believed into the success Wii game console since both companies still tried to submerge players into virtual worlds using complex graphics, advanced visual effects, vibrant sounds as well as thrilling plot. Deep inside those companies people naturally worked on things like Kinect, Move, Eye Toy and others, but the giant corporations did not take a risk to bet onto motion-sensing gaming with PlayStation 3 and Xbox 360. After four years of hard work and seeing how Wii outsells PS3 and X360 combined, Microsoft and Sony also entered the world of motion-sensing gaming.

Video games are all about human senses and it is apparent that they will continue to develop in different directions. Nintendo Wii and its controller just added one more way to be in the game. As we see, its competitors decided to add motion-based gaming to currently available platforms and not wait for their successors, an undeniable proof that the Wii changed the industry.

2007 - Apple iPhone: When PDA Weds Smartphone

Even though by 2007 Apple was one of the largest music retailers on the planet thanks to its iTunes music store, the company wanted more, it wanted to be with its customers all the time and be ready to sell both content and software.  To achieve that, it developed the iPhone smartphone.

The first personal digital assistant (PDAs) is officially believed to be Apple Messagepad (Newton platform), which was formally unveiled in 1992 (some believe that the first PDA was Casio PF 3000 released in 1983, but its functionality was just too limited, it only could store phone numbers, addresses and memos and was limited in memory capacity), but it never was mass produced due to various reasons. However, companies like Compaq, HP and Palm actually made PDAs popular in the early aughts only to lost the market to smartphones in the middle of the decade.

By 2007 it was obvious that the best features of PDAs (large touch-sensitive screen, high-performance CPUs) and smartphones (connectivity) should be combined in one device. Apple was not the first company to formally unveil a phone with touch-screen. For example, LG Electronics Prada phone was demonstrated by various web-sites back in December, 2006. But Apple understood well with the iTunes and Macs that the general public likes to get plenty of bundled programs as well as additional services and that what iPhone was all about.

Like pretty much all first-generation devices from Apple, the first-gen iPhone was far from perfect. Although it utilized Internet heavily, it lacked 3G; it had proprietary headphones output, it lacked FM radio and many other minor and major disadvantages. But the iPhone back in 2007 was the first smartphone with touch-screen that could playback a variety of video content and allowed to read big texts pretty easily; it had advanced synchronization technologies as well as numerous capabilities not available on smartphones at the time.

Many call it a joy, but the iPhone is an excellent toy - that is one of the claims from Nokia users, who managed to buy an iPhone back in 2007.

With the launch of the Apple iPhone, all of a sudden the market of smartphones was completely reshaped. If previously smartphones were primarily used by business customers, the iPhone was quickly adopted by consumers. The new Apple App online store quickly became the largest repository of applications that could ease life of end-users. All-in-all, the iPhone was another inflection point of the industry.

2008 - Intel Atom: High Performance No Longer Needed

All the way through the history of computing manufacturers of central processing units (CPUs) advertized only one characteristic of their microprocessors: performance. While sometimes around the middle of the decade both AMD and Intel started to point to new qualities of chips like energy-efficiency, performance-per-watt, cost of the platform, the majority of consumers still believed in pure performance. With Atom chip introduced in 2008 the world's largest maker of chips simply ignored performance factor: the only thing Atom should deliver was good-enough speed in basic computing applications.

Although performance of central processing units has been increasing rather rapidly for many years, a lot of consumers did not need that performance to check email, browse the Internet and do other basic tasks. What they did need, however, were small form-factor and weight, low price and ability to make their simple daily tasks. Intel understood this demand and designed its Atom chip from the ground-up to meet those requirements.

Instead of down-clocking existing processors and reducing their cache sizes, Intel took an in-order Pentium-class core (from the year 1993), added a number of modern extensions, Hyper-Threading technology, increased clock-speeds to 800MHz - 1.86GHz and added numerous power management techniques, such as Intel Deep Power Down Technology (C6), CMOS mode, Split I/O power supply and so on. As a result, the company got a chip with 47 million of transistors produced using 45nm process technology, which cost a few dollars to build and could run Windows, an email client, a browser, a text-processor, music player and some other basic applications.

Thanks to very low cost ($40 - $60) and power consumption of around 4W - 6W, Intel Atom quickly gained popularity among makes of small form-factor low-cost notebooks, which were named netbooks. In 2008 the company sold tens of millions of Intel Atom and the chips established a completely new category of products.

The launch of Intel Atom in 2008 marked a split in the trend of mainstream CPU evolution. One line of microprocessors continues to progress rapidly and conquer new performance heights, another breed of chips is advancing slowly so that to fulfill basic computing demands at low power and low cost. Moreover, unveiling the Atom core Intel clearly declared plans to enter the market of smartphones with system-on-chips featuring that technology.

2009 - Solid-State Drives Begin to Reach Consumers

Solid-state drives (SSDs) have existed for decades and the first NAND flash-based SSDs were introduced by M-Systems in 1995. The main benefit of solid-state storage was its reliability compared to traditional magnetic rotating media, another advantage was performance. Consequently, SSDs were used by aerospace, military, oil and gas and other industries requiring reliability in harsh environments.

Early in the aughts numerous companies suggested that solid-state storage could greatly improve performance of consumer PCs as well as enterprise servers. But back then NAND flash memory was so expensive than no one really consider to make such drives available massively. But as time went on, prices on NAND flash dropped, multi-level cell (MLC) flash became more reliable, controllers became smarter and benefits compared to hard disk drives (HDDs) became even more significant. Sometimes in the middle of the decade a number of different companies started to demonstrate their SSD products at trade-show touting the revolution in desktop storage.

Several flash-centric and memory-centric companies introduced SSDs for mobile and desktop computers back in 2007, but the prices were so high that actual sales were very low, even enthusiasts seeking for maximum performance did not install them. But in 2008 price on NAND flash memory dropped further and loads of companies serving PC enthusiasts successfully introduced their lineups of SSDs that were affordable enough to be adopted by consumers.

As it usually happens, it took another two years for SSDs to finally skyrocket in terms of sales. According to iSuppli market tracking agency, global sales of SSDs reached 1.4 million units and generated revenue of about $127 million in 2008. Already in 2009, manufacturers of SSDs managed to sell 5.8 million solid-state drives, four times more than in the previous year; their revenue soared to $883 million.

The year 2009 clearly demonstrated that a lot of customers - both consumers and enterprises - demand either performance or reliability of flash-based storage. This is clearly a turning point for the storage industry at large. Going forward the demand towards SSDs will only increase and HDD makers will have to find ways to massively improve performance of their drives to stay competitive.

Perhaps, HDD makers should integrate large amounts of flash storage into traditional hard drives and create ultra-fast hybrid storage devices; or invent new magnetic recording mechanisms and heads that could either deliver high performance themselves or could work stably with extreme spindle speeds (which is not an option for mobile systems as higher speeds mean higher power consumption too). Of course, there are sport cars and there are family vans; therefore, hard drives will remain in desktops, where storage space is crucial, but they can easily sit in the neighborhood to ultra-fast SSDs. In case of mobile computers solid-state storage simply have too many advantages over hard drives and it is inevitable that the total available market of mobile HDDs will eventually shrink.

The year 2010 is only about to end. There were a number of products released that, in our belief, made substantial contributions to the future of personal computing. We do not know, which one will make the biggest impact on the industry in the long-term. So, we decided to name three products which release can be a strategic inflection point for the world (just because we think so). Perhaps, the year 2010 was productive enough to create three micro-revolutions in the technology industry, but it will anyway take several years for us to understand that.

2010 - Apple iPad: Small Device Makes Big Changes

Although netbook computers sport the same clamshell form-factor as fully-fledged notebooks and have keyboards, it is impossible to do anything serious on them, which essentially means that the keyboard are useless for those, who just want to browse Internet web-sites, read electronic books and consume different kind of media. Apple understood it well and released a tablet PC without any keyboard designed just to consume, but not to create. The iPad clearly has a massive amount of drawbacks (as do all first-gen gadgets from Apple). But it is not only a logical extension of the iPhone, which allows to more comfortably do the same tasks as on the smartphone, but it is also a rational form-factor for consumption of various media and content, it can also be used in commercial, medical and other environments, where access to documents is much more important than their editing.

The idea of a slate-type device clearly was in the air as back last year a lot of companies talked about tablets and planned to introduce appropriate products in 2010. However, Apple and Samsung were just two companies, who actually managed to release their slates this year. As a result, Apple was the company, which catalyzed the consumers and the industry to reconsider the role, functionality and usage models of netbooks.

Tablets are clearly not only a new PC form-factor, it is a new battleground for x86 and ARM; it is a new battleground for Windows and Linux; it is a new battleground for AMD and Intel; it is a new battleground for Apple, Samsung and Sony. Those clashes between various companies on the market of slates will definitely create new strategic infliction points for the whole industry.

2010 - Microsoft Kinect: Console Obtains Eyes

Nintendo made a revolution in 2006 with Wii game console and its motion-sensing controllers. Microsoft attempted to craft another revolution in 2010 with Kinect, a device that is not a motion-sensing controller, but a motion sensor. Today, you are the controller, claims Microsoft.

The technology behind Microsoft Kinect is pretty simple: the device has RGB camera as well as a depth camera in addition to audio sensors. The combination of RGB and depth cams generates information that allows to detect motions of a human being and reflect the acts on the screen. At present the technology has a number of limitations because of software and firmware, but in spite of drawbacks there is a great interest towards Kinect from all around the industry.

As reported above, Nintendo Wii immersed players into the game by using motion-sensing controllers. Microsoft Kinect dives players into virtual worlds by detecting their own motions, a concept that essentially puts gamers into their imagined world. At least in theory, if not limited by technology, this could be almost the holy Grail of gaming.

Kinect is an indisputable success not only because one million of consumers bought the accessory in the first 25 days of its commercial life, but also because enthusiasts hacked it in the very first days of its commercial life. This kind of "interest" is a clear indicator of interest by computer enthusiasts, who are usually visionaries in general. Potentially, this means that Microsoft just hit a trigger that can transform the game controls as well as create a new type of interaction with user interface.

What will happen next is to be seen. But at this point the Kinect is not only innovative, but it is also a product that is clearly different from all the other moving-related approaches. The interest towards Kinect generated by Xbox 360 users and enthusiasts is an evident proof that the technology is at a point where the gaming (in fact, not only gaming) industry reconsiders its way to move forward and its approach to game design in general.

2010 - Nvidia Fermi: The First GPU to Power World's Highest-Performing Supercomputer

General purpose computing on graphics processing unit (GPGPU) is the idea that is almost ten years old. Thanks to massively parallel architecture, GPUs are great for parallel computing. This year Nvidia supplied compute cards to power the world's most powerful supercomputer.

Back in 2006 - 2007 both ATI, graphics business unit of Advanced Micro Devices, and Nvidia introduced their specially-designed compute cards aimed at high-performance computing (HPC) markets. Besides, Nvidia spent a lot of resources on creation of its CUDA platform for GPU computing and also integrated a number of compute-specific logic into its Fermi-series graphics processors (for example, the Fermi architecture is tailored to deliver maximum double-precision floating point computer performance; SIMD processors of Fermi can both read and write from/to the unified L2 cache, something that is needed for compute and something that is less needed for graphics; etc). The approach of ATI-AMD was much more pragmatic: instead of creating a proprietary GPGPU platform the firm decided to rely on industrial standards like DirectCompute or OpenCL; instead of building-in excessive compute-specific functionality into Cypress (Radeon HD 5000), the company chose to integrate only the most important ones to keep the size of the chip smaller and wait for new process technologies to integrate new features.

As the more GPGPU-concentrated player, Nvidia was also the first to receive fruits from its efforts. In 2010 three supercomputers out of top five supercomputers in the world, including the #1 Tiahne-1A with 2.56petaFLOPS performance, were powered by Nvidia Tesla 2000-series compute cards. Moreover, the same cards were used in many other HPC systems.

The market of GPU-based accelerators for HPCs is not large; financially Nvidia still earns the majority of its revenue by selling consumer GeForce-branded graphics cards and Quadro-branded professional graphics accelerators. But what is important is that there is general belief that supercomputers capable of performing at least a quintillion double-precision floating point operations per second (1018 FLOPS of ExaFLOPS) should be heterogeneous, e.g. employ both highly-parallel as well as high-performance serial processors. The Tianhe-1A is among the first heterogeneous supercomputers and the fact that is already the world's most powerful supercomputer may not be a turning point, but it is a clear signal for everyone in the HPC industry: GPUs and Nvidia have already penetrated the supercomputer market and the entering was successful to say at least.

We hope for a revolution, but we always rely on evolution. What we have to understand is that any technology has a lot of ramifications. Those ramifications may result into further developments of products by the companies we never have heard of before. What we have to consider is that loads of technologies that actually change the world or mare it different come from big companies. Exceptions just prove the rules.