by Anton Shilov
01/03/2011 | 08:11 AM
UPDATE: Typos and inaccuracies corrected on 5 January, 2011. We really are sorry, a beta-version of the article was published on the 3rd by a mistake. We really are sorry.
The year 2010 promised a lot, granted that in 2009 all the trends of the decade were very clear. However, there were not a lot of breakthrough products in 2010 and a lot of them were delayed till 2011. We did not see a lot of tablets, GPGPU-based applications and innovative devices in general. But this makes us believe that the year 2011 will bring the highly-anticipated changes in the industry.
Personal computing industry is getting broader as devices around us become smarter. This trend enables a broad set of various electronic products and mobile gadgets, but as they get more advanced, differences between them become less clear as functionality tends to get similar. As a result, the market for various digital content and games continues to expand at a rapid pace, mainly thanks to mobile devices.
So, keeping in mind the trend towards mobility, let's see what we can expect from 2011. Believe us, we have a lot of expectations!
Personal computers in slate form-factor seem to be the logical extension of almost any reading device ever used by the humankind: from clay tablets to magazines to electronic book readers. In 2011 there will be at least 70 different tablet PC models released and tens of millions shipped.
It is hard to say who invented the tablet personal computing as a concept. Touch-screens (in one form or another) have existed for decades and were used in different applications. The first "tablet PC-like" digital devices emerged back in the mid-fifties, but those devices were not computers and were hardly available in mass quantities. Various tablets with hand-writing recognition were released by IBM and others in the eighties, but given limited functionality we cannot call them PCs too.
Fujitsu Stylistic 1000. Imaby by Winhistory.de web-site
Microsoft Corp. released Windows for Pen Computing back in 1991 with the aim to enable input using a stylus (or a pen) instead of keyboard or a mouse. A number of products, including Fujitsu Stylistic 1000 tablet with Windows 3.11 were released with Windows for Pen Computing. Around the same time little-known company Go Corp. released its PenPoint OS, which was eventually used by IBM ThinkPad and eventually IBM developed a number of other Pen-based input systems for a variety of its products (e.g., ThinkPad 730 TE). Microsoft updated its WPC to version 2.0 with the launch of Windows 95, but, just like in case of the previous version, the new one did not get widely popular and was only used on some special-purpose devices for vertical markets.
The first real attempt to popularize tablet personal computers occurred in 2001, when Microsoft unveiled its Windows XP Tablet PC Edition operating system. Convertible tablet PCs running Windows did not get really popular, but they at least were used both by specialists and consumers. In early 2006 a number of PC makers introduced so-called mobile Internet devices (MIDs), which sported no keyboards at all, but supported all possible applications ever developed for Windows. Those products - which were available from companies like Asustek Computer, Samsung Electronics and Sony Corp. - were rather bulky because of power-hungry processors inside and did not get popular among consumers too. Even though Windows Vista included functionality of XP Tablet PC Edition, the only well-known tablet device based on it was HP TouchSmart tx2-series.
Since Windows for personal computers relied on x86 and it was impossible to make a really sleek and slim tablet based on any x86 chip with proper performance, a natural choice for tablets would be ARM-based chips as well as a non-Microsoft operating system. Recognizing that, Apple released its iPad slate with a modified iOS operating system in Q2 2010. Despite the fact that the iPad is still pretty heavy, it quickly gained popularity among those, who did not need netbooks and officially became the first tablet PC in history that was sold in millions. The success of the tablet was predictable and intentions to launch "over-sized" PDAs in 2010 were visible. To our great surprise, only Apple and Samsung actually managed to release popular tablets this year.
What we do know is that, according to Craig Ellis, an analyst with Caris & Company, at least 69 tablets from various manufacturers will be on display at the Consumer Electronics Show next week. So far the analyst predicts that there will be 18 Intel Atom SoC-based slates, 14 Nvidia Tegra-based tablets, 10 Freescale Semiconductor-powered devices as well as 6 slates with a system-on-chip (SoCs) from Texas Instruments shown at the CES. Other tablets are projected to feature SoCs from Marvell, Qualcomm and others. Intel itself said it had 35 design wins with its Oak Trail and Moorestown (Atom Z600) system-on-chips for 2011. Therefore, at least 86 different tablets will be available in 2011, and we can suspect that the actual number will be over 100.
At present all the leading makers of notebooks, mainboards and various high-tech devices are working on their tablets, including Asustek Computer, Acer Group, Cisco, Dell, Elitegroup Compute Systems (ECS), Fujitsu, HP, Lenovo Group, LG Electronics, MicroStar International (MSI), Research in Motion, Toshiba and many others are working on their slates that will be available next year.
Since tablet PCs will be based on very different hardware - Intel SoCs, multi-core ARM-based SoCs, high-speed single-core ARM-based SoCs, various generic SoCs - the actual devices will be very different from each other. Some of them will be tailored to deliver maximum performance and Windows experience, but their battery life will not be too long; others will deliver lower performance, but will offer longer battery life; third type of slates will have extreme battery life, but their performance and functionality will be compromised. In short, the market of tablets will be very segmented from day one and almost everyone will be able to pick up a product with the almost unique balance of price, performance and features. The 2011 is the year when the tablet market is set to explode.
The emergence of slate PCs will naturally create a micro-war between Intel and the ARM camp. Naturally, it will cause a battle between suppliers of various x86 and ARM-based chips. Indirectly, this situation has a lot of chances to transform into a war between standards.
Microprocessors with x86 architecture are capable of delivering very high performance. They power the vast majority of data centers and servers all around the world. However, they cannot deliver high speed per low watt, something that is required nowadays for mobile devices and something that will be a part of next-gen low-power servers. ARM-based chips offer low power consumption, but not a lot of other features, for example, they lack even basic 64-bit capability. Still, the fact is that power consumption is the most important factor for mobile devices, the devices that will be the most important driving point for the tech industry.
For example, smartbooks have definitely lost the conflict against netbooks. But the forthcoming battle between tablets, high-performance slates, netbooks and notebooks is completely unpredictable. Naturally, some companies would want to revie smartbooks or redevelop netbooks with all the high-performance technologies they are going to obtain in 2011.
ARM and x86 are completely different platforms. They use different processor system buses as well as input/output systems. Any kind of opposition here will lead to a war of standards and may eliminate PC-only IO standards, such as FireWire/IEEE1394, USB 3.0 or even LightPeak. Moreover, x86 uses Windows, whereas ARM uses anything, but not Windows.
What is going to happen is completely unpredictable, as just the outcome of the War in 2011. The clear thing is that it will start.
Performance of consumer applications has been determined by performance of microprocessors for decades. But with the arrival of accelerated processing units (APUs) the situation may change drastically as at least some of the programs will begin to use massively-parallel graphics processing units and thus will deliver better customer experience thanks to new levels of performance.
The development of APUs began back in 2006, when Advanced Micro Devices acquired ATI Technologies. Even though initially it was considered that integration of a graphics engine onto a microprocessor would hardly be an issue, the process took a lot longer. AMD did not only have to ensure that the hardware actually functions, but had to develop proper OpenCL 1.0/DirectX 11-compatible graphics cores, appropriate x86 cores as well as wait till the industry actually adopts the standards required for efficient APUs. As a result, the outcome of the project will only be visible in 2011.
Both AMD and Nvidia has worked hard with numerous software developers to ensure that their programs could take advantage of ATI Radeon and Nvidia GeForce graphics processors' capabilities and many applications do use the GPUs now. The software that can utilize conventional graphics chips is naturally capable of taking advantage of APUs, provided that the apps use industry standards like OpenCL or DirectCompute, not proprietary things like CUDA. But a big problem is that consumers have to find programs that actually utilize graphics chips for general-purpose computing (GPGPU).
At present Nvidia offers a set of links on its web-site that help to find GPGPU-enabled applications for various tasks. AMD will also offer a similar kind of service in the very short-term future. However, it is obvious that it is not enough for general consumers.
AMD is currently working on a number of APU-oriented software advertising and delivery mechanisms. In addition, at least one computer maker, which is set to introduce AMD Brazos-based computers at the CES, will also offer an application store that will specifically mark software that take advantage of APUs.
Availability of programs that utilize graphics processing engines for general-purpose computing will play a critical role in popularity of AMD's APUs. Both Ontario/Zacate and Llano accelerated processing units will be slower than Intel's Sandy Bridge offerings in terms of x86 performance, but will offer tangible advantages in GPGPU-enabled applications.
But do not expect the majority of programs to be APU-aware in 2011. The most important thing next year is that the second largest designer of x86 chips will give a clear signal about its commitment to GPGPU-accelerated future, the era of accelerated computing.
The emergence of accelerated processing units with built-in high performance graphics and a number of other premium features will also diminish the difference between netbooks and notebooks.
At present technical specifications of Atom-based netbooks are somewhat artificially limited by Intel so that to ensure that they do not compete against more advanced notebooks with slightly more expensive Celeron or Pentium chips inside. For example, there are no Atom-based netbooks with 12" or larger screens albeit netbooks with 7" displays are already virtually dead. AMD promises not to create any artificial limitations for its computers powered by Ontario or Zacate processors. Therefore, they will power both 10" ultra-portables and 14" office-class machines.
"Ontario and Zacate will deliver great experiences and raise the bar at prices points occupied by function limited netbooks and low-end notebooks today. Llano will deliver even more performance and better experiences beyond Zacate and Ontario-based products. AMD will not create any artificial barriers for our customers and encourages them to innovate with our Fusion APUs," said Godfrey Cheng, director of client technology unit at AMD.
In fact, Ontario and Zacate APUs will fit between Intel Atom and Intel Celeron/Pentium in terms of performance. Thanks to advanced features like DirectX 11-class graphics, GPGPU, high-definition video support and other they will indeed develop even better experience than Celeron-based systems do. As a result, it is logical to expect low-cost PCs with Atom, Celeron or Pentium inside to obtain special accelerators for HD video, high-resolution displays and other bells and whistles so to stay competitive and up-to-date.
All-in-all, the only difference between netbooks and notebooks in 2011 will be performance, not the amount of features or functionality. Of course, we will not see netbooks with high-end security capabilities, Blu-ray disc players and other expensive things, but the inexpensive mainstream systems will be very similar.
Personal data storage industry is very capacious, yet very dependable on performance and the nature of the PC market in general. The rise of the tablet and SSD-based PCs will change it forever.
Shipments of slates - which all are going to use NAND flash memory for storage - will not only increase, they will skyrocket in 2011. Sales of smartphones are projected to not only skyrocket, but to become dramatically higher. As a result, the structure of the storage market will change in 2011.
Devices used by consumers will more and more use use SSD technology (even if is it is not needed), but remote storage systems will start to use HDDs in massively larger quantities. Consumers with SSDs will rely on remote storage systems, such as network area storage (NAS) at home and Internet-based storage everywhere.
I any scenario, on one SSD a consumer gets, he/she will acquire one or more HDDs. But this time those hard drives will not be installed inside their PCs, but will either be located somewhere in the digital universe or within a local area network.
Stereoscopic high-definition televisions clearly have nod made it to the mass market in 2010. In fact, they received such a tepid welcome that it is hardly visible that the market of stereo-3D (S3D) equipment and content actually exists. In fact, S3D will not become mass even in 2011. What will happen is that the vast majority of premium-class HDTVs and Blu-ray disc players will be stereo-3D capable..
At present stereo-3D equipment is simple too expensive for the mass market and with the practical absence of content there are not a lot of points to actually upgrade a high-quality LCD or Plasma HDTVs to newer that support S3D. Another obvious drawback of current stereoscopic 3D screens is that they require users to wear glasses (meaning that in some cases it will be impossible for certain customers to watch movies in 3D because of prescription glasses) and sit in under special angles to the TVs. Those glasses may cause headaches and can affect development of children's and potentially teenagers' eyes.
But despite drawbacks of the S3D technology itself, consumer electronics (CE) makers will anyway have to introduce new models so to be able to sell them for premium prices. The new HDTVs will not only have LED backlight, but also S3D-capable matrix and 120Hz refresh rate, whereas the new Blu-ray disc players will playback both 3D and 2D content, support for MPEG4 Multiview Video Coding (MVC) codec (an extension to the ITU-T H.264 Advanced Video Coding (AVC) codec currently supported by all Blu-ray disc players) and appropriate outputs. The prices will correct themselves somewhat and stereoscopic 3D will basically come for free, it will be supported by default on higher-end TVs ($1000 in the U.S.).
Manufacturers of CE products understand very well that the transition to S3D equipment is a very costly task for consumers at the moment and there are not a lot of drivers to do that. For example, although Microsoft Xbox 360 and Sony PlayStation 3 game consoles can output in S3D mode, there are less than ten games for both platforms (in fact, Sony has the clear lead here) that use the technology. The list of Blu-ray 3D movies available on Amazon.com in the U.S. currently includes about ten titles. S3D TV channels are scarce and do not offer a lot of different content. Producers will only start making more stereoscopic 3D content when there is larger installed base. In order to boost it, CE makers will slash pricing on 3D TVs and they will be available not only in the ultra-premium segment, but also in premium and even higher-end mainstream segments.
Sales of Kinect motion-sensing controllers (and partly Sony Move devices) demonstrated clearly: consumers want motion sensing in general.
While we are not going to get a new application programming interface in 2011 for a natural user interfaces a'la in Minority Report, we are going to get a high amount of devices that are based on sensing of motions.
For example, Mark Karayan of Movea, a company that develops motion-sensing solutions, said that the company and its partners plan to unveil a number of new motion-sensing products at the CES.
"Movea will announce new partnerships with brand-name consumer electronics companies to develop next generation motion-control & motion-sensing peripherals targeting the digital & connected/interactive TV, gaming, & sporting markets," said Mr Karayan.
The company's MotionIC platform is projected to be integrated into air-mice, air-keyboards and various remote control devices enabling in-air operating controls through motion recognition.
"Imagine browsing Netflix on your HDTV from the couch with a wireless mouse that operates in air, but it's also your remote; or - instead of slowly selecting each letter on your HDTV with an arrow key, quickly type on a full keyboard on a remote control, making it easy to type in the name of the movie you'd like to watch on Hulu," said Mark Karayan.
Microsoft is not the only company with 3D and RGB cameras and appropriate software. Moreover, hackers have already managed to make Kinect work on Windows PC platform. As a result, it is inevitable that not only motion sensing controllers will be more popular, but additional motion sensors are likely to emerge.
Nvidia Corp. at present is the only company that remains without a definite x86+GPU platform. This company will have to transform itself dramatically in 2011 to survive.
Nvidia has been selling graphics processors for many years now. The company has been trying to improve them for a long time with the help the the GPGPU technology, e.g. by developing its proprietary CUDA format. But this time the billion-dollar-company-company will have to not only change, but to renovate itself.
AMD and Intel both launch microprocessor with GPU on die, hence, no more core-logic sets with integrated graphics and no more low-end GPUs.
Nvidia will continue to sell GeForce graphics chips for desktops and notebooks. However, it will not be able to address the market of x86-compatible things, e.g., offer x86-based APUs with a GeForce integrated. But the company will be fully able to address ARM-compatible market and will continue to pursue the GPGPU and supercomputer markets. Eventually, technologies on all of those markets will collide and if Nvidia's product planning is wise, it will be able to complete against x86-based APUs.
What will happen in 2011 exactly? We do not know.
Amazon, Barnes & Noble announced late in 2010 that their sales of electronic books overwhelmed those in paperback. Looks like the same will happen around the world.
Barnes & Noble, the world’s largest bookseller, late in December said that with millions of Nook electronic book readers sold, the line has become the company’s biggest bestseller ever in its nearly 40-year history. Amazon also indicated that sales of Kindle exceeded its expectations, but also named no numbers. Unofficial sources said that Amazon might sell eight million of Kindles in 2010.
B&N also said that new Nook Color e-book reader, introduced just eight weeks before Christmas, is the company’s number one selling gift of the holiday season. Barnes & Noble also announced that it now sells more digital books than its large and growing physical book business on BN.com, the world’s second largest online bookstore.
While it is easy to get a book via an electronic store, not all books can be physically got everywhere. As a consequence, many of them are bought in an electronic way.
The market of books may not only change, but will definitely transform. What will happen to - we will only see.
Nokia's Symbian has dominated the market of mobile operating systems for years. However, Google's Android is on track to replace it and become the main operating system for mobile devices in 2011.
Already in the third quarter of 2010 Android accounted for 25.5% of worldwide smartphone sales, making it the No. 2 operating system and particularly dominant in North America, according to Gartner. Back in Q3 about 20.5 million of smartphones with Android were purchased worldwide (or around 227 thousands daily). Nokia's Symbian was more popular than Android in the third quarter: there were around 29.48 million of smartphones powered by Symbian sold in Q3 2010, or about 327.5 thousand daily. However, in early December Andy Rubin, a vice president of Google, said that over 300 thousand of Android phones were activated daily, which means that by the end of Q4 2010 the platform was Google was very close to Nokia's Symbian.
Of course, sales of mobile phones with Symbian are growing too (29.48 million units shipped in Q3 2010, an increase of 61% compared to 18.3 million in Q3 2009), but the market share of Symbian is dropping fast (36.6% in Q3 2010, it was 44.6% in Q3 2009). Moreover, sales of competing platforms, including Google Android and Apple iOS are growing even faster. For example, sales of Android based phones were up 14.4 times year-over-year in Q3 2010 and the market share of the OS increased from 3.5% in Q3 2009 to 25.5% in Q3 2010, according to Gartner.
The problem for Symbian is not that it is not as progressive as Android or iOS, but the fact that Nokia is struggling to deliver competitive phones on time and cannot figure out the right feature-set for various models. While Sony's overall product lineup seems to be balanced, many models lack a number of features as Nokia wants to clearly position them for certain audience. Meanwhile, Android and iOS offer similar features across a range of devices and while some functionality may be very uncomfortable to use, it is present.
Unlike iOS, Google's Android is more open and, most importantly, is available on a wide range of mobile phones that are priced differently and which come in different design. Further development of Android will not only be a threat to Symbian, but will also be a threat for iOS since the two platforms have a lot in common.
In any case, in 2011 the operating system from Google will become the No. 1 OS on the market of smartphones. This fact has a number of ramifications. For example, Samsung Electronics' (No. 1 maker of Android-based phones) will come closer to Nokia on the overall market of mobile phones in terms of unit sales and market share. Companies like HTC, ZTE and similar will increase their sales, whereas positions of LG Electronics and Motorola will likely become stronger.
If you buy a $100 central processing unit (CPU) and a special mainboard in order to unlock its "hidden potential" and get performance level of a $200 microprocessor, then Intel will make your life a lot harder with its new Core i-series "Sandy Bridge" generation of chips.
Overclocking is a practice that has been known and used in the industry for a long time. Initially, overclocking procedures were relatively easy as neither AMD nor Intel locked multipliers of microprocessors and clock-speeds could be easily picked by changing multiplier settings on mainboards. But starting from the late nineties, Intel started to lock multipliers in order not to let third-parties to remark low-cost chips and sell at higher price points. AMD eventually did the same with the Athlon chips. As a result of locked multipliers, overclockers had to alter bus frequencies to change clock-speeds. Thanks to the fact that clock-speeds of various peripheral buses and processor system bus (PSB) were set by different clock generators, modifying PSB speed did not affect any other speeds.
The platform based on mainstream Sandy Bridge integrates clock generator into chipset. As a consequence, the PSB is simply impossible to overclock as it affects clock speeds of Serial ATA, USB and others, which means glitchers even with a mimal performance increase. But Intel plans to offer a "solution" to get special chips with a "K" moniker that has unlocked multiplier and which costs more than an average LGA 1155 processor. A good thing is tha the new processors will have much more advanced Turbo Boost technology that will overclocke them automatically and by thus will eliminate the need for overclocking.
"Locking" mainstream chips in order to avoid overclocking is a strange move to say at least from business reasons. Overclockers and gamers, who purchase CPUs just in a bid to overclock them hardly cound in more than tens of thousands in a large regious. People, who DIY their computers for entertainment purposes and then overclock CPUs and GPUs to get better performance simply buy what they can afford. Hardly anyone from these group would acquire more expensive chips. Perhaps, ASPs of CPUs will rise, but reputational losses from the move will be much higher than financial gains. Of course, PCs will get a very little more affordable. The problem is that those, who care about $10 never notice anything, but the price.
Code-named Sandy Bridge E-series microprocessors and Patsburg-family chipsets do not have those limitations. But those platforms are will also not truly affordable since mainboard makers will probably get their premium and Intel will get the one it wants.
At the end, overclocking has been getting an expensive task for ten years now, but this time it will become an ultra-money-demanding task.