Personal Computers of 2010: Ten Predictions

The evolution of technology in 2010 will bring rather revolutionary changes that are likely to significantly transform the whole market. X-bit labs believes that in 2010 electronic books will get much more popular, quad-core microprocessors will become mainstream, DVD format will virtually die, 64-bit applications will cease to be exotic and we also expect revolutions in gaming, input technologies and other high-tech spheres.

by Anton Shilov
01/04/2010 | 08:26 AM

The year 2009 was exceptionally interesting. On the one hand, for the first time in many years sales of personal computers declined, on the other hand, 2009 summed up all the trends of the decade: mobility, style, high-definition, energy-efficiency, platformization and others. The 2009 also set a number of new trends that will only become apparent in the new decade.

 

Today we are going to share our expectations for 2010, which has chances to be better than its predecessor for the economy. Still, the biggest winners will hardly be hardware companies. Online content sellers are going to see dramatic growth in their businesses, electronic books, video games, music and movies will all be in demand thanks to development of the Internet.

Let’s see, what we can expect from 2010.

 

E-Book Readers’ Market Set to Explode

The Sumerians used clay tablets to write on sometimes back 5500 years ago, the ancient Egyptians used papyrus around the same time before present, paper has been used for about 1900 years and electronic screens have been seriously competing with paper for about 15 years now. But will electronic books replace paper books after over 550 years of their dominance?

The first electronic book readers were launched back in 1998. They were bulky, they could not store a lot of information, their battery life was short and there were not a lot of e-books available in general. Sony Corp. presented its Libriè e-book reader with e-ink screen for Japanese market only in 2004, six years after the first-generation e-book readers emerged and about two years before the firm unveiled its Reader family for the rest of the world. The new-generation electronic book readers were thinner, lighter, sported higher capacities, longer battery life and there were more electronic books available on the market. However, it looks like only in 2010 the e-book readers will finally gain strong market acceptance.

By early 2010 two large sellers of books – Amazon and Barnes & Noble companies – have introduced their own electronic book reader devices that are available for around $300, quite a large sum of money. However, those devices have tangible advantage: new books can be acquired anywhere wirelessly and without need to go anywhere or wait for delivery. In addition, electronic books are cheaper compared to hard-back bestsellers.

Based on unofficial information, there are at least ten of high-tech companies looking forward to enter e-book reader market in 2010, among which are Apple, Asustek Computer, Lenovo Group, MicroStar International (MSI) and a number of others. Obviously, with competition on the market of e-book readers intensifying, the price war is inevitable and it looks like in the second half of the year there will be available e-book readers rapidly gaining market share.

With the increased amount of electronic book readers services that will distribute electronic books will also emerge. The question here is whether traditional book sellers have huge advantages over newcomers or there will still be a competition between them. The content owners would obviously like to keep the price of a single bestseller book at around $10/€10, which may make sense in the U.S., but which does not make any sense in Eastern Europe region, where a new hardback paper book costs about the same amount of money. Moreover, in order to target those markets makers like Amazon, Barnes & Noble and Sony will have to add support for Cyrillic and other fonts.

In any case, the year 2010 there will finally be the rise of the e-books. Electronic books will still trail paper books, but their market will increase dramatically this year.

Quad-Core Chips to Become Mainstream

For about twenty five years leading microprocessor suppliers lead the so-called megahertz war. Clock-speed of central processing units (CPUs) was the most important indicator of their performance and companies like Advanced Micro Devices, Cyrix and others even invented so-called performance ratings that should indicate comparable speed levels with Intel’s chips amid lower frequencies. However, everything changed in 2005 when both AMD and Intel unveiled their dual-core processors. So began the great core war!

Dual-core central processing units have become mainstream solutions since then and in 2010 their followers – quad-core CPUs – will conquer the mainstream market. There are rather clear evidences for that: Microsoft Windows 7 and DirectX 11 can take advantage of multi-core chips and a lot of other consumer-oriented software will just follow. Since quad-core chips can boast with decent clock-speeds nowadays, their performance will be enough for mainstream applications. Hence, the demand for quad-core chips will definitely be there in 2010.


AMD Athlon II X4 microprocessor

In fact, AMD has already introduced quad-core chips for the price below $100 and it does not take to be a prophet to say that more and more quad-core chips from both leading suppliers of microprocessors will be available in sub-$100 price envelope. It remains to be seen when Intel starts selling low-cost quad-core processor and under which brand. At present Intel Core 2 processors, even their dual-core versions, cost over $100 and everything below is shipped under Celeron or Pentium names.


Intel Core 2 Quad microprocessor

Six-core processors from both AMD and Intel will serve the premium segment on the market and naturally quad-core chips will lose part of their pricing and will inevitably become much more affordable than they are today, especially in the second half of 2010. The lower the price the higher is demand.

DirectX 11 to Revolutionize PC Gaming

In 2010 DirectX 11 will do what DirectX 10 should have done: change the PC gaming experience.


Screenshort from Dirt 2 game with DirectX 11 enhancements

The release of DirectX 10 application programming interface (API) back in late 2006 has not brought any substantial changes to the market of PC games due to a number of reasons:

With DirectX 11 the situation seems to be a whole lot different:

In addition, DirectX 11 not only supports new features, but also handles DirectX 10 capabilities on DirectX 10 hardware better. Therefore, the DX11 will not only improve quality of video games on newer hardware, but will also boost performance of the installed hardware. 


Screenshort from Uningine Benchmark that utilizes numerous DirectX 11 features, including tesselation

Analyst Dean McCarron from Mercury Research has already said that transition to DirectX 11 API will the fastest in history and he seems to be completely correct. The first games that take advantage of DirectX 11 were out just a couple of months after the hardware and software launches and more are incoming in 2010. It looks like DirectX 11 will finally improve PC video game experience.

Solid-State Drives: Gradually Gaining Market Share

Solid state drives (SSDs) have indisputably gained market acceptance in the most recent twelve months. However, they will still remain to be a niche products in 2010, albeit, dramatically more popular than back in 2009.

The main problem of SSDs – the cost per gigabyte – will, without doubts, migrate to 2010 from 2009. At present, high-end WD VelociRaptor hard drive with 10 000rpm spindle speed and 300GB capacity costs around $200 in the U.S., meanwhile, premium 256GB solid-state drives from companies like Corsair Memory or OCZ Technology Group cost around $800 or more. Although they do provide excellent performance, the amount of money is still too large for upgrade market, at the end, a graphics sub-system for that price will offer a much more tangible benefit. So, we foresee premium SSDs to gain market share slightly in 2010 as a result of recovering economy, but not because they will be getting tangibly more affordable. Obviously, manufacturers will make existing ones cheaper since they are going to introduce new models, but the demands of end-users will get higher too in 2010.


Solid-state drives by Toshiba Corp.

Lower-cost SSDs that are based on multi-level cell (MLC) flash, come in 2.5” form-factor and do not feature too advanced technologies will still get more popular than they are today. The price of a mainstream or entry-level 120GB – 250GB solid-state drive should get lower in 2010 thanks to mass production of 3-bit-per-cell and 4-bit-per-cell flash chips by various manufacturers. So, traditional suppliers of flash-based products will try to push entry-level SSDs more aggressively. Some desktop makers may even install such drives into their machines in order to make them quieter and greener.

It should be noted that performance is not the only benefit of SSDs. Low power consumption is important for mobile computers and servers, so, the demand for solid-state drives from manufacturers of enterprise-class systems and well as mobile computers will probably be very high this year. The price premium of SSD will not be noticeable in higher-end notebooks, hence, expect expensive mobile computers to transit to SSDs this year. In fact, a number of mainstream systems will also start using SSDs, we believe.

From the volume perspective, enterprise-class systems seem to be well positioned to become the largest market for solid-state drives due to their natural advantages of performance, low power consumption and reduction of storage systems’ complexity. Considering that Seagate, Western Digital and Hitachi GST are gearing up to start volume shipments of enterprise SSDs, we would expect the overall SSD market to grow significantly.

To sum it up, while the absolute majority of gigabytes of information will continue to be stored on traditional hard drives, the share of SSDs will increase. It will not grow to become any threat to HDDs, but solid-state drives will get more noticeable in 2010.

Stereoscopic 3D Blu-Ray to Get Tepid Welcome

Despite of many doubts, the Blu-ray 3D specifications was ratified just before the holidays by the Blu-ray disc Association (BDA). However, it is hard to expect the stereoscopic 3D Blu-ray in particular or any other stereo 3D technology in general to become popular in 2010.

The Blu-ray 3D standard itself is pretty liberal: players should support full HD 1080p (1920x1080, progressive scan) resolution for each eye, playback of both 3D and 2D content, support for MPEG4 Multiview Video Coding (MVC) codec (an extension to the ITU-T H.264 Advanced Video Coding (AVC) codec currently supported by all Blu-ray disc players) and appropriate outputs. Blu-ray 3D content should playback on 2D BD players in two-dimensional mode.

Moreover, Blu-ray 3D is display agnostic, meaning that Blu-ray 3D products will deliver the 3D image to any compatible 3D display, regardless of whether that display uses LCD, Plasma or other technology and regardless of what 3D technology the display uses to deliver the image to the viewer’s eyes. The compulsory thing for stereoscopic 3D is that those screens should support 120Hz or higher refresh rates. An here lies a problem.

Even though the vast majority of TV-sets on the market support 720p or 1080p high-definition resolution, the transition to HDTVs still has not been completed. Problem here is that far not all modern HDTVs or computer monitors support 120Hz or higher refresh rates. Therefore, if Blu-ray can address the whole installed base of HDTVs, then Blu-ray 3D can only address a tiny fraction of installed HDTVs.

Moreover, those HDTVs that support 120Hz or higher refresh rate and support some kind of stereo 3D capabilities, use different stereoscopic 3D technologies: shutter glasses-based or polarized glasses-based. Considering the fact that those technologies provide different kind of experience for different people, there will naturally be a lot of confusion and the majority will settle with “good-old” 2D equipment.


Blu-ray 3D demo by Sony Corp. Image by CTV News

There is a great interest towards stereoscopic video gaming and movies, however, with all the complexities of currently available technologies, 2010 will not be the year of stereo 3D. On the other hand, the 2010 will be the year of 2D Blu-ray as consumers continue to migrate to high-definition videos.

GPGPU Gets Ready for Prime Time

Enthusiasts have been talking about general-purpose processing on graphics processing units (GPGPU) for about eight years now and some progress in that direction has been made in the last two or three years. Looks like in 2010 there will finally be critical mass of consumer applications that take advantage of GPGPU technologies.


Screenshot from vReveal by MotionDSP, a software that can take advantage of computing capabilities of Nvidia GeForce hardware

Until recently all the GPGPU-accelerated software was developed for specific hardware using specific tools, e.g. ATI Stream or Nvidia CUDA, there were no unification or standard-based approaches. As a result, the vast majority of GPGPU programs is exclusive to Nvidia CUDA architecture and work only on Nvidia GeForce graphics processors. However, since Microsoft DirectCompute and Khronos OpenCL standards were finalized in 2009 and appropriate software development kits are available from both ATI/AMD and Nvidia, standard GPGPU programs should emerge in 2010.

Thanks to industrial standards, we expect a lot of programs to finally start using graphics processing units for general-purpose computing, which will, without doubts greatly increase the importance of GPUs and will transform them into universal accelerators. This will help ATI, graphics business unit of Advanced Micro Devices, and Nvidia Corp. to boost sales of their advanced GPUs this year.

Mobility Becomes Necessity: New Devices Incoming, Existing Categories to Grow

There is a time to scatter stones, and a time to gather stones. Nowadays it is definitely the time of scattering stones on the market of mobile devices. Just several years ago smartphones combined functionality of cell phones and personal digital assistants (PDAs), but next year we expect a range of new mobile devices to emerge.


Nokia N900

Here is what we can expect:

Obviously, a lot of product categories that will come into sight in 2010 will eventually become extinct and their functionality will be found in highly-integrated devices. However, in the next two to three years their market share will be growing.

High-Definition Everywhere, Standard-Definition Approaches End-of-Life

The first ten years of the new millennium definitely were a decade of high-definition. Mass availability of multi-megapixel digital cameras since the year 2002, emergence of 720p (1280x720 resolution, progressive scan) and eventually 1080p/i (1920x1080 resolution, progressive or interleaved scan) television in the middle of the decade, the launch of high-definition Blu-ray and HD DVD standards in 2006 and the surfacing of multiple high-definition on-demand video stores towards the end of the decade are just evidences for that. The new decade can easily be the decade of ultra high definition. But first, there will be high-definition everywhere!


Microsoft Zune HD

There are Blu-ray players and drives to enjoy premium no-compromise high-definition video; there are Internet-based services that allow watching videos with quality higher than that of DVD, but not as high as that of Blu-ray; there are cameras that shoot high-def photos and videos. All of that content has to be watched on proper screens with all the details. Of course, it is not going to make a lot of sense to install an HD screen onto a tiny personal digital media player (PDMP) or cell phone, but that device should support output of high-definition video to external HD screens using HDMI or similar interfaces. Microsoft Zune HD and several less known players already support 720p output, unlike Apple iPod, but in the coming year there will be much more HD-capable PDMPs and cell phones.

Obviously, with further adoption of HDTVs, traditional standard definition DVDs will just become less appealing. Moreover, as BD players get cheaper, and so will the actual movies, the market share of of Blu-ray disc in the retail will grow in geometrical sequence. To make the matters even worse for DVD, online HD-like services will offer better experience at lower cost, hence, there will even fewer reasons to buy the outdated packaged media format.

So, with HD heading everywhere, SD and DVD are going to nowhere.

New Types of Input on Offence: from Multi-Touch Pads to Project Natal

For over two decades the world has used keyboard + mouse type of input on the PC. Maybe it’s time for a change?

Although traditional input seems to be rather ideal for two-dimensional operating systems, with the increased amount of information nowadays some may find contemporary types of input inefficient or unintuitive. There were numerous attempts to change the input of various devices, but so far we still use keyboard + mouse to control personal computers as well as remote controls for TV-sets.

Back in 2006 Nintendo introduced its Wii game console with motion-based input and in 2008 Asustek Computer introduced Wii-like controllers for personal computers. Around 2007 many notebooks got multi-touch pads that could recognize certain gestures to perform certain operations (obviously, scrolling had been supported by touch-pads for years back them). Late in 2008, Toshiba introduced notebooks with SpursEngine processor that could be controlled using gestures (when special software was installed). Finally, in 2009 the new operating system from Microsoft – Windows 7 – started to support touch-based input. Obviously, with four or more new input technologies neither of them can really gain traction and become an alternative to traditional devices.

We do not really expect any unconventional input technologies to become a de facto standard in 2010, but we are looking forward to popularization of certain new input methods.


Microsoft Natal motion sensor

Firstly, with the introduction of Microsoft Natal and Sony motion controller motion-based input will become an official standard on video game systems. Nintendo Wii will have to prove that its platform is still better compared to those from rivals as even improved precision of Wiimote the console still lacks modern graphics and multimedia functionality despite the lack of any price advantages. At this point it remains to be seen how good motion sensor Natal actually is since while gesture-based input does work for HDTV, it may not be precise enough for dynamic video games.

Secondly, with the introduction of any new input technology, mice and keyboards are not going to disappear overnight. With the support of touch-screen input by Windows 7 many vendors will indisputably add such capabilities to their desktops, notebooks and tablets, but sincerely it is hardly convenient to use such input method on anything but tablet PC. Makers like Toshiba can also add gesture-based input, but such method cannot be used in public places due to natural reasons. Asustek has been touting voice-control for its Eee PCs, but so far no progress can be observed.


Logitech MX Air mouse/pointer

What we guess will happen in 2010 is availability of more personal computers with untraditional types of input. More systems will come with touch-screens, there will be computers that can be controlled using gestures and some systems may even adopt motion-based controllers like Logitech MX Air.

64-Bit Software Heading Everywhere

The first 64-bit microprocessors designed for consumers were launched in September, 2003, but initially there was no Windows XP operating system with 64-bit mode support and eventually it transpired that there is no software that can work in 64-bit mode natively. Moreover, few systems back then featured over 4GB of memory. Nowadays 4GB of RAM is normal, but there is not a lot of 64-bit software. 2010 will change that.

In 2010 Microsoft Corp. will release its highly-anticipated Office 2010 with 64-bit mode support, which will be the official start of 64-bit software era. Of course, Windows Vista and Windows 7 are available in 64-bit versions and Adobe Photoshop CS4 also has 64-bit flavor, but after Windows operating system, Office is one of the most popular software suites in the world. After Office hits 64-bit milestone, others will simply follow. At the end, large data sets are very common today and 32-bits may not be enough in the very near future.

Unfortunately, not a lot of video games – the most demanding applications available today – can address over 4GB of memory and use all four processing engines available in modern systems. We expect this to change in 2010 with the launch of next-generation titles, including those that rely on DirectX 11 application programming interface (API).

All the new processors on the market now support 64-bit capability and even mainstream systems feature 4GB or more memory, especially considering that modern graphics cards can carry 1GB or more onboard. As a result, transition to 64-bits is inevitable nowadays and in 2010 it will become a reality. The hardware base is here and the software is incoming.

Platformization Forever: Third-Party Chipsets Set to Die

For years the industry has been talking about platformization and integration of computer components. In 2010 and 2011 platformization will reach its logical finish and the third-party chipsets will vanish into oblivion.

Early in 2010 personal computers featuring Intel Atom “Pineview” and Intel Core i3/Core i5
“Arrandale” and “Clarksdale” processors are expected to be available for sale. All of those chips feature integrated memory controllers and graphics cores inside microprocessors, which automatically destroys any necessity for third-party chipsets with integrated graphics and memory controller. As a result, companies like Nvidia Corp. or Via Technologies will only be able to offer discrete graphics solutions for next-generation desktops, notebooks and netbooks.


Intel Core i5 "Clarkdale" processor

Advanced Micro Devices is behind Intel Corp. in terms of fusing central processors with graphics processing units (GPUs) and its code-named Llano chip is only due in 2011, which basically leaves some room for third party chipset designers. However, this year AMD will become even more competitive with its chipsets. Previously AMD’s 6-series core-logic sets were behind Nvidia’s DirectX 10-class integrated solutions in terms of performance, whereas AMD’s 7-series was more expensive. This year AMD plans to roll out 8-series chipsets and 7-series is likely to become more affordable compared to solutions from Nvidia. As a result, the share of AMD-based personal computers with Nvidia chipsets will drop dramatically. Considering the fact that Nvidia cannot make chipsets for next-generation AMD Opteron for servers and workstations, this all means that the era of AMD-Nvidia partnership is de-facto over.

Some might think that with the lack of any third-party chipsets for popular processors from Intel and AMD there will be less choice for consumers. But it should be noted that AMD and Intel now compete not only on the market of microprocessors, but also on the market of platforms. Intel has already announced that its forthcoming Core i3/Core i5 chips will support video transcoding using integrated graphics core, whereas AMD promises even richer set of general purpose computing capabilities on its integrated GPUs.