The Day After Tomorrow: Personal Computers in 2020

Personal computers of today as well as usage models of PCs differ radically from systems and user experience that we used back ten years ago. But what will happen with personal computers in another ten years? Let us try to guess! UPDATE: Added comments from Jon Peddie of the Jon Peddie Research

by Anton Shilov
09/30/2010 | 09:45 PM

The personal computing industry is living through rather radical changes in the usage models of computers as well as technologies that power  new and forthcoming devices. What we are seeing today is a tip of the iceberg of the major new era of the personal computing. In this article we will try to guess some of the ramifications of the new era on PC hardware.

 

Throughout the history of the personal computer can be divided into several eras (depending on usage model) can be determined:

Obviously, one can divide the evolution of the PC into more periods, but those three epochs seem to have maximum differences in terms of usage models when it comes to consumers and business strategies when it comes to companies.

In this editorial we will try to find out how the new usage models, technologies, business approaches and other factors will impact the major building blocks of personal computers. Our guesses are based on our knowledge of the industry and the mechanisms that change the trends of its development.

The Great Battle of Form-Factors

Nowadays there are a lot of talks about the battle between desktops, laptops, notebooks, netbooks, tablets, smartphones and so on. One would call it the great battle of PC form-factors. In reality, it more resembles the great alignment of form-factors as none of them is going to become the dominant and none is likely to disappear completely. Some are meant for creatopm, others for entertainment and third for communication, there is no size that fits all.

Many predicted the death of desktop computers years ago, but even in 2010 there will be over one hundred of million desktops sold worldwide. Notebooks are more convenient to carry around, but desktops offer larger screens, better keyboards, higher performance and other advantages. Those, who use desktops for productivity applications or gaming will continue to do so, even though the share of such people will be relatively low. Those, who use desktops only to check out email or news-papers will cease to buy such computers in the future. In fact, mass desktop as we know it today has all chances to transform into all-in-one PCs a decade from now thanks to availability of powerful, yet low-cost and low-power PC hardware.

 
Desktop of the 21st century. Concept design by Kyle Cherry

Notebook computers exist in many different sizes depending on their usage models. A gamer wants to have a large screen and a powerful graphics chip inside, whereas a traveler prefers light weight, long batter life, maximum reliability and decent level of performance to efficiently run productivity applications. Notebooks that are only used to for media consumption or email checking will likely extinct in the next couple of years since netbooks and tablets will naturally better suit to usage models of their owners. Professionals, who have been using convertible notebooks (tablet PCs) will also most likely migrate to modern slates. In general, the share of classic laptops will shrink, but will not disappear.


Lenovo ThinkPad X301

The netbook is a relatively new phenomenon on the market of PCs. Primarily such computers are aimed at two groups of people: those, who do only basic things on their PCs and any level of performance is good enough for them; and those, who use those computers for Internet-based communications as well as entertainment. The latter category will eventually prefer tablets or even high-end smartphones, whereas the former will keep using netbooks.

Tablets have been around on special markets for many years now in the form of convertible notebooks. They naturally did not become popular among mass consumers due to many hardware and software limitations. However, tablets like Apple iPad or Dell Streak are completely new types of personal computers that do not resemble anything from the past. The main point of modern slates is to consume multimedia content, use social networks and communicate. Nobody will ever position tablets for productivity applications since the lack of hardware keyboard is a major obstacle. Nonetheless, we would we would expect text-to-speech technologies to evolve dramatically in the following years, hence, it will be possible to whisper relatively long texts to slates. Moreover, thanks to evolution of cloud computing technologies, it will be possible to play even the latest video games on a thin handheld device.


RIM Blackberry Playbook tablet

One thing that slate-type PCs will never be able to do efficiently is to make phone calls. As a result, one will not be able to live without a smartphone. Moreover, those, for whom consumption of content is not very important will prefer smartphones with decent screens to tablets. Smartphones of the future will gain a lot of performance and capabilities thanks to rapid development of hardware as well as cloud computing. But given natural interface and screen size limitations smartphones will rather control a dishwasher than assist in actual work.

The next-generations of slates as well as smartphones that will feature multi-core central processing units, high-definition screens and long battery life will impact special-purpose devices, such as portable game consoles, MP3 players, dictaphones, pocket cameras, e-book readers and so on, but not general-purpose PCs. Still, a lot of tasks that are now done using desktops or notebooks will migrate to smartphones or tablets ten years from now. The main question will be is how much performance will be needed locally on a mobile device and which performance-consuming tasks can be "outsourced" to the cloud. 

 
Asus Waveface conceptual wrist-watch computer

All-in-all, there will be no a single device that would be the symbol of the "all-day online computing" era, there will be a wide array of various connected devices instead. Virtually all the PC form-factors that exist today will continue to exist in 2020; moreover, new form-factors are likely to emerge and spread by that time: expect wrist watches with access to the Internet, wearable computers, smart consumer electronics and so on.

Central Processing Units: More Functions, Lower Power

Microprocessors as we know them today are likely to disappear in ten, perhaps, a little more, years from now. Already in 2010 we do have central processing units with integrated graphics and some other features, a decade from now we are going to have solutions that are greatly more integrated.

At present central processing units are essentially doing the same things that they have been doing for decades. But ten years from now they are more than likely to transform rather dramatically. There are several ways for such transformation:

Both approaches naturally have their pros and cons, but the general direction is rather clear at this point: microprocessors and graphics processors are set to become one chip. Of course, there will be standalone graphics cards as well as central processing units for specific tasks that will not be able to process graphics, yet, will probably inherit certain stream processing capabilities to speed up certain operations.


Intel SCC chip

Apart from the trend towards merging microprocessors with graphics chips a naturally interesting thing is happening with the micro-architectures of those microprocessors. At present we see Intel (and AMD) taking three (two in case of AMD) several distinct routes with its CPU architectures:

Given the fact that Intel is more than likely trying to substitute the lack of successful graphics architecture suitable for computing with its MIC (many Intel cores) micro-architecture, it is probable that the world's largest maker of chips will eventually try to wed the two designs somehow. Still, the generic multi-core CPUs will continue to prevail as software will hardly be able to benefit from MIC-like architectures.

"The software suppliers (ISVs) will continue to lag behind the hardware and disappoint us with their lack of engagement and exploitation of the hardware as they drive for the lowest common dominator. There will of course be exceptions, but for the most part ISVs have slowed down the industry," said Jon Peddie, the head of the Jon Peddie Research analyst firm.


AMD Orochi chip based on Bulldozer architecture

At present hardly anyone knows for sure what future CPUs will be like and a lot of important trends that can be extrapolated to the year 2020 will be unleashed along with the release of Haswell and post-Haswell microprocessors in 2013 and beyond. In fact, the example of AMD Bulldozer design - where processor consists of modules and two INT units share one FPU per module - shows that companies are trying hard to pack more execution units into their chips and are eager to cut down some other things. In general, Bulldozer's "hybrid" core approach may be a glimpse of the longer-term future of CPUs.

Longer-term roadmaps from both AMD and Intel clearly show that the two leading designers of processors will continue to develop low-power architectures to compete against ARM-base designs. At present everything shows that x86 and ARM will compete directly against each other on the same markets, but x86 will retain high-performance systems, whereas ARM is going to stay the king of extreme low-power market.

Let us try to summarize our guesses:

"ARM and x86 continue to fight and co-exist. ARM is too poplar and too many companies have too much invested in that architecture to change. Intel will be the challenger and continue an aggressive technology and marketing effort to gain market share," explained Mr. Peddie.

It is also highly-likely that architectures like Power and SPARC will extinct by 2020 due to economic and technology reasons.

"They cannot gain economy of scale and they cannot support the R&D needed to stay current and meaningful," said Jon Peddie.

Graphics Processing Units: Prosperity or Extinction?

Graphics processing units have been gaining importance ever since the first 3D games that could use their functionality emerged. But will discrete GPUs actually exist a decade from now, or will they share the fate with standalone audio cards or add-on floating point processing units (FPUs)?

With the release of hybrid processing units with x86 and graphics engines inside the market of low-end and inexpensive mainstream graphics cards in general will inevitably shrink rather considerably, especially in the mobile segment, since both Intel Sandy Bridge and AMD Llano promise very high performance in video games and other GPU-accelerated applications. In order to keep the segment of graphics boards that cost less than $100 alive companies like AMD and Nvidia will have to offer very advanced products that offer major performance difference compared to integrated graphics products. Naturally, this is not a matter of technology, but rather a matter of economics.

Recently Nvidia outlined its plans till 2013 - 2014 and we do know that ATI, graphics business unit of AMD, has plans for at least two more generations after the Radeon HD 6000. Nvidia even outlined its design goals for the forthcoming Kepler and Maxwell architectures, it is not doubling gaming performance every year, but it is aggressive improvement of double precision floating point performance per watt, a clear indication that the company sees general-purpose and high-performance computing on GPUs something tremendously important.

There is a definite consensus between AMD and Nvidia that graphics chips are very suitable for highly-parallelized computing. Both companies have HPC GPU deployments and it is clear that the market of GPU-accelerated high-performance computing will only grow. That said, both companies will continue to develop new graphics processors with HPC and eventually cloud computing in mind.

But the market of HPC should not be overestimated. According to Mercury Research, the total available market of discrete graphics processors (desktop and mobile) was approximately 34.9 million units. The number translates to TAM of approximately 100+ million of GPUs per year. The world's most powerful supercomputer today - the Jaguar - utilizes around 37 000 of AMD Opteron microprocessors with six cores. The K supercomputer jointly developed by Fujitsu and Japanese ministry of education will feature 80 000 thousand of SPARC64 VIIIfx chips with eight cores to deliver 10 PetaFLOPS of double-precision performance. It takes many months or even several years to assemble giant supercomputers. Since supercomputers are usually based on dual-socket blades and it is possible to install maximum two compute boards into those blades, it is not hard to count that the TAM for HPC compute boards will remain in hundreds of thousands, maybe millions, in the coming years, but not tens of millions.

In fact, the world's two largest HPC GPU deployments are Nebulae supercomputer with 4640 Tesla C2050 cards as well as Tianhe-1 that sports a total of 2560 dual-chip ATI Radeon HD 4870 X2 cards (5120 GPUs). Still, the market is growing fast and at some point, maybe in 2020, the TAM for GPUs in the HPC space will be comparable to the TAM of graphics chips today. Of course, if Intel and AMD manage to release server processors with integrated MIC or GPU-based accelerators, the interest towards specialized accelerators will start to decline.


A server with two AMD FireStream inside

For PC gamers continuous development of GPUs for the HPC market means that GPUs will not become a niche product, but will continue to exist and evolve even in 2020. Another reason why graphics processors should continue gaining graphics performance is inevitable increase of resolutions and inability of integrated graphics products to quickly gain performance to respond to demands of the consumers on the leading edge of progress.

However, the market of graphics solutions in a decade from now will be completely different from 2010. Inexpensive graphics chips will have to offer much higher levels of performance related to integrated graphics than today and maybe even have certain exclusive features. Graphics cards of the day after tomorrow will not only be in add-on form-factor as further popularization of notebooks will drive demand towards external graphics cards. In the future it will be possible to plug an external graphics card to a laptop with decent microprocessor and play latest video games wirelessly on a large screen. Perhaps, graphics rendering will move to server (which still means that the server will need to have a GPU) and hence premium graphics experience will be possible even on low-power handheld devices.

What can we expect from GPUs ten years down the road? In case the evolution of graphics chips will be as rapid from 2010 till 2020 as it was from 2000 to 2010, then this is a question that no one can answers. The main thing that we would expect is consolidation of graphics and compute APIs in order to simplify programming of different applications. However, Jon Peddie, from a leading firm that analyzes graphics market, does not believe this and claims that graphics-specific APIs will continue to exist in 2020.

"[Graphics APIs] will be necessary to support the unique architecture of a GPU," said Mr. Peddie.

Memory: Standard A Day Keeps Strangers Away

Commodity memory market is probably among the most predictable things in the technology industry. Manufacturers of dynamic random access memory (DRAM) just love standardization as it allows them to concentrate on efficient manufacturing and not on ensuring that their products actually work with certain controllers. Device makers love that too since they have a lot of sources for DRAM. This is something that is not going to change even in 2020.

As we do know, leading DRAM manufacturers will start commercial production of next-generation DDR4 memory in 2012. Actual mass transition to the new memory is projected to occur towards 2015, which means that by 2020 there will be DDR5 ramping up. As we also know, the 2133MHz - 4266MHz effective clock-speeds of DDR4 lead to change of topology of memory sub-system to point-to-point interconnect. As a result, DRAM manufacturers will need to increase capacities of memory chips by using multi-layer technique with through silicon via (TSV) technology. In case of server multi-layer DRAM IC approach only will not be viable for high-end machines and special switches will be installed onto mainboards to allow multiple memory modules to work on a single memory channel.

The point-to-point topology of DDR4 is rather questionable, but, they say, inevitable. Given the fact that JEDEC, the organization that standardizes memory and other technologies, makes major changes pretty rarely, it is highly likely that point-to-point topology will be a part of DDR5 as well. Naturally, it is possible to expect 4.20GHz - 8.40GHz effective clock-speeds as well as generally smaller form-factors for both chips and modules.

What we do not expect to happen is the emergence of a proprietary standard that will actually have chances on the mass market. In the past decade we saw a number of attempts to push proprietary memory technologies onto the market place, but whether they involved completely different memory chips (XDR) or simple modification of platforms (QBM), they all have failed.

There are a number of technologies that can potentially combine the advantages of flash memory and DRAM. We do expect some of them to find their place on the commercial market in the next ten years, but we hardly believe that they will actually replace DRAM and NAND flash as we know them today.

Storage Systems: HDD vs. SSD vs. Cloud

With the ultimate evolution of the cloud computing in ten years from now the question will be not whether we will need local storage in general, but how much local storage we will need and how fast it should be. Ultimately, it is HDD vs. SSD vs. Cloud.

Local storage will play a much less important role in 2020 compared to the role it plays today. Nowadays we keep the absolute majority of our own files on local hard drives and this is where we install our applications. In a decade from now the importance of cloud-based applications and storage will grow so considerably that this fact alone will redefine the market of PC storage much more than any evolution of technology.

The large hard drives today are used to store multimedia files (primarily video files) by general consumers as well as large data files by professionals (e.g., engineers or designers). Multimedia – audio and video – tends to get higher quality and storage requirements. Files used by specialist require both storage space and security. Let us consider both usage models.

Ten years ago everyone was satisfied with 128Kb/s MP3s and DivX movies. Today that level of quality is unacceptable. Nowadays even 384Kb/s MP3s can be streamed from the Internet, but files encoded using FLAC or similar lossless audio formats cannot be streamed. It will take several years before it will be possible. The same applies to video. 720p HD video can barely be streamed today. Obviously, high-definition video (50GB per movie) will be possible to stream directly from the Internet in 2020, but given the fact that multimedia is evolving rapidly and already today we face both stereo-3D as well as ultra high-definition video, it is more than possible that in a decade from now video files will require times more storage space. As a result, physical media and large HDDs will still be in use by the consumers. Nonetheless, cloud storage will be needed to share photos, videos and other types of data that one wants to share.

On the other hand, it is not cast in stone that optical media will remain on the market. If it extinct, then all the storage requirements will be put onto the local storage, which will increase the demand for storage and will drive importance of low-cost-per-gigabyte hard disk drives among the consumers. Potentially, this completely redefines the market of consumer-oriented storage technology. In fact, analyst Jon Peddie believes that optical media was a flaw from the very beginning.

"Optical media was developed in the early 90s. It was clear then to many people, and certainly should have been to the giant companies developing the CD and DVD, that [Internet] bandwidth would increase and coverage would become ubiquitous within a decade or less. Also Moore’ s law was well understood and the density and price of RAM could be easily forecasted. And, HDDs were on the same exponential line of development. Knowing all that, why would a company invest billions of dollars in a technology that would follow the same obsolesce curve as 8-tarck, cassette, and video tape?," asks Jon Peddie.

 

 

 

 

Keeping work files locally allows a professional to ensure rapid access to them and to use them very efficiently. Manipulating with photos that are 600MB large nowadays using even high-speed Internet is a torture to say at least. Naturally, the demands for storage will expand in 2020. However, keeping confidential corporate documents locally is not something completely safe and it makes a lot of sense to keep all the secret documents in the cloud. As a result, relatively small/moderate and confidential files will migrate from local storage devices to the cloud. Moreover, most product information will be in the cloud. Without need to keep documents, product information, emails and other business-related things locally, the demand for storage in business PCs will either remain similar to today or will grow intangibly. Consequently, while workstations will require large local storage systems, most business PCs will not need large local storage devices.

To sum up:

 

Prices of solid-state drives have been decreasing in the recent years at a pretty rapid pace, whereas requirements towards benefits offered by SSDs have only been increasing. That said, we believe that SSDs will be used in about 50% of PCs shipped in 2020 (developing markets will stick to HDDs due to many reasons). Perhaps, those SSDs will not be used as primary storage devices for media, but they will be installed for performance boosting reasons. HDDs will continue to hold the vast majority of information, but they will have to coexist with flash-based storage. In fact, it is more than likely that hybrid storage devices featuring rotating media and sufficient amount of flash will emerge in the coming years. But not everyone thinks that SSDs will grab a significant market share. According to Jon Peddie, HDDs will continue to prevail.

"[Solid-state storage will not be able to grab a significant (over 33%) market share]. Life time operation, cost, and negligible performance difference from 10k and 15k HDDs limits SSDs to special cases (high impact, EMI, maybe size.)," said Jon Peddie, the head of the Jon Peddie Research analyst firm.

Monitors: Ultra-High Definition Vs. Stereo-3D

Whether we like it or not, but the monitor is the most important part of any computer. Any kind of information the computer can process it has to output onto the screen and hence the evolution of monitors means the evolution of user experience.

The high-definition is here and while everyone continues to enjoy the high quality of movies, photographs and graphics user interfaces, the HD can no longer amaze or be considered as the next-generation. By the year 2020 the world will likely enjoy something that is now known as ultra high definition (UHD) and that has maximum resolution of 7680x4320 pixels (16:9). The tech will hardly reach the masses in ten years from now as presently it is only a proposed standard by several TV-channels from Japan and Europe. Nonetheless, it is likely to at least emerge as the final standard that will again redefine the market of home video as well as entertainment in general.

Besides television, many professionals and gamers would like to have higher resolution displays to improve visual quality. Although 2560x1600 resolution has been the highest consumer monitor resolution for many years now, ultra high-end displays for medical imaging and other applications that support 3200x2400 or 3840x2400 resolutions. There is a clear need for more screen real estate and higher clarity of graphics. As soon as leading manufacturers of LCD panels find a way to make such ultra high resolution panels more or less affordable, they will offer appropriate products to the consumers.

Stereoscopic 3D technology is adopted very slowly these days. According to DisplaySearch, sell-through of “3D Vision-ready displays” as listed by Nvidia in the first two quarters of 2010 is less than 3500 per quarter in all major U.S. PC outlets combined, a clear indicator that at present people do not want to pay for stereo-3D. Why should they? Wearing special glasses and sitting in a special position is definitely not something comfortable, especially amid the lack of content. The so-called autostereoscopic 3D screens (that do not require graphics) can certainly make a huge difference, but present auto-S3D technologies are far from perfect and it is unclear when proper technologies emerge. In case manufacturers continue with the current approach that required glasses, virtually all monitors and TVs in 2020 will support S3D, which will barely be used.

The types of panels to be used in 2020 still remains to be seen given rather uncertain prospects for OLED (organic light emitting diode). In fact, given that large displays consume too lot of energy, it is more than likely that manufacturers will increase their efforts and will finally establish mass production of OLED-based products. As a result, by the end of the decade virtually all monitors and TVs will feature OLED screens. Perhaps, an even more interesting prospect for OLED technology in the next decade are bendable displays that can be used in loads of different applications.

To put it short, in 2020 displays are likely to get astonishing resolutions along with new types of panels. While stereo-3D with shutter glasses may become a feature found on the vast majority of consumer displays or TVs, it will hardly be the most important capability of future devices. Perhaps, autostereoscopic 3D technology changes the whole industry.

Input Technologies: Traditions and Innovations

Input technologies are another crucial pillar of personal computing. Tactile feedback and/or intuitive input are very important for effective usage. Given the generally increasing amount of devices, in 2020 there will be a lot more ways to control PCs than there are today.

When it comes to a traditional desktop or notebook, keyboard and mouse/touchpad have proved to be efficient and comfortable. But given the fact that almost all devices nowadays feature web-cameras, it is more than logical to incorporate basic recognition of gestures into the future personal computers. In fact, a special motion sensors like Microsoft Kinect (which even now costs considerably less than $100 to produce) would noticeably change the methods of interaction with consumer electronics and PCs. Moreover, facial recognition will also change usage model considerably (imaging that a computer/TV/etc knows preferences and applications that are used personally by you). It is naturally not a good idea to wave to your PC or speak to your laptop in a public place, but when in appropriate environment or while gaming, motion-sensing input should not be underestimated.

Unfortunately, given the current turmoil on the market of non-traditional input, it is highly unlikely that there will be revolutionary changes on the market even in ten years down the road.

"This is a nescient market that lacks any standards (not the least of which is a common vocabulary), is confused (and confusing) in its own terminology, and has limited applicability. Also, too many people confuse gesture with touch when discussing the UI," said analyst Jon Peddie.

Slate-type personal computers or smartphones will hardly benefit from motion-based input, but they will probably benefit from, for example, recognition of environments. For example, a mobile phone can automatically reduce the ring volume while at business meeting. A notebook or a tablet could automatically find documents or fetch additional information from the Internet that may be related to the ongoing conversation.

In fact, one of the most important qualities of personal computers of 2020 will not be ability to offer different input technologies, but to recognize actual necessities based on environment and current activity. The best type of input is the lack of any special input from the end-user. Indeed, Intel is already working on context-aware computing and probably ten years down the road we will see the fruits of that work.

Final Words

We are at the dawn of the new PC era, the era of all-day online computing. The new epoch will increase the amount of Internet-connected devices by the order of magnitude and the amount of information that will be floating around the Web will be gargantuan. The huge amounts of information create a lot of opportunities for many industry players and no company will miss it if it means additional revenue sources. Naturally, we are going to see rise of new Internet giants and fall of certain present colossus.

Will cloud services redefine the market of software and companies like Microsoft, Adobe or Apple lose their clients, only time will tell. What is clear already now is that Microsoft does not have enough platforms to power the emerging Internet-connected devices. As a result, by the end of the decade the role that Microsoft has been playing for thirty years now will be much lower.


Asus Waveface conceptual foldable laptop

In the next 10 to 15 years the importance of security will be much higher than today. Security will no longer be a must for personal computer, large cloud data centers should implement ubiquitous security technologies to protect user data not only from the outside, but also from the inside as no one should be able to keep a track on people.

The hardware and the hardware business will continue their natural evolution driven more than ever by demands of the market. We should naturally expect further convergence between various devices and technologies, but we naturally cannot predict any revolutions that may be just around the corner...