Articles: CPU

Bookmark and Share

Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 ]

Today, when most operating systems have already become multi-taking, the use of second processor helps increase the system performance quite noticeably. Besides, the support of Hyper-Threading technology in Intel processors stimulated the arrival of new quality software taking advantage of the progressive features like two simultaneously working parallel computational threads. This way we can state that the market has already prepared pretty well for the next significant step towards multi-core processor architectures.

However, before we begin our story of the dual-core processors that are to come out next year, I would like to say a few words about the actual benefits of the dual-core solutions for the processor developers and for the entire market.

First of all, the introduction of dual-core technology is another efficient way to increase the processor performance. Since the actual CPU performance is the processor working frequency multiplied by the number of instructions the given CPU can process per clock cycle, the introduction of dual-core architecture should double this parameter, because adding another core will automatically double the number of execution units in the CPU. I have to specifically stress here that if you want to achieve maximum performance you should make sure that all these execution units available in two processor cores are used to the full extent. However, this is the primary concern for the software developers, and not for the CPU designers. Therefore, it is evident that the dual-core processor architectures will ensure actual performance increase only if they receive proper support on the software side. Besides, the traditional way of increasing the CPU performance by raising its working clock frequency has now stumbled upon pretty serious obstacles, caused by technological problems in the first place.

Secondly, the use of dual-core architecture should also increase the overall processor functionality, which is also a very important factor, especially for the company marketing strategy. The thing is that the introduction of dual-processor architectures and launching of the next generation Microsoft Longhorn operating system should stimulate the development of new virtualization technologies. Both AMD and Intel believe that they are going to become a major distinguishing feature of the next generation computer systems, and thus are very important. According to independent analysts, these technologies will be actively used in the upcoming PCs in 2006-2007 and will bring their features to a completely new level.

Different virtualization technologies, which are currently known under various codenames, such as Intel Vanderpool, Intel Silvervale and AMD Pacifica, but are based around the same principles, are being developed in order to allow emulating several virtual systems within a single physical computer. In other words, these technologies should allow the users to use more than one operating system on their PC, so that each OS could be busy solving its specific matters. The difficult part about the implementation of this idea is the fact that all operating systems work within a “zero circle”, i.e. they interact directly with the system hardware. However, the new virtualization technologies that I am talking about right now should make it possible for several operating systems to share the hardware resources of the single physical machine without any performance limitations.

Vanderpool technology

As you remember, at the IDF last year Intel already demonstrated a PC with Vanderpool technology in action. There were two virtual systems running on that PC, each with its own operating system. They were changing the OS settings and installing the drivers on one virtual computer, while another virtual computer worked as an entertainment center playing back an animated movie. And during the movie playback they rebooted the first virtual system without affecting the movie playback on the second virtual system at all. So, both these tasks were performed simultaneously and didn’t interfere with one another. This way, Vanderpool should support the so-called “division of labor” on the upcoming computer systems designed according to Intel’s digital home concept.

Pages: [ 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 ]


Comments currently: 39
Discussion started: 12/28/04 03:36:08 AM
Latest comment: 12/19/15 06:51:30 AM

View comments

Add your Comment