Bookmark and Share


Intel Corp. will boost clock-speeds of its next-generation Atom "Cedarview" system-on-chip devices for desktops in a bid to increase their performance compared to existing offerings. With boosted frequencies, improved video playback and graphics performance, the new Intel Atom D2000-series will be more competitive against Advanced Micro Devices' latest Fusion chips for netbooks and nettops.

In the fourth quarter of 2011, Intel will release two new-generation Atom code-named Cedarview-D microprocessors: D2700 (2.13GHz, two cores, Hyper-Threading, 1MB cache, 10W TDP) and D2500 (1.86GHz, two cores, 1MB cache, 10W TDP), according to a source familiar with Intel's plans. The chips will support a number of significant improvements and will support 64-bit instruction sets.

Intel Cedarview system-on-chip with a new Atom core will feature DirectX 10.1-capable graphics engine that will have integrated high-definition video decoder (in order to enable Blu-ray disc playback on all Atom-based systems), will support higher clock-speeds, will have improved DDR3 memory controller and will feature digital interfaces for displays. The document by Intel also claims that the new chip will consume lower amount of power, perhaps, because it will be made using 32nm fabrication process.

The new Cedarview processor will continue to utilize the NM10 input/output controller, which should make it easier for manufacturers to transit to the new Cedar Trail platform.

Thanks to integration of high-definition video decoder into the new Atom SoC, all systems powered by the new chip will be able to playback Blu-ray video. Unfortunately, since the new Atom SoC has outdated DirectX 10.1-class graphics core, it will be unable to use it for general purpose computing. As a result, even the forthcoming platform for ULCPCs from Intel will not be able to match AMD's Brazos in terms of functionality in many terms.

Intel did not comment on the news-story.

Tags: Intel, Cedarview, Cedar Trail, 32nm


Comments currently: 5
Discussion started: 05/10/11 08:50:55 AM
Latest comment: 05/11/11 02:54:45 PM
Expand all threads | Collapse all threads


I'm having trouble figuring out why desktops need an Atom CPU. is a 2+ghz dual core hyperthreading Atom CPU that much cheaper to manufacture than a SU corei3? Surely the two will converge at some point...
0 0 [Posted by: lh3nry  | Date: 05/10/11 08:50:56 AM]
- collapse thread

of course, when talking about "Atom on the desktop", it really means Atom in a nettop.

yes, they are super cheap.

the SU series of chips (also called the "U" series) is MUCH more powerful than the atoms.

performance wise, the Atom cpus are not even in the same league as the U/SU cpus.
0 0 [Posted by: glen m  | Date: 05/10/11 10:33:06 AM]
...Intel will make sure it won't.

Current 45nm Desktop Atoms have between 66mm^2 (D4xx) and 87mm^2 (D5xx).
The SandyBridge 32nm i3 has 131mm^2. So for 1 wafer, you will get 98% (D4) vs 50% (D5) more processors. The cost per wafer is probably slightly higher for 32nm. Also, yields are better the smaller the unit is. While Intel (and others) charge more for higher frequencies, the design is what ultimately dictates the price.

Besides, they consider that even the i3 is overkill for surfing and watching HD movies (since HD decoding tends to go mostly to the integrated GPU).

So they probably want to keep the same die, but moving to 32nm will offer a better GPU and CPU design, with probably the same cost.

PS: if it will support AES-NI, I will strongly consider it to replace my file-server-router.
2 0 [Posted by: mathew7  | Date: 05/10/11 11:51:34 PM]
I totally agree, add AES-NI and you have a perfect processor for home servers.

Unfortunately, I doubt this will happen soon. Currently, AES-NI is used in higher priced chips only, and I think it will take AMD to add it to low cost chips first.
My guess is, that we need to wait a year or so until this will happen.
0 0 [Posted by: tty56  | Date: 05/11/11 05:01:52 AM]

I'm just wondering why these chips are said to have "Substantially Higher Frequencies". Compared to what? The 1.80GHz D525? The 1.83GHz N470? The 2.13GHz Z560?

(Granted, the only dual core chip of the above is the D525, but I don't consider 1.83GHz or even 2.13GHz "substantially higher" compared to 1.80GHz.)
0 0 [Posted by: ET3D  | Date: 05/11/11 02:54:45 PM]


Add your Comment

Related news

Latest News

Wednesday, November 5, 2014

10:48 pm | LG’s Unique Ultra-Wide Curved 34” Display Finally Hits the Market. LG 34UC97 Available in the U.S. and the U.K.

Wednesday, October 8, 2014

12:52 pm | Lisa Su Appointed as New CEO of Advanced Micro Devices. Rory Read Steps Down, Lisa Su Becomes New CEO of AMD

Thursday, August 28, 2014

4:22 am | AMD Has No Plans to Reconsider Recommended Prices of Radeon R9 Graphics Cards. AMD Will Not Lower Recommended Prices of Radeon R9 Graphics Solutions

Wednesday, August 27, 2014

1:09 pm | Samsung Begins to Produce 2.13GHz 64GB DDR4 Memory Modules. Samsung Uses TSV DRAMs for 64GB DDR4 RDIMMs

Tuesday, August 26, 2014

10:41 am | AMD Quietly Reveals Third Iteration of GCN Architecture with Tonga GPU. AMD Unleashes Radeon R9 285 Graphics Cards, Tonga GPU, GCN 1.2 Architecture