Dear forum members,
We are delighted to inform you that our forums are back online. Every single topic and post are now on their places and everything works just as before, only better. Welcome back!


Discussion on Article:
Every Watt Counts: AMD E-350 vs. Intel Core i3-2100T

Started by: jonup | Date 04/16/11 04:36:39 PM
Comments: 72 | Last Comment:  12/21/15 11:52:47 AM

Expand all threads | Collapse all threads

[1-20 | 21-37]

You guys keep outdoing yourselves! Well done!
Of course we, the readers can't get enough. How about some comparisons with Sempron 145 and Athlon 610e in ZOTAC 880GITX-A-E (AMD equivalent to the Zotac H67-ITX WiFi)? While the Sempron won't be able to match the Sandy Bridge, it is $45. Meanwhile, Athlon 610e is very similarly priced (~$140) and clocked (to the SNB 2100T, and it is only slightly more power-hungry at 45W.
BTW is your Zotac board H67ITX-A-E or H67ITX-C-E? What is the difference between the two?
1 0 [Posted by: jonup  | Date: 04/16/11 04:36:39 PM]
- collapse thread

BTW is your Zotac board H67ITX-A-E or H67ITX-C-E? What is the difference between the two?

As I know, these mainboards are identical. But H67ITX-C-E is newest B3-chipset based version. And H67ITX-A-E is earlier and discontinued board.
0 0 [Posted by: Gavric  | Date: 04/16/11 06:12:44 PM]
1-The TDP of Athlon 610e doesn't even include the IGP and PCI-E lanes, unlike i3 2100T

2- How do you that it is "slightly" less power-hungry ? Have you seen any review for Athlon 610e against 2100T ?

0 0 [Posted by: maroon1  | Date: 04/20/11 07:40:52 AM]
Yeah! This review is so flawed in so many ways! There should be a price to performance ratio chart at least! Bad review!
1 0 [Posted by: PFX  | Date: 04/26/11 07:12:29 AM]

Isn't it HD Graphics 2000, not 1000? There's only 2 variants as far as naming goes.
0 0 [Posted by: DavidC1  | Date: 04/16/11 06:16:47 PM]
- collapse thread

It is HD Graphics 2000 with lowered frequencies. Some people have called it HD Graphics 1000 for convenience.
0 0 [Posted by: Gavric  | Date: 04/16/11 06:20:39 PM]
I've found that it is called HD Graphics 1000.
0 0 [Posted by: DavidC1  | Date: 04/17/11 10:59:01 AM]

Another example of XBit Labs calling them as they see them.

After all the pre-release hype about Fusion/Brazos, it was disappointing to find out that the CPU part of Zacate is so weak.

Zacate is not worth it for HTPC: It's not a good plan to build a new system that is barely at the lower limit of acceptable performance.

From this comparative review, I have to conclude that the only safe use for Zacate is in low-end 10.1 inch netbooks. The problem is that many manufacturers are using the E-350 in $500 to $600 12 to 14 inch laptops and are using the even weaker 1.0 GHz C-50 for their 10.1 inch netbooks.
0 0 [Posted by: BernardP  | Date: 04/16/11 07:36:09 PM]

This article is misleading with respect to power efficiency.

Readers first scan the bar plots and only the really interested person read the text.

You compare the i3 2100T to an E-350 with a highly power inefficient Gigabyte mainboard. The MSI board you tested (yourself!!!!!),, is far more power efficient.

Power consumption:
Test -> Idle, CPU-B , GPU-B, CPU+GPU B
E350 MSI 7.3 15.8 17.5, 22.1
E350 Gigabyte 12.8, 23.9, 27.7, 31.2
i3 2100 9.7, 33.6, 22.8, 38.9

In conclusion: the MSI + E350 has lowest power consumption of the 3 tested platform in their tested configuration. (Actually it also beats the D525 + ion combo)

Why didn't you include the MSI data as well? Although you mention that the MSI is more power efficient, you still conclude "the Fusion power efficiency is just a marketing hype."

You can conclude that Gigabyte has launched a very inefficient E-350 board, power-wise the MSI is a far better choice and that the i3 2100 has a better power/watt ratio. However, idle the E350 is still a better choice (given the tested configurations!)

And you entitle the article "Every watt counts" ....
Quite embarrassing to be honest
2 0 [Posted by: roodkapje  | Date: 04/17/11 12:51:00 AM]
- collapse thread

Indeed! Omitting the MSI board's data is inexplicable to me. How do you justify it? Of course you could also raise doubts on the Zotac board power efficiency, but those doubts should be clarified by adding some other board to the comparison, not by picking the worst of the bunch from the Amd side.
0 0 [Posted by: paolo.scanferla  | Date: 04/18/11 05:21:49 AM]
Xbitlabs used Zotac motherboard for i3 2100T, and it is not efficient in power consumption

1 0 [Posted by: maroon1  | Date: 04/20/11 07:09:34 AM]

Why are you using a 880 W PSU for testing low power equipment. Power consumption comparisons are absolutely impossible at 1% utilization! PSUs are typically very inefficient at low utilizations and 1% is ridiculously low. Why don't you use a <100W or even less for these reviewsß
0 0 [Posted by: Superdude  | Date: 04/17/11 01:05:01 AM]
- collapse thread

We have measured the power consumption after PSU, not before. That is why PSU efficiency is no matter at all.
0 0 [Posted by: Gavric  | Date: 04/17/11 11:04:53 AM]


Thank you so much for this article and your time consuming labor. I read it early this morning with constantly growing interest because I have recently bought a Zacate-Bobcat/HD6310 motherboard for a specialized application and also because I have just recently finished the theoretical design of my new movie server for which I have chosen the Core i3-2100T.

By the way, I chose the In-Win BQ656.AD80TBL chassis/PSU for my specialized application because of its small footprint, volume and racket characteristics. The PSU was rated at just 80 Watts. The In-Win AD80TLB PSU did not carry an 80PLUS certification, but after calling In-Win and talking over my requirements with them I decided to give their chassis/PSU a try. I am very happy with the result. I never saw such a beautiful chassis design before and their production detail was immaculate. Like you, I was not very impressed with the Fusion-Brazos-Zacate-Bobcat/HD6310's performance. I did need somewhat of an IGP, but only the performance needed to run a VNC server. I had already spent about $400 ($100 of that was for the blessed MS GFC-00599 – what a ripoff) on the system and didn't want to dedicate a monitor, keyboard and mouse to the project. In my application the PC could not ever be allowed to sleep or hibernate, so the PC burned about 35 Watts continuously at the wall, according to my WattsUp? Pro. I thought that burn level put it right in the PSU's efficiency sweet spot.

I just want to comment on one thing that I thought was missing in your discussion of the results. That was the effect of semiconductor lithography and process on the overall performance and burn data. Of course there are a few other minor details that could come into play below 10 Watts system burn, such as a power sucking NEC USB 3 chip, and so on. But the fact is that you compared a PC built on AMD's 40nm, TSMC bulk process no less, to an Intel PC based on their year old 32nm HiKMG process. My guess is that if AMD had access to an Intel manufacturing line the results would be very much more even. I'd even bet that the burn numbers might turn to AMD's favor. I won't make a guess on the performance shift, but I'm satisfied that AMD would benefit dramatically. I'm not defending AMD, but how the hell can they be expected to compete with Intel's always ahead litho and processing. It seems to me that the famous IBM SoI processing can't even be implemented by all of the Arab oil money in the world. I feel sorry for AMD, having to get hammered while TSMC and GloFo flounder in their own feces. I guess that we may soon see a brief interval in time where AMD finally draws abreast of Intel in litho and process, then a few months later Intel will jump a whole step ahead again with their 22nm HiKMG process while GloFo struggles down to 28nm and TSMC does God only knows what. It's a sad situation it seems to me. Maybe AMD should consider hooking up with Samsung. They seem to be cooking the SoI better than anyone else.

For my movie server application I have tentatively chosen the Core i3-2100T for some of the same reasons you chose it for your article. In my case I wanted the maximum compute/watt and compute/$. AMD just can't compete from the litho hole they are in. This PC will also need a minimal IGP since it to will only run a VNC server. I will point out that there is a better choice for the ridiculously rich among us, the Core i5-2390T (CM8062301002115). It seems to be a 35 Watt slightly faster version of the Core i3-2100T, but with a 3.5GHz TurboCk enabled. It is spec'ed to have an IGP identical to the Core i3-2100T. The problem with choosing the 2390T is that it OEM priced at $335US and carries a 30-day warranty. How 'bout them apples. I can't handle it. I sure wish Intel would release the 2390T as a retail product, though.

I will need a heftier PSU for the movie server PC because in addition to the CPU/IGP it will have to power one of those hotrod Areca hardware RAID 5 controllers which could push the PC burn up over 75 watts when everything is active. I have selected and already purchased an In-Win BM639.AD160TSL chassis/PSU for the server. It is small, but not as small as the BQ656.AD80TBL because it includes a 160 Watt PSU and a small chassis fan. Once again, the design and workmanship are something to behold. The AD160TSL PSU is not 80PLUS certified, but there is a 180 Watt unit that is.

What I would give for a 80PLUS-Gold fully modular 200 Watt PSU.

Ilya, thanks again for your very interesting article.

0 0 [Posted by: Orville  | Date: 04/17/11 05:39:11 AM]

One of the worst reviews on xbitlabs.
2 2 [Posted by: bereft  | Date: 04/17/11 07:33:25 AM]
- collapse thread

Please explain
0 0 [Posted by: maroon1  | Date: 04/20/11 07:49:11 AM]
Probably referring to the fact that when you make "every watt counts" review, you don't pick such power inefficient motherboard as the Gigabyte's E-350 is, when you've reviewed a LOT better MSI yourself already earlier and know the Gigabyte doesn't give right image of the E-350 platform.
0 0 [Posted by: Kaotik  | Date: 05/05/11 02:43:25 PM]

Wow. Impressive. Intel's 35W TDP CPU has lower idle power consumption than AMD's 17W. Think about Intel's Mobile Sandy Bridge platform with 17W TDP. AMD's Brazos platform is way too over-hyped.
0 0 [Posted by: lukesky  | Date: 04/17/11 08:48:51 AM]
- collapse thread

read the text, with a better mainbord the brazos was more efficient by about 25%.

in fact you can subtract 6 watt's from every Brazos benchmark, giving it a clear lead in EVERY power-usage test.
0 1 [Posted by: Countess  | Date: 04/18/11 03:06:39 PM]
So ?

There are also more efficient motherboards than Zotac which was used with intel i3 2100T

0 0 [Posted by: maroon1  | Date: 04/20/11 07:15:08 AM]
So why did they not use the most power efficient boards from both sides if "every watt counts"?

Since AMD claims that Zacate is on par with Pentium or Celeron turned out to be very untrue already, a direct head to head with an i3 is quite ridiculous (explains the huge price difference). Brazos competes with Intel's Atom, so in order to justify this duel an Atom / ION combo should be included in THIS review. The conclusion of this would then be: Intel Atom & AMD Zacate are slower than an Intel core i3. Somehow this does not come as a surprise to me...
And as the new Sandy Bridge i3 is manufactured in 32 nm, it better be more efficient than Atom (45 nm) and Zacate (40 nm), otherwise they would have some troubles with their process... Efficiency is not caused by architecture alone, it is also due to the manufacturing process.
0 0 [Posted by: tmold  | Date: 04/20/11 01:15:30 PM]
@tmold You can't include the Atom in this review because it is not a APU chip, therefore disqualifying it in comparison with any sandy bridge or zacate cores.
0 1 [Posted by: veli05  | Date: 05/09/11 08:47:05 AM]
@veli05 That is true. Unfortunately at the moment there does not exist any combination of architecture (CPU, APU..) and manufacturing technology between Intel and AMD which would justify a fair comparison. So at least to show the whole comparable spectrum of intel, the Atom should be included. To find out that the current i3 is way faster than a CPU based on bobcat architecture, you don't need to make test for this. It was designed to compete against Atom. Same the other way round that Intel HD3000 is slower than the GPU part of Zacate... And in terms of efficiency you simply cannot win with a 40 nm design against a 32 nm design. Next gen Zacate on 32 nm vs. upcoming Cedar Trail Atom (hopefully also in 32nm) this is a fair comparison. If you can put manufacturing aside, then you have better chances to find out system efficiencies. The end user will not care about this in the end, but this point would highlight the true design capabilities.
0 0 [Posted by: tmold  | Date: 06/07/11 03:28:59 AM]
@ maroon1 : so a board with the 2100T uses 100 watts in prime95?

that kinda puts it out of the running for this test.
0 1 [Posted by: Countess  | Date: 04/21/11 03:19:10 PM]
@ maroon1 : so a board with the 2100T uses 100 watts in prime95?
1 1 [Posted by: Countess  | Date: 04/21/11 03:23:40 PM]

OK - so its an atom or a half celeron w/ great graphics & 18W tdp + ~4W for SB.

Who would use it for the jobs you benchmark it for?

People didnt mind the atom, they minded it grinding to a halt in graphics.

for those 24/7 servers (htpc/proxies), grannies, thin clients using the cloud or backends... its nirvanah

cheap as chips, ports galore, fanless (cpu & psu), dx11 (open cl), usb3 ...

I hear ~150 $US early adopter prices for cpu, discrete level graphics & good mobo. Not bad.

If you are such power users, why sandy bridge? Man up and get a discrete GPU & a proper cpu & a loud fan.

SB will still disappoint on graphics, same as atom. Intel is years behind.

You admit that with another mobo, it wins at idle power. Game set and match for these apps.

This could be a watershed. Intel are on the wave in front of amd in moores law, but progress in x86, has created a solution looking for a problem.

AMD, however, is solving real problems. Graphics is almost the sole app that makes modern PCs grunt for most users.

Many would opt for a cpu downgrade (half the celeron above), if it came with a gpu upgrade and frugal and quiet.

Heres my call. AMDs next move (reliably rumored ) is to shrink current discrete GPUs to 28nm (tsmc methinks) (now use 40nm same as brazos). When process proven, brazos > 28nm. Woo - big jump = kick ass product - 30% shrink. Could happen in a year.

By then, brazos would have the next gpu up the line cut and pasted in to the brazos APU design, & maybe a 30% lower tdp or better if 28nm is a better process as well - thats ~5w for the 350 = 13w, or 6W for the single core brazos. Both include powerful graphics and most of the SB in the tdp.

If open CL works out, even the single core brazos could be a winner in some apps.

If you look at a die shot of brazos, its 2/3 GPU. Clearly this reflects AMDs priorities.

Both relative costs and critical graphics potential are skimmed over in the review.

Yeah, I think we know costly SB wins on x86 synthetics, same as we know bananas differ from oranges, but the puny, frugal, cheap brazos, in its first iteration, kicks intels ass in what people care about - graphics.

Are there any factory fanless I3S? Just curious.

Just sayin - if overall logic of story is so flawed, can you trust the rest of it.

LLano vs i3 MAYBE (but still crap intel graphics), BUT BRAZOS? Chalk and cheese pricewise and most wise.

The tests were biased anyway, but the only issue is to compare atom w/brazos. Anecdotaly, the user experience is way better with brazos. An atom with an nvidea igp subsystem may come close, but at extra cost and power draw.
0 1 [Posted by: msroadkill612  | Date: 04/17/11 10:51:42 AM]

Intel Bias is too much... Why don't you pick a motherboard that doesn't chew 6 watts by itself idling? You've already reviewed one on your own site!

Every watt counts... ummm... great title if your review actually took it to heart...

Drivel of a review... How far you have fallen!
0 2 [Posted by: sanity  | Date: 04/17/11 06:16:03 PM]

This looks like a commercial for INTEL. And look at the image quality on page 10 - no wonder the SB processor is so much faster.
0 0 [Posted by: bbo320  | Date: 04/18/11 01:37:08 AM]

In other news, a $900 intel extreme edition cpu trounces the hell out of my 69 cent pocket calculator chip. Idiots...
0 1 [Posted by: shadowmaster625  | Date: 04/18/11 05:47:41 AM]

this review made possible by intel benchmarketing $$$.
just because intel's atom is rip. xD
0 1 [Posted by: wuttz  | Date: 04/18/11 06:20:26 AM]

Good review. It just shows how slow are CPU the cores in bobcat. But if a take a look at the bobcat die shot, the GPU part is masive. So no wonders.
Those bobcat cpu cores alone would not eat even 1/3 of the whole bobcat fusion power.

Its just not balanced. It has a great gpu (and thats just 80 SP cores vs intels "super" EU-cores) but weak cpu.
0 0 [Posted by: Zool  | Date: 04/18/11 06:57:07 AM]
- collapse thread

its also half the die size of the i3, therefor much cheaper(which has been completely ignored in this review), and it was designed to beat the atom, and it does that thoroughly.

and reviewing a platform on power-usage alone, and then using a board for the AMD that you KNOW is far more energy hungry then others that you have tested before is completely irresponsible and misleading.
1 1 [Posted by: Countess  | Date: 04/18/11 03:15:16 PM]
To the power usage. If i3 opens a rar archive in 100s and bobcat in 427s than the intel CPU goes idle for 327s.
intels i3: (100*33.6+327*9.7)= 6532 Watt
amd bobcat: 427*23.9 = 10205 Watt
If u count a litle bit than the i3 is quite impresive.

Or rather bobcats CPU performance is weak.
0 0 [Posted by: Zool  | Date: 04/19/11 05:50:03 AM]
not really, it costs 3 times as much. and that's with a 40 vs 32nm scenario. 28nm bobcat isn't far off.
1 1 [Posted by: Countess  | Date: 04/21/11 03:22:40 PM]
The E-350's GPU core isn't much more powerful than Intel's one. The Radeon 5450 performs around or slightly under HD 2000 level in many games (though in others the HD 2000 loses by quite a bit). The E-350 is clocked lower than the 5450 and doesn't have dedicated RAM. While the GPU in the i3-2100T is also clocked lower than the HD 2000, I think the gaming results would be a toss-up even if the E-350 had much faster CPU cores.
0 0 [Posted by: ET3D  | Date: 04/24/11 06:47:19 AM]

Without a price comparison, this review is not that much usefull. What would cost the Intel setup vs the Brazos setup?

For an HTPC, I bet the Intel setup is overkill, but won't be sure until I see the price.

Sure, I can check for myself at Newegg or other sites, but shouldn't performance/dollar be part of any serious review?

EDIT: OK, I did the math at Newegg:

ZOTAC H67ITX-C-E + Intel Core i3-2100T: USD 295

Now if you're building an HTPC and choose the Intel route you're probably an idiot. See how cost puts everything under a new light?
0 0 [Posted by: Aleve Sicofante  | Date: 04/18/11 09:44:02 AM]
- collapse thread

That's why my recommendation for Athlon 610e in ZOTAC 880GITX-A-E comparison makes a lot more sense. This combo will set you back $270-275 should perform similarly or better than 2100T and should stay within 10W of Core i3 in just about any scenario.
0 0 [Posted by: jonup  | Date: 04/18/11 05:34:50 PM]

I'd really love to know what kind of WiFi card is on the zotac board. I'd also like to know how easy or difficult it is to replace aforementioned card.
0 0 [Posted by: hsew  | Date: 04/18/11 02:12:05 PM]

where is the image quality comparison fro the transcoding?

its fine to be faster but if the quality is unusable that's a mute point. and we have already seen in the past that results can vary wildly on that point.
2 1 [Posted by: Countess  | Date: 04/18/11 03:34:17 PM]

So this is another stupid review with tiny bit of intel bias? No surprise. Even in your own review the result vary with a more efficient MSI Fusion board.

Tell reviewer:
How much does intel paid you or paid the website? Because this is absolutely horrible.

Maybe intel should open his "intellabs" website. At least you will know what to find.
1 2 [Posted by: Nintendork  | Date: 04/18/11 03:58:54 PM]

This is embarrassing to say such things to a company like AMD. Personally I think we(End-Customers) need more and more from technology and then we must not have these kind of bias for those companies but...
By the way we could imagine if AMD does not exist; we may had our Pentiums/Celerons yet at three-digit clock speed of one core...Who knows?....
AMD made the competition for Intel and that was good.
AMD still is more than good and very respectable.
1 1 [Posted by: Pouria  | Date: 04/19/11 09:59:25 AM]

For all those who said that xbitlab should have used MSI instead of Gigabyte because it is more efficient in power consumption. I want to inform you that Zotac H67-ITX is not efficient when compared to other motherboards

In other words, xbitlab was using as inefficient motherboard for their intel configuration. Let us not ignore this

0 0 [Posted by: maroon1  | Date: 04/20/11 07:28:56 AM]

[1-20 | 21-37]

Back to the Article

Add your Comment