Information

Dear forum members,
We are delighted to inform you that our forums are back online. Every single topic and post are now on their places and everything works just as before, only better. Welcome back!



Discussion

Discussion on Article:
Contemporary APUs: AMD Trinity vs. Intel Ivy Bridge

Started by: beenthere | Date 03/31/13 04:45:09 PM
Comments: 34 | Last Comment:  05/23/13 11:41:19 AM

Expand all threads | Collapse all threads

[1-15]

1. 
show the post
3 7 [Posted by: beenthere  | Date: 03/31/13 04:45:09 PM]
Reply
- collapse thread

 
AMD trinity burn around 25 percent more power than Intel Ivy Bridge. how can you call that a slightly lower power configuration. Even in Celeron G1620 vs A4 Trinity both Single Thread and Full CPU+GPU

Its sounds that Ilya Gavrichenkov is lying on his article, if you said "While 22nm could offer slightly lower power consumption"

If someone going to buy AMD Trinity, it better not to buy discrete graphis since thats the propose of AMD Trinity Low cost with better video capability than Intel.

3 2 [Posted by: kailrusha  | Date: 03/31/13 05:16:50 PM]
Reply
 
What about A10-5700? Why look at the least power efficient models? Looking at the performance for the CPU and GPU, the A10-5700 is the sweet spot hands down, not the A10-5800K series. A10-5700 is mopping the floor with i3-3225 in games while using just 12W (11% more power) -- 105W vs. 117W.

Since AMD is at least 1 full node behind, they are not going to have a product that's competitive both on the CPU and GPU side. That's just not how physics works. You can continue to expect Intel having an APU with a faster CPU and AMD naturally offering a faster GPU solution but a weaker CPU. A sacrifice has to be made. Right now the GPUs inside AMD's APUs are too weak really to play modern games which is why most people are choosing an Intel i3 against A10 series. We'll see what happens once AMD brings out Steamroller cores and adds a lot more power to the APUs in the next 2-3 years.
3 1 [Posted by: BestJinjo  | Date: 04/01/13 10:46:06 PM]
Reply
 
If someone buy I3-3225 he should disable the Onchip Graphics. and replace it with ATI HD 7750 which burns around 50 watts only.

I3 3225 CPU Burn only = 79 watts
ATI HD 7750 = 43 watts
ATI HD 7770 = 89 watts

http://www.guru3d.com/art...50_and_7770_review,7.html
1 2 [Posted by: kailrusha  | Date: 04/02/13 02:59:26 AM]
Reply
 
That comparison makes no sense whatseover since the price of buying an i3-3220/3225 + HD7750 is way more than the price of the A10-5700. Also, people buying a $130 APU are not going to be cross-shopping that setup with a $220-230 setup.

i3 3220 = $130
i3 3225 = $145
HD7750 = $90-100

i3 3200 + HD7750 = $220 minimum

You cannot start comparing an i3 + discrete GPU to an A10 and ignore the cost factor. No one in the world is cross-shopping a $220 CPU+GPU combo against a $130 A10-5700 because an A10 or i3 target budget systems where cost is a big factor.

If you can afford to purchase an i3 + a discrete GPU worth $100, then of course none of the AMD's APUs makes sense. But that's not the market APUs serve at all.

Again, you seem to be confused what target market the APUs service. It is evident from this statement you made:

"Its sounds that Ilya Gavrichenkov is lying on his article, if you said "While 22nm could offer slightly lower power consumption""

He is not lying. The power consumption differential between similarly priced i3 3220 and A10-5700 is only 12W and yet the A10-5700 mops the floor with it in gaming or GPU related tasks. Therefore, someone who is interested in a budget gaming APU is better off with an A10-5700.
3 0 [Posted by: BestJinjo  | Date: 04/04/13 06:26:26 AM]
Reply
 
I don't agree about expecting AMD to continue to have a slower CPU. Now that Jim Keller is back at AMD, I think we're going to see higher IPC AMD X86 cores.
1 1 [Posted by: anubis44  | Date: 04/11/13 06:46:16 AM]
Reply
 
how do you count these 98% of the market?
U think gamers are 98%?
I still think 98% of the processes a user does are CPU not GPU bound.
Therefore an i3 is the winner in these cases.
And if you still want both: high x86 performance and some GPU Performance there is still no solution to buying a i3 or i5 and a (cheap?) Graphicscard
1 0 [Posted by: Rollora  | Date: 04/01/13 11:29:57 AM]
Reply

2. 
I am very surprised by the 4k video result. My PC is Q6600 + 7500LE. Yet, when I play MP4 1080p video coded at about 12mbps, CPU rarely shot above 25%. I would say 99% of the time, it stays at 17-25%. The 4k video used in your test is using the same codec and has only 34mbps. How could we see such a significant frame drop with faster CPUs + much faster GPUs?
1 1 [Posted by: jjpcat  | Date: 03/31/13 08:51:55 PM]
Reply

3. 
How did you manage to enable 4K DXVA hardware decoding on UVD3 GPUs ?

Every discrete Radeon card I have tried with UVD3, is not able to HW accelerate 4K. In fact latest Radeon drivers are limited to 1080p only.
2 0 [Posted by: NikosD  | Date: 04/01/13 07:59:58 AM]
Reply

4. 
I long for the Athlon days with AMD was kicking arse and taking names. Can they ever do it again?
5 2 [Posted by: ruel24  | Date: 04/01/13 08:08:08 AM]
Reply

5. 
Beautiful review.
It showed something i was thinking for a while now, AMD and Intel are starting to be uncomparable.
Like x86, powerPC or ARM.
Because we use computers for longer than 1 year i would like to see the same, but totaly same comparison between Llano and Sandy Bridge. To see how does it had matured with latest drivers.
Realy exelent review.
2 2 [Posted by: kingpin  | Date: 04/01/13 08:08:39 AM]
Reply
- collapse thread

 
I'd also like to see a new Llano and Sandy comparison. Anyone knows if any significant improvement were done to the drivers?
0 0 [Posted by: MHudon  | Date: 04/02/13 12:07:46 PM]
Reply

6. 
I have enabled FULL hardware acceleration and got QUICK SYNC working on G2020!! (will work only in IVY cpu)
you need 15.26.1.64.2618 driver to enable quick sync in MediaEspresso.
I checked it and GPUZ showes the GPU usage :-)
Same effect with sony vegas- choose the AUTOMATIC Encode mode or GPU and you will see GPU load 50%~80% during Render
Please XbitLabs- check it and write the findings!

SEE SCREENSHOT!:
http://s17.postimg.org/5x23b6ty5/G2020.jpg
4 0 [Posted by: hwgeek  | Date: 04/01/13 08:54:17 AM]
Reply

7. 
Have anyone been lucky enough to test 4k video with a 4k monitor? I really appreciate if you can post your results if you have such a experience.

I download a 4k test clip (ftp://publicusr:readpubli...stfiles_short_Version.zip).

I am using a HP laptop with A8-4500M and the display is 1600*900. While playing those 4k clips, the CPU stays at 11-17% most of the time. Not sure if the codec is playing any trick, as it may know my display has a very low resolution. I don't know if the CPU utilization would be higher, if this PC is connected to a 4k monitor.

Hi xbitlabs, could you describe in details how you did that 4k video test? If your results were true, it could have had a major implication for the whole PC industry in 2-3 years.

Thanks.
1 0 [Posted by: jjpcat  | Date: 04/01/13 10:08:42 AM]
Reply

8. 
Are we over the days of QuickSync not working in you have another GPU on your PC?

I must admit I didn't read every word so you may have said in the article but back in the Sandy Bridge days QuickSync only worked when you have a monitor attached to one of its outputs, is this still the case?

I will be buying a Haswell set up when it's out but will be using a dedicated GPU also, will the GPU on the CPU still function?

What would be better (mainly quality but also speed) new AMD card, nVidia, sticking with QuickSync or disabling all GPU acceleration and purely using x86?
0 1 [Posted by: loadwick  | Date: 04/01/13 03:03:59 PM]
Reply

9. 
After reading revues in December last year I put together an i3 3220
For one of my grand sons, whilst there was an AMD A6-3670 here already which was about a year old this was to stop 2 kids trying to play on one machine. Now the newer i3 3220 is hardly ever used as the AMD machine appears to be the preferred unit for playing games on.
I even put an old HD5570 video card into the Intel unit to try and make up for the woeful graphics, this did not improve the game play at all, It cost me a newer HD 7770 card just to get it to have the same graphics as an older AMD cpu.
It is still not the preferred unit for the kids to play on. The only thing which it appears faster at is when using an USB stick. Other than that the Intel computer appears to be slower less user friendly. This was even more apparent when in my ignorance I changed both units to Win 8., a complete turn off for both the 12 and 14 went back to playing on tablets, once I ran the recovery and went back to the win 7 they started to use the units again.
They both have an dislike for the Intel toy even if they don't know what CPU's are in the machines as they visit during the weekends.I see no reason to change to Intel for my own use or the 10 units in the business.
2 1 [Posted by: tedstoy  | Date: 04/01/13 03:54:54 PM]
Reply
- collapse thread

 
I've write something about this a while ago, and tell me if I'm wrong, but I feel that for small things like browsing files, music or pictures, my AMD machine feels sort of snapier. The AMD is an unlocked Phenom2 550 while my main computer is an I7 2600k. I'm not talking about video editing here, just doing normal, current operation. If you ask me, this really does count for a "light" user like my wife for example.
2 0 [Posted by: MHudon  | Date: 04/02/13 12:21:05 PM]
Reply
 
I've had the same experience with AMD machines. For very light workloads they are very snappy and this is still the case for Trinity.
2 0 [Posted by: Milli  | Date: 04/03/13 05:01:24 PM]
Reply

10. 
Why do you always insist on testing Pentiums and Celerons with a Z-series chipset and 1866MHz RAM? My guess is that most buyers will combine these cheap CPUs with a B75 or H77 based mobo and be limited to 1333MHz with quite a bit lower performance depending on benchmark. To be fair you should test also i3 with 1600MHz since this is the max for any buyer choosing a lower spec mobo. Ideally, you should test 1333/1600/1866 so that we all can see the influence of memory speed.
1 0 [Posted by: Badcatch  | Date: 04/02/13 01:15:40 AM]
Reply
- collapse thread

 
I would assume it was to keep the same ram across the board. Because 1866 is the upper limit an Amd apu can use without any overclocking. That way gaming reviews would be more accurate. As someone else said for everyday usage Amd builds do feel snappier (web, movies, light gaming). For professional work intel all the way. As far as the power usage for home user, Unless its in a notebook I stopped caring. I have 3 pcs in my house, 3 TVs and an Ac highest my electric bill has been last year was $87. Heck the icemachine in my fridge used more power then all 3 of my pcs. Any friends who build new pc regularly less if its amd or intel I make a link to power options on the desktop. I just tell them when gaming full performance mode all other times balanced.
But great article shows the strengths and weaknesses of each platform very well.
1 0 [Posted by: Mendoza  | Date: 04/03/13 04:22:00 AM]
Reply

11. 
Great review! And thanks for pointing out a very important point again:

when the system includes a discrete graphics core, the integrated GPU has no effect on 3D or heterogeneous computing performance, which means that the computing performance of x86 cores remains the main factor for choosing CPUs for classic discrete PC configurations


Overall, it looks like A10-5700 and i3-3225 are both worthy APUs.
1 0 [Posted by: gamoniac  | Date: 04/05/13 08:36:33 PM]
Reply

12. 
I need to build a new pc, which will perform better in games (RTS & RPG, no FPS) ?
a. Celeron g1620 + radeon 7850 1gb or
b. i3 3220 + radeon 7750 2gb

thank you
0 0 [Posted by: curious  | Date: 04/07/13 11:39:34 AM]
Reply
- collapse thread

 
take G2020 +better GPU
0 1 [Posted by: hwgeek  | Date: 04/07/13 11:04:55 PM]
Reply
 
I think the Celeron would bottleneck any decent GPU, so if you're on a strict budget, the answer is b.
0 0 [Posted by: Hood  | Date: 04/14/13 09:50:12 PM]
Reply

13. 
Overclock please thanks.
0 0 [Posted by: damric  | Date: 04/14/13 09:20:02 AM]
Reply

14. 
To me it doesn't make sense to go with AMD, unless your budget is around $300 for the whole system but you still want to play games. As stated above, the best AMD APU is only equal to entry level discrete graphics (~$50), so if you can stretch that budget even to $400, you can have the best of both worlds, computing and graphics, and still use less power than the AMD solution.
0 2 [Posted by: Hood  | Date: 04/14/13 09:59:10 PM]
Reply

15. 
Is it me or those idle power consumption numbers look a tad too high? People have been building HTPCs with G530s and G620s from last generation and getting 20-25W idle measured at the wall. But here you are measuring after the PSU and using current gen 22nm IVB Pentiums and still getting 36+ wattage? What gives? Your test config doesn't seem to use GFX card as well, so what exactly is consuming so much power?
0 0 [Posted by: Prazsky Krysarik  | Date: 04/20/13 07:45:41 PM]
Reply
- collapse thread

 
Cheap power meters can sometimes give odd results. They are are made to handle loads in the kilo-Watt range and can be unreliable with small loads, as the A2D converter can't discriminate small changes. The other consideration is power factor, which can screw up measurements of true power, especially if using a cheap PSU with poor PF correction. It's worth looking at what the power meter indicates for current and phase as well as power. I'm not a great user of these things, but one time I used a power meter while adding components to an A/V setup, and I swear the power consumption went down, according to the meter, when I added one more component. Switch mode PSUs can do weird things.

And while I am here, a big thank you for an excellent review which answered the questions that were worrying me.
0 0 [Posted by: JCB  | Date: 05/11/13 11:04:14 AM]
Reply

[1-15]

Back to the Article

Add your Comment