Dear forum members,
We are delighted to inform you that our forums are back online. Every single topic and post are now on their places and everything works just as before, only better. Welcome back!


Discussion on Article:
AMD Richland vs. Intel Haswell: Integrated Graphics Performance Review

Started by: Hutchinman | Date 09/15/13 10:28:42 PM
Comments: 44 | Last Comment:  06/29/16 04:01:31 PM

Expand all threads | Collapse all threads


So we wouldn’t recommend you to have high expectations about the Dual Graphics technology. Right now, it is just an experimental feature that allows AMD’s marketing department to draw pretty presentation slides with nice-looking charts. In reality, it is not very useful and not very functional. In fact, it is just glitchy. And we can’t really expect it to improve. The Dual Graphics technology isn’t new. If AMD hasn't bothered to correct its bugs yet, it will hardly do so in the near future.

The fuck are you talking about? AMD is currently working on the tearing issues and stutter in all parts where their GPUs are contained. They are fixing it right now. It will probably be last on the list, but they are fixing it.
1 2 [Posted by: Hutchinman  | Date: 09/15/13 10:28:42 PM]
- collapse thread

Yes , there are strange things going on toward AMD on this site and i"m starting to ignore xbitlabs overall.

If i"m not mistaken amd already released microstutering fixed driver and APU crossfire on AMD exists for at least 3 years so i"m confused how it dosent work proper. I see 50% + increase in xbitlabs tests and thats looks good to me.

And last thing is how come that until few months or year ago,AMDs crossfire was superior to nvidias and now its not even recommended. I remember when nvidias crossfire had 20-30% performance boost and AMD 90-110% , yes 110% performance boost in crossfire, when nVidia released 9800GX2 dual board dual gpu graphics card to try to compete to radeon HD X2.

Something is fishy here.

2 1 [Posted by: kingpin  | Date: 09/16/13 12:18:10 AM]
The problem is frame times and runt frames. The frame pacing in crossfire is completely off so what happens is that one GPU will draw a frame and by the time the other GPU is busy writing another frame into the frame buffer the first GPU will be done with it's next frame and will overwrite the frame by the second GPU.
Like this apps like fraps and afterburner will detect it as being multiple frames while really only a few lines on the screen change and the end user won't see the frame by the second GPU.
Frame pacing is when they basically make sure that the timing between the frames is right so you don't get one GPU overwriting the other one's frame before the user sees it fully, enabling the user to see playback that is similar to the framerate being reported.
Frame pacing is working on some of the xfire solutions with 13.8 but it's far from complete.
Nvidia has had frame pacing for a few years now but no one paid attention to frame pacing, and when it was uncovered it caused a shit storm. In kepler I think Nvidia even found a way to implement frame pacing on a hardware level.
AMD can fix this, but ATM the framerates may as well be lies when it comes to some xfire solutions(dual graphics for instance).
0 2 [Posted by: dinosore  | Date: 09/18/13 01:42:00 AM]
I confirm, at least three new driver have been released to correct CF troubles. 13.8, 13.8B, and 13.10.
0 2 [Posted by: MHudon  | Date: 09/18/13 07:44:06 AM]

I wonder when we will see 4K video content become widely available. My TV service provider (Rogers Canada) still only provides 1080i signal.
It will be at least a few years before 4K TV are affordable. Then a few more years to become popular. That would be a few APU generations later. By that time, I am sure 4K playback will be a standard feature in all APU.
0 1 [Posted by: gjcjan  | Date: 09/16/13 05:17:20 PM]

X-Bit Labs is notorious for posting inflammatory articles particularly directed at AMD. They do this to get their page hit count up which means increased ad revenues. This is a dirty little secret of web forums.

BTW, anyone with a clue knows that Richland's GPU section is far superior to Intel's Haswell and that AMD's Crossfire is superior to Nvidia. Well everyone except X-Bit Labs...
3 4 [Posted by: beenthere  | Date: 09/16/13 08:41:52 PM]
- collapse thread

Its Same as again and again posting inflammatory against intel.

Can you give us a link in AMD Richland Crossfire Setup against Intel Haswell.
0 2 [Posted by: kailrusha  | Date: 09/17/13 05:33:56 AM]
Semiaccurate is against NVidia. It is quite kind to intel.
0 1 [Posted by: gjcjan  | Date: 09/17/13 04:48:27 PM]
there some comment against intel. most of them prevent you from commenting.
0 1 [Posted by: kailrusha  | Date: 09/17/13 05:54:27 PM]
On rare instances I see Semiaccurate being neutral on Intel. Many times it too bashes Intel.
1 1 [Posted by: trumpet-205  | Date: 09/20/13 10:24:27 PM]
AMD's Crossfire is superior to Nvidia

Sorry beeny, but you're way off about that part. Sadly, Crossfire is far from SLI in terms of stability/compatibility and therefore performance. If it doesn't work in real life, who cares about high frame rates in benchmarks

You might have to be an ''experienced'' gamer to know that, but trust me, CF is often tougher to set-up.

Everyone here knows how much I value my precious 7970s, but still, this is a fact.
3 2 [Posted by: MHudon  | Date: 09/18/13 07:58:31 AM]
BTW, anyone with a clue knows that Richland's GPU section is far superior to Intel's Haswell and that AMD's Crossfire is superior to Nvidia. Well everyone except X-Bit Labs.

Intel's graphics has far less execution units VS AMD that has a lot more of their so called stream processors. Comparing the efficiency of both hardware. Intel rules on efficiency. Intel is able to utilize high graphics performance with its existing memory bandwidth compared to AMD. AMD really needs a lot more memory bandwidth for 384 stream processors to show their best. The truth is already in the given benchmarks, but it seems you can not see that.
2 2 [Posted by: tecknurd  | Date: 09/19/13 06:59:54 AM]
Agree. Do note that testing done here make use of DDR3-2133. Many people on budget will opt to use 1333 or 1600 instead. That makes graphic edge on AMD diminish quite a bit.

But if you do spend extra on 2133, then budget saving goes away. Again, one way or the other you will lose an edge.
1 1 [Posted by: trumpet-205  | Date: 09/20/13 10:21:54 PM]

since i work on the multimedia team at amd i can tell you that the cpu utilization for 4k content shown is xbits charts. Media Center Home Cinema isn't using hardware acceleration with this test. Please run the same clip using Power DVD 12 or 13 and put the numbers from that. Make sure video acceleration is selected in the settings from PDVD.
1 1 [Posted by: Hamza Saigol  | Date: 09/20/13 08:21:14 AM]
- collapse thread

Can 4K be even accelerated using AMD GPU at this point? I don't recall AMD ever advertise it has the hardware capability to accelerate 4K.

Cyberlink stated that it can be accelerated using Intel Quick Sync so long as it is H.264 with bitrate less than 60 Mbps. It says nothing about Nvidia or AMD under its support list.
0 2 [Posted by: trumpet-205  | Date: 09/20/13 10:16:23 PM]


Back to the Article

Add your Comment