by Anton Shilov
08/06/2009 | 03:37 PM
Advanced Micro Devices this week released beta version of OpenCL development platform that takes advantage of x86 central processing units (CPUs). The company already offers beta version of its ATI Stream platform that relies on graphics processing units (GPUs). AMD hopes that early release of x86-based OpenCL development platform will encourage software makers to tailor their applications for AMD’s processors.
“By supporting multi-core CPUs and GPUs with our OpenCL environment, AMD gives developers easy access to both processing resources, so they can efficiently write cross-platform applications for heterogeneous architectures with a single programming interface. AMD is supporting OpenCL with our ATI Stream SDK as an enabler of wider GPGPU adoption among developers and users,” said Rick Bergman, senior vice president and general manager of AMD products group.
At present ATI Stream SDK 2.0 beta only supports execution of OpenCL-based programs using x86 central processing units, whereas the ATI Stream SDK 1.4 beta only supports acceleration of OpenCL apps using graphics processing units. Therefore, software designers who wish to utilize central processing units for their OpenCL apps should use the 2.0 beta, whereas those, who program using Brook+ language and compute abstraction layer (CAL) should continue using 1.4 beta.
The final version of ATI Stream (which will be available at some point in future) will support OpenCL for both CPUs and GPUs as well as DirectX compute shaders for GPUs. The final version of ATI Stream will still require software developers to determine which of the available resources – central processor(s) or graphics processor(s) – to use for multi-thread computing.
Even though OpenCL is generally considered as an application programming interface (API) for multi-core graphics processing units, there are numerous algorithms that map better to multi-core CPUs, AMD explained, which is why it makes sense to optimize multi-thread applications for microprocessors as well.