Evidence Mounts About Single-Chip GeForce GTX Titan: First Picture and Another Benchmark Result

A New Benchmark Result Points to Performance-Breakthrough with GeForce GTX Titan

by Anton Shilov
02/07/2013 | 09:22 PM

As the anticipated launch of the new flagship graphics card from Nvidia Corp. is getting closer, more details about the graphics card emerge. According to multiple media and non-media sources, Nvidia is indeed preparing a new single-chip graphics card based on the GK110 graphics processing unit. The novelty is believed to be extremely fast, yet precise information is unavailable. The new product will be unique and extremely expensive.

 

The new graphics card from Nvidia will not belong to the GeForce GTX 6-series or 7-series and will simply be called GeForce GTX Titan, which underlines its uniqueness and potentially points to limited edition release. A Danish web-store recently added “Asus GeForce GTX Titan 6GB GDDR5” product into its listing, but quickly removed the unannounced item. The graphics card is indeed based on “GeForce GTX Titan”, has 6GB of onboard memory and has two DVI, one HDMI and one DisplayPort connectors. The board carries “GTXTITAN-6GB5” product model label. The graphics card was prices at DKK7276, which is around $1308.

The first blurry photo of what is claimed to be GeForce GTX Titan has been published by Wccftech web-site. The board looks as long as the GeForce GTX 580 and features only one huge graphics processing unit covered by a heat-spreader and marked as “Nvidia GK110”. The card carries 12*2 memory chips for a total capacity of 6GB accessed across a 384-bit bus. The board has rather tricky power sub-system: the PCB seems to be capable of 9-phase voltage regulation for the GPU, but only 6-phased circuitry (the publisher of the photo claims 8-phase is in place) is installed. The 2-phase voltage regulators for memory chips is located near MIO connectors for SLI multi-GPU configurations. The board will naturally support 4-way SLI and will require 8-pin and 6-pin PCIe power connectors.

A new 3DMark 11 benchmark result of what is claimed to be GeForce GTX Titan has been published by Arab PC World shows that the graphics card is capable of hitting X7377 extreme score, even higher compared last week’s leak. The web-site also revealed a screenshot from GPU-Z, which recognized GK110 graphics processing unit (which has rather strange “10DE-1100” PCIe device ID) and detected rather strange PCI-E 3.0x16@x8 1.1 mode. The screenshot has been altered and the amount of stream processors and clock-speeds were hidden.

A full Kepler GK110 implementation includes 15 SMX units (with 192 stream processors per SMX) and six 64?bit memory controllers. Different products will use different configurations of GK110. For example, Nvidia Tesla K20 deploys 13 SMX, whereas the Tesla K20X has 14 SMXs. Nvidia GeForce GTX Titan is believed to be largely based on the Tesla K20X compute card and therefore features GK110 chip with 2688 stream processors, 224 texture units as well as 384-bit memory bus.


Nvidia GK110 graphics processor

Nvidia GK110 processor was designed with high-performance computing (HPC) in mind and therefore features numerous architectural enhancements designed to speed up highly-parallel computations. For example, GK110 supports higher amount of registers per thread as well as such technologies as Dynamic Parallelism, Hyper-Q, Grid Management Unit and GPUDirect. Theoretically, Hyper?Q (enables multiple CPU cores to launch work on a single GPU simultaneously, thereby dramatically increasing GPU utilization and significantly reducing CPU idle times), Dynamic Parallelism (adds the capability for the GPU to generate new work for itself, synchronize on results, and control the scheduling of that work via dedicated, accelerated hardware paths, all without involving the CPU) and GPUDirect (enables GPUs within a single computer to directly exchange data without needing to go to CPU/system memory) can speed up at least some graphics applications and will be able to boost performance of physics effects computing.

Nvidia did not comment on the news-story.