Bookmark and Share


While Nvidia Corp. reveals some things about its next-generation products, it does not want to disclose the whole picture and it clearly does not provide any precise timeframes. While without providing further details, at the annual investor day conference this week the company did disclose some of its expectations for its next-gen graphics and compute architecture code-named Maxwell.

What we do know about the Maxwell family of chips so far from the official sources is that they will integrate general-purpose Denver ARMv8-compatible cores in addition to graphics stream processors and that they will be able to support unified virtual memory technology with microprocessors from Intel or AMD, a rather big deal for many applications. It is also logical to expect higher horsepower in general, which will boost video games, the main driver for Nvidia’s GeForce business.

Officially, Nvidia expects three major things from the Maxwell architecture: improved graphics capabilities, simplified programmability (probably thanks to ARMv8 and unified memory enhancements) as well as energy-efficiency.

“Number one for Maxwell, that is likely something that we are doing that breaks new ground in visual capability, something that is even more beautiful. […] Number two, it is likely that Maxwell breaks new ground in programmability, ease of programmability, because we want to expand the general purpose nature of the processor without sacrificing its speedup relative to a microprocessor. […] The last thing, the energy efficiency of Maxwell, it is going to crush Kepler. […] We know exactly how to measure it now and we know what it means to be good,” said Jen-Hsun Huang, chief executive officer of Nvidia, during a Q&A session at the company’s investor day conference.

The first GeForce consumer-class products based on Maxwell architecture are expected to emerge sometimes in 2014. Later on, Nvidia will release Maxwell-powered Tesla compute accelerator cards as well as Quadro professional graphics cards. Eventually, Maxwell architecture will be used for mobile application processors that belong to Parker family of Tegra products.

Tags: Nvidia, Maxwell, Kepler, Geforce, Tesla, Quadro, 20nm


Comments currently: 27
Discussion started: 04/13/13 01:29:46 AM
Latest comment: 12/18/15 10:23:38 PM
Expand all threads | Collapse all threads


In regards to Nvidia, I won't believe anything it spins until I see it in a working prototype. Mr Huang has a nose longer than Pinocchio's.
3 3 [Posted by: linuxlowdown  | Date: 04/13/13 01:29:46 AM]

show the post
2 6 [Posted by: tks  | Date: 04/13/13 03:39:47 AM]
- collapse thread

That is correct but only up to a certain performance level. You will never be able to put a high end CPU and a high end GPU on the same die, you simply can not cool it.

CPUs can push over 300w when overclocked and graphics cards can push 400w even 500w, you can not make a normal cool that can handle 700+ watts of heat.

Haswell and Kaveri will be great products and will let a lot of people go without a dedicated graphics card, video playback no issue at all, will play most games at basic settings but you will never be able to play a AAA title at high res with all the settings maxed out on any kind of APU, just won't happen.
7 2 [Posted by: loadwick  | Date: 04/13/13 04:08:30 AM]
Yeah they just had a Q&A with amd on RockPaperShotgun about the future of the APU. They said that once we have reached max visual quality, discrete will go. That's an interesting statement seeing as that can be very far away, if not impossible. APUs are just too condensed to do the job of a big boy video card.
5 2 [Posted by: evernessince  | Date: 04/13/13 06:26:26 AM]
"You will never be able to put a high end CPU and a high end GPU on the same die, you simply can not cool it."
Wait a sec. Have you been keeping up with recent Sony news? You've got an 8-core CPU on die with a high end GPU. Impossible to cool? Well dang. Guess that's it for Sony then.

1 2 [Posted by: electrogonzo  | Date: 04/14/13 07:22:44 PM]
PS4 8 core mid range CPU 2Ghz (35w) with a midrange GPU HD7850 on one core! (55w)

No high end (250w) HD7970 or (130w) i7 3770 3.8Ghz
0 0 [Posted by: vid_ghost  | Date: 04/16/13 10:36:53 PM]
show the post
1 7 [Posted by: linuxlowdown  | Date: 04/13/13 11:40:04 PM]

2014 is a long way off!
3 1 [Posted by: loadwick  | Date: 04/13/13 03:52:53 AM]

show the post
2 5 [Posted by: OmegaHuman  | Date: 04/13/13 09:02:47 PM]

Omg. Everywhere you go there are mindless amd drones posting in countless numbers. Looks like. Amds done a great job of ruining the internet with their advocates. Rock on guys, bash away. But could you please tone it down a tad. I mean seroiusly,!
6 2 [Posted by: ocre  | Date: 04/14/13 08:13:01 AM]

So, silly nvidia (At it again) is touting unified memory enhancements on its road maps as if its a feature they created lol

For the unified memory enhancements, Nvidia should thank ARMv8 and the HSA Foundation/AMD. Its a feature all 2014 ARM, GPU`s and APU`s will have inc the Consoles lol

Nothing Nvidia does surprises me.

They failed with the Tegra and soon they will fail with HPC (Nvidia`s bread & butter) when Intel and AMD catch up with OpenCL.

AMD killed PhysX and will kill Cuda also. Both are proprietary cr*p that nobody needs in the Professional HPC market and Games.
0 0 [Posted by: keysplayer  | Date: 04/16/13 11:01:32 AM]
- collapse thread

cool down amd fanboy.. all of them are great, they do their best.. not like you bashing others, try make a decent cpu/gpu then we can talk..
0 0 [Posted by: najmi93906533  | Date: 05/27/13 03:55:21 AM]
since when does showing something on a roadmap means that "they" invented it?
cant see nvidia saying that anywhere...
and right, no one besides nvidia ever failed with their hardware... ;-)
didnt know amd killed physx. im still using it on the games that support it.
0 0 [Posted by: Frank Honest  | Date: 10/20/13 12:48:24 AM]

I'm tired of all the 'Intel this, AMD that, nvidia there' blah blah blah. New technologies come and the whole industry are keen to adopt them. The fact that nvidia make stuff and AMD make stuff and intel make stuff is good for us all as it hopefully creates competition so we can buy cool stuff for less money than if there wasn't competition.
I like Intel, Sony, Microsoft, AMD, Nvidia and lots of others because they egg each other on to make better and better stuff. I am not loyal to any one of them, they have to earn my money for sure.

Just my two penneth!
0 0 [Posted by: Sascha Phelps  | Date: 06/29/13 03:16:32 AM]


Add your Comment

Related news

Latest News

Wednesday, November 5, 2014

10:48 pm | LG’s Unique Ultra-Wide Curved 34” Display Finally Hits the Market. LG 34UC97 Available in the U.S. and the U.K.

Wednesday, October 8, 2014

12:52 pm | Lisa Su Appointed as New CEO of Advanced Micro Devices. Rory Read Steps Down, Lisa Su Becomes New CEO of AMD

Thursday, August 28, 2014

4:22 am | AMD Has No Plans to Reconsider Recommended Prices of Radeon R9 Graphics Cards. AMD Will Not Lower Recommended Prices of Radeon R9 Graphics Solutions

Wednesday, August 27, 2014

1:09 pm | Samsung Begins to Produce 2.13GHz 64GB DDR4 Memory Modules. Samsung Uses TSV DRAMs for 64GB DDR4 RDIMMs

Tuesday, August 26, 2014

10:41 am | AMD Quietly Reveals Third Iteration of GCN Architecture with Tonga GPU. AMD Unleashes Radeon R9 285 Graphics Cards, Tonga GPU, GCN 1.2 Architecture