Bookmark and Share


Performance of integrated graphics solutions have been growing rapidly in the recent years and is expected to leap forward this year as AMD and Intel Corp. release their next-gen products. At the same time, sales of personal computers have been declining in the recent quarters. Both factors pose a threat not only to all PC hardware makers in general, but for producers of discrete graphics chips in particular. However, Nvidia Corp. does not believe it will face any problems with standalone GPUs.

“The discrete GPU market has been growing for us 12% CAGR over the last four – five years. We see no reason why it is going to stop. The reason for that is because we are making the GPU more useful over time. Four years ago no one spoke of using GPUs for general-purpose computing. Four years ago no one spoke of using GPUs to accelerate digital content creation applications. Four years ago no one spoke about putting GPUs in servers. […] We have expanded the reach of GPUs into non-PC devices,” said Rob Csongor, vice president of investor relations at Nvidia, during the quarterly conference call with financial analysts.

Today, companies like Cisco, IBM, Dell, HP, all are shipping servers with GPUs inside. Those servers are aimed for high-performance technical computing, cloud computing, Big Data and many other purposes. In addition, graphics processing units (GPUs) these days also accelerate many consumer and professional applications, which means that if someone needs truly high performance, he or she will have to buy a discrete graphics card.

But there are other factors that will drive adoption of standalone graphics adapters. Despite of the rise of smartphones and tablets, personal computers are back as a video game platform thanks to leading-edge titles like Crysis 3, FarCry 3, Grid 2, Metro Last Light as well as various Free to Play games.

“Meanwhile, PC is really one of the most important gaming platforms today; it is one of the most important gaming platforms because it is open. If you were developing free to play games […] the PC is really a terrific platform to them, there are many markets outside the United States, where the game console are just not as popular. For example China, South Korea, [in] many of the regions outside of the United States, particularly Asia, which happens to be the fastest growing markets for us, the PC is really the preferred gaming platform and we are seeing a lot of growth there,” said Mr. Csongor.

It is evident that tablets, thin clients and ultrabooks are gaining market acceptance. However, there are numerous use cases for which those devices are simply not even considered.

“So, there are a lot of reasons to be enthusiastic about the continued growth of GPUs. […] Tablets disrupt the PC. [They] disrupt the PC for casual PC use. You cannot really use a tablet to design a car yet, and it really does not make sense to use a phone to create a movie. So, there are a lot of productivity [applications where] the keyboard is important and large storage is important and a mouse is important and large display is important. So for a lot of us the PC continues to be very important and those are not being disrupted really,” stressed the vice president of Nvidia.

Tags: Nvidia, Geforce, Tesla, Quadro, Kepler, Maxwell, Business


Comments currently: 28
Discussion started: 05/10/13 02:17:14 AM
Latest comment: 12/18/15 10:19:29 PM
Expand all threads | Collapse all threads


show the post
2 7 [Posted by: Tristan  | Date: 05/10/13 02:17:14 AM]
- collapse thread

What do you mean by REAL Tristan? Nvidia also spends millions on designing ARM cores. I'm totally surprised that you would neglect to promote this (just for you Tristan - English 101: An example of well communicated sarcasm). Nvidia doesn't have a bright future in graphics. Intel is allowing it to live for now because its own Iris graphics is so crap. But in the future when Intel catches up a bit more, it will engineer a way so discreet cards cannot be added at all to its platform. Then, Nvidia without a x86 licence is dead meat in high end gaming. It will only be able to pair its graphics to ARM cores - and this will not do. Grandiose Mr Huang will pay dearly for not merging with AMD and agreeing to start off as just a board member in the newly combined company. OCZ company is suffering a similar fate because its CEO did not compromise in wanting to be head of the newly formed company's SSD division when it merged with Seagate. Its shares went from $8.00 to now $1.50. It can't survive the market now and Seagate will pick up its IP at a later date for a pittance, after all the good engineers have left.
3 4 [Posted by: linuxlowdown  | Date: 05/10/13 03:42:05 AM]
Why would Intel do that? They would shoot themselves in the foot (if AMD is still around). I don't see Intel's CPUs and NVIDIA's GPUs as competing products but as complementary. What you say could make sense if Intel manages to develop an architecture that give a superb general purpose performance that gives you astonishing graphics too. They'll then need to develop a new generation of system bus that lets you add as many CPUs on a card as you need (instead of graphics cards). Practically you have two main processors in a modern computer a CPU and a GPU. One processor will never be able to do as much work as two. Only for that fact even if you have a CPU that gives you a GPU perfomance to match a CPU + GPU combination, you'll still need two CPUs. I don't see that happening in the next 5 even 10 years. Although I would appreciate it. But we are at least 20 years away from reaching 100% realistic graphics.
Unless everything evolves differently: cheaper consoles are released more often and with keyboard/mouse support, tablets and smartphones feature some sort better input (other than the touch screen).
Most people buy laptops anyway.
0 1 [Posted by: Zingam  | Date: 05/10/13 05:19:58 AM]
I don't think you get what I'm saying Zingam (or I don't get what you're saying). Summary: Intel will push out Nvidia graphics on its platform when its own graphics has adequately caught up to Nvidia, so they have all the $$$ for themselves. I see that happening in about a 5 year time frame.
1 3 [Posted by: linuxlowdown  | Date: 05/10/13 05:24:18 AM]
Youre beating a dead cow. Intel will never manage to develop their proper graphics not because they reall couldnt but because they're never been interested in that. That Larabbe project was only as a spunoff for HPC and not the other way around as it was presented nowadays. In future probably graphics wont be so proprietary stick to some architecture as Open GL is really standard for more than 15 years and intel has other supercomputing projects in their sleeve so they dont nee DEDICATED GRAPHICS for x86-64 based machines.
0 1 [Posted by: OmegaHuman  | Date: 05/12/13 11:09:37 AM]
- This news is about graphic. Things like ARM and others do not count here.
- Intel is forced to maintain PCIE. Without it, they will lose 80% of their sales, and face pressure form goverment market regulators. Did you konow, they forced Intel to maintain PCIE for future ?
- Intel process is very advanced and very costly. Their CPU/GPU must be small (less than 200 sqmm) to be profitable. This is primary reason, they splitted Haswell graphics to two small cores (integrated and optionally external), instead of single large. Integrated graphics can reach levels copmparable to descrete low end, and this will not change.
- Graphics is not only few shaders included near CPU. NV have strong knowledge, experience, replationships, patents, track records and clear vision. Intel is low to NV in these areas, and this will not change for long time.

Hope this help
2 2 [Posted by: Tristan  | Date: 05/10/13 06:36:07 AM]
Nope. Your concrete thinking didn't help. You're talking about the arrangements and technology of 2013. I'm trying to project long into the future based on global fundamentals (like all successful investors such as Warren Buffet). Firstly I've lived long enough to see governments and their associated bodies go back on past decisions at will, depending on the politics and the corruption of the day. No policy or ruling is necessarily infinitely binding or cast in stone. At the stroke of a pen, Intel could shut out Nvidia when the time is right. Secondly, business alignment and relationships are particular to the prevailing competition and economic conditions. It's not like a marriage relationship between husband and wife. It's fickle and in place for survival and profits. It's often considered "goodwill" and thus given an imaginary value on the balance sheet. Read the case about the collapse of Time-Warner to see how that is just smoke and mirrors in business. Thirdly, the Haswell graphics technology that you explain will change in the future and it shouldn't be assumed that it won't. AMD supporter I am, I do expect Intel to catch up in 5 years. In short Nvidia is a dead duck because it doesn't have its own x86 license to create its own platforms - it is dependent on Intel (the relationship was once codependent - but increasingly with every new Intel graphics core, this is not the case). Don't lose your life savings through it.
1 1 [Posted by: linuxlowdown  | Date: 05/10/13 08:14:09 AM]
Myopic and short sighted on too many levels. Completely wrong.
0 0 [Posted by: beck2448  | Date: 05/15/13 01:19:52 AM]
And you think "high end" gaming would prosper on ARM architecture. I would disagree on that

Or that x86 is here to stay. Many original licences for x86 are already long gone history so in next ten years nvidia could use x86 usefully only by agreement with AMD on their x86-64 Supposingly if that company aint gone in 10 years time.
2 0 [Posted by: OmegaHuman  | Date: 05/12/13 11:05:34 AM]

The future of PC gaming is HSA based.

AMD now sits at the nexus of both gaming and HSA. Expect Kaveri to be a next gen gaming gorilla. Expect it's excavator based successor to be a gaming Godzilla ... AND challenge Intel on the CPU front.

That means AMD APUs replacing Intel as the PC gamers chip of choice and AMD's HSA based graphics boards being the natural choice to accompany that APU.
5 1 [Posted by: spigzone  | Date: 05/10/13 06:38:41 AM]
- collapse thread

That's my prediction too. Every day it looks more likely.
2 1 [Posted by: linuxlowdown  | Date: 05/10/13 08:26:43 AM]

nVidia is right that GPU wont get dead by anytime soon. GPU's are evolving rapidly to a GPGPU accelerator than just 3d accelerators.
3 1 [Posted by: tks  | Date: 05/10/13 07:38:33 AM]
- collapse thread

GPU won't be dead. But Nvidia will be. It's only chance of survival will be a merger with Intel. Intel is trying to force its hand by developing competing CPU/GPU combos and will no doubt endeavour in the future to cripple its sales for a cheaper company share price buyout, by any means. But the merger is still not a given taking into account the stubbornness and grandiosity of Mr. Huang who has demonstrated in the past that he can cut off his nose despite his face. I believe this company is a medium risk proposition. Its future prospects are precariously tied to the decisions of Intel's board.
1 1 [Posted by: linuxlowdown  | Date: 05/10/13 08:30:13 AM]

Actually in a few years there will be very few people buying discrete graphics as APUs will be replacing most CPU/GPU combos. APUs will offer all levels of performance from entry level to mid-level and even very highend performance equal to or better than 95% of all discrete consumer based GPU cards.

That means the only people buying a discrete GPU card will be a very small percentage of consumers who are willing to dramatically over-pay for the privilege. Eventually they will surrender as GPU cards will cost many times more than an equal performing APU. That will be the end of Nvidia and that end is not too far off.
3 1 [Posted by: beenthere  | Date: 05/10/13 12:02:56 PM]
- collapse thread

Right. And you suppose there be new level of silocen physics thats valid for a CPU design pardon me APU (blabla-truch) and GPUs will be crappy old physics design. In fact you would believe but it's actually easier to design GPU than CPU so that is why everybody pushing crappy APu as they could bragg how newCPU called APu has improved TFLOPS level in every new generation cycle ... It's Moores Law on steroids.

And as GPu goes 2,5x times better GPU HD7770 costs 25% less than top end APU So in budgetary view mode it's only a little bit more expensive to buy CPU scrap off "Athlon X4" based on same Trinity design and separate GPU than APU itself. And it many situations it could even consume less. APU is a great for a things like Playstation and XBOX and OEM market. If PC market is dead as pools trying to show for last few years then consoles are its successor and APU is really cutting down OEMs cost. Too bad if this would really become the truth but as media centers, consoles and various 5th grade gadgets come to market less and less people spends on a real computer as all that gadget crap is usually much cheaper for one time payment and comes ready out of the box. It's a crisis economics supported by close minded uneducated people and has nothing to do with "APU is great thing for PC theory".
0 2 [Posted by: OmegaHuman  | Date: 05/12/13 11:23:55 AM]
You bozos always post the same dumb crap about Nvidia. You have absolutely no idea how much more demanding the requirements of future graphics will be. At least 1000 times more powerful than today in the next 10 years. Intel won't come close. Nvidia is leading the way in supercomputing and high end. The need for evermore powerful graphics processing is increasing EXPONENTIALLY!
0 0 [Posted by: beck2448  | Date: 05/15/13 01:23:35 AM]

HSA is the future
3 0 [Posted by: redeemer  | Date: 05/10/13 01:04:38 PM]

The future is Graphic (GUP).
All we will use in the future need to be appearing graphically.
Especially more and more sophisticated 3d graphic that help us to design and build new cities … in the future we need more powerful GPUs.
0 0 [Posted by: JanGozi  | Date: 05/11/13 02:06:27 AM]

don't forget that the joker card in nVidia's hand, in the gpgpu world, is CUDA.
CUDA from the beginning until now and a few years more are the big thing.
The bad news for CUDA is the further acceptance of OpenCL from companies such as Adobe. This means that the gpgpu market in not a monopoly any more.
Those facts, considering the poor OpenCL performance of nVidia's cards, will add to a bigger headache for nVidia's people.
1 1 [Posted by: Yorgos  | Date: 05/11/13 07:51:04 AM]

show the post
0 3 [Posted by: MelodyRamos41  | Date: 05/11/13 07:26:08 PM]

Microsoft would need to think about to give the OS for free and
prevent the spread of Android.Main culprit for the decline of the PC market is Micorsoft.
1 0 [Posted by: Blackcode  | Date: 05/12/13 01:26:23 AM]
- collapse thread

Yep, their business model is outdated now.
0 0 [Posted by: linuxlowdown  | Date: 05/12/13 05:09:52 AM]
I vote for Android x86-64 ... stink down Microsoft.
0 0 [Posted by: OmegaHuman  | Date: 05/12/13 11:26:59 AM]

It would be interesting to see how much the scientific supercomputing market drives the development of future annual releases of video cards. With Exaflop computing arriving in 2018, Petaflop computing will be within the average consumer reach. With many more individuals having that processing ability in a home computer rather than it being reserved for the those selected by science grants approved, what should the average scientist use that new processing power towards?
0 0 [Posted by: qubit  | Date: 05/12/13 05:10:26 PM]


Add your Comment

Related news

Latest News

Wednesday, November 5, 2014

10:48 pm | LG’s Unique Ultra-Wide Curved 34” Display Finally Hits the Market. LG 34UC97 Available in the U.S. and the U.K.

Wednesday, October 8, 2014

12:52 pm | Lisa Su Appointed as New CEO of Advanced Micro Devices. Rory Read Steps Down, Lisa Su Becomes New CEO of AMD

Thursday, August 28, 2014

4:22 am | AMD Has No Plans to Reconsider Recommended Prices of Radeon R9 Graphics Cards. AMD Will Not Lower Recommended Prices of Radeon R9 Graphics Solutions

Wednesday, August 27, 2014

1:09 pm | Samsung Begins to Produce 2.13GHz 64GB DDR4 Memory Modules. Samsung Uses TSV DRAMs for 64GB DDR4 RDIMMs

Tuesday, August 26, 2014

10:41 am | AMD Quietly Reveals Third Iteration of GCN Architecture with Tonga GPU. AMD Unleashes Radeon R9 285 Graphics Cards, Tonga GPU, GCN 1.2 Architecture