by Anton Shilov
11/03/2013 | 12:59 PM
No matter how good the product actually is, its price and secondary qualities may actually define its market success. Traditionally, the most important thing what people cared about when buying a graphics card was its performance. Today, leading GPU developers provide free games with their premium devices, which results into added value that greatly influences buying decisions.
Performance in different types of video games and “synthetic” benchmarks has always been the most important criteria of graphics cards purchase both by end users and major PC makers. Scores in Futuremark’s 3DMark and PCMark has influenced buying decisions of global PC manufacturers, such as Dell or Hewlett-Packard. Performance in certain trend-setting titles, such as Far Cry, Half-Life or Doom in many cases has been defining market shares of ATI/AMD Radeon and Nvidia GeForce graphics cards. But with the launch of AMD’s Never Settle program a lot of things have changes.
Pure performance costs. A premium or ultra-premium graphics cards are available for $599 - $999, a price of a notebook. Such graphics solutions – AMD Radeon R9 290x or Nvidia GeForce Titan – offer extraordinary speed in most games and provide perfect results in various artificial benchmark programs. More affordable graphics adapters in the $200 - $500 price-range are absolutely perfect for most popular games. When it comes to even less expensive solutions, then many of them offer decent performance in previous-generation and/or not graphically in-intensive games. To sum up, if you need to play Battlefield, Far Cry or Need for Speed, then you should have a decent up-to-date graphics adapter that costs well over $100.
While it is nearly impossible to significantly increase performance of a graphics card that is generally behind performance-wise to its competitor in the price range, it is possible to lower the price and position it against a less potent rival. But a price-cut usually increases risks of losses on the company level, which is something the top management is ought to avoid.
But what is needed if it is impossible to significantly boost performance and/or cut price, but still sell a graphics card? It is possible to deflate the actual performance numbers by optimizing drivers for particular applications and make the product look better than it actually is. That does not work for a long term since reviewers of graphics cards actually love what they do and therefore never tolerate cheating. If caught with pants down, it harms a company’s image much more than it does to sales. But while it is not easy to deflate the importance and meaning of performance, it looks like it is possible to deflate the price of a product by adding value.
Graphics cards with free accessories [thus, added value] have existed for a decade now. But a cheap mouse pad, bag made of inadequate textile or CD/DVD/BD holders made of artificial leather (which even does not pretend to be leather) have never appealed to people wishing to spend some $300 - $500 per graphics card. What those people want is to play games comfortably. So, bring them games. For free.
Nowadays a high-end PC title can cost $70 in the U.S. and €70 in Europe (Battlefield 4 and Need for Speed: Rivals as well as some other titles are sold at that price), which is a result of considerably increased development costs as well as addition of professional musicians, social psychologists, actors/narratives as well as many others into the process. Most AAA-class PC games still cost some $50/€50 (loads of game developers then release DLCs [digitally downloaded content] so to max out spending per game to $100/€100) at launch, a price at which not everyone can buy impulsively.
Due to increase of PC gamers in general, game developers can afford themselves setting initially high price tags without fear of being rejected by avid customers. Even if there are potential customers who pirate games (and lock themselves from DLCs), there are loads more who actually buy even at launch prices. At the same time, nobody likes to miss the money, which is why game developers welcome volume purchases at discounts. Just what the doctor ordered!
Currently both AMD and Nvidia Corp. provide up to three AAA video games with the purchase of high-end GeForce GTX or Radeon graphics card. Some titles may cost $20, some are sold for $50 to end-users. AMD and Nvidia get the licenses for around $10 per game. End-user gets up to $150 value with the purchase of a graphics card.
When you buy a graphics card that costs – say, $350 – do you care about 10% difference in performance, or do you care about a huge – up to $150 valuable – game package you are provided? To my mind, you do care more about the software titles than about the technology that actually supplies. Want to play Splinter Cell: Blacklist? Get yourself a decent GeForce GTX. Have never played Far Cry 3? Buy AMD Radeon. Performance? They are more or less equal [in case someone has not repeated itself in cheating mistakes]. Exclusive features like AMD Eyefinity or Nvidia PhysX? They further degrade the importance of actual performance in games, but who cares about them anyway? Price of graphics cards is similar. Games are different. Obviously, the latter decide the buying decisions.
While benchmark results will continue to be important for enthusiasts, market observers and analysts, it simply does not seem that they will directly influence purchase decisions going forward. Incredibly slow innovation on the graphics chip hardware level in the recent years just proves this.