News
 

Bookmark and Share

(20) 

Even though not all game developers have adopted shader model (SM) 3.0 introduced three years ago and some claiming that transiting to DirectX 10’s shader model 4.0 right now hardly makes sense, Nvidia’s new software development kit (SDK) already features profiles for shader model 5.0, which is believed to be an improved version of the SM4.0.

Nvidia new SDK 10, which was released just last week, apparently contains macro invocations that define the supported CG profiles, including such profiles as Fragment50, Vertex50, Geometry50, meaning that current SDK supports architecture micro-code profiles for pixel shaders 5.0, vertex shaders 5.0 and geometry shaders 5.0.

While hardly anybody knows what shader model 5.0 actually is and how much is it different from the shader model 4.0, the inclusion of the architecture micro-code inside a compiler indicates that Nvidia foresees the arrival of shader model 5.0-capable hardware soon enough to enable game developers to compile their titles for it. On the other hand, it is logical that Nvidia is working on next-generation graphics technologies and chips, such as G81, G90 and so on.

Nvidia’s new set of tools for software developers is primarily designed to take advantage of the company’s latest hardware. The Developer Toolkit includes SDK 10 with code samples for the latest graphics processors; PerfKit 5, a set of powerful tools for debugging and profiling GPU applications for Windows Vista and DirectX 10 with shader edit-and-continue, render state modification, customizable graphs and counters, and more; ShaderPerf 2 suite that provides detailed shader performance information with support for new drivers; FX Composer 2, a development environment for cross-platform shader authoring; and some other tools.

Officials for Nvidia did not comment on the news-story.

Discussion

Comments currently: 20
Discussion started: 03/12/07 04:24:04 PM
Latest comment: 05/23/11 05:48:08 AM
Expand all threads | Collapse all threads

[1-3]

1. 
This is such ridicolous, nvidia wanted to suck all the money from consumer just to get this tiny improvement like what happen between SM2 and SM3 nvidia suck this time around
0 0 [Posted by:  | Date: 03/13/07 02:07:38 AM]
Reply

2. 
I don't understand all this ranting.. the company is inventing something and there is ranting.. would you all like the firm to ship the same products over and over and let the competition get ahead of them and eventually the firm to go bancrupt.. is that it..

Its some new thing in the pipeline.. thats always the thing with computers.
0 0 [Posted by:  | Date: 03/13/07 02:50:13 PM]
Reply
- collapse thread

 
You are a twat who is still using a integrated graphics that can play Solitare at
a blazin 2 FPS!

Get into the mind of a hardcore gamer and you'll might know why
they are ranting!

If you can't this is the reason.

1. Hardcore gamers wants the best!
2. They are willing to spend top cash for the best products that gives them the
best performances! 1 FPS slower is like hell to them.
3. These products are uber expensive!
e.g. A 8800GTX cost around $600!
Some even got 2 just for SLI
2 8800GTX=$1200
4. The 8800GTX supports DX10 and SM4.0.
5. Nvidia's G80 drivers for Vista still sux nor does it support SLI yet!
6. Nvidia decides to ditch those gamers who decides to
go and make SM5.0
7. Gamers gets mad and starts ranting
8. Will those G80 series cards support
DX10.1 and SM5.0 with a simple driver update or will
they need to spend another $600 just for the latest technology?


Get it!?
If you still don't they go back to your hut and continue
sucking those hairy donkey balls of yours!




0 0 [Posted by:  | Date: 03/15/07 07:16:24 PM]
Reply
 
Bullcrap and please stop using this terrible language of a 13 year old kid.

These so called hardcore [First Person Shooter, yeah there are more genres out there] usued to turn down the graphics and play with ugly settings.. so your saying they dont do this anymore ? Im pretty sure most could live with last years high end.. otherwhise $600 is not so über expensive if you only get 1 card each year.
0 0 [Posted by:  | Date: 03/16/07 02:54:16 PM]
Reply
 
Btw at $600 I could get a new card every month if i wanted too :P

But trying to accumulate some good camera gear, wich is much more expensive than computers..
0 0 [Posted by:  | Date: 03/16/07 02:57:36 PM]
Reply
 
A gamer who buys a expensive card is most likely not going to turn down the gaming
graphics. If that is the case, then they could of just go with a cheaper card.
Also high end products comes out every 6 months or so (if it's not in a delay
*cough* *R600* *cough*), then most gamers will go with the higherend card.
Even if they did just got themselves a high end card 6 months ago.

Right now it's the 8800GTX later on the 8900GTX and
possibly the 8950Gx2

Dual Intel Quad Core QX6700
8GB DDR2 1066
Asus Bearlake-X motherboard
3TB HD space
2 8950GX2 in quad SLI
30" Apple Cinema display

Holy sweet mother of pearl! *____*

BTW: If you just decided to join the PC gaming industry, get used to it because
this is how it works! If you can't deal with it then go back to those
shitty console of yours.
0 0 [Posted by:  | Date: 03/16/07 05:52:17 PM]
Reply

3. 
Actually this is Cg 1.5, not SM5.0 The news story is wrong.

Remember Cg? It is (was?) HLSL compatible from back when the idea was Direct3D/OpenGL would just have assembly shaders to prevent new drivers from including different compiler optimizations that broke existing shaders. MS worked with nVidia to create HLSL, which is basically Cg, for D3D shaders.

However Cg is still useful for things like OpenGL fragment programs (assembly pixel shaders) and Playstation3 development.

At any rate shader model 4 is here to stay a long time. Although DirectX 10.1 is just around the corner...
0 0 [Posted by:  | Date: 03/23/07 08:23:27 PM]
Reply

[1-3]

Add your Comment




Related news

Latest News

Monday, July 28, 2014

6:02 pm | Microsoft’s Mobile Strategy Seem to Fail: Sales of Lumia and Surface Remain Low. Microsoft Still Cannot Make Windows a Popular Mobile Platform

12:11 pm | Intel Core i7-5960X “Haswell-E” De-Lidded: Twelve Cores and Alloy-Based Thermal Interface. Intel Core i7-5960X Uses “Haswell-EP” Die, Promises Good Overclocking Potential

Tuesday, July 22, 2014

10:40 pm | ARM Preps Second-Generation “Artemis” and “Maya” 64-Bit ARMv8-A Offerings. ARM Readies 64-Bit Cores for Non-Traditional Applications

7:38 pm | AMD Vows to Introduce 20nm Products Next Year. AMD’s 20nm APUs, GPUs and Embedded Chips to Arrive in 2015

4:08 am | Microsoft to Unify All Windows Operating Systems for Client PCs. One Windows OS will Power PCs, Tablets and Smartphones