AMD, Nvidia Slap-Fight Over DirectX 11

L33tsauce_Marty

New member
Jun 26, 2008
1,198
0
0
hansari said:
Virgil said:
George144 said:
What happened to DX10, we don't seem to have got much use out of it so far.
DirectX 11 is more like DirectX 10.5....
I HAVE A QUESTION!

You always hear about different game engines like Havok and FMOD...but they all seem to revolve around DirectX in some way.

Is there nothing else besides DirectX? I mean, I know about OpenGL...but off all the different middleware everyone is trying to create, is their nothing to replace DirectX? (would replacing DirectX be silly like replacing numerals in mathematics?)
It has to do with Windows. DX is from Microsoft.
 

Epifols

New member
Aug 30, 2008
446
0
0
I think a lot of the stuff that has come out in the last couple years is just a giant rat race. Ever since I switched to gaming PCs it's just cost me $5000, invoked mast amount of frustration, and a really shitty pay off.

It seems that technology has been advancing at a grotesquely slow pace. Sure they are constantly announcing new hardware titles, but it's all rubbish.
 

Radelaide

New member
May 15, 2008
2,503
0
0
Kwil said:
George144 said:
What happened to DX10, we don't seem to have got much use out of it so far.
Bioshock is shur purty with it though..

And Batman's pretty sweet lookin' with PhysX too.

Yeah, I'm bringin' out my epeen 'bout my puter.. I get to do it so rarely. :)
I'll whip mine out too if you want a contest xD
 

jamesworkshop

New member
Sep 3, 2008
2,683
0
0
PhysX isn't proprietry
GPU accelerated PhysX is
Either way its strange for Nvidia since the 300 series is going to support Dx11 anyway so its not like they aren't focusing on it themselves and everybody uses DirectX in the GPU market
 

aaron552

New member
Jun 11, 2008
193
0
0
Treblaine said:
The move from one Direct X API to a higher one is never to improve efficiency (framerate) but improve the quality of graphics, that inevitably has a hardware cost but hardware gets more powerful and cheaper all the time.
Not entirely correct. The DX10 graphics pipeline is A LOT more streamlined than the DX 9 pipeline. The reason DX10 mode runs slower is that the devs didn't just do a direct port of their DX9 engine, they also added a whole heap of DX10-only effects.
 

Jou-LotD

New member
Jul 26, 2009
43
0
0
This report made me laugh. I currently have an AMD card, but that is because I am running an old rig and Nvidia gave up on AGP a long time ago. Anyways, this reminds me of when AMD brought out DX10.1 and claimed that Nvidia by not updating to it was going to be left behind. And as many pointed out, DX10 wasn't used so much and I don't think there is a single game that uses solely 10.1. Of course Nvidia is going to go to DX11 when they need to, and they need to when they decide because they are the market standard. AMD doesn't set the pace, Nvidia does.
 

Danny Ocean

Master Archivist
Jun 28, 2008
4,148
0
0
hansari said:
*Awesome Picture, haha*
From what I understand of it, DirectX is the Windows standard hardware interface. Which, to me, basically lets coders ignore the large variation in different coding languages and hardware used on a PC, and focus on making the software. I think it goes like this, excuse the crappy art:

Software
^runs on
High-Level Programming Language
^runs on
Low-Level Programming Language
^runs on
DirectX/OpenGL (Depending on OS.)
^runs on
Really Low-Level Programming Languages for each piece of hardware.
^runs on
Binary
^runs on
Electricity.

I think... It's probably wrong, I'll drag some software kooks in here to tell us what's what.

TechTerms.com said:
DirectX is a set of standard commands and functions that software developers can use when creating their programs. While any Windows-based software program can include DirectX commands, they are usually used in video games. For example, developers may use DirectX for controlling video playback, sound effects, and peripheral input (such as a keyboard, mouse, or joystick). By incorporating DirectX functions into a computer game, programmers can use predefined commands to manage the video and sound of their game, as well as user input. This makes it easier for programmers to develop video games and also helps the games look more uniform, since DirectX games use many of the same commands.

Technically, DirectX is known as an application programming interface (API), which consists of predefined functions and commands. In order to create programs that use DirectX, software developers must use the DirectX software development kit, available from Microsoft. However, most users need only the DirectX "End-User Runtime" installed on their computer in order to run DirectX-enabled software. The DirectX API is available for Windows software and Xbox video games.
 

Geoffrey42

New member
Aug 22, 2006
862
0
0
Malygris said:
He was even blunter in response to a question about why AMD is focusing on DirectX 11 when "most games are on DX9 console ports," saying, "If NV was able to produce a DirectX11 card today, they'd be delivering a much different narrative on DirectX 11 to the press. If NV really believes that DirectX 11 doesn't matter, then we challenge them to say that publicly, on the record."
Oh, publicly, like... [a href=http://www.xbitlabs.com/news/video/display/20090916140327_Nvidia_DirectX_11_Will_Not_Catalyze_Sales_of_Graphics_Cards.html]this?[/a] I love "who's the bigger man" marketing rhetoric.

UberMore said:
As long as nVidia create a card that supports DX11 when it comes out, I'll be happy, or a driver that allows me to keep my now archaic 8600GT 256MB.
"When it comes out" is precisely the issue for Nvidia. They're having issues with their new GPU yields, and that's causing a major delay for their DX11 parts. In this case, AMD/ATI is essentially accusing them of trying to cover their own issues through these DX11 things, because Nvidia isn't going to have cards out in time for the first set of games that support it. Whether that actually matters is the whole issue; I certainly haven't seen any reason to rush out and replace my card just yet...

Virgil said:
DirectX 11 is more like DirectX 10.5. The major features involve better multithreading support and an interface for programmable GPU's. The funny thing is that both of them will run just fine on DirectX 10 hardware - they just need to be added to the drivers. This could mean some major performance gains for everyone running even moderate video cards with modern processors, so it can't really happen soon enough.

The major feature that requires hardware support is tessellation, which will be very cool, but not until it's pretty widely available. Even then it's mostly eye candy.
While those features could be added to the drivers, if the card doesn't have all of the features required for DX11, it won't qualify. The same things that disqualify current Nvidia cards from being 10.1 certified will also prevent them from being "upgraded" to DX11, since DX11 contains all 10.1 requirements, plus the new ones.

Do you think games are going to support 10.3 implementations of drivers (This is a card which supports this DX11 feature, and that DX11 feature, but not those DX11 features, so pretty please turn on the appropriate in-game processing...)? If not, then while they can implement those features through drivers, I don't see how that's going to do anyone any good when it comes to a DX11-game. If you've got more info on this, I'd be very interested to see it (not sarcasm, I mean it, I was just reading up on DX11 due to the 5870 reviews this week), otherwise, I think individuals like the one I'm about to quote are SOL with regards to DX11, and they just don't realize it yet.

odubya23 said:
If the general gist is that the current high-end Graphics adapters will support DirX 11, then ATI is making a bunch of hooplah about nothing, my GTX 285 SSC will just convert to that with the driver update.
 

Asehujiko

New member
Feb 25, 2008
2,119
0
0
IdealistCommi said:
Hell, I don't know of many games that only use DX10, if any. Most still use DX9, and I tihnk it will stay that way for a while.
Stormrise is currently the only game that requires DX10 directly instead of a [if os=not vista then cancel setup] piece of marketing BS. Additionally, it is a very shitty game.
 

Virgil

#virgil { display:none; }
Legacy
Jun 13, 2002
1,507
0
41
Geoffrey42 said:
While those features could be added to the drivers, if the card doesn't have all of the features required for DX11, it won't qualify.
That's definitely correct, the cards can't (and won't) ever be considered DirectX 11 cards, but some of the features of the API can still be used on hardware that's not fully compliant. Take a look at this presentation [http://developer.amd.com/gpu_assets/Your%20Game%20Needs%20Direct3D%2011,%20So%20Get%20Started%20Now.pps] (Powerpoint) for some more details. It's an overview of how game developers can use the DirectX 11 API on DirectX 9 & 10 cards and still get the architectural benefits, even if all the hardware rendering features aren't there.

hansari said:
Is there nothing else besides DirectX?
Danny Ocean said:
From what I understand of it, DirectX is the Windows standard hardware interface. Which, to me, basically lets coders ignore the large variation in different coding languages and hardware used on a PC, and focus on making the software. I think it goes like this, excuse the crappy art: ...
You're close enough - DirectX (or OpenGL) is how programmers tell the video cards (and other hardware) what to do. The system drivers are then responsible for turning DirectX/OpenGL calls into actual work for their specific piece of hardware. The end result of this (in theory) is that a programmer can know DirectX/OpenGL and not have to know exactly how every video card works, since the system will take care of those details. Anyone that remembers how much work you had to go through to get sound working in DOS games can appreciate DirectX/OpenGL :p

As for replacing it, it wouldn't really make much sense. Any replacement would also have to be supported at the driver level, which means all the major hardware vendors would have to get on board. The only viable replacement would be OpenGL, and the main advantage it offers is more platform compatibility, but at the expense of being generally less developed.
 

Virgil

#virgil { display:none; }
Legacy
Jun 13, 2002
1,507
0
41
Asehujiko said:
Stormrise is currently the only game that requires DX10 directly instead of a [if os=not vista then cancel setup] piece of marketing BS. Additionally, it is a very shitty game.
Because of the relatively poor penetration of Vista, and because DirectX 10 requires Vista's new driver model, most games that support DirectX 10 also have a DirectX 9 renderer built-in. That also means that the use of DirectX 10 is also typically limited to optional eye-candy, and not anything that would use those features for gameplay.

Hopefully Windows 7 will help change that - it's a great OS, and it's kind of stupid that we keep getting better and better graphics hardware that few games can really take advantage of.
 

Geoffrey42

New member
Aug 22, 2006
862
0
0
Virgil said:
That's definitely correct, the cards can't (and won't) ever be considered DirectX 11 cards, but some of the features of the API can still be used on hardware that's not fully compliant. Take a look at this presentation [http://developer.amd.com/gpu_assets/Your%20Game%20Needs%20Direct3D%2011,%20So%20Get%20Started%20Now.pps] (Powerpoint) for some more details. It's an overview of how game developers can use the DirectX 11 API on DirectX 9 & 10 cards and still get the architectural benefits, even if all the hardware rendering features aren't there.
Thank you for the presentation. That was interesting. So, if I have this right, DX11 is inherently supportive of "profiles" for older hardware. The dependencies then become the hardware people writing drivers that support what features they can (which seems likely), and that the software developers implement their code to support all features available to that hardware profile (which seems likely for new games, at least). Not automatic, but it also doesn't seem far-fetched that developers will put in the effort, since it should be beneficial for everyone involved.

Yet more reason for people with DX10/.1 cards not to care about DX11 cards, though.
 

thiosk

New member
Sep 18, 2008
5,410
0
0
Its adorable that ATI is trying to smear Nvidia over physX being an unpopular format while simultaneously suggesting how important it is to be running DX11 RIGHT NOW.

AMD is a bunch of silliness, and their best business plan is to get Intel sued on the basis of anti-american antitrust laws.
 

jamesworkshop

New member
Sep 3, 2008
2,683
0
0
thiosk said:
Its adorable that ATI is trying to smear Nvidia over physX being an unpopular format while simultaneously suggesting how important it is to be running DX11 RIGHT NOW.

AMD is a bunch of silliness, and their best business plan is to get Intel sued on the basis of anti-american antitrust laws.
DirectX.11 is only going to be important when Windows 7 gets released and more than about 10 games actually use it.
I don't know why PhsyX seems to be unpopular the standard PhysX like ragdoll is handled by any CPU and on the consoles considering its the default physics engine for the Unreal 3 engine
and the GPU acceleration is always an optional extra plus it seems strange to argue that a feature that makes the games better graphicaly doesn't belong on the graphics card.
 

Geoffrey42

New member
Aug 22, 2006
862
0
0
thiosk said:
Its adorable that ATI is trying to smear Nvidia over physX being an unpopular format while simultaneously suggesting how important it is to be running DX11 RIGHT NOW.
PhysX being unpopular is entirely AGEIA/Nvidia's fault, for going from ridiculously proprietary, to somewhat less proprietary. If you want PhysX in more games, then make PhysX more widely implementable. At least with DX11, anyone who wants to can make a card, or a game, that is compatible with it, can. AGEIA made their silly "this card is way better than a graphics card, this couldn't possibly be done without a dedicated card" attempt, and failed, and Nvidia picked them up, and said "we managed to implement their fancy stuff in CUDA, but it couldn't possibly be done otherwise". There's nothing so special about CUDA that ATI couldn't implement PhysX directly into their setups, but Nvidia is grasping tightly to what they perceive as a competitive advantage. All the while, shooting PhysX in the foot. I think the biggest problem is that Nvidia is trying to profit off PhysX in too many ways: as a licensor of the physics engine to game developers, and as a bullet-point on their graphics cards. It only makes sense for game devs if a broad enough portion of the market can take advantage of it, and it only makes sense for buyers of video cards if enough games take advantage of it. If they'd pick one (and I think the licensing end is probably more easily profitable), they'd be far better off.

AMD/ATI taking advantage of Nvidia's GT300 yield problems, and using it to differentiate themselves from Nvidia? That just makes sense to me. I don't think it matters all that much, but ATI has the single-chip performance crown for the moment, and they've got a broader feature set in terms of DirectX support. Wouldn't you be crowing about that, if you were in their shoes? I would also point out that unlike the 10-10.1 conversion, which Nvidia chose to stay out of (because it obviously wasn't worth bringing out a whole new chip to support), their current posturing on DX11 is entirely because they've got issues with their new cards. They're not choosing to remain with their current chip longer because DX11 isn't worth it. If anyone's the hypocrite, it's them.

EDIT: Disclaimer - Everything I just said was not sourced; this is just my understanding of the AGEIA/PhysX/Nvidia/CUDA situation based on articles I've read over the last few years. My interest in hardware minutiae fluctuates based on chronological proximity to a new computer purchase, so everything above, if it doesn't sound right to you, could probably use some fact-checking.
 

jamesworkshop

New member
Sep 3, 2008
2,683
0
0
Geoffrey42 said:
thiosk said:
Its adorable that ATI is trying to smear Nvidia over physX being an unpopular format while simultaneously suggesting how important it is to be running DX11 RIGHT NOW.
PhysX being unpopular is entirely AGEIA/Nvidia's fault, for going from ridiculously proprietary, to somewhat less proprietary. If you want PhysX in more games, then make PhysX more widely implementable. At least with DX11, anyone who wants to can make a card, or a game, that is compatible with it, can. AGEIA made their silly "this card is way better than a graphics card, this couldn't possibly be done without a dedicated card" attempt, and failed, and Nvidia picked them up, and said "we managed to implement their fancy stuff in CUDA, but it couldn't possibly be done otherwise". There's nothing so special about CUDA that ATI couldn't implement PhysX directly into their setups, but Nvidia is grasping tightly to what they perceive as a competitive advantage. All the while, shooting PhysX in the foot. I think the biggest problem is that Nvidia is trying to profit off PhysX in too many ways: as a licensor of the physics engine to game developers, and as a bullet-point on their graphics cards. It only makes sense for game devs if a broad enough portion of the market can take advantage of it, and it only makes sense for buyers of video cards if enough games take advantage of it. If they'd pick one (and I think the licensing end is probably more easily profitable), they'd be far better off.

AMD/ATI taking advantage of Nvidia's GT300 yield problems, and using it to differentiate themselves from Nvidia? That just makes sense to me. I don't think it matters all that much, but ATI has the single-chip performance crown for the moment, and they've got a broader feature set in terms of DirectX support. Wouldn't you be crowing about that, if you were in their shoes? I would also point out that unlike the 10-10.1 conversion, which Nvidia chose to stay out of (because it obviously wasn't worth bringing out a whole new chip to support), their current posturing on DX11 is entirely because they've got issues with their new cards. They're not choosing to remain with their current chip longer because DX11 isn't worth it. If anyone's the hypocrite, it's them.

EDIT: Disclaimer - Everything I just said was not sourced; this is just my understanding of the AGEIA/PhysX/Nvidia/CUDA situation based on articles I've read over the last few years. My interest in hardware minutiae fluctuates based on chronological proximity to a new computer purchase, so everything above, if it doesn't sound right to you, could probably use some fact-checking.
Sounds quite on the money but Dx11 doesn't mean anything until Windows 7 gets released since Dx11 will be added to vista with a windows update so Nvidia still have until the end of next month to pull thier finger out instead of talking rubbish.

PhysX is a tricky one as the SDK supports Software physics and thus available on PC,Wii,PS3,360 and the default physics of the Unreal engine but also supports Hardware accelerated physics on their 3D cards.
PhysX won't last but the experience will have been extremly valuable even if CUDA isn't the programing language used to write it plus they must have made some money of it.

The 5870 is a nice card but at £300 its a little pricey compared to their own 4890 (half the price) considering the games for 2010 are

Bioshock 2
Starcraft 2: wings of liberty
Diablo 3
Guildwars 2
Assassins creed 2
Left 4 dead 2

Basically nothing thats going to challenge todays £100 to £150 graphics cards the 5000 series and Nvidia 300 series are dead in the water IMO

[H]ard OCP
Wolfenstein Gameplay Performance and IQ Article

State of the Game

Wolfenstein comes to us as yet another entry in a growing catalog of games which bare distinct signs of being console-focused titles. The lightweight graphics, memory, and processor requirements, the nonfunctional anti-aliasing, the clumsy menu system, and the mere handful of customizable graphics options all combine to show us that this chapter of B.J.?s World War II exploits was geared for the console and only adapted for the PC. While it can be argued that it makes sense from a business perspective for developers to focus on console development first, it does leave PC games wondering what exactly is next for us.

Further, it throws into sharp relief the dubious value of escalating the GPU arms race yet again. We are rapidly approaching the unveiling and launching of new silicon from both AMD and NVIDIA, and we are forced to wonder exactly what the point is. If a $200 video card will play so many new games (Wolfenstein, Call of Juarez: Bound in Blood, Ghostbusters, Demigod) so brilliantly on a 30" monitor with the highest resolution and graphical fidelity right now, and if this trend is going to continue (which it seems to be), why do we need to invest in new graphics cards with the frequency at which NVIDIA and AMD would have us believe we should?

Without a doubt, more demanding technology showcase games like Crysis and Arma II will continue to be developed. But does it really make sense in the current economy to invest in yet more graphics horsepower for such a small clutch of tech-heavy games among such a wealth of mediocrity?

http://www.hardocp.com/article/2009/09/01/wolfenstein_gameplay_performance_iq/10

Hell I can't even see the need to upgrade my 9800GTX and thats from April 2008 when that got released.
 

InsertEvilLaugh

New member
May 28, 2009
33
0
0
DX 10 was a failed experiment as was Vista, DX 11 is a performance boost not a hit like 10 and 7 will be worth it.
 

Dommyboy

New member
Jul 20, 2008
2,439
0
0
Mmm, can't wait for these tech demo games! After all, graphics are everything it seems a lot now.
 

Audioave10

New member
Mar 24, 2010
509
0
0
Since they make games first for the console (that's where the money is), advanced tech is slow until the new consoles are released. That may be 3 years away.