Ultratwinkie said:
Jesus Christ, you move goalposts like a theist.
So suddenly we dropped power now? Well I guess I was right.
However, Nvidia made its views very clear:
http://www.extremetech.com/gaming/150892-nvidia-gave-amd-ps4-because-console-margins-are-terrible
http://www.maximumpc.com/article/news/nvidia_calls_ps4_%E2%80%9Clow_end%E2%80%9D123
or are you going to do the same shit and "speculate?" Like you have for the last fucking page?
I cited Nvidia. They explained that they did not have the resources to undertake the process because they would have to take the resources away from other projects. Too many irons in the fire doesn't mean they wouldn't have undergone this project had they been capable of juggling it too. SoC is something Nvidia could have done, but they really haven't been going that route so this would have been new ground breaking for them. Not to mention that they'd have to dance around AMD's patents in one of the few areas AMD is stronger in. Nvidia made $500 million off the 7th generation and that was with only one console. That's pretty good for one chip. Let's say the profit goes down to 1/5th of that. What do you think they typically make on any given chip? AMD won all three console companies. They stand to do VERY well off of this, a fact you dismiss because AMD was on shakey financial terms at the time of the deal. Even companies who are doing poorly aren't going to take huge losses just because. Companies would declare bankrupcty or even cash out and go out of business before willy nilly taking a loss.
Nvidia wanted more money than what was being paid. They refused to make consoles.
Yep. So they weren't able to produce the product in a way that was deemed worth their time. This is in contrast to AMD who was. Same argument as the first.
AMD was desperate, so they took the contract to band aid their money loss.
By "band aid" do you mean to make a profit? Again, AMD was already geared to design more SoC solutions. If nothing else, this funded the very expensive R&D process that they were already going to undertake. It's a business no brainer for them whereas Nvidia wasn't planning to do that so you were looking at a much lower margin for them.
This isn't saying NVidia did anything bad. This is just saying that the deal was much more attractive to AMD than it was to NVidia.
Look at it this way. Nvidia would have to incur unexpected costs designing an entirely new SoC solution that they weren't planning on making. It isn't an area they're particularly strong in yet so it would have taken them even more money to get up to speed on it. Add that to what I'm sure is a lower profit margin for the card and you could be looking at a much lower profit than most of their other projects would give them.
Then, look at AMD who was already going into this area of research. Even if they take a loss on this overall it will still significiantly reduce the cost of R&D on this chip which enables them to start releasing SoC solutions for products that would benefit from them. So this new research gives them a nice edge in SoC projects going forward whereas Nvidia's business model isn't so interested in SoC.
This deal was literally nothing but good for AMD and nothing but a risk for Nvidia at the certain high opportunity cost of losing resources for more lucrative projects. It isn't that Nvidia didn't want console business and it wasn't that AMD was desperate. It just made sense for both companies to take the route they did. The only loss to Nvidia is that their cards won't be specifically supported as much anymore but they're still an incredibly common card manufacturer so it's not like they won't be supported to the point of working. Just not optimized in several cases.
It's a common stage of loss to belittle the the value of the opportunity lost. Even for teams and companies we root for it's a simple physchological defense mechanism to pretend like the "other guys" didn't really get that good of a deal. Maybe you're doing that and maybe you're not, but this was a great deal for AMD even if it wasn't as good a deal for Nvidia. AMD would have been dumb not to take the deal whereas Nvidia made a calculated decision.
I'm just not a fan of either companies. You might as well be talking about chair companies to me. Had I found a comparable Nvidia card for a similar price in the same performance range I would have been more likely to buy that.
On top of this, you seem to fail to understand hardware at all.
Physx is software, not hardware. The extent to which it is used or the amount of processing it utilizes is entirely up to the developers implementing it. Think of it like particle physics. People like to use Physx for steam rendering. You can determine how detailed the steam is according to the available resources. This is why low to ultra pc settings matter. You are basically telling it how detailed the physics engines along with other settings can be. That doesn't mean tha the lowest setting doesn't have any physics or detail. Just that it's far less in comparison.
Physx lived when PC got no support from consoles. Outdated consoles that would never handle it. For 7 years.
Are you now claiming that the consoles have never been able to "handle" it? 7 years is the lifespan of the consoles. You seemed to indicate that the consoles were somehow cutting edge on last release.
Anyways, you are misunderstanding what physics engines are for. They aren't necessarily more resource demanding, in fact, they can even be less resource demanding than the various custom-made physics engines that development studios make for their games for certain processes. For example, ragdoll physics. A game could lean on Physx or Havok to help with that if their code is less efficient. True, they are usually meant to add effects on top of the vanilla game but not always.
Nvidia benefits from people working with consoles that have Nvidia cards because physx was still an option there. It makes it easy for developers to make use of physx to a lesser degree on the console and then turn it the heck up on the pc. Metro Last Light used Physx for persistent particles (blow a tile off a column parts stay visible on the ground) and for some steam/fog interaction (the steam is still there in all versions but with physx as characters move through the fog the steam dissapates). These are minor things that require a more detailed engine coupled with the hardware to use it. However, if you're already developing for an Nvidia card it does you no harm to include this feature as something that can be toggled on or off. If you are developing only for AMD cards, it does require extra steps to enable it.
That is why Nvidia went ahead and contributed the drivers. Yes, this allows the use of phyx on the consoles but makes it infinitely more easy for developers to implement the utility if they want to. However, and this is a major issue at the moment, there's not that big of a difference between Havok's newely released physics engine and Physx
<youtube=v1hoVCZZOd0>
That's real time on the ps4 hardware with a million objects.
If the new engine can already do physics this well with a million objects in such a varied environment then it can do everything that Physx is currently being used for. There is a huge difference between Havoc pre-2013 and Havok now. As stated, they released a new version last year that drastically improved performance.
<youtube=sS0Fqx_zxf8>
Cut to around .48 to skip the silly fluff and see Havok 2012 compared with Havok 2013. The performance difference is staggering for the same task.
There's a reason why the majority of AAA studios use Havok if anything else at all. It's a lot more user friendly with a ton more tools for interface. Until 2013, Physx was the better software. Now I'm not sure which is better at all. They may even have individual strengths and weakensses for all I know but one isn't necessarily better than the other.
Now that its easier to port to PC, physx will have a much easier time. In fact, you even said that despite no support from consoles physx was still being used more than havok. Nice way to backtrack on your own damn statement. Regardless of who uses it, its still popular on PC. In fact, the 2nd biggest community on steam is Russian.
Actually, my statement has been that they only create something like 3 meaningful games a year that actually use any component of physx. I presented Nvidia's own chart to discredit the meaningfulness of your statement that more games used physx. Havok outscores NVidia in every game area except for the shit pile. The games that score less than 50 metacritic or don't even have a score at all. How can you even pretend to toute these numbers as meaningful? When the games scoring more than 50 on metacritic are firmly in Havok's corner at 154 to 109 Nvidia? Sure, the shit pile has 81 for Physx to Havok's 27 but really? I'd consider those all to be detractors.
But you're basically saying that Batman: Arkham Origins using Physx is no different than say a game that wasn't even popular enough to get reviewed by games critics or ones that got scored below 50? There is a difference and you know it. As the quality of the game goes up, the number of games made with Physx drops drastically. From 81 on the shit pile to 8 in the Excellent. Havok seems to follow a bell curve with both shit and excellent in the 20s and the majority of the titles in the middle. But still a MUCH higher weighted average than physx. As a Physx fan, aren't you even a little embarrassed by that chart?
What's more is that these games are only games that support those engines. They aren't games that necessarily use those engines. Most of the games that use/support Havok still use it while a sizeable chunk of the games that support physx do so in name only wihtout using any of its modules. Wierd, huh?
Only PC matters to nvidia. Physx is maarketing to PC, which is why its free for PC developers.
And yet, the Source Engine went with Havok. So Havok had its hand in Portal, Half-Life, Stanley Parable and several other significant PC games. Do you have any more recent numbers (and perhaps any more recent quality comparisons) that would indicate a significant change of some kind?
If the 7th generation console circle jerk didn't kill physx, nothing will. Its a PC gamer feature, and wasn't meant for consoles beyond exceedingly basic things. If physx could survive so could TressFX. Which AMD threw away, and the only thing they could use to contest physx's dominance on PC. The market they desperately need to stay afloat.
Hair physics. It is literally an engine solely devoted to how hair behaves. That is not a competing engine. That's ridiculous. Like saying that a radiator manufacturer is in direct competition with a car manufacturer. AMD itself is in competition with NVidia but it really isn't anywhere close with a physics engine. AMD is not only competitive in the PC market, but it actually has a larger market share than NVidia.
AMD in consoles are meaningless, developers haven't coded for hardware since the early 90s. We've been over this. Device drivers allow software to run regardless of the specifics of the hardware. As long as they make the software itself run without memory leaks and other issues, its fine. Consoles are literally meaningless to PCs. If anything, console hardware only makes PC gamers upgrade to much more powerful cards that make consoles look like a joke, which is nvidia's domain.
The number of hardware sold means everything to the hardware manufacturer. Do you think Nvidia gives a crap about how many people use their drivers when it's their cards people actually pay for? According to that Forbes article, AMD is still outselling Nvidia in the pc market.
Hell, by your own logic the 7870 should be running sales like crazy. Except by steam's own stats its less popular than the absolutely ancient GTX 210.
By what logic are you talking about? Why should the 7870 be selling more for some reason and why do you think I indicated that? The card on the ps4 is not the 7870. It's an SoC version of it.
As for the GTX 210 comment, you do realise that GTX 650 Ti is also under it, right? 7850 is the 12th most common card surveyed in the month of February too. What are you trying to draw from this? There's 13 games right there that are within 00.1% (added the double leading zeroes to avoid the thought that this is 10%, it's only .1%) difference of the overall market share. 650 Ti and 7870 both right beside that 210. So what's your point? These are single cards that own almost a full percent of the entire market share of the Steam community. Heck Intel's 4000 and 3000 series are the two most common cards here. If anything, that should tell you that the steam survey almost means nothing regarding card quality. A full 36.24% of the cards listed aren't even on the list. They make up less than the 00.50% of a the market share each. Even Intel Ironlake (Mobile) is at the 1.23% mark.
What part of anything I said would indicate that a 7870 would magically sell more than other cards? I'm just surprised that the 7970 is that much higher.
And its hilarious how your own Forbes news post has a link to steam stats that say the exact fucking opposite of what it claimed. This is hilarious.
You do realised that they only referenced Steam to point out that Intel has the first and second spot on the most common cards. Those two intel cards make up almost 9% of the Steam market by themselves. If you throw in the 4th most common card (HD 2000) you've got one brand already showing over 10%. The person was just saying that they couldn't leave Intel out of the dicussion and that's why.
But what the article was actually talking about was market shares for the year. Actual shipment rates and not surveys. Real numbers. Not what people currently have in their machines but what people were buying:
http://jonpeddie.com/back-pages/comments/intel-and-nvidia-raised-shipments-levels-for-the-quarter/
Now, Nvidia is increasing in shipments and it looks like AMD is decreasing. But at the moment and last quarter and last year, AMD sold and shipped more units.
But I'm not sure what the point of this part of the discussion is. I think AMD still F-d up their business management. I think you're confusing me saying that AMD makes a legitimate product as me saying that it's somehow better than Nvidia. I think AMD has made significant business mistakes and that will drown them if they don't make the appropriate changes. But their cards are fine.
Physx being on the ps4 is meaningless. Without CUDA it wouldn't work the way devs wanted it to, which is the higher end physics. In fact, Physx was licensed for the xbox 260 and ps3. When the last time you saw physx on a console? In its full glory? None. When was the last time you saw a 7th gen PC game have physx? Multiple times from AAA games and running a lot of what physx has to offer.
In it's full glory? Haven't seen it yet since Nvidia locked it to CPU processing. But Nvidia's statement was that it would be fully functional on the ps4. Time will tell if they're lying but developers won't be fooled.
http://www.pcgamer.com/2013/03/09/is-tomb-raiders-performance-trouble-the-herald-of-things-to-come-for-nvidias-gamers/?ns_campaign=article-feed&ns_mchannel=ref&ns_source=steam&ns_linkname=0&ns_fee=0
Even Source isn't an ace in the hole when that engine is ancient and on its way out. Source doesn't even support the new stuff that havok supposedly uses.
Depends on what Source II will use to build its engine. Considering the drastic changes to Havok in 2013 it wouldn't be crazy for them to use it but I always wouldn't be shocked if Valve did everything on their own this time around.
Physx wouldn't make its debut on consoles, not in its full form.
It's possible that they will relegate it to just the CPU again. We'll have to see if that has changed. Nvidia could stand to lose a lot of ground if games like Tomb Raider continue to come out developed in such a way to benefit API. I don't think it would benefit them almost at all to have it available if it can't be used at all.
And its working. Physx wouldn't actually exist on consoles and on PC it can actually be turned on. At best its a trojan horse to market Nvidia's tech on AMD's only recent accomplishment.
Perhaps, but Physx has actually been used in ps3 games. It's the GPU-based version that so far hasn't been.
AMD however made a horrid decision to hand out TressFX to everyone else. They could have built onto it and actually have something to compete with physx. They could make extras to market to the market they have been losing. Instead, they hand their greatest potential to their greatest enemy.
You know, you keep saying this and I keep not caring because it's just hair physics. As far as I'm concerned, AMD is light years behind Nvidia in the physics engine department and I'm actually impressed they managed to put anything competent together. The thing is, they don't have to be with Havok and custom engines being all over the place. But, because I don't care I haven't really asked what you mean by they gave it away, which your next sentence touches on.
It doesn't matter if its just hair, it could be something they could actually market and build on. Tomb Raider used Tress FX. They could even put it on the consoles and AMD cards exclusively.
Wait, you think because developers patched the issues with Tress FX on Nvidia and Intel cards that this means AMD "gave it to them"? That's silly. What's important for AMD is that it is optimized for their cards. Or do you not remember the embarrassment Tomb Raider's performance on the Nvidia line was?