The Order: 1886 Reveals Actual Gameplay Footage

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
Ultratwinkie said:
GOW selling less than the first GOW. That isn't an achievement. All you proved that the sales were front loaded and failed to reach penetration level sales.
What? "Sales were front loaded"? Every game sells the most copies in the first couple of years and then fewer and fewer as the years go on with a few spikes here or there. All of the GOW games did the same and GOW 3 is further ahead at this many years in than any of the others.

What do you think "penetration level sales" are? GOW 3 is the 19th best seller on the 360 for the entire generation. That's out of the 3,500 games it sold more than 3,481 of them. It is one of only 6 games post 2010 to be in the top 20. Had it been released on the ps3 it would have been 15th on the list (ironically, also being just better than Red Dead Redemption there too).

In what world is that not "penetration level sales" and what is the big difference between number 18 which is GOW 1 which you admit to have done splendidly and number 19 which is GOW 3? They're even likely to switch places within the month. I get it if you don't want to admit that you're wrong but the numbers couldn't be less in your favor. You can't look at a game that was one of the best sellers and somehow claim that it didn't sell well enough. That is axiomatically wrong.

And the cards I would have picked would have been 100x better. If they picked the 660 or even the 760 we would have much more range.
Why would we be better off with weaker cards? I think you're just arbitarily favoring one brand over another without any admission that both brands are legitimate companies and that development studios design their own game engines anyways and it's the game engines that determine the physics and rely on the power of the card to process. Physx is just a physics engine. Consoles and PCs and iOS devices use Havok as their physics engine which is a legitimate competitor of PhysX.

But, and I can't drive this point home enough for you, the PS4 DOES have PhysX and APEX support. NVIDIA decided to give it to them anyways (because it would be dumb/silly for them not to).

http://www.ign.com/articles/2013/03/07/nvidia-announces-physx-support-for-playstation-4
http://www.vg247.com/2013/03/08/ps4-nvidia-pledges-physx-support/

So, with AMD cards offering better overclocking and higher memory bandwidth and video RAM I can't think of a better scenario for the consoles with the best of both worlds. Why would NVIDIA do that? Because it means more PC games utilize it in PCs.

Hell, the 660 isn't actually high end. Its lower tier on the nvidia scale. Its even middle tier on AMD's scale. This card is old. The reason its in the high end card range is because nvidia consistently blows AMD away on the majority of cards. There is no contest. There was even evidence of that on the steam stats page that I referenced.
It doesn't matter. The 660 is a less powerful card than the HD 5870 and the ps4 uses a modified 5870.

What you've got to understand is that the two consoles going AMD means that development is going to start supporting that card brand a lot more. NVidia had benefitted from people relying on its CUDA cores which is also what made NVidia cards so much more expensive but with these major companies going with AMD developers will start relying on AMD's Open CL which is great. You're way overblowing the benefits of physx. Especially when Havok Software (supported by games like The Last of Us, Uncharted, and Bioshock) exists and is so excellent as is. Even with machines currently having PhysX, far fewer games use it than Havok.

The GTX 750 Ti is slightly slower than the 5870 too, by the way. You're literally arguing for a cheaper, shittier card just because you like the company more. But since Havok is CPU based and Physx is video card based, it simply doesn't matter if the video card has Physx or not since Havok is there but I've already made this a moot point.

I'm more than a little upset that I didn't research the topic a little more before getting dragged into this or I could have sidestepped this part days ago.

And its hilarious how you miss that the skyrim specs have proven you wrong. The 7800 is the same as the ps3 and it runs it at the same level. No optimization ever made it better than the dedicated card. Which means your idea of console optimization is grade A bullshit. Again.
You do realize that there are other specs in a computer other than GPU, right? Like RAM? Skyrim's minimum requirements = 2 GB, PS3's RAM = two 256MB.

I don't know about you but even on low Skyrim found its way above 1GB on my system. While having a lot of extra RAM doesn't really improve performance, having too little is crushing. They have to make up for it elsewhere and that gap is spanned by optimizations and special accomodations made specifically for consoles in a way that would never be done for PC owners.

and looked closely,
If you have to look closely in 1080p then who the hell cares? We're not looking for ghosts in the window at the back of a picture.

No matter how you spin it, these consoles are entry level. They don't want to splurge anymore, so they cut everything to the bone.
Do you remember what happened to the PS3 last generation? It came in at $600 and still lost money and the sales tanked because people weren't willing to pay for it. This is market driven.

I'll agree that it isn't as advanced as it was compared to the overall market last gen but it's still well above the average gaming machine now and a heck of a lot for the pricepoint.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
I take it this means you now agree that GOW 3 wasn't a failure somehow. Glad we could come into agreement on something over the internet. But it's pretty hard to continue claiming that 19th top selling game of the entire 360 console generation somehow underperformed. I am most pleased that I can stop defending a series I don't like. Hopefully that line of discussion will put into perspective that this being a cover-based shooter isn't an auto-fail.

Ultratwinkie said:
The 5870? You're saying the next gen card is a 5870? A card from 2009? The 760 beats that.
Clearly a typo. We were discussing the amd 7870 (the PS4's equivalent GPU now that it has finally been reverse engineered [http://www.extremetech.com/extreme/171375-reverse-engineered-ps4-apu-reveals-the-consoles-real-cpu-and-gpu-specs]). Sorry for the confusion but surely you'd have been able to figure that out from context.

Anyways, the ps4 it turns out is actually a modified 7870. So it should actually be more powerful even than the 7870 which itself is the 27th most powerful card on the consumer market at the moment (at least that Passmark has tested).

The difference between the 7870 and the 760 Ti is negligeable. It's 4,258 vs 4,387 so around 120 points difference when both are over 4,000. Even then, the 760 Ti is only for OEM. Quick, go find a link to be able to purchase it. As far as I can tell it still isn't even available yet despite having been made some time ago. From what I was told, it was supposed to just be a rebranded 670 which is a great card but it certainly isn't performing like it.

So please explain why Sony would spend 33% more on that card for less than 3% of an increase when 7870's have all the advantages of AMD gpus and can still support PhysX which you were touting as the big dynamite differece of Nvidia? This is all win and no lose.

Not only that, but all of these cards are in the top 30 performers at this time (4 months after release). You really don't have an argument here. The ps4 isn't cutting edge but it's darn nice for the price point. Do you recall what happened to the ps3 sales when they reached for that 33% more expensive card?

The reason you use 1-2GB of RAM is because of texture difference. Which is why I said look closely at the textures and you see the difference. RAM isn't power related, its memory related. More ram means more things to keep track of. Which as I already told you the PC has a lot of. Not counting much higher draw distance.
Here's where things get weird, the ps3 is a bit more proprietary than that. You are technically right here but the ps3 also decided to create asset categories to help deal with this kind of thing. The asset categories were generally responsible for tracking object while the RAM would have assisted with asset quality (aka, texture). You may recall Skyrim having a significant problem on the ps3 for something like 6 months while Bethesda scrambled to fix the issue (again, this was the reason I built my pc). The actual issue was that the individual asset categories were getting too bloated and the ps3 was crashing because of it. I've worked as a QA engineer on software for some time and the problem was known pretty early so, since I had the game, I tested it. On release, no assets were resetting. You could go back into any dungeon you'd cleared after years of in-game sleeping and still see bodies and arrows and objects litering the ground. Most outdoor bodies/enemies would refresh but not dropped weapons. The flowers would also refresh. Nirnroots were stacking blooms each time you picked one and enemies you encountered but did not engage were still being tracked across the world. The large file sizes people were talking about weren't the problem, they were just indicative of the issue. Basically, the ps3s were crashing because if any one of those asset categories got too bloated, the system crashes.

Now, those aren't RAM but those are responsible for keeping track of the objects. RAM would have been better/cheaper but I guess they'd already spent so much on the card that more RAM would have broken their backs with the initial price point already being $600 back then.

I wonder if the PS4's GDDR5 will impact texture at all. From what I've read it will make a difference but should be minor.

So Skyrim has reason it uses 2GB of ram, and that's because it wasn't cut down to play on consoles. Which you refuse to believe and think the PC and console version are the same thing. They aren't. And you refuse to look at anything else otherwise.
It wasn't cut down perse... The Console versions were deeply flawed in ways the pc version wasn't (such as the failing to reset dungeons). What actually happened is they patched the PC version in a way that made them better with the HD texture patch. But because they couldn't get asset bloating under control on the ps3 version (they didn't have this issue with the 360, fyi), they could not add anything. This is why Skyrim never got DLC on the ps3. Additionally, both consoles had actual texture glitches. Not even a resource problem but actual glitches in the code. It's easy now that we're entering a new generation of x86 consoles to forget that previous generations all had proprietary hardware and that porting wasn't anywhere near as simple as it will be going forward. No, the 360 and ps3 versions were both different codes (360 less so because it was a lot closer to x86 than any other console).

Now then, do you have examples comparing the 360 post texture patches to the pc? I am interested in seeing it.

Have you looked at the Bioshock Infinite comparisions I posted? They basically all looked the same but the 360 was actually hazier. This is because Bioshock Infinite didn't have any asset bloating issues.

So perhaps Skyrim was a particularly poor choice of mine to use considering that it's why I built my machine due to its general handling of assets. But the requirements should have all still been for Vanilla skyrim and not post-patch/upgrade Skyrim.

Even the phsyx support won't go as far as it could if they chose higher end parts. They cut down the order into a corridor without physx, how do you think their limited hardware will handle physx? Without cutting it down to almost nothing? Truth is it won't.
That's silly. Physx is just software code. If they were given the physics engine then it'll function the same as if it were an Nvidia card. The only thing to worry about here is the power of the card and the 7870 is no slouch. Again, it isn't a Titan or a 780 Ti, but it also isn't a 5870. There has to be some kind of balance. You can't make the console $1,000 and have people actually buy it. Sony already learned that lesson. Do you have an argument to explain why consoles have to be cutting edge? I don't think "because they were last generation" is a valid argument with both companies showing losses in the billions in the first generation they shot much higher than normal.

And I'm not sure if you're aware of this, but Sony and Microsoft didn't go with NVidia because they needed their GPU solution to offer a SOC option. System on a Chip. That's what also prevented Intel from CPU or GPU considerations.

Here's a Forbes article explaining that. [http://www.forbes.com/sites/patrickmoorhead/2013/06/26/the-real-reasons-microsoft-and-sony-chose-amd-for-consoles/] If you want x86 SOC you go AMD. For computer users, we don't have to have SOC so it's not really a concern for us. It is for consoles.

and this isn't because the ps3 cost 600$. This was because they wanted the best technology has to offer.

In equivalent times, it would be like sony putting a 780 TI into the ps4. Nevermind the 770 did about 60-80% of the power for much less cost.
I don't have the numbers to compare the historical pricing and performance of those GPUs so I can't agree with or disagree with what you're saying. However, if you can figure out the price of the Nvidia 7800 at the time of pruchase then we can account for inflation or deflation in the video card market and figure out what the actual comparable video card would have been rather than wild guessing. I can't find the price history so I have no idea what the NVidia 7800 cost back then.

Its not about the best, its about the versatile tech. Its like they bought a lambo, found it too expensive then threw the lambo in the garbage and went back to a scooter.
Right, because a video card that's almost exactly as powerful as the one you recommended and is the 27th most powerful card at the moment is a scooter.[/sarcasm] A scooter isn't on the top of the list of performance for nearly any product.

This isn't black and white, there is a middle area they completely ignored. They could've bought a family sedan instead of the lambo then bought another family sedan when it got too old.
Ok, now you're sending mixed signals. First you seem to be saying, "Boo, they didn't release a Titan" and now you're like "they don't have to be the best". In what way does a 7870 not meet these requirements other than not having the name NVIDIA printed on the chip? Have you actually looked at the high performance list?

http://videocardbenchmark.net/high_end_gpus.html

Sorry, but you don't get to say what is a lambo and what isn't. The performance of the device speaks for itself. Look, if you gave up on the GoW 3 because it's sales were on the top of the list, you've got to give up on this too. It's like you have no regard for how things perform in relation to other things (let me know if this is the case and I'll try to adjust my position). Lists matter, especially a list made in order of performance while we're discussing performance. You keep listing Nvidia cards like they're the gold standard when they keep hitting lower in performance measure than the card being referenced. First your argument was that they don't have physx so the power difference doesn't matter and now that the PS4 does have physx you're beginning to behave like the cards themselves perform better. That's just not true according to actual non-biased testing. Raw numbers.

I would have been happy if they went to the R series of amd.The R series is one of the few areas that AMD can actually compete with nvidia on a power scale. Amd is known for cheap parts, not their power. If they got a 660,760, or 750 I would be happy. The highest I could ever see anything going would be an under volted 770. At absolute max. Instead they started penny pinching and cutting everything down to bare minimum. And the only people who are going to suffer are gamers, publishers, and devs.
So if they'd gone 660, 760, or 750 you'd have been happy? Then be happy because the 660 is three steps under the 7870 according to passmark (and better according to 750 is that much weaker than the 660 which is already weaker than the 7870. [http://gpuboss.com/gpus/Radeon-HD-7870-vs-GeForce-GTX-660]

Only the 760 from your list is more powerful and that's at a decent (but not impossible) jump in price. However, as stated above they couldn't produce an SoC option. As far as AMD's R line there's only the R9 270x that's a little bit better at a comparable price. I do know that the card is modified so perhaps it does go up that way somehow but I don't think the jump would be worth it. But almost every card above the 7870 is significantly more expensive. I think they managed to hit the sweet spot while maintaining their SoC requirement. Frankly, I don't think they could have done any better and especially not over a year ago when the prices could have been even more extreme.

how? low tech, low future proofing. They want this gen to last 10 years, and this tech won't be able to bring them there. They will scrape the barrel of power long before the end of the 10 year generation.
No such thing as future proofing. Never has been. But keep in mind that the 10 year generation doesn't mean that the next generation doesn't get released during it. That's what Sony and Microsoft are doing now with the ps3/360 continuing to be supported for the next few years (so yeah, they are hitting that 10-year target).

Will they scrape the bottom of the barrel sooner? I'd think so unless technology slows down like people keep claiming it's doing each year before we magically (seemingly) make another breakthrough that punts the ball further. But I'm not that worried about it because the advance in graphics is good enough to suit my needs. Honestly, the PS3/360 generation had nearly advanced enough to produce graphics that are simply good enough for me. The Last of Us and the Uncharted games were beautiful. It wasn't quite there but it was close. I feel like this generation is powerful enough to get there in a way that future consoles will just be polishing things up well enough. But then again, I'm not a graphiophile (and no fault to those who are).

But you can't expect consoles to shed billions of dollars over the generation because they took too big a hit at the start of the generation and think they'll do the same thing again. These aren't charities and like it or not, they will control the gaming market and will bottleneck it when their limits are reached (same as the 7th generation did).
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
Ultratwinkie said:
Nvidia can do whatever you want them to do and they'll do it well, you just need to pay them.

Nvidida even admitted that they could do what AMD did, but they weren't willing to be paid at a loss.
Ok? I'm unsure how this impacts anything. AMD could do it at the needed price.

It also factors into physx. They are very protective of it, and they don't like the idea of them losing their biggest edge. Consoles will have to deal with a huge performance hit than an nvidia card would. They make sure physx always works without the hassle.
Good thing that it's NVidia that providing and designing the actual Physx support then, unless you think they'll do a purposefully shitty job.

AMD cards can run physx, but the performance hit they get negate any benefit. Want to know why? Device drivers and CUDA cores.

https://en.wikipedia.org/wiki/CUDA
What do you think Nvidia agreeing to support Physx on the ps4 means? It means they'll make drivers that work with the PS4.

CUDA cores, which I mentioned earlier in our discussion, are just why Nvidia cards typically outperform other video cards. They don't magically make the cards perform higher than the numbers they actually perform at. For example, the 7870's performance will always be at that level above the other cards you mentioned, CUDA or not. It just comes at a cost. As long as the drivers are designed for the card they're implemented on then we shouldn't give two flying monkey shits what the name on the card is.

That being said, Physx has been around since 2005 and has had very few games that have used it (bear in mind, games that "support it" are not necessarily games that use it). It simply isn't that widely used. You're talking something like one or two major games a year and 4-5 games total that actually use it. Over 7 games total that "support it" but don't necessarily use it (in a given year). What you have to realise is that a lot of AAA game studios (the kind that generally need particle physics) develop their own proprietary physics engine. When they do that, they don't use anything else.

Your overall knowledge of Physx may be dated. I'd say 5 years ago they were the best with Havok being clearly dated at that point. But Havok just released a new physics engine last year [http://www.techpowerup.com/181341/havok-launches-next-generation-physics-engine.html]. It's damn nice software but I don't know which one actually pulls out ahead but Havok is also very widely used (as I stated, the Source Engine uses a modified Havok engine for its physics).

Havok can be used with OpenCL whereas Physx can only be run on CUDA. That's more significant than you may think and is a non-trivial reason that most games go with Havok.

The 660 is actually more powerful depending on what version. the Ti is normally just an overclock and replaces the older card versions. They also can be just more efficiently designed versions, but people just call them overclocks even though its not always the case. Modern cards are all 660 TIs and that beats the 7870. Which is what a console would be built with if they choose it.
AMD cards are better at overclocking than Nvidia cards. Are we going to include overclocking capabilities here too?

Same reason the 760 TI is not off the table. OEMs are basically businesses that buy the hardware, and Sony and microsoft qualify as those.
As stated, the difference in power between the 7870 and the 760 TI is almost nonexistent but the price is.

speaking of cards, the 7870 is under volted. So its actually less powerful. Its basically underclocked.
Doesn't matter, the performance marks are still the performance marks and we have no idea how Sony has actually modified the card. This comment would be like telling me that some guy at the olympics came in at third but he was wearing pink. Um, ok, but that doesn't change the measureable performance.

GDDR5 is overkill beyond 2-4Gbs. GDDR 5 is awful for processors, and unless they want to use 2k and 4k textures anything beyond 4 is overkill. I doubt even blue ray could hold 4k texture games. We sit at 50GBs and we are far from 2k textures. Similarly, DDR3 is awful for graphics.
I'm not sure overkill (aka, too powerful) is a valid complaint. I do wonder why they went that way though. Maybe just a cheap (better be cheap) way to stand out in a lineup?

You're confusing the typical CPU with the console's combined GPU/CPU (APU). Also, you're ignoring the super high bandwidth. It basically makes up for the slower timing by eliminating the transmission latency which they're doing thanks to the the tighter connection between RAM and APU. As long as Sony got the GDDR5 for relatively cheap then there's no bad side to this and a heck of a lot of good. This configuration would have net negatives in a traditional PC though. But this is another example of hardware optimisation.

The ps4 is limited by processor. The xbox is limited by graphics. Both are on entirely different pages. processors may not need to be powerful, but DDR 3 helps when you want to do more complex stuff like AI over a large area.
They both have the same or a very similar processor, I thought.

Still, today's processes really don't rely on the CPU the way they used to. This really isn't a liability for the next couple of years because the CPU should be the last resource to be fully tapped except, again, to serve as a glorified switchboard operator to send processes to RAM/HDD/GPU. But at the end of the generation when the barrel is getting scraped it will diminish the wiggle room they could have had. But most any other type of CPU would have pushed that price up and unnecessarily for the next 4 years (maybe less). We'll begin to really notice the bottom of that barrel by year 5 and then it'll drag on until the ps5 comes out (assuming that it actually turns a profit this time which is already looking good).

The biggest problem with your list is that you assume a list entitled "high end cards" are actually high end forever. The majority of the cards on there are completely ancient.
What? No. I just assume that they are currently high end and that's all that we can ask for.

When you take it all in, there aren't all that many cards. Past AMD, and Nvidia, there really isn't anyone else. AMD, and Nvidia put out only a few cards every so often. So of course its going to be "high" on a list of only a handful. The list wasn't that big to begin with.
Doesn't matter. The performance numbers are still comparable.

Want a site that actually changes with the course of time?

http://www.futuremark.com/
If you really like that site then you should acknowledge that the highest performing card right now is the AMD 7990 and that the 7870 is even higher on this list than the other placed it. Also, the cards you said you'd be happier with are lower on the list.

and consoles already shed billions of dollars just on the research and development alone. Consoles are an expensive business, and the tech they want to use won't last.
Never said otherwise. But they have to pick a spot and run with it. That spot will then continue to be supported by developers as long as its popular. Though, this should make a PC owner a bit mad. It means most of our games are bottlenecked under the hood by the hardware in consoles because developers aren't going to create a new engine just for pc.

They spent too much last generation for the best tech the world had to offer at the time, and now they cut everything down to the bargain bin because they didn't want to spend their money wisely.

if they went with reasonable tech back then, and went with the reasonable tech now, we wouldn't have run into this problem.
If you're ok with them going with reasonable tech, then what's your issue here? This is actually higher-end tech at the moment so that it's well priced too is the best of all worlds for the average consumer. All I care is that it is a significant step in the right direction and our PC games will finally start moving forward too.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
Ultratwinkie said:
AMD agreed because Nvidia was eating them alive at the time and they were near bankruptcy.
If this would make AMD take a loss like it would Nvidia, why would they have agreed to it then? AMD actually has a competitive advantage regarding SoC solutions. They've already done it before whereas NVidia has practically no insight that way. So the cost of R&D was much steeper for Nvidia. Regardless of your feelings for Nvidia or against AMD, this should be clear. AMD was practically the only game in town where SoC solutions were required.

Even consoles couldn't help them, because they made less than 20% of their revenue. For a project that demanding. Which is why Nvidia said no because they asked too much and paid too little.
I'm not sure you're familiar with the hardware R&D cycle. There will always be large R&D costs associated up front and developing these custom chips for both companies would have been big. But now what? The chips have been produced and now it's just a matter of making them smaller and cheaper to make but they've already sold around 10 million cards so far. This year should not only be a firm profit margin for AMD but several sources thing AMD can double by year end. They are wonderfully poised thanks to this. Not only that but with their card being in the consoles they're going to have an advantage with games made for pc's too. I can't think of a better choice for them to have made. Do you honestly believe that AMD made a poor choice? This is basically the first time in AMD recent memory where everything appears to be coming up AMD (though Nvidia doesn't seem to be taking a hit which likely means this is bad news for Intel).

And its nice how you say Nvidia will help AMD. Except they won't. They haven't helped AMD on the Witcher 3, and you even admitted physx is restricted to CUDA.
If Intel is including Physx support on the PS4/XBO to encourage developers to use the tech so it will benefit their PC gamers, why would they make it shitty support? Precious few games use Physx already, poorly performing Physx drivers on the PS4/XBO would just be more nails in the coffin of a product that is actually good. But Havok's new software may actually be even better than the current Physx's offering. It's much newer and looks very slick.

Are you unaware that Havok exists and is Physx's competition? It's been much more heavily used than Physx. Also, why are you not responding to the fact that very few games even use Physx?

7990 is a flagship that is no longer sold. The power draw is also horrendous for the power it gives. Its a 400w minimum draw when 250 watt cards like the 270x and the 780 TI give most of the power.
You do realize that this is why AMD video cards are better at being overclocked, right? Nvidia places locks on how much power their cards can access and so they can only go so far. AMD doesn't lock it down.

Tell me, as a gamer, do you prefer more power output or do you care more about power consumption?

7870: 8090.
7850: 6630
The 7870 XT isn't whats in the ps4. Its an under volted 7870 which means its 7850-7870 in range, which has been said time and again.

http://www.tomshardware.com/answers/id-1689376/graphics-card-equivalent-ps4-xbox.html

So thats about 7,000 ish in performance.
Ok? You found a thread of people who aren't experts talking about something that they aren't sure about whereas I posted a link to people who actually took the damn thing apart and had the utilities to know what they were looking at.

Hmm... which to believe which to believe. Oh, here's another tomshardware thread not only agreeing with me but linking to the same link I gave you: http://www.tomshardware.com/answers/id-1993056/ps4-equivalent-gpu.html

The problem is that we still don't know some of the specifics of the card to really tell us how underpowered it is. But the general thinking is that it's slightly under the 7870 but much closer to it than the 7850. So you have your 660 equivalent. Either way, still not a low rung card, just not the best.

And on the topic of CPUs, GDDR 5 is specialized as graphics RAM. It wasn't meant for anything that processors do. DDR3 is processor memory, and allows you to do things like keeping track of a lot of AI and pathfinding, etc. This has nothing to do with APUs or power.
DDR memory is otpimized for desktop use because it has ultra low latency. GDDR5 is generally considered optimized for data transfer. That is the difference between the two. They're both memory and it's not like both couldn't be used to do what the other can do (such as keep track of multiple objects or load textures), it's just that thye're not as efficient at it.

But, as I stated, Sony has taken advantage of console's standardized hardware by almost eliminating transmission latency in their design. It won't be exactly as fast at processes that would be better suited for DDR3 but it would be close enough to be entirely unnoticeable while still having all the advantages of GDDR5 which can be an advantage.

Take for example the fact that the XBO has all this trouble with rendering games in 1080p. This is the reason why the PS4 doesn't blink twice at doing so. We should also see better load times and antistophic filtering.

As long as they really did resolve the transmission latency, then there's really no downside to this. Now, on the reverse, you can't really get DDR3 to do what GDDR5 does to any realistic degree where mass data transfer is concerned.

If they kept 4Gbs of GDDR 5, and replaced the other 4 with DDR3, the range of the ps4 would increase. There is a reason gaming PCs use both, because they both do different things.
That would be interesting and they did consider doing that. However, they stuck with one type to make the development process simpler. This is Sony speaking from experience if you recall their divided 512MB RAM setup and the frustration it heaped on an already frustration-saturated pile of hard to code for hardware.

The card that sony went with is also a power hog, which clashes with a low power draw and a low power processor. So they won't use their "high end card" to the fullest thanks to their insistence on low power. Even the 760 and 770 draw less power than the 7870.
Dude, again, this isn't just a 7870. It's a modified SoC card. Do you have a full spec sheet or understanding of the way it has been modified? My statements are that it is around or the equivalent of a 7870. It is not itself a 7870.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
Ultratwinkie said:
AMD wouldn't really have lost money. They use lower quality parts and had nothing to lose at that point. Nvidia would have because they had a hold on much more lucrative markets. Basically, nvidia had better things to do and the stuff they sold couldn't be sold at the discounted prices AMD was giving out.
Let me make this clear. You are still admitting here that AMD gave Sony what they asked for at the price they asked for it at and Nvidia couldn't or didn't do that for whatever reason. Any other rationalizations for Nvidia is bullshit unless you were at the negotions or are an Nvidia employee. AMD built a new facility for this. Busy didn't mean shit to them, they just created a new department and their existing teams continued on.

Why? AMD was losing the PC gaming market, and nvidia is storming the mobile market. When it came to processors, intel still dominated. They were running out of options, and it was better to sell a soon to be outdated jaguar than risk irrelevance.
Two things, I don't care how Nvidia is doing overall, they are a very strong company with a good product and a well liked name. I've purchased many cards from them but this is irrelevant to discussion. Doing well doesn't mean they turn down huge profits and marketshare. They lost a bid. You should understand that Nvidia is likely unhappy about this loss of .

And yes, it may be a modified 7870 but that doesn't make it impressive:

http://www.gameranx.com/updates/id/20377/article/xbox-one-vs-ps4-amd-outs-console-like-gpus/
http://www.futuremark.com/hardware/gpu/AMD+Radeon+R7+260X/review

So AMD says its a R7 260-265.
If that article is correct. AMD says it is closest to R7 265, a card that isn't out yet and isn't scored on the other sites we've been using. It's the XBO card that is R7 260 and is hilariously underclocked but I haven't brought XBO up because it looks like they've purposefully targetted shitty specs. Despite that picture, there's no such score of the R7 265 anywhere else. Interestingly enough, Microsoft's desire to augment performance with cloud computing would make any complaint about them somewhat obsolete if you have amazing internet connections. The PS4 may go that route too but I would hate that.

But this is just an article where a person is saying that they think these cards are comparable. This isn't AMD saying so. Actually, as nearly as I can tell this is the only source making this claim. The people who actually took the card apart and had the expertise to know what they were looking at pegged it higher. But then again, we don't have reliable performance results for the 265 so I don't know if that's better or worse and I'm not trusting that tomshardware link when I can't find it on their site.

Speaking of graphics cards, the RAM isn't responsible for high resolutions. Its the pixel rate. The RAM is just there for memory, and memory for different things.

The stuff you would use DDR3 for doesn't need to be fast. It just needs to exist.
I'm not sure what part of what I said you are disagreeing with. Did I accidentally say high resolutions? For some reason I mentally associate highly detailed textures with high resolution. Something about the detail being what makes me interchange the terms incorrectly. Perhaps I'm thinkin High res textures?

For example, lets say we have a game where you have a lot of dynamic actions that are recorded. You killed a king, and now the cops are looking for you.

You don't need high bandwith for the game to remember you killed someone, you just need it to be there on hand. Its cheaper go DDR 3 too. You don't need "high numbers" for this stuff.
Right, but you can't use DDR3 to do what GDDR5 does easily. So if you can get either for the same price AND avoid the latency issue that GDDR5 runs into then it's a best of both worlds.

Its only there for marketing. Not for anything actually useful. Its just there to show high numbers for the customer to feel smart.
Oh, yeah, I'm sure. It is brilliant to have RAM that seems better. However, it's more than that. They took advantage of standardized hardware to get rid of the latency issue that makes GDDR5 bad at performing DDR3 tasks. This move will actually extend the console's lifespan by a little bit of time by giving the GPU access to more video RAM. Honestly, if computers could do this too then I think they would. It would be a lot more unified to have this setup but that latency is a necessity for us to have the ability to pick and choose part manufacturers.

GDDR5 is fast because graphics need to be fast, and there is only so much that you need before it becomes useless. The stuff that requires GDDR 5 in 8 gbs is well beyond the ps4's ability and is going into 6K textures at that point.
Don't know how developers may make use of it. It'll be interesting to see.

Coding for two kinds of ram isn't hard. If the bedroom coder can make games for PC, then a big publisher can make a game for a console. If its such and issue, they can just make device drivers for it.
No, it's not hard. That's the standard process. But one would be easier.

The reason the 512 limit was bad was because they used cell architecture. It was a tiny limit on an insane architecture. This is x86 and PC gaming don't have that issue. There is no reason to limit ram to one kind when you are emulating PC.
There were several things wrong with it. Not just that. But that is what made it particularly insane to program for.

and lastly, what does a wildly inefficient card have to do with overclocking? Its a 400 watt minimum at factory default. It isn't overclocked, its just a jokey flagship like the titan was. AMD cards are known for their heat output. Overclocking does have a limit, and that limit is dictated by heat.
I'm talking about AMD cards vs Nvidia cards. One of the few known advantages (look it up) that AMD has over Nvidia is the ability to overclock the cards. Not that this impacts the PS4 at all, but it does impact how they view power consumption.

FYI, I've had NVidia cards burn out too.

When it comes to physx, it doesn't matter. Its a PC feature. developers don't bother because they normally aren't building benchmark games. The 7th gen didn't use physx at all, and PC gamers still had it. Havok's new engine is still not fully used in the next gen as far as I know, and the only next gen demonstration I seen is them dropping tiny blue balls everywhere. Kinda like how knack was "so intensive."
Havok's engine is pretty damn new. Just under a year old we certainly shouldn't be seeing the kind of games that need it yet until development cycles that use it conclude and release games. Knack was supposed to be partical physics demonstrations. Shame the game sucked but I haven't played it so I can't speak about it.

Which is a far cry from what physx can do. The list of 500 games that use havok only has a handful that even touches what physx does normally. Most off the games only use basic physics and very few use anything more than that.
Most games that use physx only do it for very minor things too. We've been over this. Very few games support physx at all and even fewer use it. There's a lot of games that use Havok and most of them at least use something.

Oh wait, they do that all the time. Just because they say "we'll support you" doesn't mean there isn't a catch. As you even admitted without CUDA cores physx is useless.
Ok, then physx will die because developers don't have a reason to develop for it. There's already fewer than 7 titles a year that support it and around 4 or less that actually use any part of it per year too. So that number will just get smaller if the entire x86 console market isn't properly supported.

The intention of Nvidia's offer of support was to prevent that from happening. If they do a shit job then they're ruining their name and business. We'll see though.

Taking hostage of cool stuff is one of the biggest reasons nvidia is the biggest graphics card manufacturer. Beyond PC gaming and nvidia consoles, they really don't care. If any console devs use it, it will be way more limited.
Taking hostage of cool stuff doesn't get you business when you suck on standardized hardware that around 15 million homes have right now (if we're including the WiiU which, haha, we shouldn't. In which case it's already around 10 million). People are going to develop games that they can easily port from x86 consoles to pc. That's the benefit of everything going that route. So if you hold it hostage entire platforms you will get burned here. So Nvidia is doing something smart and making sure they don't impact their PC Physx business. It isn't dumb.

And the fact AMD is passing out TressFX to nvidia, their "ace in the hole," means there is literally no reason to side with AMD beyond basic optimization. Their cards are as basic as they come.
TressFX is just hair physics. Who gives a damn?

Nvidia built your ps3, and yet that doesn't mean AMD shriveled up and died when no one optimized for the hardware.
What? That's exactly what was happening to them.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
Ultratwinkie said:
AMD was desperate, and they were taken advantage of on that basis. Nvidia had way more leverage, and didn't want to waste their time on bargain bin consoles. These guys made the ps3, if they were to make a console they want to go all in. They wouldn't stop at the bargain bin.
AMD is making a profit off of every console being sold, it's only Sony that isn't making a profit on the console so what does it matter to a chip producer if the final product is cheap as long as they still get paid for their chip? Your reasoning here is deeply flawed. It is more apt to completely isolate the card they're making from what it's being used for. A banana salesman doesn't give a crap about what type of smoothy his bananas are going to be used in as long as he gets paid for the banana.

But it's also a little more complex in another way that makes the deal more important. AMD has already sold 15 million of these chips/cards because they're in every console. This means that their chip is the one developers have in mind when they create their games. Whether you want to admit it or not, this is a problem. AMD is going to take some market share and perform a lot better over the next several years. Though I anticipate that most of their acquisition of market share will be from Intel rather than Nvidia.

So I strongly maintain that Nvidia simply lost the bid that they probably wanted.

If the consoles are not going all in, they really don't care. Its not a loss to them because they control most of the market. Its beneath them to waste time on something that isn't another ps3.
Again, you're making bullshit (aka dishonest) statements when you claim to know Nvidia's mindset.

On AMD's side, all they could actually afford to sell was the mobile hardware. They were losing everything because they couldn't figure out what made nvidia popular.
Again, you're making wildly unfounded claims. AMD was already planning on making SoC GPU/CPU combos and this deal brought in revenue for something they were already going to make because SoC is great (required) for mobile devices. As such, it is far easier for them to take smaller margins and to simply let Sony, Microsoft, and Nintendo fund the research they were already going to perform. Even if they lost money they would still be better off because it took part of the cost of research on the matter. Nvidia, on the other hand, isn't an SoC manufacturer and this would have been a major shift for them and additional cost.

AMD was simply the right company at the right time.

It wasn't just the power, but the extra support. AMD doesn't do that. It doesn't have the extras aimed at gamers. Its these extras that made nvidia the fan favorite of PC gamers.
Nvidia does have nice extras on PCs. But neither company has ever had the same extras on consoles. All console developers want from them is their hardware and little else. I think you're operating under the misconception that I don't like Nvidia. I do. I'm just not going to blind myself to the fact that AMD is a legitimate competitor.

That's why physx exists, its meant as marketing as a nvidia exclusive feature. The only thing they would do is make sure physx is limited on consoles to hold it over the heads of console gamers to get a better machine to experience the full effect.
Again, very few games use physx and the difference is usually minor smoke effects that other physics engines can also do.

physx didn't shrivel up and die during the 7th gen, where it took work to port to PC,
Why is this relevant when now we have a bunch of x86 environments? In the 7th generation and prior you were going to have to do a lot of work anyways if you were going to port across multiple platforms so the PC was treated as if it were another proprietary console. Now, all the big players use the same x86 so the work is going to be minor. This is actually a detractor from your point. If you can just optimise it for the console hardware and easily port it over, why would you spend a ton of money to support physx as an engine? You wouldn't unless it would benefit the console version too. That's why Nvidia is providing support and why it would be dumb if it didn't work on the consoles.

it won't now. The loist that does list physx games is incomplete on what game uses what.
Yes, both lists are incomplete. So why don't you provide data to back your comments? Most sites that discuss whether to go AMD or Nvidia bring up the point that precious few good games actually use it. The only data I really see directly comparing the two is something from Physxinfo.com in 2009 [http://physxinfo.com/articles/?page_id=154]. What's weird is that they actually paint Havok in a damn good light. The total titles comparison was Physx 200 to Havok 181. However, they actually admit that Havok has a huge quality of game advantage over Physx:

http://physxinfo.com/articles/wp-content/uploads/2009/12/titles_rating_graph.jpg

Again, that chart was from a physx site for some reason. What that indicates is that in 2009 the games we cared about used Havok. The Third Rate or Specific rating is for any game with a metacritic of lower than 50 or having no rating at all. That is the only area that physx was beating havok in at that time despite a huge leap in the number of games.

Here's the link just in case it disappears again: http://physxinfo.com/articles/wp-content/uploads/2009/12/titles_rating_graph.jpg

The amount of physx games that use more than basic physics is much more than the games that fully use havok.
What a weird comment. Games that use one piece of software partially is more than games that use another piece of software fully is a nonsense statement. The reverse is also true. Games that use havok partiallyis much more than games that use physx fully. That's an obvious statement because every game is going to have their own engine with a significant amount of physics already built in. Havok and Physx are both meant to augment those engines, not replace them. So they are almost always only partially used rather than fully used. So to count the number of games that partially use it is basically to include all of them.

Remember that steam's Source engine actually uses Havok Physics. That's already a huge head start for them like it was for Physx when the Unreal 3 engine used it.

But if it's anything like the graph above, Physx has padded their numbers with a lot of shitty games. I'd need a new graph to see otherwise. Of the games we may care about, there's only a handful that support it at all or even use a single feature.

Even when havok makes it more accessible, developers don't care.
Even Physx disagrees with you. Steam too (as a PC gamer that should mean something).

Physx is a PC feature to market to PC gamers. Console gamers are irrelevant because they wouldn't buy the cards. You seem to forget that physx was doing just fine for 7 years without console support.
It isn't for the gamers, it's to entice developers to create software that uses physx. They aren't going to do it if only PCs may have Nvidia cards but nothing else. Do you know what the PC market looks like to developers? 20.7% of PCs have an AMD graphics card, 16.3% has Nvidia, and then the lions share of 62.9% goes to Intel [http://www.forbes.com/sites/jasonevangelho/2014/02/19/pc-gpu-market-bounces-back-with-nvidia-up-and-amd-down/]. That's a forbes article from last month and not a distance 2009 article like the one I found from Physx admitting that lower quality development studios pad their numbers.

Intel is certainly padded numbers with those imbeded graphics but they still make up a huge market segment of laptop users. The important fact is that more PCs use AMD than Physx. This is why Physx is so rarely used by larger companies.

You may say "its just hair physics" but it was the only thing AMD actually put out there in terms of extras. Without exclusivity, you might as well go nvidia.
Nobody cares. Extras don't mean shit if games don't utilize them. By your logic Nvidia just gave Physx and Apex away for free. Both of these things are just to get people to use their stuff. The logic being that Tress X will work better on an AMD card than on an Nvidia card but if every card had it then developers were more likely to utilize it. It's exactly the same reasoning Nvidia had for supporting the PS4.

You can't have it both ways. Either this is a good business practice or a bad one. Which is it? Did Nvidia make a mistake or did both companies make wise choices?

So next time, don't call bullshit when Nvidia came out and admitted the price they were offered wasn't worth their time.
Opportunity cost is different than saying they can't make a profit at it. What Nvidia actually said was that if they did a console they had to look at what other piece of business they would have to put on hold. Nvidia said they simply didn't have the resources to do everything they were doing and do this. Their words. How you interpreted it as the consoles being somehow beneath them is beyond me.

AMD did have the available resources to do this and as I stated, were already planning on SoC solutions. This was just at the right time for them and the wrong time for Nvidia.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
Ultratwinkie said:
Jesus Christ, you move goalposts like a theist.

So suddenly we dropped power now? Well I guess I was right.

However, Nvidia made its views very clear:
http://www.extremetech.com/gaming/150892-nvidia-gave-amd-ps4-because-console-margins-are-terrible
http://www.maximumpc.com/article/news/nvidia_calls_ps4_%E2%80%9Clow_end%E2%80%9D123

or are you going to do the same shit and "speculate?" Like you have for the last fucking page?
I cited Nvidia. They explained that they did not have the resources to undertake the process because they would have to take the resources away from other projects. Too many irons in the fire doesn't mean they wouldn't have undergone this project had they been capable of juggling it too. SoC is something Nvidia could have done, but they really haven't been going that route so this would have been new ground breaking for them. Not to mention that they'd have to dance around AMD's patents in one of the few areas AMD is stronger in. Nvidia made $500 million off the 7th generation and that was with only one console. That's pretty good for one chip. Let's say the profit goes down to 1/5th of that. What do you think they typically make on any given chip? AMD won all three console companies. They stand to do VERY well off of this, a fact you dismiss because AMD was on shakey financial terms at the time of the deal. Even companies who are doing poorly aren't going to take huge losses just because. Companies would declare bankrupcty or even cash out and go out of business before willy nilly taking a loss.

Nvidia wanted more money than what was being paid. They refused to make consoles.
Yep. So they weren't able to produce the product in a way that was deemed worth their time. This is in contrast to AMD who was. Same argument as the first.

AMD was desperate, so they took the contract to band aid their money loss.
By "band aid" do you mean to make a profit? Again, AMD was already geared to design more SoC solutions. If nothing else, this funded the very expensive R&D process that they were already going to undertake. It's a business no brainer for them whereas Nvidia wasn't planning to do that so you were looking at a much lower margin for them.

This isn't saying NVidia did anything bad. This is just saying that the deal was much more attractive to AMD than it was to NVidia.

Look at it this way. Nvidia would have to incur unexpected costs designing an entirely new SoC solution that they weren't planning on making. It isn't an area they're particularly strong in yet so it would have taken them even more money to get up to speed on it. Add that to what I'm sure is a lower profit margin for the card and you could be looking at a much lower profit than most of their other projects would give them.

Then, look at AMD who was already going into this area of research. Even if they take a loss on this overall it will still significiantly reduce the cost of R&D on this chip which enables them to start releasing SoC solutions for products that would benefit from them. So this new research gives them a nice edge in SoC projects going forward whereas Nvidia's business model isn't so interested in SoC.

This deal was literally nothing but good for AMD and nothing but a risk for Nvidia at the certain high opportunity cost of losing resources for more lucrative projects. It isn't that Nvidia didn't want console business and it wasn't that AMD was desperate. It just made sense for both companies to take the route they did. The only loss to Nvidia is that their cards won't be specifically supported as much anymore but they're still an incredibly common card manufacturer so it's not like they won't be supported to the point of working. Just not optimized in several cases.

It's a common stage of loss to belittle the the value of the opportunity lost. Even for teams and companies we root for it's a simple physchological defense mechanism to pretend like the "other guys" didn't really get that good of a deal. Maybe you're doing that and maybe you're not, but this was a great deal for AMD even if it wasn't as good a deal for Nvidia. AMD would have been dumb not to take the deal whereas Nvidia made a calculated decision.

I'm just not a fan of either companies. You might as well be talking about chair companies to me. Had I found a comparable Nvidia card for a similar price in the same performance range I would have been more likely to buy that.

On top of this, you seem to fail to understand hardware at all.
Physx is software, not hardware. The extent to which it is used or the amount of processing it utilizes is entirely up to the developers implementing it. Think of it like particle physics. People like to use Physx for steam rendering. You can determine how detailed the steam is according to the available resources. This is why low to ultra pc settings matter. You are basically telling it how detailed the physics engines along with other settings can be. That doesn't mean tha the lowest setting doesn't have any physics or detail. Just that it's far less in comparison.

Physx lived when PC got no support from consoles. Outdated consoles that would never handle it. For 7 years.
Are you now claiming that the consoles have never been able to "handle" it? 7 years is the lifespan of the consoles. You seemed to indicate that the consoles were somehow cutting edge on last release.

Anyways, you are misunderstanding what physics engines are for. They aren't necessarily more resource demanding, in fact, they can even be less resource demanding than the various custom-made physics engines that development studios make for their games for certain processes. For example, ragdoll physics. A game could lean on Physx or Havok to help with that if their code is less efficient. True, they are usually meant to add effects on top of the vanilla game but not always.

Nvidia benefits from people working with consoles that have Nvidia cards because physx was still an option there. It makes it easy for developers to make use of physx to a lesser degree on the console and then turn it the heck up on the pc. Metro Last Light used Physx for persistent particles (blow a tile off a column parts stay visible on the ground) and for some steam/fog interaction (the steam is still there in all versions but with physx as characters move through the fog the steam dissapates). These are minor things that require a more detailed engine coupled with the hardware to use it. However, if you're already developing for an Nvidia card it does you no harm to include this feature as something that can be toggled on or off. If you are developing only for AMD cards, it does require extra steps to enable it.

That is why Nvidia went ahead and contributed the drivers. Yes, this allows the use of phyx on the consoles but makes it infinitely more easy for developers to implement the utility if they want to. However, and this is a major issue at the moment, there's not that big of a difference between Havok's newely released physics engine and Physx

<youtube=v1hoVCZZOd0>

That's real time on the ps4 hardware with a million objects.

If the new engine can already do physics this well with a million objects in such a varied environment then it can do everything that Physx is currently being used for. There is a huge difference between Havoc pre-2013 and Havok now. As stated, they released a new version last year that drastically improved performance.

<youtube=sS0Fqx_zxf8>

Cut to around .48 to skip the silly fluff and see Havok 2012 compared with Havok 2013. The performance difference is staggering for the same task.

There's a reason why the majority of AAA studios use Havok if anything else at all. It's a lot more user friendly with a ton more tools for interface. Until 2013, Physx was the better software. Now I'm not sure which is better at all. They may even have individual strengths and weakensses for all I know but one isn't necessarily better than the other.

Now that its easier to port to PC, physx will have a much easier time. In fact, you even said that despite no support from consoles physx was still being used more than havok. Nice way to backtrack on your own damn statement. Regardless of who uses it, its still popular on PC. In fact, the 2nd biggest community on steam is Russian.
Actually, my statement has been that they only create something like 3 meaningful games a year that actually use any component of physx. I presented Nvidia's own chart to discredit the meaningfulness of your statement that more games used physx. Havok outscores NVidia in every game area except for the shit pile. The games that score less than 50 metacritic or don't even have a score at all. How can you even pretend to toute these numbers as meaningful? When the games scoring more than 50 on metacritic are firmly in Havok's corner at 154 to 109 Nvidia? Sure, the shit pile has 81 for Physx to Havok's 27 but really? I'd consider those all to be detractors.

But you're basically saying that Batman: Arkham Origins using Physx is no different than say a game that wasn't even popular enough to get reviewed by games critics or ones that got scored below 50? There is a difference and you know it. As the quality of the game goes up, the number of games made with Physx drops drastically. From 81 on the shit pile to 8 in the Excellent. Havok seems to follow a bell curve with both shit and excellent in the 20s and the majority of the titles in the middle. But still a MUCH higher weighted average than physx. As a Physx fan, aren't you even a little embarrassed by that chart?

What's more is that these games are only games that support those engines. They aren't games that necessarily use those engines. Most of the games that use/support Havok still use it while a sizeable chunk of the games that support physx do so in name only wihtout using any of its modules. Wierd, huh?

Only PC matters to nvidia. Physx is maarketing to PC, which is why its free for PC developers.
And yet, the Source Engine went with Havok. So Havok had its hand in Portal, Half-Life, Stanley Parable and several other significant PC games. Do you have any more recent numbers (and perhaps any more recent quality comparisons) that would indicate a significant change of some kind?

If the 7th generation console circle jerk didn't kill physx, nothing will. Its a PC gamer feature, and wasn't meant for consoles beyond exceedingly basic things. If physx could survive so could TressFX. Which AMD threw away, and the only thing they could use to contest physx's dominance on PC. The market they desperately need to stay afloat.
Hair physics. It is literally an engine solely devoted to how hair behaves. That is not a competing engine. That's ridiculous. Like saying that a radiator manufacturer is in direct competition with a car manufacturer. AMD itself is in competition with NVidia but it really isn't anywhere close with a physics engine. AMD is not only competitive in the PC market, but it actually has a larger market share than NVidia.

AMD in consoles are meaningless, developers haven't coded for hardware since the early 90s. We've been over this. Device drivers allow software to run regardless of the specifics of the hardware. As long as they make the software itself run without memory leaks and other issues, its fine. Consoles are literally meaningless to PCs. If anything, console hardware only makes PC gamers upgrade to much more powerful cards that make consoles look like a joke, which is nvidia's domain.
The number of hardware sold means everything to the hardware manufacturer. Do you think Nvidia gives a crap about how many people use their drivers when it's their cards people actually pay for? According to that Forbes article, AMD is still outselling Nvidia in the pc market.

Hell, by your own logic the 7870 should be running sales like crazy. Except by steam's own stats its less popular than the absolutely ancient GTX 210.
By what logic are you talking about? Why should the 7870 be selling more for some reason and why do you think I indicated that? The card on the ps4 is not the 7870. It's an SoC version of it.

As for the GTX 210 comment, you do realise that GTX 650 Ti is also under it, right? 7850 is the 12th most common card surveyed in the month of February too. What are you trying to draw from this? There's 13 games right there that are within 00.1% (added the double leading zeroes to avoid the thought that this is 10%, it's only .1%) difference of the overall market share. 650 Ti and 7870 both right beside that 210. So what's your point? These are single cards that own almost a full percent of the entire market share of the Steam community. Heck Intel's 4000 and 3000 series are the two most common cards here. If anything, that should tell you that the steam survey almost means nothing regarding card quality. A full 36.24% of the cards listed aren't even on the list. They make up less than the 00.50% of a the market share each. Even Intel Ironlake (Mobile) is at the 1.23% mark.

What part of anything I said would indicate that a 7870 would magically sell more than other cards? I'm just surprised that the 7970 is that much higher.

And its hilarious how your own Forbes news post has a link to steam stats that say the exact fucking opposite of what it claimed. This is hilarious.
You do realised that they only referenced Steam to point out that Intel has the first and second spot on the most common cards. Those two intel cards make up almost 9% of the Steam market by themselves. If you throw in the 4th most common card (HD 2000) you've got one brand already showing over 10%. The person was just saying that they couldn't leave Intel out of the dicussion and that's why.

But what the article was actually talking about was market shares for the year. Actual shipment rates and not surveys. Real numbers. Not what people currently have in their machines but what people were buying:

http://jonpeddie.com/back-pages/comments/intel-and-nvidia-raised-shipments-levels-for-the-quarter/

Now, Nvidia is increasing in shipments and it looks like AMD is decreasing. But at the moment and last quarter and last year, AMD sold and shipped more units.

But I'm not sure what the point of this part of the discussion is. I think AMD still F-d up their business management. I think you're confusing me saying that AMD makes a legitimate product as me saying that it's somehow better than Nvidia. I think AMD has made significant business mistakes and that will drown them if they don't make the appropriate changes. But their cards are fine.

Physx being on the ps4 is meaningless. Without CUDA it wouldn't work the way devs wanted it to, which is the higher end physics. In fact, Physx was licensed for the xbox 260 and ps3. When the last time you saw physx on a console? In its full glory? None. When was the last time you saw a 7th gen PC game have physx? Multiple times from AAA games and running a lot of what physx has to offer.
In it's full glory? Haven't seen it yet since Nvidia locked it to CPU processing. But Nvidia's statement was that it would be fully functional on the ps4. Time will tell if they're lying but developers won't be fooled.

http://www.pcgamer.com/2013/03/09/is-tomb-raiders-performance-trouble-the-herald-of-things-to-come-for-nvidias-gamers/?ns_campaign=article-feed&ns_mchannel=ref&ns_source=steam&ns_linkname=0&ns_fee=0

Even Source isn't an ace in the hole when that engine is ancient and on its way out. Source doesn't even support the new stuff that havok supposedly uses.
Depends on what Source II will use to build its engine. Considering the drastic changes to Havok in 2013 it wouldn't be crazy for them to use it but I always wouldn't be shocked if Valve did everything on their own this time around.

Physx wouldn't make its debut on consoles, not in its full form.
It's possible that they will relegate it to just the CPU again. We'll have to see if that has changed. Nvidia could stand to lose a lot of ground if games like Tomb Raider continue to come out developed in such a way to benefit API. I don't think it would benefit them almost at all to have it available if it can't be used at all.


And its working. Physx wouldn't actually exist on consoles and on PC it can actually be turned on. At best its a trojan horse to market Nvidia's tech on AMD's only recent accomplishment.
Perhaps, but Physx has actually been used in ps3 games. It's the GPU-based version that so far hasn't been.

AMD however made a horrid decision to hand out TressFX to everyone else. They could have built onto it and actually have something to compete with physx. They could make extras to market to the market they have been losing. Instead, they hand their greatest potential to their greatest enemy.
You know, you keep saying this and I keep not caring because it's just hair physics. As far as I'm concerned, AMD is light years behind Nvidia in the physics engine department and I'm actually impressed they managed to put anything competent together. The thing is, they don't have to be with Havok and custom engines being all over the place. But, because I don't care I haven't really asked what you mean by they gave it away, which your next sentence touches on.

It doesn't matter if its just hair, it could be something they could actually market and build on. Tomb Raider used Tress FX. They could even put it on the consoles and AMD cards exclusively.
Wait, you think because developers patched the issues with Tress FX on Nvidia and Intel cards that this means AMD "gave it to them"? That's silly. What's important for AMD is that it is optimized for their cards. Or do you not remember the embarrassment Tomb Raider's performance on the Nvidia line was?
 

Evonisia

Your sinner, in secret
Jun 24, 2013
3,257
0
0
luvd1 said:
Oo, pretty. And did I see reaction animation when a bullets bounces off the chest high wall your hiding behind? That's a step forward right?
Gears of War has it, your characters will flinch or duck their head when bullets fly within a certain range.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
Ultratwinkie said:
1. 500 million over 7 years makes about 71 million per year. Garbage for hardware manufacturers. Do you have any idea how expensive hardware development is? Nvidia made this much and they came out and said it was garbage returns. Amd is flaling behind Nvidia by your own words. This is chump change. The money won't change anything.
$71 million in net profit. That wasn't revenue.

https://ycharts.com/companies/NVDA/net_income

On just one card for one console, NVidia made $71 million in net profit. In 2010 Nvidia had a $99million net loss followed by a year where they made $256 million. That's nothing to sneeze at but $71 million would still be more than 1/4th of that number.

$71 million is not a trivial number. Even with them making over $500 million annually now it would have been a significant portion of their income for them. That is never less than 1/10th of their net income over the entire span of the console. If you think that's garbage then you're kidding yourself. What other card do you think came close to that return on investment?

The development of that card likely led to improvements for their other cards across the board. You're talking about one card that gave them huge profits and probably nontrivial advancements. It was even based on the 7800 so it didn't even necesarily take that long to develop it.

What's more is that you're being dishonest by dividing it by 7 years. Are you going to divide the same number by 17 years ten years from now? You divide it by the amount of time spent to develop it which I don't think we even know. So it's incorrect at best since the profits now for this very cheap card is likely a small fraction of what it was in the first quarter.

http://techreport.com/news/25527/thanks-to-consoles-amd-posts-first-profit-in-over-a-year

See that? AMD got 48 million in profits. Nvidia got 71. So the profit margins nvidia hated just got smaller by around 50%. That 48 million was suspected to have 22 come from selling assets like buildings. so about 26 million profit.

So that's 36% of the profit Nvidia got every year on one console alone. Counting all the consoles of this gen for AMD's paltry income. Pathetic.
Two things:

1. That is one quarterly net income, not annual like your erroneous $71 million estimate was based on. If they made $48 net each quarter that would be around $200 million in one year. $71 divided quarterly would be $17.75 million. But averages aren't numbers. We'd have to see what Nvidia's net profit was when the ps3 was released. But I'll tell you this much, the ps3 really failed to hit the desired numbers where as the ps4 is exceeding anticipated numbers. The PS4 alone is exceeding the ps3's sales and AMD has XBO and the WiiU to boost those numbers even though I'd consider both of those consoles to be doing poorly. The wiiU is practically circling the drain.

2. That is the entire company performance and not card specific. So saying that they made 48 million in Q3 isn't saying that the card made them 48 million. However, this is the first profit they've made in over a year which seems to indicate that it's only the consoles that brought it up to breath. The profit of the console cards could easily be over 100 million and making up for the huge losses they have been posting. However, I dont know if the losses being posted were due to R&D costs or not. Their actual difference in sales between Q2 and Q3 in the "Graphics and Visual Solutions" area was over $350 million but that's revenue.

We do know that AMD is making $60-100 per console sold.

2. Source is over 10 years old. Physx didn't come around until 2007. Do you even hear yourself? Its like saying "if guns are so great, why didn't the Romans use them?"
What? I'm just explaining that Havok is frequently used in games. I didn't say that source picked Havok because Havok is necessarily better. It's curious that you keep assuming my numbers are painting Nvidia in a negative light when I'm just portraying them realistically.

3. Exclusives sell cards. AMD has none. Even a crappy exclusive is better than none. And They made it so it could work very well on Nvidia to the point AMD is worthless. Nvidia held physx hostage, so AMD cards can't use it unless you mod it on. Which makes the game run much worse than with a Nvidia card.

If they are so behind Nvidia, how can you say Sony got a good deal? You just admitted Sony took a big steaming dump in a box and sold it for $399. Because it won't have the next gen stuff you personally wanted? From a company that lives in the stone age? By your own words?
No, card performance compared to price sells cards. People who get a weaker card that's more expensive just because it has physx on it are being dumb. All physx offers is some minor differences in physics. You get interaction with steam and persistent debris. That's it and that's now something that Havok easily offers in real time.

4. Havok demos that only show little blue Viagra pills in a very small area and with nothing actually happening and a promo with a very rough still photo.

Did you even see the physx demos? It does way more than that little demo did, and in actual games right now.
Maybe if I say it again it will help. 1 million persistent objects dropped and rendered in real time. Consider that physx was most commonly used so that glass that is shattered doesn't disappear, this is the equivalent of 1 million pieces of debris. Real time. Real time means game time. You shoot a window and this happens right away. Most tech demos are rendered over several hours to days. It's the real time and the number of objects that is impressive, not that they're blue objects. You can trivialize them all you want but it is impressive.

If you want something more than a million objects, each able to interact with eachother then I guess I can give you a less complex example that you may find more appealing.

Here's Havok's engine rendering tire friction, vehicle weight, and several other vectors for an eight wheeler through mud, a river and other terrain:

<youtube=oPdNJ0e5A24>

The once significant gulf between Physx and Havok has been drastically reduced. Physx could still maintain the edge but the difference has dropped enough to where it may not matter. Whatever engines are paired with Havok or Physx, they'll have the ability for objects to persist, water simulation (keep in mind, Physx's recent water simulation is impressive as hell but not real time by any stretch of the imagination while Havok's new ocean sims are real time), and steam/smoke physics but Havok already had those as well. Object collision is at an all time best and things will only get better going forward.

So I'm sorry, but Physx just isn't the beast it used to be now that the differences will be a lot more subtle. The actual game engine using these two will matter far more in addition (of course) to the hardware.

So maybe you have Nvidia examples that show more impressive rendering but my guess is you're just being fooled with more detailed object textures which would otherwise be supplied by the game engine and not the physics engine.

5. You cited supplier market share. That is like saying "xbox one sold more than Ps4" because it sold millions more to retailers.
Maybe if it was a new launch or something. But the numbers even out after launch because retailers won't resupply if their stock isn't moving. AMD has a fairly stable reorder process benefitting them. They are currently selling in larger numbers in the market and there's really no single card pushing that. I would say though, that Steam is inevitably going to be comprised of gamers. We do care more about our video cards and I'd posit we're more likely to go Nvidia for cutting edge tech. My entire point this thread is merely that a card whose performance is higher than another card, is better than the other card. You will have individual games that perform better on a specific card, but we've clearly seen that go either way and with the console race won by AMD it's going to slant more and more their way no matter how much you like Nvidia cards. But by and large, games which aren't built specifically for one card or another will perform better on the cards with the best performance. That you would ardently believe that slapping the Nvidia brand on a weaker card somehow sidesteps that because of a physics engine that has legitimate competition is misguided.

Physx, for all it's qualities, is rarely supported by games, especially games anyone cares about. A trickle of games each year that can be counted on each hand. Havok is regularly supported and used on major games all the time.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
Evonisia said:
luvd1 said:
Oo, pretty. And did I see reaction animation when a bullets bounces off the chest high wall your hiding behind? That's a step forward right?
Gears of War has it, your characters will flinch or duck their head when bullets fly within a certain range.
Right, this has been around for some time. Uncharted has it too and I think Last of Us but I didn't get into too many gun fights there.

Ultratwinkie said:
I divided it by the years the 7th gen was around. Which is 7 years.
Oh, I get what you did. I'm saying it's wrong. If anything, it's the life of the card and not years. This one card made them $500 million in net profit as of that article's date and further encouraged console gamers who port to pc to use their software. What type of return do they see on other cards? Do they really typically get $500 million back on every card they make? More? Less?

1. AMD is leaving desktop cards. By 2016 they will be mostly gone. There will be no "optimization" because that word is a BUZZ WORD. No different from "cloud processing" o the xbone.
I've seen where AMD is trying to reduce its reliance on PCs but so is every PC company, ever. Even Nvidia is trying to branch out and successfully so. AMD's SoC solutions lend themselves quite well to mobile gaming.

DO you have citation stating that AMD announced that they will be leaving the market entirely? Or do you just have one of the many links stating their desire to be less reliant on PC sales?

Here, the an actual article on the 2016 strategy [http://www.forbes.com/sites/sharifsakr/2013/04/23/amd-to-reduce-reliance-on-pc-market-by-2016-sell-new-arm-based-chips/].

What they hope to do is to expand their non-traditional pc revenue sources to be as much as half of their revenue. This doesn't have anything to do with selling less on PC.Just more everywhere else. So I'm calling shenanigans on your misappropriation of this statement.

2. Oh boy, 1 million tiny and low poly and res triangles in a low poly and res world. It may be amazing to you but I can see the limitations they had to do. Even the mud demo isn't that all impressive. If you want impressive, see physx's fire breath demo or the water demo.

Unless they use it for rain physics, its not that impressive.
Rain is not object physics in most cases (particularly not persistent objects) and you certainly wouldn't have a million raindrops all at once. You don't get it. 1 million objects rendered simultaneously in real time is a huge step forward. Think about what Physx was popular for. It could persist tens of objects that were broken glass or fragments off of something that was shot. Maybe they could even do hundreds before it disappeared. That was last gen tech. This is a million objects interacting off of eachother. This would almost never need to be rendered, the impressiveness is in the number. I'm sorry if you don't think it's impressive. But you're also belittling Physx's claim to fame in the same breath. We've gotten to a point where rendering physics for persistent objects has far exceeded anything we'd typically see on screen in real time.

Whether you agree or not, this is a significant step in physics. If it can handle a million objects dropping on varying surfaces and bouncing off eachother in real time then it can certainly render a few fragments of broken glass and other debris.

3. Again with the "AMD is beast" crap. A modern 660, which is the 660 TI, beats the 7870. 760 even more so. 770 beats the 7870 like a step child.
*sigh* AMD is not beast. But an AMD card that performs at X is capable of X. If it's capable of X but an Nvidia card is only capable of X-n then the AMD card is more capable. I have not now, nor have I ever cared about "extras". All I and most of consumers have ever cared about is card performance and price. Why would I buy a different card for the rare occasions where a game ever uses it.

4. More moving goalposts. First it was support, now its "games everyone cares about." Stop moving the damn goalposts. Physx has more games period. The fact its younger than havok and has more is staggering. Its a PC feature for PC devs, don't expect it on console. Which is where havok reigns and only in its BASIC FORM.
Laughable. By "moving goalposts" do you really mean "making valid counterpoints"? You're going to defend the Nvidia-stated fact that the only area Nvidia exceeds Havok is in the shit pile of less than 50 metacritic or no score at all? I don't consider a physx engine on a card that supports shitty games to be any better than a non-card-based physics engine that supports great games. Do you honestly see no difference in quality over quantity? Do you really believe that the quality of the games referenced has no bearing on the discussion? That was a link directly from Nvidia and I still don't know why they posted it. Havok beat them in every area except for the less than 50 metacritic category. Why would I buy a card for games I don't want to play? Why would you? I understand you buying Nvidia for quality but not for a barely touched physics engine that works great on paper but doesn't see the light of day in the vast majority of games that don't suck.

And what's this silly "basic form" nonsense. Developers use Physx and Havok for specific physics calculations. They don't use them to supply all physics. At least, I don't think they do since all of the Physx support is usually just particle physics and not much else and the support for Havok is a vague "physics" which could mean it's used more fully or is really just vague. So either both types are solely used for augmenting physics or Havok is the one used more thoroughly here.