What I said is that your interpretation is wrong. No one working in engines would mix a rasteriser and a GPU. What he's referring to can be either the relevant part of the GPU, the relevant part of their own engine, or more likely the combination. Certainly he feels there's something significant to improve in the code since he expects to coax double the current speed out of it.Treblaine said:You say that but how about you actually read the article this thread is based on, there is a quote attributed to him he actually uses the word:Nutcase said:A guy whose day job is engine technology doesn't call GPUs "rasterizers".Treblaine said:With that I can say with confidence, CPU is not such a limit on frame-rate, in fact Carmack explicitly stated that it was the rasteriser (i.e. GPU, he uses the older term because he has been in the business for so long) that was the problem,
John Carmack admitted. "The rasteriser is just a little bit slower - no two ways about that..."
There is no "most advanced CPU", but a range of chips from good to bad for every specific purpose. A Cell is good at working with large masses of vector data, and bad at compiling software. A Core i7 is good at compiling software, but loses to Cell at working on vector data.You also said: "Bottom line, when some calculations need to be done, it doesn't matter where they are done. It matters even less in a high-bandwidth architecture like the CBE."
Well I think my previous post where the link showed Crysis being rendered on the most advanced CPUs yet being outperformed on the cheapest GPUs by a factor of 15 show it DOES matter where. In fact it is borne out in this precise scenario, RAGE is under-performing on PS3 by 30-40 fps, running at as low as 33% the speed of 360 or PC.
Whether e.g. physics, transform and lighting are done on the chip labeled "CPU" or the chip labeled "GPU" is irrelevant as long as the overall capacity of the system is good enough to get the job done. A Cell with a weaker GPU can match a weaker CPU with a stronger GPU since it doesn't need to offload as much work.
In the same arbitrary fashion as with processors, you hold a PC/360-optimized engine as a universal standard and direction of "advancement". It makes just as much sense to say KZ2 is the standard, and a 360-optimized engine makes COMPROMISES like using bad lighting.And it is not id Software's fault that PS3 can't handle an advanced engine. Your proposal of using a different TYPE of engine is unrealistic as for the most PS3 exclusive games they use fundamentally the same type of engine with graphical COMPROMISES like the low res textures in KZ2, while pushing PS3's strengths like certain lighting effects.