Crono1973 said:
Pearwood said:
Always thought it was weird how people blame the PS3 for a problem that's exclusive to one game. One notoriously buggy game that the developers have admitted was badly optimised and have pretty much fixed now at least in my experience. I just don't think Skyrim is the most intense game on the console and yet it's the only one with this problem.
E - just realised I was trying to be fair in a fanboy post, I think I'm doing it wrong. *ahem* LOLZ PS3 RULES 360 SUCKS. All is as it should be now.
Yeah, it's kinda crazy how people thought it was the job of the PS3 to make Bethesda's game work better.
Your post does not tell the whole story.
360
CPU: IBM PowerPC, Tri-Core, 3.2Ghz
RAM: 512MB GDDR3
GPU: ATi Xenos (Basically an modified X1850) at 500Mhz, with 10MB eDRAM.
PS3
IBM PowerPC "Cell Broadband", 3.2Ghz
RAM: 256MB XDR DRAM System Memory
GPU: NVIDIA RSX (Essentially a Geforce 7800 upgraded slightly) @ 550Mhz
VRAM: 256MB GDDR3
Wii
IBM "Hollywood" (PowerPC G4 based, basically higher clocked gamecube CPU) @ 729Mhz
RAM: 64MB GDDR3 main system memory
GPU: ATi "Hollywood" (Based on Gamecube's "Flipper GPU", higher clocked)
VRAM: 24MB 1T-SRAM
For Giggles
Gamecube
CPU: IBM PowerPC "Gekko" (PowerPC 750CXe Core) @ 486Mhz
RAM: 24MB 1T-SRAM Main System Memory
GPU: ATi "Flipper" 162mhz
VRAM: 3MB 1T-SRAM
Nintendo 64
CPU: NEC VR4300 (MIPS R4300i) clocked at 94Mhz (Could not use DMA to access main memory)
RAM:4MB Unified (up to 8MB) R-DRAM clocked at 500Mhz (16 bit bus) Horrible Access latency
RCP/GPU: 62.5Mhz Multi-Function chip (Handles video, audio, I/O, and contains the memory controller) Also hindered by a 4KB Texture Cache.