Dead Rising 3 Gets Massive 13GB Update

Hairless Mammoth

New member
Jan 23, 2013
1,595
0
0
Sanunes said:
From what I have read elsewhere, the game only takes up an additional 2-3 gigs after the download, so the real question is why did they need to overwrite 10-11 gigs of content. Since I haven't heard of a patch this big yet for the XB1, I really wonder what Capcom did wrong with their game design.
I was about to ask that. Steam is always updating things like Left for Dead or Team Fortress with 4 plus GB patches and my Steam folder isn't growing by that much. At least they are updating games that are multiplayer centric and get tweaked all the time. These PCs in console's clothing may have surpassed them in a negative aspect that was once a PC only thing. If they have to send you most of the original game's worth of data to overwrite what's installed, I'd hate to think how rushed more games will be in the future if this sets a precedent.
 

Coach Morrison

New member
Jun 8, 2009
182
0
0
Hairless Mammoth said:
Sanunes said:
From what I have read elsewhere, the game only takes up an additional 2-3 gigs after the download, so the real question is why did they need to overwrite 10-11 gigs of content. Since I haven't heard of a patch this big yet for the XB1, I really wonder what Capcom did wrong with their game design.
I was about to ask that. Steam is always updating things like Left for Dead or Team Fortress with 4 plus GB patches and my Steam folder isn't growing by that much. At least they are updating games that are multiplayer centric and get tweaked all the time. These PCs in console's clothing may have surpassed them in a negative aspect that was once a PC only thing. If they have to send you most of the original game's worth of data to overwrite what's installed, I'd hate to think how rushed more games will be in the future if this sets a precedent.
If I remember right from the TF2 forums sometimes during updates people would complain about their client downloading a larger patch, I think it was a bug where it would re-download most of the files for whatever reason.

OT: So it's for dlc? Why not make people download it when they buy the stuff instead of taking up space?
 

antipunt

New member
Jan 3, 2009
3,035
0
0
Nathan Josephs said:
you know not everyone has fibre yet. a 13 gb mandatory patch is pretty ridiculous.
Not to mention that 13gb is enough for an entire new game >_> ....
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Sanunes said:
Creator002 said:
StubbornGiant said:
How much space are you gonna have left after the updates just for games xD
IGN did an experiment on the Xbox One's hard drive size. They couldn't install every game currently out. It doesn't even get close to 500GB before filling.

OT: What did they do? Remake the whole game? I don't even think all my programming projects, including games, even come close to 13GB combined, and this is just a patch!
From what I have read elsewhere, the game only takes up an additional 2-3 gigs after the download, so the real question is why did they need to overwrite 10-11 gigs of content. Since I haven't heard of a patch this big yet for the XB1, I really wonder what Capcom did wrong with their game design.
the answer to that is bad file storage. they probably went the easy way and stored the data in large archives. like "textures.dat holding ALL textures in the game in one file, which actually has 10.000 files inside, but hard drives treat it liek one file and thus less likely to get things fragmented. however they probably encoded the archive in such a way that you cant replace part of it without reprocessing whole archive (part of why modding some games were hard). so they thought its easier to jut redownload whole archive even if it changes only part of it, hence the large sized download. we see this a lot in the MMOs and their archived packages. expansion comes out, heres a 5 gb download that wont actually icnrease the size of the game.
Basically this is good idea when you plan to never update the game as it usually causes less trouble for people that dont know how hard drives work and get it fragmented (as NTFS actually looks if it can fit whole file without fragmenting when writing, which causes less fragmentation when using the drive, and FAT 32 doesnt (but really noone uses FAT anymore))
 

ntfwc

New member
Oct 28, 2013
14
0
0
Strazdas said:
Sanunes said:
the answer to that is bad file storage. they probably went the easy way and stored the data in large archives. like "textures.dat holding ALL textures in the game in one file, which actually has 10.000 files inside, but hard drives treat it liek one file and thus less likely to get things fragmented. however they probably encoded the archive in such a way that you cant replace part of it without reprocessing whole archive (part of why modding some games were hard). so they thought its easier to jut redownload whole archive even if it changes only part of it, hence the large sized download. we see this a lot in the MMOs and their archived packages. expansion comes out, heres a 5 gb download that wont actually icnrease the size of the game.
Basically this is good idea when you plan to never update the game as it usually causes less trouble for people that dont know how hard drives work and get it fragmented (as NTFS actually looks if it can fit whole file without fragmenting when writing, which causes less fragmentation when using the drive, and FAT 32 doesnt (but really noone uses FAT anymore))
I think really big files are, in general, more likely to get fragmented. However, I suppose that if you had an archive file that you wrote directly over, without resizing it at all, future updates shouldn't make fragmentation worse. But then you are implementing a kind of file system in the file itself, which is a bit strange. It would be a development effort with few gains and added restrictions that I doubt studios would subject themselves to.
FAT32 is a very dated file system, but it does seem to get used in basically all USB flash drives. It doesn't support files over 4GB (as indicated by the 32 in the name). It is possible to manually split larger files up before copying them over, but that's not really something you want to do with any frequency.

OT:

A 13 GB update really doesn't seem like something that should be common. Proper patches should only include the data that actually changes. So unless they changed a significant amount of the assets for the game, this really shouldn't be so big. Even if they are using large archives, patching systems can work on a binary level (most effective if the patch is applied on an uncompressed level, of course).
Granted, it could turn out to be a common occurrence if Microsoft only provides them one method to patch, and it's an inefficient one.
 

MrBaskerville

New member
Mar 15, 2011
871
0
0
Will you be able to do other things while it downloads? And can you turn of the machine during a download and continue on a later moment?
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
ntfwc said:
I think really big files are, in general, more likely to get fragmented.
One big file is more likely to get fragmented than 1 small file. however 1 big file versus 10.000 small files is less fragmented as those 10.000 files would be written all over the drive. And since NTFS looks for "large enough" space to fit the file if possible it is more likely that at worst that big file will be in 5 fragments instead of 10.000 files in 100 fragments. That makes readability easier and since you need to read hundreds of those files each time your loading something - working with it faster.

But then you are implementing a kind of file system in the file itself, which is a bit strange. It would be a development effort with few gains and added restrictions that I doubt studios would subject themselves to.
i do not know how hard of easy it is to develop for that, only that most games i know used that and that games that didnt and are inconstant developement also started moving towards packages like these (for example world of tanks). It is obviuosly popular with game developers.

FAT32 is a very dated file system, but it does seem to get used in basically all USB flash drives. It doesn't support files over 4GB (as indicated by the 32 in the name). It is possible to manually split larger files up before copying them over, but that's not really something you want to do with any frequency.
I have a flash drive with FAT32 as well, however that is also getting outdated. all the flash drives i see now are going full NTFS, so FAT32 is pretty much obsolete with exception of older devices like my MP3 player that still uses a 2gb FAT32 card.

Even if they are using large archives, patching systems can work on a binary level (most effective if the patch is applied on an uncompressed level, of course)
I agree that this is how it SHOULD work. however sometimes archives are built in a way that you would need to "Rebuild" them to make this work (for example how if you mod San Andreas and forget to press rebuilding of the data file the game will be full of bugs). and sometimes doing a "mysteriuos rebuilding" on a machine that takes a lot of time with large archives (takes around 2 minutes for ~2GB files i done it with, no idea how much 13 gb will take) may cause anger with users. Especially considering that nowadays some people may even download the whole thing faster than that. I do agree that updating the archives would be what should be done but does the average user even know what that is?
 

ntfwc

New member
Oct 28, 2013
14
0
0
Strazdas said:
ntfwc said:
But then you are implementing a kind of file system in the file itself, which is a bit strange. It would be a development effort with few gains and added restrictions that I doubt studios would subject themselves to.
i do not know how hard of easy it is to develop for that, only that most games i know used that and that games that didnt and are inconstant developement also started moving towards packages like these (for example world of tanks). It is obviuosly popular with game developers.
I don't doubt that packaging resources is popular. I was referring to designing an archive format where you could update the archive file in place without changing its size, in order to avoid possible fragmentation from reallocation. Since individual objects would likely change size in updates, you would have to handle allocation and deallocation within the archive yourself, so you would be implementing a type of file system. It would be an interesting challenge.
 

StubbornGiant

New member
Apr 30, 2011
19
0
0
Zachary Amaranth said:
Chaosritter said:
Aren't most games in the 40 GB range on the Xbone? So isn't it closer to a third of a game?
If that's the case then its worse than I thought lol, I sense portable Xbone hard drive extensions in the near future...
 

Something Amyss

Aswyng and Amyss
Dec 3, 2008
24,759
0
0
StubbornGiant said:
If that's the case then its worse than I thought lol, I sense portable Xbone hard drive extensions in the near future...
I think that was the plan in the first place, but it's still kind of stupid (of them). Especially when you're trying to make this the center of a living room experience.