He is thinking of the UNIX Millennium. Unix systems used to use a signed 32bit INT to store the number of seconds since some time in 1970, which is how UNIX systems and its derivatives like BSD, Mac OSX and GNU/Linux tell time internally (for arbitrary reasons that made sense to stoned computer researchers in 1970).arc1991 said:I'm no expert in binary department...but why will computer clocks and systems have major problems it's just a number, 19/01/2038...NLS said:Why are people surprised? 2,147,483,647 is the highest you can go with a signed integer.
Hell they could almost try to make that a selling point for the next-gen version of GTAV "Now with support for a maximum amount of 9,223,372,036,854,775,807 $" since that's how high you can go with 64bit.
People will always say "oh why didn't they think of that", but did you know that a lot of computer clocks and systems will have major problems when time hits 19 January 2038 because of the exact same problem?
Brain. Hurts.
It's not that big an issue, as all major Unix implementations and derivatives already have moved (or about to move) to a 64 bit INT. And if you are still running legacy systems that require a 32bit INT in 2038 then you have bigger problems than that.
It doesn't affect Windows systems, although Windows has had it's share of time issues and integer overflows in the past due to sloppy coding and bad design.