You're right it's a quirk of our human representation, but you've got how the wrong way around.tahrey said:As for the .9999 thing...... I don't see the relevance either. It's just a quirk of human, digital representation of the universe's analogue nature. It's a sampling error, if you like. One that gets ever smaller as you refine your digital representation to a more accurate level with more digits (same as sampling analogue data with more bits; something that would be 254.99609375 in a 16-bit system normalised to a 0-255 scale (or, 65279 without normalising) becomes quickly becomes 255 dead when you cut out some of the bits, especially when reducing to 8 bit. Similarly the universe holds 0.999 recurring to an infinite number of places; it trends to 1.0, but never reaches it. The limit here is one of our own perception, and of our number system. There could be room for a million and one 9's after the 0. when you spread your measurement out to encompass the planck length width of the entire universe, but if we only represent that with 999,999 nines, it becomes 1.0 ... the number under consideration hasn't changed, it's just that our representation of it is innaccurate.
It's not that the universe "knows" that 0.999... and 1 are different but we, as humans, misrepresent them as the same. It's that the universe "knows" that 0.999... and 1 are exactly the same number but we, as humans, have accidentally created a decimal system that can represent the same number in two different ways.
0.999... is just another way of writing 1.
1.999... is just another way of writing 2.
2.4572999... is just another way of writing 2.4573.
They are rather silly ways of writing those numbers, but they weren't invented to be useful, they weren't really invented at all, those ways of writing those numbers exist merely as a by-product of our number system.