the antithesis said:
I don't know what the fps of most HD televisions is nor if that's even a factor.
This gets goofy-bananas fairly quickly. And this only refers to North American video standards.
Take an image. Slice it up into 525 lines. Number the lines 1 through 525. Scan the odd-numbered lines out in 1/60th of a second. Then scan out the even-numbered lines in the next 1/60th of a second. You now have one frame of video, which took 1/30th of a second to display. Hence, video
field rate is 60 FPS, and video
frame rate is 30 FPS. This alternating display of odd then even then odd then even lines is called interlacing.
Now, introduce color televistion. It turns out you need a little extra space on every video line to synchronize the color circuitry (colorburst), but you can't speed up the horizontal sweep to compensate. Result: You're now scanning frames at 59.94 FPS, and frames at 29.97 FPS. So that's where those weird numbers come from.
Meanwhile, computer displays were originally repurposed television sets, so they swept out imagery at 60 fields/30 frames per second. And it was blurry and jittery. Someone said, "Say, can't we display all 525 lines in one sweep instead of two? The machines are fast enough now..." And thus were born progressive displays, where all the lines of an image are swept out at once, rather than being broken up over a series of fields.
Then the computer guys said, "You know, 640 * 480 really isn't enough pixels. Can we get more lines? And more pixels per line?" And thus was born the multisync monitor, which would try to adapt to whatever horizontal and vertical sweep rates the computer was generating. Soon there was 800 * 600, 1024 * 768, 1152 * 864, 1280 * 1024, and beyond. And you could display them at 60 frames per second, 72 frames per second, and eventually 240 frames per second.
Then someone saw how much nicer the computer displays were looking compared to their broadcast television counterparts and said, "Can we get some of that?" And thus were born the first "hi-def" video standards. But they had to be quasi-compatible with all the
very expensive equipment the studios had already paid for. Hence, "Standard Def" is the old interlaced color video standard of 59.94 fields/29.97 frames per second, called "480i" (the 'i' means "interlaced"). 480p is the same number of pixels as 480i, but they're swept out over a single 1/59.94th second frame, rather than two 1/59.94th second fields.
But the new hi-def sets were being designed by the same groups of people as had been making computer monitors all that time, and they said, "There's no reason this multisync tech can't work in a TV. If we're getting a signal that's 59.94 fields per second interlaced, we'll sync to that. If we're getting a signal that's 720p at exactly 60.00 frames per second, we'll sync to that, too. Hell, plug in your old computer; we'll display whatever that thing's kicking out. 1024 * 768 @ 72 FPS? No problem..."
So the "standard" digital video format today is, "Whatever the content creator exported it as," since all the new displays just adapt. If you're going to broadcast over the air, then you are constrained by the FCC to a limited, well-defined set of formats, which are 480i, 480p, 720p, and 1080i. (I don't know if there's an FCC-sanctioned 720i or 1080p.) OTOH, if you're just sharing H.264 files over the Internet, then it can be whatever you think the recipient's video player can handle.
And now all that is stuck in your brain, too.