Poll: Does video quality affect your opinion of a movie?

Bad Jim

New member
Nov 1, 2010
1,763
0
0
Stabinbac said:
Pixel count at this point is far less important than dynamic range. That needs better cameras, better compression techniques, larger file sizes, and higher quality TVs. Sick of shitty, blocky, shadows.
Very true. Although I think professional cameras and compression techniques are already up to the task.

They also need higher framerates. I have a 144Hz monitor, and it's lovely to have games running at 100+fps, but no video media goes that high. It's especially annoying because 3DTVs can theoretically handle it already.

Maybe, if sites like Youtube would ever listen to reason, we could get them to limit bitrate but let us choose how to use that bitrate. There's no reason why we should be allowed to watch 1080p videos but shouldn't be able to watch a 480p video at 60fps, or in 48 bit colour.
 

Twinrehz

New member
May 19, 2014
361
0
0
Country
Norge
Zachary Amaranth said:
Of course, early adopters don't necessarily push things forward. History is littered with failed formats and ideas. The various permutations of CD, for example, never took over because there just isn't a market. People liked CDs and they didn't see the point in spending extra for DVDA, HDCD, SACD, etc. It's worse now, with more people buying non-physical media.

And I suspect it'll be worse for 4K and 8K for that reason, too. Not only are consumers likely to think that BD (or maybe even DVD) is "good enough," I don't think consumer markets are going to want to invest that much in it. I know that the early adopter and cinemaphile will, but will enough of the market care? Will broadcast companies want to broadcast in 8K at a significant cost with few potential viewers? Will enough movies be moved?

I'm not saying "no" but I am kind of leaning that way. I'm sure others disagree, though.
I think that's exactly the case. The know-it-all of the audiophile world have since the dawn of CD clamored that it's just plain bad. The thing is, any average guy in the street will not notice the difference, because he hasn't been introduced to the world of extremely expensive equipment where there might be any reason to clamor for higher quality. Even MP3 (and more recently, MP4) is widely used and considered "good enough", even though it's technically inferior even to CD. Personally I wouldn't have mind a more widespread use of at least DVDA, since most modern equipment is equipped with a DVD-player, but I can't say for certain that I would notice the difference.

I have ONE DVD which boasts DVDA, which is Jean Michel Jarre with his album Aero, but I have yet to test it, and before that I have to retrieve it from my parents' house. I also have a CD with HDCD, but I can't be bothered to actually BUY a CD-player that supports it, for two reasons: 1) I have ONE CD with that format, and 2) I mainly just copy my CDs over to my computer and use that for playback, so that I a) don't have to swap discs whenever I want to listen to something, and b) it's convenient to use the PC even for disc playback if I feel like it.

I find I can go on with this, so I shall. It is only most recently that I've found myself in possession of equipment that might actually allow me to hear the difference between MP3, CD and DVDA, and only because I went specifically for it. Had I been less "aware" of it (or cared less, as I assume most people do), I would probably not have bought a HiFi system, but rather settled for some home entertainment system with built-in amp and tiny speakers, coupled with a thundering subwoofer that adds the baseline. Thing is, what I bought is considered cheap in the world of HiFi, but with the same amount of money, I could have bought the aforementioned entertainment system.

I suppose the price tag is to separate the chaff from the wheat, but it almost seem like deliberate alienation of those who don't care enough to purchase anything solid that will last them. (I'm deeply sorry if this seems offensive to anyone, it was not meant that way. I'm merely trying to differentiate between consumer and enthusiast). I also bought a DAC because some guy advised me to do so, since I use my PC for playback, and the headphone out on computers (as on most devices) is simply not mechanically suited for speaker playback, as I've noticed with my father's insistence on hooking up his TV set to the amp via a cable plugged into the headphone jack of his TV, the result being that the volume needs to be turned way up to be able to hear any normal conversation, and at low volumes you get this weird buzzing sound with low frequencies.

The same becomes apparent in the world of video. While most people have "seen" the improvement of blu-ray over DVD, and blu-ray has become more widely available, and thus presents a viable reason to purchase a player supporting it, people with a screen smaller than 60" are likely to not notice the difference with 4K, while 60" and above is considered extravagantly large by most people, not to mention prohibitively expensive, and difficult to position. The room for impressing the common man is growing small. Enthusiasts will find room in their hearts and homes for this, but as for anyone with normal income, normal sized house and pretty normal interests, it's more of a gimmick whose only discerning feature is being more expensive.

Some might argue that you can hear/see the difference, and most people might be able to, but there comes a time when people get tired of switching out what they have because it's outdated, and when the difference is so small you might as well not bother to care. I was willing to shell out full price for the Game of Thrones blu-ray box sets when they came on the market, I can't say I'm willing to shell out for them again just for the benefit of 4K resolution that I have to purchase a new player and a new TV to actually enjoy.

As far as I know, there aren't even TV-channels in Europe airing with 1080p quality, because there isn't enough room in the signal spectrum used by satellites to accommodate for such bandwidth, and yet they're already trying to introduce 4K to people. MAYBE the 3D-channels offered on some networks, I seem to recall 3D-channels needing 1080p resolution, though I could be remembering that wrong. It still doesn't change the fact that the bandwidth requirements increases dramatically with higher resolutions, unless they use compression that renders the whole thing moot.

kasperbbs said:
1080p is enough for me, perhaps i would change my opinion if i saw what 2k or above looks like, but i couldn't afford a new TV right now anyway, so i'll probably switch when 4k becomes the norm and prices wont be as ridiculous as they are now, or when my TV dies and i'll have no choice in the matter.
Just wanted to point out that 1080p is 2K. 4K is called that because it consists of 3840 horizontal pixels, twice that of 1080p (which is the number of vertical pixels), that consists of 1920 pixels horizontally. 8K has twice the number of pixels that 4K has, which is 7680 pixels. Why they called it 4K instead of 2160p, I don't know, I suppose it's easier to say 4K than 2160p.

You might want to read that two or three times to get the gist of it. In my head it seems so simple, but I get why it could seem confusing.
 

Stabinbac

New member
Nov 25, 2010
51
0
0
Bad Jim said:
They also need higher framerates. I have a 144Hz monitor, and it's lovely to have games running at 100+fps, but no video media goes that high. It's especially annoying because 3DTVs can theoretically handle it already.
That's a whole can of worms in itself. Higher frame rates lead to limited shutter speeds. That leads to higher sensitivity, that leads to more noise. You make darker scenes more difficult. You also reduce the artistic use of the motion blur you get with slower shutter speeds.

Games can give you nice simulated blur at 1000fps. Movies can't.

Higher resolution also has potential for increased dynamic range. Smaller pixels may be noisier by themselves, but the collection of them as a whole tends to average out more accurate, and contains more detail.
 

Poetic Nova

Pulvis Et Umbra Sumus
Jan 24, 2012
1,974
0
0
As long as it doesn't look like someone smeared vaseline all over I can live with lower video quality.
 

Bad Jim

New member
Nov 1, 2010
1,763
0
0
Stabinbac said:
Higher frame rates lead to limited shutter speeds. That leads to higher sensitivity, that leads to more noise. You make darker scenes more difficult. You also reduce the artistic use of the motion blur you get with slower shutter speeds.

Games can give you nice simulated blur at 1000fps. Movies can't.
Artistic use of motion blur is still possible. Just shoot at low framerates. A 100fps format inherently supports 50fps or 25fps, you just repeat frames.
 

Something Amyss

Aswyng and Amyss
Dec 3, 2008
24,759
0
0
Twinrehz said:
I think that's exactly the case. The know-it-all of the audiophile world have since the dawn of CD clamored that it's just plain bad. The thing is, any average guy in the street will not notice the difference, because he hasn't been introduced to the world of extremely expensive equipment where there might be any reason to clamor for higher quality. Even MP3 (and more recently, MP4) is widely used and considered "good enough", even though it's technically inferior even to CD. Personally I wouldn't have mind a more widespread use of at least DVDA, since most modern equipment is equipped with a DVD-player, but I can't say for certain that I would notice the difference.
I only own a handful of "next gen" audio discs, and I no longer have a player that plays them. In part simply because they all seemed like a marginal improvement over CD for a lot of price. I'm not saying I wouldn't like better audio, but at that price I guess I just wasn't a big enough audiophile.

As far as MP3 goes, I use MP3s primarily in already loud situations (out in public) and as such it's an easy compromise since I'm not going to hear all the nuance anyway. I won't go really low quality, but there's no need to go lossless over MP3. One thing that burns me is that there are a bunch of CDs which were obviously compressed below 128 at some point during production.

The same becomes apparent in the world of video. While most people have "seen" the improvement of blu-ray over DVD, and blu-ray has become more widely available, and thus presents a viable reason to purchase a player supporting it, people with a screen smaller than 60" are likely to not notice the difference with 4K, while 60" and above is considered extravagantly large by most people, not to mention prohibitively expensive, and difficult to position. The room for impressing the common man is growing small. Enthusiasts will find room in their hearts and homes for this, but as for anyone with normal income, normal sized house and pretty normal interests, it's more of a gimmick whose only discerning feature is being more expensive.
Yeah. I admit, I'd love to have a super large display for movies and gaming, but I don't have the room to justify it, and probably wouldn't be able to justify the cost. I don't even have good speakers anymore, because there's no place in my apartment to justify them. People find it weird, because I used to have the best gear, but I just don't care enough to pay the premium anymore.

There's also a huge gap between the electronics we're used to and quality merch. And I think growing accustomed to cheap CD/DVD/BD/whatever players has kind of soured us on a real investment. Though in my case, not necessarily so much. I paid a fair amount for my AV receiver and it's been a solid piece that's survived many replacements of my friends' CD/DVD/Bd/Stereo/HTS that came out of Wal-Mart and whatnot.

It's not always true that you get what you pay for, but it's true most people aren't willing to pay these days. And I don't entirely blame them.

Some might argue that you can hear/see the difference, and most people might be able to, but there comes a time when people get tired of switching out what they have because it's outdated, and when the difference is so small you might as well not bother to care. I was willing to shell out full price for the Game of Thrones blu-ray box sets when they came on the market, I can't say I'm willing to shell out for them again just for the benefit of 4K resolution that I have to purchase a new player and a new TV to actually enjoy.
And that's exactly why, really. My GoT videos came with both DVD and BD and you can see a marked difference, but I'm not sure my TV would be large enough for me to notice a 4K "upgrade." So buying new equipment has become an issue of diminishing returns. On a similar note, that's sort of how I feel about gaming consoles these days. The leaps and bounds used to be phenomenal, and now? Eh.

As far as I know, there aren't even TV-channels in Europe airing with 1080p quality, because there isn't enough room in the signal spectrum used by satellites to accommodate for such bandwidth, and yet they're already trying to introduce 4K to people. MAYBE the 3D-channels offered on some networks, I seem to recall 3D-channels needing 1080p resolution, though I could be remembering that wrong. It still doesn't change the fact that the bandwidth requirements increases dramatically with higher resolutions, unless they use compression that renders the whole thing moot.
I wouldn't be surprised, though I've never looked into it. I admit, I don't even care about broadcast here, because the price kept going up on cable without offering me more stuff I actually wanted to watch. I'm mostly concerned with bandwidth as far as internet goes, what with bandwidth caps and fees and all that crap that American ISPs pull.
 

Twinrehz

New member
May 19, 2014
361
0
0
Country
Norge
Zachary Amaranth said:
There's also a huge gap between the electronics we're used to and quality merch. And I think growing accustomed to cheap CD/DVD/BD/whatever players has kind of soured us on a real investment. Though in my case, not necessarily so much. I paid a fair amount for my AV receiver and it's been a solid piece that's survived many replacements of my friends' CD/DVD/Bd/Stereo/HTS that came out of Wal-Mart and whatnot.

It's not always true that you get what you pay for, but it's true most people aren't willing to pay these days. And I don't entirely blame them.
While it's not necessarily bad at first, it's not built to last, common for a lot of things these days. I'm in a different mindset though, I'd like the stuff I buy to actually last a while, because while I'm down with buying expensive stuff, I'm not down with replacing it once every 2-3 years, because of poor construction and cheap components. Also they tend to look like crap, often because it's glossy plastic that looks like absolute shit after a few months when the dust has settled. And to add even more insult, because of the poor construction it tends to warp and bend after a while, making it look even more like garbage.

I've seen my fair share of shitty audio equipment. I once purchased some JBL computer speakers, because I thought JBL was well renowned and a good brand. They lasted less than a year before the sound started to go static. In the end I wound up with some 15+ year old computer speakers that my brother had left behind, which had already lasted that long, and were still good to go. Some slight imbalance in audio levels between left and right channel, but nothing a little fiddling in sound options couldn't fix. So they lasted until they were replaced, and would probably still work if I had any use for them and hadn't thrown them away. I think those speakers were even unbranded, but they were quite heavy, which I've learnt is often a good sign in audio equipment.

I also inherited an ancient Akai amp from my grandmother after she passed away. It was purchased in 1979, and started to go wrong somewhere around 2007-2010. The problem with it could probably have been fixed, someone suggested to me that I should try replacing the capacitors, as it sounded like they could be dried up. Alas, I'm not much for soldering, so it was given to a school for the student studying electronics to fiddle around with. Also it was old, and I like new stuff.

And that's exactly why, really. My GoT videos came with both DVD and BD and you can see a marked difference, but I'm not sure my TV would be large enough for me to notice a 4K "upgrade." So buying new equipment has become an issue of diminishing returns. On a similar note, that's sort of how I feel about gaming consoles these days. The leaps and bounds used to be phenomenal, and now? Eh.
The most interesting leap in my eyes was from 5th to 6th generation (PS1/N64 to PS2/GC/Xbox). The difference was HUGE. Of course, this generation crapped itself inside out because the systems aren't powerful enough. The increased demand of graphics is a huge drawback for consoles, more than it has ever been before, and they're just not up for it. Developers are already sacrificing 1080p and 60fps to be able to make games run on the systems, while computers have had a headstart and are running in circles. Of course not all games have to be mindblowing graphics and high framerate, some of my favourite games are indie games, offering something far more interesting than flashy graphics. There's a reason to why I think Burnout: Paradise looks all right; it doesn't NEED to be better. It CAN be better, but there's no need for it. And although PCGamer wants me to believe that Next Car Game is so realistic it's indistinguishable from reality, it really isn't. Real cars aren't THAT glossy, especially not when they've been driving for a while.

I wouldn't be surprised, though I've never looked into it. I admit, I don't even care about broadcast here, because the price kept going up on cable without offering me more stuff I actually wanted to watch. I'm mostly concerned with bandwidth as far as internet goes, what with bandwidth caps and fees and all that crap that American ISPs pull.
I barely watch TV anymore. I live in a dorm, so there's cable included in the rent, for the tv in the common room, but I have no interest in it whatsoever. If I want to watch something, I use other methods, because the tv channels send garbage 95% of the time, and what might be interesting isn't aired at a time that suits me anyway. Not to mention that a lot of the stuff that I find interesting, isn't even being aired. If I were to wait for some stupid network to pull their thumb out of their butt and start airing something worth watching, I would probably never have seen an interesting show again in my entire life, because the networks in this country is absolute bullshit, UNLESS I pay for the premium channels, which I'm not going to anyway, because thanks, got internet.

The only time I watch tv is when I'm bored and don't know what to do, so I sit down and watch top gear or storage wars. It's dumb entertainment, but at least I can tolerate it. I won't go into how much junk there is being aired on tv all the time, because then I'd spend the rest of the afternoon writing. Also it'd probably be boring to read; it would be a long, tedious, never-ending rant on everything that's wrong with broadcasting.

Bad Jim said:
They also need higher framerates. I have a 144Hz monitor, and it's lovely to have games running at 100+fps, but no video media goes that high. It's especially annoying because 3DTVs can theoretically handle it already.
This has been discussed at length in several threads on this forum, a lot of people don't like higher frame rate on movies, it makes it seem almost unnatural, or looking like it's live action. See The Hobbit in 48fps for that particular can of worms. I myself kinda liked it, but there's a lot of people that just don't.

Maybe, if sites like Youtube would ever listen to reason, we could get them to limit bitrate but let us choose how to use that bitrate. There's no reason why we should be allowed to watch 1080p videos but shouldn't be able to watch a 480p video at 60fps, or in 48 bit colour.
I agree on this, I wouldn't mind if youtube allowed videos to run at an optional 60fps, if the uploader wanted it to.
 

Tortilla the Hun

Decidedly on the Fence
May 7, 2011
2,244
0
0
I'm not much a quality queen, but I must say that hi-def looks fantastic. I watch a lot of older movies so they don't need to be at the same level of video quality of movies today of course, but it's great to see them digitally remastered.
 

Bad Jim

New member
Nov 1, 2010
1,763
0
0
Twinrehz said:
This has been discussed at length in several threads on this forum, a lot of people don't like higher frame rate on movies, it makes it seem almost unnatural, or looking like it's live action. See The Hobbit in 48fps for that particular can of worms. I myself kinda liked it, but there's a lot of people that just don't.
I'd say The Hobbit was a bad choice of movie to introduce a higher framerate. It looks unnatural because it is not natural. It is faking things all the time, for instance it pretends that Ian McKellen (Gandalf) is much taller than Richard Armitage (Thorin), while Richard is actually taller. A high frame rate makes the trickery more obvious, and the fact that even basic conversations require special effects means that you are almost always looking at something that's not quite right. Most movies have far less special effects, and will look natural when they aren't using any, regardless of frame rate.

Another, more important issue is that low framerates cause motion blur, which tends to remove the benefit of high resolution. I suspect that, if you put 1080p @ 48fps and 4K @ 24fps side by side, the 4K would only look better when the image was nearly static, and would still require twice the data.
 

Twinrehz

New member
May 19, 2014
361
0
0
Country
Norge
Bad Jim said:
I'd say The Hobbit was a bad choice of movie to introduce a higher framerate. It looks unnatural because it is not natural. It is faking things all the time, for instance it pretends that Ian McKellen (Gandalf) is much taller than Richard Armitage (Thorin), while Richard is actually taller. A high frame rate makes the trickery more obvious, and the fact that even basic conversations require special effects means that you are almost always looking at something that's not quite right. Most movies have far less special effects, and will look natural when they aren't using any, regardless of frame rate.

Another, more important issue is that low framerates cause motion blur, which tends to remove the benefit of high resolution. I suspect that, if you put 1080p @ 48fps and 4K @ 24fps side by side, the 4K would only look better when the image was nearly static, and would still require twice the data.
I'm probably remembering the phrasing wrong, so the word "unnatural" may not have been used.

I recall something about higher framerate in movies (in general) supposedly looking off, like it's live action that you're seeing on a stage, and something about it not being possible to distance oneself from the movie enough to break free from reality, something that was possible to achieve when the movie is in 24fps. Personally I've never been that bothered by motion blur. I liked the way The Hobbit looked in 48fps, and want more movies to be like that. It doesn't seem to be the general opinion, though.