Sometimes I wonder if high bitrates are less desirable

Don’t get me wrong - if you take RAW 8K footage from an ARRI Alexa and then compress it to make it commercially usable for public consumption, the better the codec and the higher the bitrate, the better the final result. But that is a state of the art, extreme high end professional camera taking perfectly pristine, extreme fidelity digital footage with all but zero flaws and artefacts.

But I’ve been experimenting at the opposite end of the spectrum with a view to finalising my settings for a large personal project turning my large collection of not-so-wonderful commercial DVD PAL 576i sport content into something that is watchable in this day and age when bluray is the new normal and Ultra 4K is at the enthusiast end of town.

What I find on ocassion is that given this old footage contains so many analogue flaws that no AI or processing on the planet can simply make disappear altogether (at least without far worse side effects), the effect of choosing higher bitrates can actually be counter productive. I find this more often from footage from particular Australian broadcasters from the 70s and 80s who might have artificially sharpened their broadcast pictures so as to appear more detailed than their competitors. “Culprits” here in Australia (back in the old analogue days) were Channel 7 and 10 who loved really sharp sports broadcasts compared to Channel 2 and 9 who did not seem to overly process their material. But that sharpening could (and often did) produce ugly effects that are now baked into the source tapes forever. It only got worse if they made a second generation tape from that first one - then you got artefacts on top of artefacts.

Of course back in the day when we knew no better we probably marveled at “how sharp” those pictures were but they were over-processed in order to stand out. Perhaps the visual equivalent of the music “loudness wars”.

So when it comes to bitrates, the higher the bitrate (assuming you are using high bitrates to begin with), you are going to capture more of what is there - good AND bad - with higher fidelity. If that means an ugly flaw that existed in the original analogue tape such as overly sharpened aliased edges, that in turn will possibly actually “magnify” the flaw as opposed to a slightly lower bitrate that can actually smooth it over a bit more.

In my testing so far, I am actually finding that some of my commercial DVD sports footage from the mid 1980s comes out better in the end (Topaz Full HD) using H.265 at 16 megabits / second as opposed to anything higher. Yes, the higher bitrates ARE more faithful, but the 16 megabit setting seems overall a slightly better compromise in that the outright losses are at least equally if not more offset by smoothing over some of the more overt flaws.

I also see this effect in some old black and white films from the late 40s that I have. Again, a slightly lower output bitrate can sometimes be more pleasant to watch even though technically it is less accurate. It might slightly smooth things over such as excess film grain, edge contrasts etc that might be difficult to mitigate through other processing (without causing worse side effects than the problems you are trying to get rid of).

I have a DVD set of the Black Sheep Squadron (TV series from the 1970s) and some of the episodes on that DVD set are so massively over-sharpened that a “smoother” codec like H264 is actually a better subjective choice in my opinion!

Instead of a final output, what if you transcoded the source to a lower bit rate before TVAI? The logic being that in a similar way, it would get rid of the artifacts TVAI doesn’t handle well and produce more compression artifacts that TVAI is good at correcting.

I might not have worded that well. The background of this idea is similar to your story. I have some family videos that were recorded on VHS then digitized to DVD. When I run them through TVAI, most all of the models don’t change much about the image, but they do make the file several times bigger. I ended up turning up the final compression to make the final files smaller—and I felt like I wasn’t losing anything meaningful. Like it was actually making the images look better because it was losing more of those inaccurate fine details that TVAI was adding. I decided that that was a pointless circle though. The reason for using TVAI on them was to make them look better. I ended up making them look different, but not even subjectively better.

Now that I’ve said all that, I need to say that I have not tried my idea of lowering the quality before TVAI. So it may end up being more of the same.

1 Like