I increase my movies to 100fps using apollo. Sensational results.
But I’m having trouble controlling for file size. I’ve edited my videoencoders.json to using a cq bit rate of 21(in between medium and high topaz setting).
I’m using 265 main 8 bit in topaz on a 264 video.
I also edited my aq strength to 8(topaz has it at 15). 0-15 is the range. Before I did this, the swings in file size were even larger. It didn’t seem to effect video quality. Is this a bad idea or ok?
Right now if I take a 6gb 264 movie it will end up anywhere from 8.5 gb to nearly 30gb 265 depending on the movie. So strange!
I even had one 11gb go to 42gb just using apollo.
What’s causing this? After using topaz if I put the file back into handbrake and use the SAME cq(21) the 42gb file dropped down to 26gb!!
I like using Fastflix for post processing to AV1. It does a very good job at compressing H264 videos below what you would get from H265.
One thing I like about Topaz is that it doesn’t inject a ton of extra keyframes when increasing the frame rate. When I was converting 4k movies to AV1, some uploaders on torrents used some program for their upscales that caused Fastflix to blow up in file size. Never had that issue from any processed video from Topaz.
Increasing the frame rate will increase the file size, tweaking some of the settings in the JSON can help, if you are comfortable with doing that, generally not recommended though for most users as it can cause other issues.
The most common workflow is to export it out and then take it to handbrake or another app to compress back down if needed.
Is it not safe to simply change the cq to 21 and the aq strength to 8? Is that ok or will I mess things up? Will that cause any issues for me? I’m just using apollo. I don’t want to screw up anything. I’m going to convert my entire movie collection to 100fps with apollo.
I know increasing the frame rate will increase the size. It’s just unpredictable the amount for each movie. It’s fine, I guess I’ll just use handbake for some that get too big. The other’s I can leave alone. Just trying to save time.
But am I ok with my edits to the file in json? I found those two settings work best for me overall.
It’s quite simple, doubling the fps doubles the file size etc. (by same encoder quality settings). Add grain additional increase the file size, depending on codec/encoding settings and grain structure, this will vary, but I would estimate an increase of +50% when adding grain.
Personally, I would not convert films to 100 fps, I recommend double the frame rate to 50/60 fps, I would use exactly a factor of 2x interpolation and not something odd. When you must deinterlace and BOB Flag brings a benefit (pends on footage), then i don’t do additional fps increase by interpolation
The problem is keeping grain because h265 is worse here and h264 a little bit better but also bad, best is AV1 in in terms of quality-file size ratio, while it retaining all or most of the grain.
There are different AV1 codecs out. Software (CPU) encoding is always better than gpu. By far the best AV1 encoding results delivers “AOM AV1” (AOMedia Video 1) codec, for example implemented into FastFlix…beware encoding time is a nightmare but the output quality in relation to the small file size is outstanding! This form of codecs are the future.
I don’t like grain. The slight denoising aspect is great actually. It doesn’t even turn people plastic like some heavier denoising settings do. I don’t have to run any iris or denoising filters on it. Apollo motion interpolation denoises slightly on it’s own anyway. I’m not compressing movies enough to really get rid of all of the grain anyway.
You’re absolutely 100 percent about even 2x interpolation when using motion interpolation. Using odd numbers gives all kinds of stutters/jitters.
I use shutter encoder to convert a 23.976 movie to 25fps. This simply speeds it up without adding frames. Speed difference is not noticeable. Taking it to 30fps would be horrible and noticeable, but 25 is fine. I then convert 25 to 100fps with topaz apollo, which is a clean even 4x interpolation
100fps is way better for actions scenes than 50 fps, even though 50/60fps is still is good. 50fps gets rid of all the stutter/jitters during camera pans, but I get this unbelievable next level of clarity at 100 fps during intense action scenes like spiderman. It’s really next level. Mind blowing. Also, apollo takes just as long to convert a movie to 50fps as it does 100fps. So it’s a no brainer for me.
I then set my tv to 100hz with nvdia control panel in custom resoltions. You have to frame match the screen or it’s not smooth.
It shouldnt mess anything up in most cases, the quality that you desire and the look are subjective to what you want to watch and how you like to view your videos. Thats for each user to decide what they want as some want no noise at all, while others prefer some grain and noise to be retained.
For me it depends on the film, movie or content piece Im watching or viewing. If the original is a film that was shot on film at 24fps, then Im not changing it or cleaning up the grain as that was how the original filmmaker and director of photography wanted it and that can be a part of the story. Newer content is different though, as some of its just badly shot or had a LUT dropped on that makes it hard to view without changing screen settings.
You can edit your JSON all you wish, my recommendation would be to duplicate it first and keep an original stored elsewhere in case something does break, so you can replace it with the original and not be completely lost.
yes more fps is better for motions but TVs frame interpolation gets better and better. I don’t wan’t blow up my files on the NAS with 100fps or 120fps (120 fps you must take when source was ntsc, the odd numbers thing we talked) or to lower the filesize you have to lower encoding quality to make them smaller, but I don’t want that, so i go with original fps or a BOB deinterlace or Frame Interpolation fps duplication. But if it suits you, everything is fine.
Yeah I have two 24tb hdd’s and a 16tb on top of it and obviously and I can only do 2 movies a day.
The only thing that worries me about 100fps is that I have to be able to switch the tv to 100hz. I can easily do that with nvdia control panel right now, but if that ever breaks or a new tv doesn’t support that, I don’t want to have all these movies I can’t watch smoothly. 100hz isnt’ a standard in the U.S. like 120hz is.
I’m wondering will a 50hz tv signal still be smooth with 100fps content? Obviously it won’t look like 100fps, but will it be as good as native 50fps? I’m assuming tv will just display each frame twice and you’ll only see one and it will be fine.
I’m assuming since they divide evenly it should be fine, right?
A 100 fps movie at 120hz looks horrible at times due to judder/stutter. You need these frame to divide evenly.
Yes, that’s a good question, no one has been able to explain to me yet. Even a TV expert with a YouTube channel struggling by explain jittering and input Hz and FPS. Because FPS does not have to be the same as the frequency, so what happens here whats the best is my question.
Maybe that’s the case if you have 50Hz but the video has 100fps, it runs smoothly because it’s divided by two, but as you already wrote, the specifications certainly play a role too. 120Hz would be what TVs should be capable of.
There are certainly various factors that lead to jittering and generally jerky movement. OLED panel Pixels have extremely short response times and do not glow after being switched off, I never had any motion issues on 15-year-old slow LCD Panasonic TV.
The fact movies are still shot at 24fps blows my mind when tvs are 60hz and 24 doesn’t divide evenly. Complete idiocy by the movie industry. All this crap of 3/2 pulldown and contrivances when it’s totally unneccesary. Movies should just go to 60fps, but at least going to 30fps would help out greatly.
At 120hz, the amount of jitter at 100fps is horrible because the fps and hz of the tv don’t align and don’t divide evenly. It’s refreshing at 120 frames trying to display 100fps. A mess.
I tried 100fps on a 50hz setting on my tv and it looks WAY better than 120hz, but definitely not perfect.
Setting the tv to 100hz for 100fps is perfect. Hopefully nothing happens to prevent that in the future.
Also, even in slower content that extra fps of interpolating in topaz apollo to 100fps instead of 50fps really helps the motion. It’s basically a “cushion” or a “reserve” if the motion interpolation algorithm makes any small mistakes. Just something I noticed as well.
I don’t think the 100fps on a 50 Hz TV would be smooth; it might at times, but it’s ultimately going to be dropping a frame each time for the refresh. I have not tested this, though, personally, but I might try and see if I can do that in the future.
The film industry is stubborn and will not change just because TV’s and screens have gotten better. Some are purists and won’t change for any reason, others, depending on the project and camera used, will adjust, though, but they will go with 30fps for standard broadcast, possibly 60fps in some cases. Outside of that, it is unlikely a production will export its deliverables to anything else.
Ok, thanks. Yeah obviously frame matching the fps to the tv is the ideal. I’ve always got custom resolution utility if nvdia control panel ever stops working in the future. If a tv can do 120hz or higher than obviously it should be able to do 100hz. Right now I have no issue with setting my tv to 100hz.
Yeah the film industry refuses to change at all. Stuck in the past. You might as well buy a crt tv to watch their content. It’s kind of ridiculous at this point. They still are using anamorphic non equal pixel aspect ratios on some of their 1080p deliverables to streaming services and filming at 23.976,(a former technical limitation) And some are delivering 23.976 still at 29.something on a 3/2 pulldown. It’s just nuts all around.
I used topaz apollo on 1080p top gun maverick and F1 the brad pitt movie and at 100fps those movies looks completely insane and awesome. And that’s just interpolation, not native fps which would look even better. There’s a reason sports is broadcasted at 60fps instead of 24fps.
ChatGPT says If source video is 24 fps, the best refresh rate for a TV is usually 24Hz or a multiple like 48Hz / 120Hz, rather than 60Hz, but 50Hz is also not optimal for 24fps sources.
24 fps → 24 Hz (Ideal Match)
Each frame is displayed exactly once per refresh cycle.
No frame duplication patterns are needed.
Motion looks smooth and cinematic, exactly as intended.
Frame 1 → refresh 1
Frame 2 → refresh 2
Frame 3 → refresh 3
No uneven timing No motion judder
24 fps → 60 Hz (Common but Imperfect)
Because 24 doesn’t divide evenly into 60, TVs use a technique called 3:2 pulldown.
Frame A → 3 refreshes
Frame B → 2 refreshes
Frame C → 3 refreshes
Frame D → 2 refreshes
This creates:
Uneven frame timing. Subtle stutter/judder in panning shots. You’ll often notice this during slow camera pans.
Even Better: 24 fps → 120Hz
Many modern TVs run at 120Hz
24 fits perfectly:
120 / 24 = 5
Each frame is shown 5 times evenly.
There’s nothing smooth about 24fps at any refresh rate. It’s horrible looking. So chat gpt has no idea what it’s talking about. It’s just pulling stuff off cinematic purist reddit posts. lol
Being at a different refresh rate than the source just means uneven frame timing so it makes it even worse, although 24fps is so bad to begin with anyway.
The vast majority of people only have 60hz displays. So more stupidity filming at 24 hz from the film industry instead of 30hz which at least would be an even multiple.
Basically, Ideally you want to be whatever the source fps is. So if you have interpolated something to 50 fps than you want 50hz. An uneven multiple like 60hz will be terrible, while an even multiple like 100hz would be better, but still not as good as an exact match obviously.
I have a 120hz tv. You get a ton of stutter/judder at 24fps. Just turn on motion smoothing on your tv and compare. 24 fps is not enough frames to deliver smooth motion. You can’t even do a simple pan. If it was smooth, then interpolation wouldn’t improve smoothness.
If 24fps was smooth, you could simply game and frame lock at 24fps and set your tv to 120hz. Let me know how that looks. LOL There’s a reason 60fps gaming looks way smoother than 24fps.
If 24fps was smooth, then sports would be filmed at 24fps. It would save a ton of bandwidth. It would look horrible.
It’s simply because cinema purists are demanding the judder/stutter for 24fps. Simple as that. It’s not smooth and they like it not smooth. It’s “cinematic” aka A ton of stutter.
But do you feed your TV with 120Hz when you play a 24fps movie?
So 24fps is a common TV standard, for 3840 × 2160 (4K UHD) all supported frame rates are:
23.976fps (most common for films), 24fps, 25fps, 29.97fps, 30fps, 50fps, 59.94fps and 60fps