Shot/scene backgrounds can be somewhat expanded like that with real footage from nearby frames if there is camera or zoom movement. BorisFx Mocha Pro for example - there tend to still be some missing bits but there would be much less work for the AI to do because it would have more than the original 4:3 frames to work with (the totality of the background in the entire shot).
Trouble is that those pesky directors tend to do cuts/shots every 4 or 5 seconds in some genres and not a lot longer in most others so your AI might have to work on each or most of hundreds or thousands of short clips and try to find consistency between clips that often isnât there. Heck, AI is already inconsistent at duplication even when everything ârealâ is identical. AI would do the invented bits of the same backgrounds differently in different clips, and weâd notice.
But it should be possible to fill the missing parts of an expanded (4:3>16:9 etc) static background in say a 5 second clip, probably at lowish resolution, I guess. Making it look the same in other shots, or for animate objects including people, I suspect wonât happen though, not in the foreseeable future anyway.
I donât think youâre likely to see much effort toward that goal, since original aspect ratio and black bars became the SOP over stretched widescreen way back in the early days of DVDs.
Maybe itâll come one day. Maybe in 10 or 20 years. But to do it right, youâd need a really special model created for Star Trek series in 4:3 so that it can analyze every sequence in every episode. That way, it could better determine the fill areas for 16:9, based as you say on camera movements and the neighboring images. And keep the whole thing coherent.
I have been struggling with this problem for a long timeâI explained all my latest findings in the last post of the thread I opened some time ago related to the problem:
Thatâs right!
Thatâs what happens when Aion has such strict memory requirements that I had to use the CPU.
The first round did not produce a file for a reason I donât know, so after updating my OS and program, I am confirming every 12 hours that the output file is growing.
P.S. Wish partial files were playable, Iâm pretty sure mpv and VLC have the functionality but it doesnât work
Even when I donât use mark-in and mark-out, Iâm getting Aion failing in the same way on my 3070 8GB.
Iâm trying to take a 720p 30fps file to 1080p 60fps with Proteus and Aion, so nothing particularly demanding I wouldnât think.
Given that Aionâs QUALITY is VRAM dependent according to Topaz, I probably wonât get great results with it anyway.
I just wanted to experiment with it I guess, because Apollo still leaves something to be desired, in the sense that while itâs supposedly AI powered, I would say its quality is merely on par with Adobe Optical Flowâs non-ai based interpolation that was introduced 5 years ago. Well, maybe itâs slightly better than Optical Flow, but not by a lot.
Hereâs a few clips I did as a test. You can try the supplied original clip to see how Adobe Optical Flow compares. If theyâre just using Nvidia Optical Flow, then all of the interpolation models in TVAI are easily better. (You might have some troubles since it seems Adobe is loath to admit that Matroska is a competent digital video container worth supporting.)
I guess my opinion is based mostly on the anecdotal evidence that I still frequently see obvious artifacting with Apollo at a 2X framerate when someone waves their arm across the frame, the same way I do with Optical flow, even if itâs a little less obvious with Apollo. In both instances, the footage with the artifacting would still be unusable and must be edited around, or that segment of the video must be replaced with non-interpolated footage.
Iâve heard that said too, just cannot fathom why. I gave an arm and a leg to upgrade from a 422 to a 444 recorder for my camera, and now i have to buy a Mac too?
Could we get an official statement on this @tony.topazlabs?
Feels like Iâm missing something when I have to sacrifice half of my color resolution in such an advanced software.
I was considering to purchase an upgrade, but I wonât make sense until this issue gets resolved.
Could I please get an update/confirmation that this is being looked into?
9 days? But thatâs madness. Youâre going to burn up your CPU running your PC H24 for 9 days. Especially for rendering. Itâs not good for the machine. And hello energy consumption too.
Hmmm it´s not good for the machine when the cooling is too bad.
Normally a PC should be able to run at max. especially when doing overclocking.
You only burn up your hardware if it´s not well cooledâŚby the way the CPU is able to limit it´s speed when going hot.
Using an cooling-system that´s not designed for running at max. should never be used in a PC for a longer timeâŚ
Worry not, my PC is well cooled and power limited. All consumer CPUs are lawfully required to be capable of sustaining 24/7 operation at least within advertised clock speeds and temperatures. I cannot overclock anymore since recent updates overwhelmed the power controller anyways.
I am not able to change the input framerate of DPX footage. It always defaults to 25 fps, and when I click edit and choose 23.976 nothing happens. The output file is also at 25 fps which obviously messes up the speed of the footage, since it was recorded at 23.976. Is there a solution to this problem?