Topaz Video 1.2.0

I’m still only getting h.264 exports when I have Prores 422 HQ selected with Starlight Sharp on MacBook Pro M3 Max fully loaded. Why is this?

Any news on RDNA 4 GPU’s being able to use starlight mini?

Getting ridiculous how long it has taken to actually implement this feature for AMD cards. We are talking a good 9 months or something like that and still counting…

Isn’t Starlight Sharp cloud-only on Mac? That might explain the h.264 output, which I think is the only available option for cloud. But I may be mistaken.

Andy

1 Like

Understood, right now the generative models for use in Astra are too large for local processing. So they have to be run via the cloud.

Nyx High Fidelity is the Denoise tool that is available locally within the app. Are you seeing issues where you can not apply it?

Have you written into the support team for assistance? Users that were seeing issues with downloading the model were seeing this in 7.1.5, but rolling back to 7.1.0 were able to download it for use. Feel free to write in to help@topazlabs.com and send the logs and we can investigate.

That may be correct. I need to double check and confirm the oddest thing though. I was working with a 30 second spot and somehow, I think it was a cloud export, that only took 2 minutes. I have no idea how that happened.

Hi, when will there be a new update or beta? There hasn’t been anything since the January version.

5 Likes

ok interesting yes i can’t find it, maybe i have to reinstall the program then

The devs are working on one now that we are hoping to release soon, they have been trying to address many bugs and issues in this next patch to try and clear some of the backlog of issues that have been negatively affecting users.

4 Likes

Try a reinstall and then if that does not work, send me a screenshot and the logs to help@topazlabs.com and we can dig deeper to see what is going on.

Has anyone had an issue with some of their outputs having partially corrupted frames like below?

Some of my outputs have red blotches (left example vs right which is a few frames apart) at random parts throughout for a second or two, both for my Nvidia rig and AMD rig. Same models etc which didn’t have this issue with 5.3.6 on either card, compared to since version 1.0.4 (rarely used any version other than 5.3.6 till now, so unsure which specific version it would have started at) and now 1.2.0 (same models but with new grain) where I’m occasionally getting these corrupted frames. Reprocessing the same file either results in a clean output, or the corruptions in a different place.

Bonus points if anyone has a workflow to replace certain frames from one clip with another, without re encoding, which would help avoid redoing the entire TVAI export.

That’s impossible, but I have my script setup to use tiff images as the intermediate format. With that, I sometimes do modifications to the frames before running the next AI pass or the final encoding.

oh sweet, yes, tiff images, may do the trick for future exports thanks.

Regarding swapping out specific frames or portions of a video without re encoding (specifically the entire file) being impossible.. I’d be surprised if this was the case.

Using chunked encoding as a rough example, you can encode portions of a clip separately, often using different CRF values (such as with Av1an) and then mux the encoded parts back together into a single video file at the end.

Taking that process into account… it should be feasible to split this exported file into portions, separating into clean parts and corrupt parts, taking note of the frame range(s) for the corrupt parts and doing a re-export (from source) for those specific frame range(s) in TVAI and then muxing the clean parts from original export with the replacement parts.

I suppose you might be able to get that to work if you cut on key frames. I use ffmpeg commands for all my video editing. It would probably be a lot easier with a more normal video editor.

Not sure how many of you guys also own Topaz Photo, but there have been very interesting developments with the latest version that may have implications for Topaz Video.

From the most recent Topaz Photo update:

”This release brings a major milestone for Wonder 2, enabling this new model to render locally. We first introduced Wonder 2 as a cloud-only model, but starting today you can run it entirely on your own device for greater privacy and offline workflows. **To make this possible, we’re introducing NeuroServer, a new technology we’ve built to enable larger, more powerful models like Wonder 2 to run locally with Topaz Photo. NeuroServer opens the door for us to bring even more capable models to your desktop in the future.”

With the introduction of NeuroStream and NeuroServer, we’ve built the infrastructure to bring Wonder 2 to your local machine. These technologies allow larger, more demanding models to run directly on your device — and Wonder 2 is the first to take advantage of them.”

A user asked:
”NeuroStream sounds great. Could that be something for Topaz Video as well?”

Lingyu (Topaz) responded:
”This technology should find it’s way into our other applications in the near future!”

I don’t understand the Topaz Video tech fully. If Topaz released more powerful video models, that up until this point could not be run on local machines, could that lead to speed and/or quality improvements for starlight mini or a model even better than mini? To me, this new technology breakthrough sounds extremely exciting.

1 Like

That is pretty exciting. Maybe I’m construing things, but I can see this being a reason for them to move away from ffmpeg. But it still doesn’t make sense to not make it CLI first and GUI second.

How can we take advantage of these new features if they are locked behind an nvidia gpu? That’s the first question I would ask.