MODEL_INFERENCE_ERROR on Starlight Sharp in the Cloud

Hi I’m using Topaz Video AI 1.0.1 and I’m trying to use Starlight Sharp. My computer is probably not good enough to render it, so I’m sending it to the cloud, however every time I do so I get a “MODEL_INFERENCE_ERROR” error message. It happens on a number of different videos I’ve tried. Most video are H.264 at 720p resolution, one was at around 570p, and are fairly short (10 - 40 seconds). The regular Starlight seems to work just fine. Full message is:

0199eed9-03c2-72e0-92d5-de243344567d {“estimates”:{“cost”:[21,24],“time”:[3463,3504]},“message”:“Processing failed - MODEL_INFERENCE_ERROR”,“progress”:0,“status”:“failed”}

Can you email the support team at help@topazlabs.com with the logs from the app so we can have the devs take a look?

Generally that error stems from an issue with the model not being able to read part of the file. Running the video source through Handbrake can help clean up any random metadata tags or corrupted frames that are causing issues, and then send it back to the cloud.

Your cloud credits would have been refunded by the system for those processes after you got the error message.

same here, even after converting the input file with ffmpeg

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.