Multiple gpu in video ai

I have 2 3090s and video ai only uses 1 gpu. It doesn’t even max out the gpu either. Only uses like 20 percent.

Is this by design?

i got only one rtx3090 in my system.
and yes, the rtx30x0 series is not fully supported, yet.
They are working on it.
well, at least i hope they still are.

Recommendation/workaround of the topaz developers is:

run multiple instances of veai parallel.

i do so and i can easily run 3 instances rendering 3 different videos at the same time.
what saves me 30-50% of the all-in-all rendering time.

with 3 instances you can use up to 95% of the rtx3090 gpu power
but even with 3 instances only up to 10gb vram max are used.

on your system i expect you can run at least 2 instances on every gpu
making it 4 up to 6
if the rest of your workstation supports it (cpu and ram)

as you “should” be able to set every instance to use either gpu1 or gpu2

well, there are small “issues” with the rendering queue files.
one veai instance save.json file overwrites the other instances save.json file.

so you can lose ready prepared queues … during a system crash/breakups

and on the 2.0.0. gui i see small “cosmetical” sideeffects like
“stop rendering” while there is no rendering process running on “this” instance.
its safe just closing it and starting the process any time

sometimes the ai model mystical changes from one model to another and
compression settings switch back to last settings until you restart the gui.
sometimes the resolution switches also / is not what you wanted to render.
always read the filename to be written to make sure it is…

Ahh I see. Didn’t know it wasn’t supported yet. Thanks for letting me know.

I am using two RTX3090s. To max them out, start Video Enhance mutliple times with each a separate file to enhance, or, each a part of the same file. This works great, and I have no trouble at all to maximize GPU usage (look for GPU usage,and power stats in GPUz, not the crappy Task Manager GPU stats)

1 Like

i am doing the same, as recommended by topaz devs, with just one RTX3090, but i have problems,
since the queue files of instance one (save.json) is aways overwritten by instance two or three.

I am using the queue files heavily so that really disturbing, especially in the case of a system crash, then the queue file empyed out by zeros…