I have a Dell Precision 7760 with i9-11950H, NVIDIA RTX A4000 Laptop GPU, and GeForce RTX 3060 Ti attached via USB-C cable; the 3060 Ti is installed in a Sonnet 750 Breakaway Box.
I have been steadily doing upscales of ~45 minute TV shows (Star Trek Next Generation episodes if you must know ) from 1080p to 4K. Using all GPUs, processing time for one episode is normally somewhere between 3 and 3.5 hours. I upgraded to the latest Video AI (3.5.4) and had been having no trouble (I believe it was set to āAutoā rather than āall GPUsā but I canāt be certain). Upon starting the third upscale of the day, Video AI showed a processing time of over 7 hours for the same process (upscale 1080p to 4k via Gaia model).
Any idea why this is happening? Video AI shows all the GPUs in Preferences, the Intel integrated GPU, the A4000, and the 3060 Ti. I unplugged the Sonnet box (both removed power and disconnected USB-C cable), rebooted the machine, uninstalled Video AI and then reinstalled Video AI but rolled back to 3.5.2 (Iāve been having trouble with 3.5.4 on other machines, giving the dreaded āerrorā when āall GPUsā selected). That exhibited the same issue.
In desperation, I rolled back to 3.5.1, and gloriouski, Iām back to about 3 hours processing time (and I can choose āall GPUsā without getting an error)! I think Iāll stay on 3.5.1 for the time being (on this machine, Iām only doing 1080p to 4K upscales so I donāt need the Iris V2 model).
Whatās strange is that this started suddenly. I should note that because of the outboard GPU, Video AI does not complete processing, so I have to close out Video AI and re-start it at each upscaling session. Something evidently changed after I started my second upscaling session and before I started the third, but I donāt know what it wouldāve been.
2023-10-16-17-57-44-Main.tzlog (67.5 KB)
2023-10-16-18-27-56-Main.tzlog (68.3 KB)
2023-10-16-19-59-10-Main.tzlog (195.2 KB)
2023-10-16-20-10-50-Main.tzlog (19.4 KB)
2023-10-16-20-11-54-Main.tzlog (68.4 KB)