I also have two Name NVIDIA GeForce GTX 1070 Ti cards.
When I use one card, I get 0.2fps when up-converting from 1080p to 4K using Gaia model. When I switch to All GPUs , Max Mem% and 4 processes, I get 0.4fps but I also get Unknown Error during the preview process. When I click on the x to see an error the dialog is blank.
Two things, the i9-9900K might be able to handle two at a time, but probably not more than that. I have a GTX 1060 and those speeds sound about right for a 1070 doing HD to UHD. TVAI is really heavy on the CPU, though Gaia uses the GPU more than the other models.
From that command, it’s converting to tiff and nvenc h.264. That’s going to fill your drive C really fast, but it’s also doing a preview, so you should have enough space for that.
Does it work if you output to PNG or TIF instead of H.264?
Also, it’s got device set to 2, do you have the integrated Intel graphics enabled?
Yeah, mine doesn’t show the iGPU either, though, I probably made a point of disabling it in the BIOS.
I might be wrong, but I think device=0 is the main GPU, so device=1 should be another like your second 1070. I only think that because if I remote into my other computer, the command with device=0 fails. (I think because that changes device 0 to the remote emulated graphics.)
Sorry, that’s my only lead. I could be totally wrong and maybe device is only used to tell it to use the GPU or CPU.
The need of Directx for TL Software is a little downside.
At the moment the Radeon Instinct is comparable cheap (3500€) and has a lot of FP16 (180 teraflops) power, but it lacks directx hardware, it has no monitor output.
And because of this you cant use it for TL Software.