It’s using the CPU to scale to 8k, excuse me? I have it set to use the GPU, to use the maximum memory of the GPU. Why is it not working as it should?
Check the other reporting types in the GPU section. You should find it loading the GPU for video / image processing, I think.
Now I’m interpolating a video to 60fps, it’s barely using the GPU and the CPU its at 3-8% of usage.
Look under graphics_1, at least that’s what it is on my GTX 1060. It uses about 97% doing DVD to 1080p.
That graph doesn’t exist on my RTX 3080ti. It’s shows on the 3D instead. And yes, it never uses much of it.
I have the same problem with my 3080 GPU, VEAI is running without any background application running and the GPU is only used at 25% in Task Manager W10…
My system is really not slow in other aspects, Cpu Intel Core I9 10900K, 32GB Ram and Integrated NVME disks (raid zero) with read/write speeds at 5GB/sec….
And still VEAI is running like a snail…
Anyone has any thoughts on improving the speed?
For info : i’m converting old videos from 528x240 to 1280x720 mp4.
Thanks for all your comments in advance and have a nice day!
It should use CUDA to exploit the full potential of the GPU, it doesn’t look like it uses CUDA. Other video post-processing AIs uses CUDA (like RIFE, ESRGAN, NCNN etc) And the GPU is like crazy doing coil wine, and that’s how it should be, that it uses 100% of its power.
I can’t for the life of me get this to use my 3090 - it only uses my i9 9900k - at 80% or so…
Other video programs use my GPU just fine… Premiere Pro / After Effects / and so on…
What am I doing wrong?
You’re not doing nothing wrong, the program does…
Wow. That’s too bad. Figured a program this expensive would offer GPU usage instead of CPU.
Weird thing is, they actually let you select your GPU in preferences, acting like it’s going to use it.
The program is too expensive for what it offers, it does not interpolate effectively, its scaling models do not take 100% advantage of the GPU… it is a very poorly optimized program. It seems that the developers do not want to implement open source technologies like RIFE, NCNN, FLAVR or ESRGAN to do the interpolations or scaling, which work much better than some VEAI models and AIs.
It is sad to have paid so much for a program that is not the best in its field ( AI Video post-processing), it could be, but the devs apparently don’t want it to be.
Has anyone figured this out yet ?
I have a 2080 Ti and I was told by support it was because I was using Game Ready Drivers. I have since switched to Studio Drivers as suggested and it still uses my CPU over the GPU.
There is nothing wrong with TVAI at all.
It’s a skill issue of people using Task Manager incorrectly.
My CPU usage on an i9-9900k maxes out on all cores. But my GPU does 7 to 8 percent load with only 2 to 3 GB’s of VRAM out of 11GB being used, even though it set to use the GPU resources up to 90%.
I’m not sure if the CUDA cores are working away as I didn’t switch to that graph. But I know when the program 1st came out, my GPU be working at 80% to 90% loads while the CPU only be doing 20% to 30% loads. I’ll have to check the CUDA core graph tomorrow when processing.
If you want see what your GPU load really is don’t waste your time with task manager.
Use gpu-z as it shows the real core load or hwinfo64.
Task manager shows the 3D processing of the GPU. TVAI is not using the 3D engine to encode, it uses the tensor core which is a different part of the GPU, that is why it is not reflected in Task manager and you falsely believe it is not using the GPU. the exception is GTX series that don’t have tensor cores, then it would use the 3D engine and it would show/reflected in Task manager. use GPU-Z to monitor your power draw etc. and you will see it’s working hard
As noted, you can’t rely om Windows Task Manager. It also depends on the model being used. Some lean about evenly on the CPU and GPU. Others are much heavier on the GPU. I am currently running a 1080->4K upscale with Proteus.
This is my CPU load (I9-12900KF):
Not too much utilization, but look at peak temperature and peak wattage…I wouldn’t want to see my CPU work much harder.
But here is my GPU Load:
Pay attention to the power use…peaking at 448 watts. Utilization is peaking at 95% and averaging around 70%.