Brand new user. V2.6.1 Black Friday sale.
Using VEAI on 30 year old Betacam SP interlaced footage.
Used Dione Interlaced DV with impressive results.
There are several de-interlacing models. Is best practice to simply try each one?
Tried “Proteus Fine Tune”, but it didn’t de-interlace. Is this normal? Or is this user error?
Should I use a two-step process? First de-interlace, then further processing with another model?
Yes, you would need to calculate two passes.
Dione is a dedicated de-interlacing model, no other model provides this feature.
I’m using Dione to de-interlace and upscale to 200% a 6 minute 720 line DV video. VEAI is currently rendering at about .3 sec per frame, so the 6 min video will take about an hour and a half according to the info at the bottom of the screen.
Resource Monitor and Task Manager (Windows 10) both show about 10-15% CPU utilization and similar GPU utilization. GPU is RTX 3060, RAM is 64GB and CPU is Ryzen 9 3900X at about 4GHz
Is this normal? Why such low resource utilization?