- Mac Support. HQ/HQ-CG: OV iGPU/CPU support; LQ: TF-CPU AVX2+MKL support
- Audio Support. Variable FPS not supported. Wait for LibAV
- Better speed
- 100% Denoise/Deblock preset
- Root path file crash fixed
First feedback on new release after testing some footage:
Great work this update!
Remarkable is, that the face recognition works much more better. Now faces mostly keep their lines and structures. In the last version faces get often destroyed. Very good improvement.
I have to test next days other footage with this new version, which I already use with the old version.
Also render speed has improved remarkable.
You are on a good way.
I just bought this and every time I try it it falls over. I have a MacBook Pro (Retina, 15-inch, Mid 2015) 2.2 GHz Intel Core i7 running mac os X High Sierra . Any tips most welcome.
Please go to the main website and raise a support request.
Running VE AI on a 42.GHz Quad core Retina iMac, Radeon Pro 580 with 64GB RAM, and an external GPU w/AMD Radeon VII 16GB.
I see no option to use my eGPU card, or the internal card. There’s always a check next to “use CPU”, regardless of how many times I click it and restart the app.
So far, upscaling a ~200MB clip is reporting to take ~ 400:50 min.
I assume that’s 400 hrs and 50 minutes because it’s already been running for about 24 hrs and it’s barely into the clip.
For this to be useful in any way, it’s got to be orders of magnitude faster. I was considering a purchase, but on a practical level it’s basically useless at this level of performance.
And I don’t see an edit button for my post - that’s 4.2GHz of course.
I Think the Mac One Is CPU Intensive .THEY could USE OpenCL On MAC.I Have Dininchi Resolve Studio For Windows And Mac.Affinity Photo On IMAc is able to use ALL GPUS on IMAC.
Have just installed the mac version on my iMac 27"-5K, 4,2 GHz Quad-Core Intel Core i7, 48 GB 2400 MHz DDR4, Radeon Pro 580 8 GB. MacOS 10.15.3.
The software works, but as it is now it is not useable in a production mode, it takes way to long time, even to fix a 2 minutes video shot from an iPhone. Guess that will improve in coming versions.
I also have a vintage MacPro 2009 upgraded to 5.1, 2 x 4-cores, 3.33GHz, 64 GB Ram. The present card is only 1 GB Ram, and does not support Metal, so I am running High Sierra. So I will need to get a better card than now, but I am not sure that ‘modern’ cards will work in my older setup, I am looking on ebay and have found one I can afford, it is a NVIDIA GeForce GTX980 4 GB GDDR5. My question is now, will this card work nicely with your software??? Any suggestions on other cardtypes are wellcome.
I’m afraid not - only 10xx series and above are supported.
Arrrgh… thought I had found a solution. Would you know if these 10xx series will also fit (into) my MacPro? Physically and specwise? I have been away from the PC-world for more than 10 years
Or a link where I can ask?
I’m afraid I don’t as I’m a Windows user. No doubt a Mac user will be along and can answer your questions.
I tried the Mac version. The only way I could get a difference between my original footage and an enhanced one was to make a version of my footage that was downsized by a factor 2 (half the resolution), then I could see big difference (less blur). I got this tip from Imo who wrote me on this forum that my footage detail density did not match the pixel density, ans suggest converting my video file to a smaller pixel size. It worked. How strange Topaz don’t give the tip! Did you guys here ever get a good result without doing this (on a Mac version) ?
Heureka, I have found a GTX 1080 8GB card that fits my Mac Pro now I am eagerly waiting for it to arrive from Germany
Interesting, and I just wonder if you then could use this downscaled file to make an upscale which is still good!
I downloaded then used the first 100% preset to decrease the blur. That works! Then I used the 4k preset to increase the size. The release took off some more blur but all the faces of the character were strange, artefacts that made them look like Madam Tussaud subjects. Same kind of effect you get with some photoshop filters. I should precise that my footage was a black and white movie from the forties
there is no doubt that this software is the best video up-scaling software for consumer use. however, it takes over 3hrs to render a few minutes of video. i have looked at the task manager to see how it is uses the cpu/gpu, it is only use around 25% of the gpu and around 40% of the cpu. i have a intel i7 -9700 (overclocked) and a nvidia 1660 ti graphics card, so i would have to spend a serious amount of money to get a faster render speed. this means that on a practical level, only those with very powerful, and very expensive systems can take advantage of this program. i wanted to use it for up-scaling 1080p 100fps video from my sony dslr, but it would take all day to render just a short clip with my system. So, great software but its way way to slow for most people to use in a realistic time frame.
Looking forward to the day (hopefully soon) when Video Enhance AI supports the native AMD graphics processors in Mac computers.
I’ve got the same issue on my iMac Pro under High Sierra. It crashes when I launch the process. However it is suppose to run even under Mac OS 10.12.
Hi, thanks for that first update! Sounds great to me. Already loved the beta and initial release, your work is absolutely amazing!
Could you maybe add a “check for updates” fonction or a little update notification box in the future? Would be pretty much appreciated
Keep it up, you rocks!
Been using this all weekend on a 2017 iMac 3.8 w/64GB. As others have said, it’s dog slow. But I have been getting great results. Existing 1080p to 4K is superb. But my main use is taking old commercials from my early days in advertising (the 90s) and scaling them to 1080. I’ve been taking the original files, using Neat Video noise reduction on them, and then scaling. Usually works very well. Here are a couple samples:
1080p to 4K: vimeo dot com/16711099
SD to HD (and converted to 16:9): vimeo dot com/256475816
Can’t post direct Vimeo links on here, apparently.