Will updating from 2.6.4 to 3.3.5 decrease render times?

I do render whole movies, that takes time. Got a faster CPU and Topaz 3.x seems to be faster with Proteus.
But now there is Iris and does much better jobs on some files.
36h with Proteus or 5.5 days with Iris…(M2 Mac) for one 90min movie

I chose the better quality. Topaz gets better and faster, but new models with much more benefits cost time.
Choose if you want a speed up with old settings or better quality with new things.

correct, upload the original 480p clip. that way i can run an upscale test and provide you the time it took me to process it.

We are not looking into upgrading your CPU, as this would be diminishing return in case of TVAI.
we are looking into upgrading your graphics card. since i got the same rig as you are apart from the graphics card, once i test your clip and provide you the time it took me to upscale it, you would have an idea of what performance boost you would be expecting by upgrading your graphics card only…

Ah yes, that’s what I thought you meant, just wanted to make sure. Ok, attached.

I’ve been running some experiments with various settings to compare render time and output quality.
Here’s the settings for the first two attempts:

#1 Artemis High Quality, crop to fill frame off, grain off, CRF 15, 7.11 sec/frame
started 8:40AM 7-23-23 ended 9:15AM 40MB 35 mins

#2 Artemis High Quality, crop to fill frame off, grain off, CRF 10, 8.20 sec/frame
started 9:24AM 7-23-23 ended 10:03AM 78.5MB 39 mins

My first attempt about 4 days ago with grain took 6 hours, making a 116MB file. I don’t see any difference between the rendered files with grain, or no grain.

I tried to use the ‘interlaced’ models, but the ‘keep audio’ was greyed out.

When you run your test, can you list the various settings/options that you used?

Hmmm…I don’t see anyway to attach a file…do I need to post it to Google Drive, then post the link here? OK, found the ‘upload’ button, but it says ‘not allowed’

better just place it on google drive. more simple that way

OK, here’s the Google Drive link…The Invaders S2E8 Dark Outpost.mkv - Google Drive

In the meantime, I’ve experimented with various combinations of settings…Artemis Low, Med, High Quality, grain on, off, CRF from 1-15, noise, high compression…for my 12 sec 9.32MB DVD clip, they all rendered at about 7.14 sec/frame, taking about 35 mins, ranging from 40MB to 254MB…and THEY ALL LOOK THE SAME. The 40MB file looks as good as the 254MB one.

I’m currently running a Gaia High Quality CRF 10, no grain…but it’s running at 51.01 sec/frame, with over 3 hours to go. The split screen progress doesn’t look very promising.

I ran the tests with all the Models and it upscaled it very fast for me.

With VEAI 2.6.4 i took me 1m 6s to upscale it to 1080p using Artemis HQ - Auto (I did not test the others with VEAI - the others i tested with TVAI v3.3.5 - see below), twice the time it took me with TVAI v3.3.5. I encoded it as h264 CRF21 in VEAI. in TVAI it was encoded using h265 which supposedly slower then h264. Even then, TVAI v3.3.5 beat the VEAI v2.6.4 more than 100% speed.

image

Here are the time it took for each one of the AIs in TVAI v3.3.5

Here is one I upscaled to 4K with TVAI v3.3.5

those are the settings I used for upscale with TVAI v3.3.5

You can download the Videos I upscaled and see which one you like the most
https://drive.google.com/drive/folders/1NtiWj51Bg6M6yt1nqiuN2yDI9VcqdW_G?usp=sharing

My Specs:
Processor: Intel(R) Core™ i7-4930k CPU @ 3.40GHz (Ivy Bridge-E)
RAM: 32.0 GB DDR3 SDRAM Quad Channel 1600MT/s 9-9-9 (CL9 / CAS# 9.0)
OS: Windows 10 Enterprise Edition (64-Bit) 22H2
GPU: NVIDIA GTX 1080 Ti (11Gb VRAM)

As for the GPU: in Gaming the performance of the GTX1080 Ti is in par with the RTX 3060.
In TVAI (v3.x) the 3060 I think would perform better as the 3060 has Tensor Cores & RT Cores and mines doesn’t (1080 Ti, only has CUDA cores) and TVAI utilized those RT/Tensor Cores to its benefit. CUDA cores count are the same for both 1080 Ti & RTX3060 (3584 CUDA Cores). 3060 also comes with GDDR6 while 1080 Ti is GDDR5x. so my bet that you would get better results with a RTX3060 then i am getting with my GTX 1080 Ti

As you can see the only major difference between your computer and mines are: the GPU and the Topaz version (v3.3.5)

If you want to check the difference between the outputs, I suggest using the Video-Compare tool.

Thanks for all you hard work, much appreciated!

Wow, using VEAI 2.6.4 sure was fast and a small file. I’m going to run a test on my system using CRF 21 and see how long it takes. When you hover your mouse over the ‘Constant Rate Factor’ question mark, it says ‘lower than 17 produces better quality’. This is why I was using CRF of 15, or 10. But, it looks like using 21 is just as good?

Yes, our systems are almost identical, except for GPU’s, but I wonder if my Edition Windows 10 Home Version 21H2 makes a difference, compared to your enterprise 22H2? I’m not sure what kind of RAM I have, but yours is probably better?

So, you would recommend me upgrading to a RTX 3060? Will my system support it? What’s the difference between RTX 3060 and RTX 3060 TI? Which one should I consider?

I’ve tried using the ‘video compare tool’…is this the one that splits the screen into 4 screens with 4 different ‘models’, or the split screen slider that lets you see the original video, and the upscaled video?

In H.264 and H.265, CRF ranges from 0 to 51 (like the QP). 23 is a good default for x264, and 28 is the default for x265. 18 (or 24 for x265) should be visually transparent; anything lower will probably just waste file size. Values of ±6 will result in about half or twice the file size.
I never use sub 20 crf. usually i am between 23-26.
You can read more about it in this Post.

I can assure you this would have zero impact the OS version. Enterprise just means it could be managed remotely by your IT staff and some other features mainly for organizations.
RAM is also insignificant at those levels as we are at the same generation. and anyway mostly is processed through the GPU VRAM, assuming you would have a newer generation of GPU (1080 or above).

I would try and aim for 3080 if possible. if not I assume the TI might be better, but that is beyond my knowledge as i own none of them. I believe the 30 series would fit your system. better double check though.

it’s this one

Thanks for the fast reply, and that CRF link…loads of info. I’m going to research those RTX 3060 and RTX 3060 TI GPUs, prices, and if they will work in my system. My baseboard is MSI, B75A-G43 (MS-7758) 2.0…I think this is from when I built my PC in 2013…time to upgrade!

1 Like

IMO it’s time for a new computer.
I have an RTX 3050 and I’m happy. As for me, it’s fast at a low price. Today I would buy an RTX 4060 (Ti?) and that is what I would recommend to you if you are considering buying a 3060 Ti.

@Kadet @tkmops 4060 performs in about the same as the 3060, because of the cut down memory bus, lower GPU Cores and PCIE-Link x8 (PCIE Lanes) unlike the 3060 which is x16. this is especially significant to older computers with PCIE Gen.3. our mate has a PCIE Gen.3…
RTX 3060 Ti outperformed the 4060 Ti in many titles and on others it was in Par. in almost all cases 3060 Ti outperformed the 4060. the 3060 out performed the 4060 on some titles as well.

all the critics on Youtube like Lines Tech Tips, Gamers Nexus, Hardware Unboxed, JayZTwoCents, etc. are all adamant “stay away from this card”.

2 Likes

Imo agrees but would not recommend the RTX 4060. Better get a higher 40xx one for the told reasons! :slight_smile:

1 Like

the problem is that he is on a tight budget. so a GPU replacement would give him the best boost money can buy with his limited budget. that is why we were only talking about GPUs.

A 2060 S would be a better bet as there are plenty of new ones around, way cheaper and same performance as a 4060ti

Thanks for all the responses, everyone. Still trying to find out which cards my system will support. Since my motherboard is so old(from 2013), I’m probably limited to which cards will work. I’ll post back when I get more info.

Yeah, sorta tight…food inflation is kickin’ my A$$. I think I can afford max $500 for upgrades(but my motherboard needs to support whatever new GPU). In a few years, hopefully inflation will have ‘cooled’, and I can afford a whole new rig with Win 11($4k-5K)

i think on an old mainboard like this even a great graphics card might not live up to your expectations. :eyes:

Yeah, there’s that also. It may not be worth the $300+ for a new GPU(if my motherboard supports it)…there may be bottleneck issues. I’m not sure what my ‘bus’ speed is, but that would limit everything.

I know this issue from a good friend who got an AMD RX 6700 XT for his AMD bulldozer architecture from 2015. The GPU can only deliver a tiny fragment of its full potential on that machine! :pensive:

it will, mines an old as his and the relative GPU I have does wonders.
I have the CPU, MB, RAM that he has and with my 1080 Ti I process my video pretty OK. 100 times better than what he is getting now.