Is the gpu bit-width important in rendering?

I plan to buy a new graphics card recently, and noticed that the new 4070TI bit-width is very low. If I use this card, will the rendering speed be slower than 3090 or 3090TI?

Not a technical answer but if you look at the user benchmarks the lone 4070Ti you’ll see comparisons between rigs similar to what you might be scoping out for yourself…

4070Ti w/ i5-12600K

  • 4X Slowmo Apollo: 19.72 fps Chronos: 16.47 fps Chronos Fast: 23.24 fps

3090 w/ AMD Ryzen 9 5900X
(cpu is about 30% slower according to an arbitrary search on user benchmark.

  • 4X Slowmo Apollo: 16.48 fps Chronos: 15.93 fps Chronos Fast: 23.25 fps

3080s seem to perform similarly.

A quick perusal of the entire thread shows:

  • You have to have a 3xxx+ series to have skin in the game
  • The only breakaways happen with 4090 cards and the newest CPUs

Video AI v3.1.X - User Benchmarking Results - Topaz Video AI / User Benchmarks - Topaz Community (

At this point you have probably bought something, but here’s my experience anyway:
I bought a 3080 ti the day it was released at like $1400 (Because I felt like I overpaid on a 3090 a few months before that at around $2000).
Just looking at the 1920x1080 Artemis number:
3080 = 22 fps
On my server computer, I put in an AMD RX 6900 XT for $600:
6900 = 12 fps (When it worked. It would often produce black patches making this effective number much less. because of re-runs.)
Found a used 4070 ti for about $870 to replace the 6900.
4070 = 24 fps.

So for the prices I paid for the GPUs, the 4070 is the most speed per dollar. But if you can get a 3080 ti for $600-$800, you’ll get a lot more fps per dollar than I did.