RTX 3080 VRAM issues after installing latest driver - VEAI

Same was happening on my 3070.
Using v2.1.0 (instead of v2.1.1) made it last ~x3 as long almost before it errored on me but obviously it still happened. I am going to try the the studio driver, hopefully it doesn’t effect performance for games or anything otherwise I might just find the old driver to downgrade to that doesn’'t have this issue.
Thanks for the posts.

I was having this GPU conversion failure on 3070 on latest NVidia Game Ready Drivers.

I downloaded and installed the latest NVidia Studio drivers v462.31 and re-ran this 1080P Artemis v12 conversion on 3070 - and it worked.

Looks like latest Game Drivers are broken. Hope NVidia finds problem before this is released to Studio drivers in a future update!

1 Like


The Game Ready drivers are only for gaming.

Yeah, learned that myself as I always defaulted to the GRDrivers because they were pretty up-to-date and have gotten the most support, or so I thought, I now default to STDrivers because you can still game with them as well as getting benefits to your content creation progs!


Seems to be that way l alright. I think people are scared to install Studio drivers because they (like me) were not exactly sure what they are, or if they are completely different and don’t work for normal “gaming” too. But it seems that they are just a previous stable build of the drivers which are tested better and left longer to find any bugs - ie. they are not untested beta builds of drivers pushed out without much testing to fix bugs found in some new games - more stable nvidia driver builds in other words.

Yeah, I remember reading that somewhere before, it might have been a Reddit post most likely, I’m glad I made the switch!

I did not know that this is unclear.

I only have Pro GPUs (since 2014) and do gaming with them as well, have never had a downside from them.

Yes it is very unclear. NVidia has been muddying the waters about what exactly the differences are between the “gaming” drivers and the “professional” drivers for absolutely years now.

Making it out that the pro drivers were completely different and render things “better” than the gaming drivers… extra floating point precision, better anti aliasing… all sorts… maybe marketing bullshit to try and sell the same GPU for four times the price or not… hard to tell… point it they have definitely confused everyone with their pro-line cards and drivers.

Don’t they even intentionally cripple certain operations in the gaming gpus/drivers to artifically make some professional apps run faster on those operations?

Quadro cards have so many advantages compared to GeForce cards.

  • Much lower power consumption, very good for multiple cards or even single card
  • Stable core clock, or even boost higher for a longer period of time
  • Low temperature + fan noise
  • Small form factor
  • Higher CUDA + Tensor cores
  • Specific drivers to stabilize the workloads without any issue with power and thermal
  • Optimized for intensive workloads
  • Massive amount VRAM
  • NVLink support up to 2 or 4 cards.

Unfortunately, the price is high, but it’s worth it trust me. I’ve used so many GeForce cards to know that Quadro cards are better for computing and professional workloads like this. People can go ahead and yell at me about the price and performance/dollar. But it is what it is.


Well, right now the Quadros are lagging big time. The Current Rtx 8000 is still using Touring which is a disadvantage. They are workhorses but are known to be slower but cause fewer problems. I have no complaints with my 3090 and the Studio Drivers. I still can game with the few that I play, but the card is used more for production for me. Stick with the studio drivers and you cant go wrong

RTX A6000 is the lastest model. And it’s faster than RTX 3090 at 300W.

Sorry Buddy, but your wrong https://gpu.userbenchmark.com/Compare/Nvidia-RTX-3090-vs-Nvidia-Quadro-RTX-A6000/4081vsm1300600

I see how you are saying its faster. There are some areas is a tad bit faster, but its right there and you are right about the power consumption.

Nobody trusts this site man. Never use this site for benchmarks. Never.

It’s always faster. I’ve researched about RTX A6000 for 2 months. Except for gaming, it’s faster than RTX 3090 in every case.

Well, it leans toward gaming, and the Quadros fall on their face there. So the 3090 will win on that “nobody trusts that site” place. Bold statement. The fact is that if you have a multi-use machine the Gforce cards are a good blend. But if you can afford the Quadro you for sure don’t need to be using Topaz. There are much better Pro options out there.

I’m not sure if you know what you’re saying. This conversation is a hard pass for me.

I’ve two RTX 5000 and i know no one Pro solution thats better.

They are no Magic Machines, just GPUs.

And one could also game with them, i use a P5000 since 2019 for Gaming.

And Userbenchmark … you could also ask your Neighbors Cat about GPU performance, that would also make no sense.

The NASA is using Gigapixel too.

1 Like

Trying this again on Game Ready Drivers 466.27 that came out 2 days ago.

So far we’re 1000 frames in without an error. Usually it’d fail at <300.
(this is 4K output)
I also couldn’t get Gaia to run on the known bad drivers which I am now able to run with the latest GRD.
I’ll edit this if it does end up failing but if not then you could probably assume the latest drivers are working again.

And generally while you may not notice a difference between Game and Studio drivers, sometimes you will. I only return to GRD due to having some minor issues with Asassin’s Creed Valhalla.

Edit: Scratch that. Right as I hit submit on this post it just failed.

Hello, it wants from me set less VRAM if i click to start process and i set less but still the same result with error message. So i downgraded app to 1.3.8 and problem is fixed…even 720x480>>>1440x1080 with 0,70fp/s is for me slow with 16GB RAM/6GB GTX 1660 TI and i5 10300h, if i had similar result with gtx 1050 3GB, 8GB RAM and i5 9300h…also, it must be with new version pf program because old version of Gigapixel takes easily 4500GB of GPU with no problem, so why not Video Enhance?

Also, new version doesnt work at all with dione TV deinterlace. Source is the same as output in view. Idk why they made wrong but I want from them fix it!!!