I have conducted several dozen tests, and the quality of images generated locally using the RTX 5080 graphics card (Redefine - Creativity 1) is consistently distorted compared to what I obtain when sending the photo to Cloud Render… Is this a deliberate tactic by the company to encourage credit purchases?, or are we using outdated AI models locally ?
The local redefine model is definitely a smaller model designed to run on consumer gpus with lower vram requirements.
Only the devs can say how much larger the server model is, and how difficult it would be to run locally. They may already make it available to enterprise customers.
That’s true - but still, the cloud rendering is (at least nearly) always better than local rendering.
So there are likely different models used. And then, of course, the question arises if those cloud models just cannot be used locally or if it’s not wanted…
Topaz have assured us that there is no “conspiracy” to make the cloud rendering better than those done locally in order to persuade people to shell out more money. I’m inclined to believe them.
I have been doing local renderings of my old low res digital JPGs from more than 20 years ago. Local renderings take up to 40 minutes on my M4 Mac. I have done a few in the cloud, to compare, and in this example at least, there is one bit that was rendered better locally, namely the statue of the Virgin on the cross. The cloud rendering produced more detail on one of the distant hills, but not on the adjacent volcano, where the result was about the same.
For me, the local render is better overall:
We always do our utmost to make technologies available locally. However, one factor is that vastly more powerful servers unlock great experiences that cannot be achieved locally.