Gigapixel AI 4.1.2 - Preview does not match output

I’m on deadline and in desperate need of a 4.5x enlargement of a heavily compressed, lo-res jpg original.
What i see in the preview is exactly what i need, but none of the outputs i’m seeing match what’s in the preview window. I’ve gone through all the options and any variation i can think of to make this work and not a single one outputs like the preview.

Here’s a screenshot showing the preview windows in Giga vs the output tif opened in Ps… it’s a HUGE difference!

1 Like

There are 2 things that you can try:

  • Switch to CPU processing to see if that makes a difference.
  • Try the enlargement in 2 stages, e.g. 2.25x and 2x

There is also a beta available in the cloud (on-line) where you could try to see if there is a difference the thread is at:

Thanks Don, i did actually try the Cloud version earlier this afternoon and the results were basically the same, except that version converted my sRGB to aRGB and added about 20% more saturation to the image. There’s also no NONE setting for both Noise Suppression and Remove Blur, neither of which i want in this case.
With the staged scaling, i had tried that before and while the preview was a little more accurate, the end result was nearly identical to the results i didn’t want. The 4x + enlargement looks remarkably good in the preview and that’s the output i need for this image.
Looks like CPU rendering is the winner! The output is much closer to the preview and the results are 100% better than GPU rendering. Is this working as intended? Shouldn’t the output be identical for both renders? Why isn’t there a warning in the prefs about GPU not matching the preview and having worse quality?
Thank you for your time and advice.

The issue with GPUs is there are NVIDIA, Intel and AMD and then there are Creative and Game drivers for NVIDIA and integrated GPUs which use system RAM so some people do have rendering issues, especially with the older and less powerful GPUs. Also the processing is completely different as GPUs don’t have to manage IO tasks as they are only executing parallel computational tasks …

There is no exact science but the rendering is always a lot faster with a GPU and in some cases the CPU can do a better job because it is slower and system interrupt driven.

I note that you have used “use maximum AI models”, on the nVidia discussion thread I think I recall some posters (it may have been on another AIG thread???) that some folk find Gigapixel gives the better output if you set “No” to “use maximum AI models”.

Might be worth a try if you have not done so?

Odd, because when I use the Cloud AI version, I actually get less saturation. But worse, the colors are all off. I have attached a 6X conversion (low noise reduction, low blur reduction); notice how the blue sky becomes purple, and the foliage is pretty bland.

I was going to say the same thing. I use Max Quality = No and get the best results and much fast speed. (AMD Radeon GPU)

Things are looking worse than ever in v4.4.5 with the output being even further away from the preview than in 4.1.2.