Topaz Gigapixel v1.1.0 - v1.1.1

I don’t know if 151 megapixels would have been enough to resolve the text in reality, I can’t say.
I also can’t say whether 50 MP would have been enough.
There is no camera with autofocus and 151 MP that could work that fast.
Everything has limits, mostly physical ones.

This is one of those pictures that I never usually show because it’s work for clients.


EOS R3 - 24 MP.




Wonder 2 - 151 MP

4 Likes

I just jumped into this thread but is it true that downscaling the input to Wonder 2 before upscaling back gives better results?

I did try the same image, once downscaled to 50% original size then 2x’d by wonder, and again the same image using 100% sizing, no down/or up - and the downscaled process resulted in better output

Am I going crazy?

1 Like

Yes.

Maybe :rofl:

Its about density.

When an image is downsized the pixels become more dense / sharper.

The model is able to work better with that.

And its a 4x native model, not 2x.

So my downscaled 3766px images (from 6K) become 15000px images.


The are some problems / bugs still.

1 Like

So I made a little Windows utility for pre-processing images before uploading to Wonder:

For small images, you would use ‘Densify’ which uses this logic:

If longestEdge < 900 Then
    scale = 1.35
ElseIf longestEdge < 1400 Then
    scale = 1.25

Utility uses FANT bilinear scaling and anti-aliasing to prep images to try and hit the ‘sweet spot’ for Wonder v2 model.

For larger images, (1400px and up) you would want to use ‘Downscale’ and set the target pixel size (example is 960px for a 1920px image).

You can select and process 1 image using single image button or do an entire folder with bulk button.

Example:

Original image → 780x1024 → run through ‘Densify’ in utility (scales to 1.25x) → feed into Wonder v2 at 2x with grain settings 3/3/1 → run output through utility again, downscale to 1920px → run output through high fidelity model @ 1x with grain settings 8/8/1 → done

Results are good so far, but I don’t have any shareable images yet.

Is anyone interested in the utility? Windows only. I have only tested with .jpg so far, don’t know if it would work with RAW formats.

3 Likes

Question for the Topaz Team:

What is the training timeframe for this model?

https://models.topazlabs.com/v1/irwn_dit-v1-fp16-128x128-rev4-ox.tz2

https://models.topazlabs.com/v1/irwn_dit-v1-fp16-128x128-rev4.onnx.data

It feels like a step up from Wonder v1—sharper images and fewer artifacts.

Also, I really hope Wonder v2 will be available for local processing. As a “skeptic,” I prefer not to upload my private gallery to the cloud. I firmly believe that data is only truly private when it stays on my local machine. :slight_smile:

2 Likes

Today I am getting the Super Special cloud rendering treatment!

Then Photoshop says this (we’ve seen this before):

Nope, at this point I can see the cloud Recover v3 is good:

…whereas the local one isn’t:

Original image:

There’s also that cropping bug in the cloud processing here (in Gigapixel v1.1.1), hence I used [Windows]+[Shift]+[S] combo to capture a part of screen.

Another example:

Recover v3 strong from your cloud (with crop bug, unfortunately):

Local (I used a screenshot to show the whole program):

When you fix the local Recover v3, it’ll be a GREAT tool!

I also encountered a bug with failed preview (and export) of 1.08x upscale (1536 pix at longer side) with Redefine with 0 Creativity. I had to use 2x upscale to export. That’s the image I tried:

  1. While Recover V2 is currently the only model that features a built-in downscaling option, any image can be downscaled with any model using decimal values less than 1 in the scale factor parameter.


    In this example, a scale factor of 0.5x results in a 50% reduction. Downscaling with different models will produce different results, so experiment!

  2. If an image is greater than 1000px on both sides, doing a 1x pass with pre-downscaling applied in Recover V2, then importing that result for processing with Wonder V2 after can create great results on old film and scanned prints.

  3. The sweet spot with Wonder V2 is 4x.

:raising_hands:

3 Likes

Nice tip, thanks!

1 Like

A post was split to a new topic: Cloud Renders to be more clear in file naming

I haven’t tried downscaling with other models yet.

But I definitely need to do that.

My results when other models have modified images beforehand were that the pixels merged together and the resolution decreased, which led to poor enlargement results.

If you make pre-downscaling available for all models, please include different interpolation algorithms such as Bilinear or Lanczos.

Yes, I think that would be a welcome improvement.

I’m still busy with quality control for Wonder 2.

I wonder how it works, it’s so crazy.

1 Like

So Wonder v2 is actually not “Wonder” at all - it’s “Bloom Precision” according to the Topaz servers.

Perhaps it should be named to what the servers actually call it in Gigapixel?

Batch processing is still an issue for me. I sent 31 images for processing in the cloud using Redefine. ETA was about 20 minutes. This morning, more than 8 hours later, 6 had still not been processed and it was all stuck. Since then processing continues to be intermittent.

What type of images and what scale option did you select? If the ouput is above 256MP it will fail

These were png files and I was upscaling them approx 2x to a height of 2200px.

seems like this should work. Is that random images that are failing or always the same? Let’s say you send the one that failed in a second batch will it work?

I sent the ones that had not processed overnight again, some individually, some in small batches. This worked. So it’s not specific images that cause the problem as far as I can tell.

It is still happening, even with a single file (a JPG this time)