Gigapixel v8.4.2

Yes, the error occurs in this case as well. I also tried to download the installation file from here and run it separately, the same error.

Upd. I also checked the installers of other versions. 8.3.3 installer runs fine, but starting with 8.3.4 - AVX error occurs.

Upd 2 The problem was resolved after installing the latest Windows updates

1 Like

My System Specifications

  • GPU: NVIDIA RTX 5080 16GB (OC Edition)
  • RAM: 96GB DDR5
  • OS: Windows 11 (Fully Updated)
  • Topaz Gigapixel Version: 8.4.2
  • All drivers (GPU, chipset, system) are up-to-date

When using the Redefine Creative (Beta) mode to upscale a 2048×2048px image to 4096×4096px, the process takes approximately 2–3 minutes per image, which is unexpectedly slow given the system’s specifications. During processing, GPU utilization appears to be low and not fully engaging the hardware resources.

With a high-end setup like this, a single upscale task of this size should ideally complete within 15–30 seconds, or at least utilize GPU capacity more effectively. The current processing time is a bottleneck, especially for batch operations.

Please investigate potential performance limitations in the Redefine Creative (Beta) pipeline. Optimization for modern high-performance GPUs such as the RTX 5080 would significantly improve usability for professional workflows that rely on speed and batch processing.

If this level of performance is considered normal for a high-end GPU like mine, I can’t imagine how long the process must take for users with mid-range or older graphics cards.

Redefine Creative takes 11 min on my 2070 Super (8GB DDR6, 460.8 GB/Sec Bandwidth).

I have my AI Processor set to Auto and it uses the GPU at 100% power and 7.2 GB of GPU memory (1.1 GB to run Windows and GPAI and 6.1 GB more for processing the image).

The CPU is at 3% and of the 32 GB of DDR4 RAM it uses 2.4 GB to run GPAI, and an additional 3.6 GB when processing the image.

Any of the legacy Upscaling models take just 2 seconds.

Gigapixel / Standard / Face Recovery / Gen. 2 / Realistic removes the earring in the image, which is not very Realistic. Significantly reducing the Strength suppresses this, but it also weakens the correction of other defects too much. I rarely use Face Recovery, so I have little experience with it. The Gen. 1 version (almost) does not have this defect.

Question (not related to your Recover Face observation/comment): Did you Sharpen All vs just your subject?

I ask because it looks like there are a lot of ‘worms’ (my term…) in the After background that implies the lack of details there got sharpened. If you like it, that’s certainly cool. Did you experiment with just sharpening the subject at all?

I keep wishing we could pull Sharpen controls to the left (beyond a zero point) in order to blur where desired.

This image was sharpened overall, not just the subject (person). If you mean the white very slanted (almost perpendicular) lines, then those are blurred raindrops (it drizzled). The background is just the forest, hills and sky. The photo is not important, just something from a trip in bad weather, so I didn’t play with it in any way.

I sometimes use sharpening just the subject, for example a bird, where the background is not interesting or distracting and the bird is not well focused. Blurring the surroundings or background would sometimes be useful for me, including the ability to select only a part of the image to blur.

I’m not sure if this answers your question.

2 Likes

You answered the question!

Didn’t know if you tried to keep the background softer and it still sharpened overall - since it looked like sky areas sharpened too.

Thx!

If you want to examine the worms… :laughing: I’m attaching the original and the edited version with the sharpening:


1 Like

For those who may have wondered what became of all of those high-Creativity Gigapixel Redefine renders I was doing months ago: You can grab a sample pack with 78 full-res renders representing various subjects and also see organized PDF thumbnails of all of the final 6600 images.

5 Likes

Thank you!

It sounds like if I grabbed your earlier set, that this is a bigger set.

I had fun using some of the 1st set as texture overlays that I mostly used with blend modes or lowered opacity to blend composite elements.

Thx!

Yeah, because sometimes the sharpening causes certain soft elements to look squiggly, I started calling them worms! :upside_down_face:

Glad you like them! This sample set is more varied (ie, not just textures) and represents each subfolder of the new configuration.

1 Like

Hello!

Our team would like to inquire further.
Please share the following details in a message addressed to help@topazlabs.com using the subject line “Gigapixel GPU Utilization”

Attach your application logs:

  • Launch Gigapixel and go to Help > Open log folder
  • Look for the Logs folder, right-click, and compress it
  • After compressing, attach that .zip file to your reply

Attach your system profile:

  • Press Windows + R on your keyboard to open the Run dialog
  • Type dxdiag and press Enter
  • Wait for the progress bar in the DirectX Diagnostic Tool to complete
  • Click Save All Information at the bottom
  • Choose a location to save DxDiag.txt and click Save
  • Attach the file to your reply

By sending these details to help@topazlabs.com, someone on the team can follow up with investigating.

Thanks :folded_hands:

At least 18 minutes for single image with my AMD RX6800 XT.

18 min is a bit surprising, in that the AMD RX6800XT has quite a bit more raw power than my Nvidia 2070 Super. I suppose that the AI processing is just not optimized for AMD hardware.

The 2-3 minutes experienced on a 5080 seems just about proportional to the difference in power with my old 2070 Super.

AI chokes on the AMD. Can’t run the fancier AI models locally for anything that will have an end size over 1K on the longest edge.

These last replies in the conversation tickles my interest a bit…
I’m bored and want to take my RX 907000 XT for a spin. :grin:
@ferdictn1 What settings did you use? (apart from dimensions and module)

Creativity? Texture?

Creativity Low Texture 3 was 11 min with the tif texture (attached) I used.
Realistic None was 8 min on the same image.
Auto was maybe 10 seconds faster than GPU only.
It refused to try with CPU only.

Test Image.zip (2.6 MB)

1 Like

Remember I said I was bored?

I picked a random photo from a couple of months ago, cropped out 2048 and opened it in Gigapixel. Here are my findings with the RX 9070 XT:

Redefine Creative 2048 → 4096 low creativity, texture 1 - 1min43s
Redefine Creative 2048 → 4096 low creativity, texture 2 - 1min43s
Redefine Creative 2048 → 4096 low creativity, texture 3 - 1min43s
Redefine Creative 2048 → 4096 low creativity, texture 4 - 1min43s
Redefine Creative 2048 → 4096 low creativity, texture 5 - 1min43s

Redefine Creative 2048 → 4096 medium creativity, texture 1 - 2min1s
Redefine Creative 2048 → 4096 medium creativity, texture 2 - 2min1s
Redefine Creative 2048 → 4096 medium creativity, texture 3 - 2min1s
Redefine Creative 2048 → 4096 medium creativity, texture 4 - 2min1s
Redefine Creative 2048 → 4096 medium creativity, texture 5 - 2min1s

Redefine Creative 2048 → 4096 high creativity, texture 1 - 2min15s
Redefine Creative 2048 → 4096 high creativity, texture 2 - 2min15s
Redefine Creative 2048 → 4096 high creativity, texture 3 - 2min15s
Redefine Creative 2048 → 4096 high creativity, texture 4 - 2min15s
Redefine Creative 2048 → 4096 high creativity, texture 5 - 2min15s

Redefine Creative 2048 → 4096 max creativity, texture 1 - 2min36s
Redefine Creative 2048 → 4096 max creativity, texture 2 - 2min36s
Redefine Creative 2048 → 4096 max creativity, texture 3 - 2min36s
Redefine Creative 2048 → 4096 max creativity, texture 4 - 2min36s
Redefine Creative 2048 → 4096 max creativity, texture 5 - 2min36s

The amount of seconds are to be read as -ish (like ± 1s)

Me it surprised that the texture slider didn’t have an impact on the time it took to finish a job.
I also noted that GPU utilization was good, although it didn’t use much of the memory (maybe that one varies more with size of the picture you’re processing rather than the settings used - what do I know?)

Cropped original is this - nothing special really, but I am keeping a different crop of it as my phone background for the moment:

I’m sharing the results here for anyone interested:

@ferdictn1 I hope the people at Support find something in your case to shave some off from your processing times, but for the moment I have to say I’m not displeased with having gone for AMD instead of Nvidia - at least not when comparing results. I was actually looking at the 5080 but got repelled quite a bit by the pricing and decided to roll the dice with the RX 9070 XT instead.

1 Like

With your image on my RX 9070 XT processing times were exactly-ish (±1 sec) the same as with my photo using “Redefine Creative”. Tried with Low 3 and Medium 3 - both came out after about 1min43s and 2min01 s respectively.

With “Realistic None” your image came out in 1min19s

Br.