[Redefine] Cases

If I try to do anything with the new Redefine model in Gigapixel 8 Gigapixel just crashes back to desktop. Windows 11 with RTX 4090 and 64gb Ram so well within spec. Any ideas anyone?

Gary

I’m on Windows 11, most recent update, and Gigapixel 8 crashes shortly after I open an image. I Get an “initializing” message, and a few seconds later, it goes to desktop. Gigapixel 7.4 runs fine.

I am having the same typed of problem. I have tried running the redefine model multiple times both as a plug-in to PS 26.0.0 on a Windows 11 platform and as a standalone app, but it does not complete processing. The indicator bar advances almost to the end and the spinning processing icon continues to spin, but nothing happens. After 10 mins. I gave up. I tried processing both a TIFF image and a JPG in the standalone version, but neither would complete.

Under Mac OS Sequoia, the Redefine Model processes a preview but when saving back to Lightroom Classic, processing continues indefinitely and never completes. Computer fan runs until I cancel the process. So, I can’t see or evaluate the result and have to exit without saving.

1 Like

Hi,
Gigapixel 8 is a mind-blowing update. But when I run it locally on my RTX 4090 I realized that white dots (pixels) appear very often in the image which I have to remove later in photoshop. It happens with the “old” models and the new generative AI models. I hope you can fix this issue in future upates.

Kind regards, Andreas

Looks to me like the new redefine model works best on 4-6X the original resolution. The problem I’m running into is that when output images pass a certain resolution (Somewhere around 50MP) the program will not process the image when exporting. Looking at Task Manager, the process starts and the GPU spikes for a short while, less than a minute, then just drops to 0% and stays there and nothing else happens.

Perhaps with large files it quickly runs out of memory to work with and stalls?

It seems their code may be good enough to not crash, but not nice enough to warn you why it stopped.

Possibly but checking Task Manager my system still has 60% of VRAM and RAM available so I don’t think that is the problem.

Workaround: Preview entire image first, then export.

Hello, I am trying the beta redefine model.
When I exeute it, the output is a black image (even the export is a black image).
Config:

  • Intel i7 14700K
  • Intel Arc A770 16GB

Hello,

When I click “preview entire image” using the Redefine model, it calculates the remaining time, goes to finishing up, and gets stuck there for quite a while. This is regardless of the image.

I’m running a M3 Max 16" MacBook Pro with 36 GB of RAM.

2 Likes

I just tried recover. Initially it estimated 7 minutes; but soon the estimate grew to over an hour for a roughly 1.3 megapixel source image. Also it works exactly one time per program run, and only works again after a force-exit.

When changing the creativity level to the next, it did not gen a new image but use the old one. when try to compare different level images, same will also happened.

I have placed a PNG image into V8, and selected redefine, no enlargement, but when it finished, I had lost my transparent background and finished up with a white one. Topaz Labs have rushed this version out knowing they have problems.

I don’t think Topaz’s generative AI models can support transparent backgrounds.

Other generative AI models also have this limitation and users have to add the transparency to the created files using extensions or other tools like Photoshop.

Full image is 4032x3024 10bit HEIF iphone15 attempting 2.5x enlargement. A small preview (about 1/90th full image) takes 25 min to render, repeating that 90 times, I estimate 33 hours for full render.
GAI is using 16gb RAM, , (5gb swap @400MB/sec), assuming it’s using the 16-core Neural Engine as CPU and GPU don’t look too taxed.
Managing expectations with accurate estimated times would be very helpful.
MacBook Air M2 16gb ram

Hi,

I placed one image that worked fine. Other images came out with a strange looking background. Then I started getting images with white backgrounds. A guy at Topaz is looking into the problem, and he did not mention that you cannot use transparent backgrounds.

Cheers

Stephen

In my experience the regular upscaling models in Gigapixel do work with transparent backgrounds. Only the generative models (Redefine and Recover) are likely to have this limitation, but maybe they can work around the problem.

14 hour test
As a test I reduced my image in 1/2 to 2016x1512 (or 1/4 MP) then ran ReDefine 2x, creative 4 texture 3 on the image, for an output size of 4032x3024 (my original image size)
The process took about 14 hours and mostly seemed memory bound, GAi was using 15GB RAM consistently with processors and gpu running 25-30% plugged in on standard power mode, system was useable during that time to surf the web but YouTube videos would hard freeze (black video screen) about every 5 min and the page would need to be reloaded to continue playing.
In a way this is pretty amazing as I’m using the MacBook Air M2 4efficiency 4performance cores with 16GB memory. About 5GB of memory swap was in constant use at about 300MB/sec so the fast ssd helped a bit. This laptop has no fan and just was slightly warm.
Cloud Processing would have cost 6credits ($1.5 buying by the credit) and taken 4 minutes.

We are investigating slower than expected processing speeds with our Beta Redefine model and I apologize for the slow times. We hope to continue to improve upon the speed and performance going forward, but unfortunately, this can be expected when interacting with Beta Models that are so resource intensive.

We provided free cloud credits to allow users access to the model while we continue to improve.