The Preview facility in GigaPixel 8 for restore allows amall areas to be previewed as to how they may look. If I create a small selection in Photoshop as a new layer via a selection tool and then try to restore that layer, the software does not detect the “empty” pixels, but restores the entire image. Surely the ability to select small areas for restoration using this method simply mimics the preview facility.
If it worked as it should then users can restore parts of an image that are important and not huge areas of sky or other landscapy elements where restored detail is not an issue.
I have noticed that the image brought back into photoshop is cropped and in a different position as a result, is this me or the GigaPixel/Photoshop software. Doesnt matter whether its rasterised or not, same effect happens.
** Fixed ** That was me, had to match resolution of image to the DPI of Gigapixel!
My woes continue, whilst from a distance the restoration creates more definition, the process introduces too many artifacts back into the image, very pixelated and too noticeable to use! Not sure whether this is the model, or not.
If you’re still asking about the Generative AI Models Recover & Redefine then I’m afraid these Models were only designed for Low Resolution Digital images and not Analogue images
Topaz recommend using the Basic Models for larger images
I have put Topaz own information about these Models below I’ll also, post the link to the original document at the bottom
Recover
This model brings the best fidelity in upscaling for extremely low resolution images.
Use images that are 1000px and smaller to get the most optimized result.