Alright,
I haven’t seen any example use of the iOS Gigapixel App, so I am going to share my experience, but also a quick comparison with the Desktop version ( I tried both Mac and PC version and no difference). This is not exhaustive , but my first experience.
Hopefully the attached PNG make sense. (from left to right, A-original low Rez image–> B-desktop resluts–>C/D/E-App results )
I took some video and images on my iPhone at a VERY dark small concert. Needless to say the “best” useable images were still frames from the video. After exporting the still frame, the resolution and quality was too low (see Col A).
Since I was traveling, I downloaded Gigapixel App for iOS and selected and uploaded the image using the “Pro” Everything option which uploads to the cloud. Column E was the result and I was floored. Yes it’s not perfect, but for cropping in and doing some Lightroom adjustments it was more than I expected.
So here is the conundrum/confusion and hoping to get some comments from Topaz.
When back at my desktop Mac running Gigapixel 8.2.2 I tried using a variety of models, settings, etc. and the best I could get was similar results to Column B. (If I use too strong of creative settings it goes sideways real fast)
Just to verify I sent the original image back to the cloud from my iPhone App and tried a couple different options in the App settings (not very adjustable) and the results were Columns C & D.
Questions/Observations:
So, it seems that the cloud processing using the app is using “better” models for the generative upscaling? Also, there is no selectable control as to the size the final output will be, so that plus the generic settings leave little adjustment.
How can I get similar quality generative enhancing and upscaling on the desktop App? What is this huge discrepancy? It seems like the desktop App should be able to get much closer to the iOS App result.
Thanks,
Norbert