The disc space needed for models is getting crazy. I installed Photo and noticed it copied about 13GB of models from an existing Photo AI install then downloaded about another 30GB.
So if I keep Photo AI installed that’s 13GB of dupes and wasted space. I imagine Gigapixel will be the same.
When I compare Photo AI and Gigapixel AI model directories I see about 6.5GB of identical files. I expect that duplication will be greater with the Studio applications.
We need a storage scheme which avoids duplication of identical models.
When I create an image backup of my system drive (which includes 100+ installed applications) Topaz models account for fully 1/3rd the size of that image and it looks like with studio apps it will be 1/2. As well as de-duplication we should be able choose where to store models to allow more selective backup of the huge amount of relatively easily recovered model data.
I have practically the same experience with Photo. The data volume is/was very high.
In addition, I am now outside of a normal Internet connection, I am in nature and trying to take some photos. But my connection is now only possible via mobile phone (slow and also limited by the provider’s data volume). I tried (carelessly) to install Photo (in the hope of a nice improvement of the images with the latest version). It took me well over an hour, a huge number of models were still being downloaded, and in the end it did not install anyway, because my 15 GB limit was simply exhausted and I was left completely without Internet. If I had suspected that there was so much data, I would have postponed the installation for a week, when I would have a normal connection without data volume restrictions.
So I would like to ask a small question: Would it be possible to specify the maximum possible data volume that the package can take to download for individual software packages (Photo, Gigapixel, Video)? If I had known the total number of gigabytes, I would have postponed it for later and not had to deal with any inconveniences. Simply add a number to the Download link indicating the maximum possible data volume, nothing more.
When I update Topaz, it asks me to go through a complete installation flow including agreeing to terms and conditions, entering my password, etc. (I’m a Mac user). Frankly this is tedious and feels really outdated. I can’t think of another app I use that still requires so much interaction from me during updates. It used to be the norm, but now 99% of apps simply show an unobtrusive indication that an update will be installed on relaunch; those that still opt for letting users approve an update do so without a bunch of clicks, agreements, and passwords. Can Topaz modernize the update process too? It feels like this aspect of the app is far behind the advances made in other aspects.
Thanks so much to everyone for sharing your feedback!
We understand the concerns around model storage size, duplication across apps, and limited internet connections during installation. These are very real pain points, especially for users working in the field or managing backups across multiple Topaz apps.
We’ve noted all of your points and will be sharing this thread with the development team for their review. Your suggestions around de-duplication, user-defined model storage locations, download size estimates, and more flexible install options (such as on-demand model downloads) are all important considerations for improving the overall experience.
Thanks for taking the time to outline these so clearly and we appreciate you helping us make this application even better in the future!
Thanks so much for your feedback. Our team really appreciates you taking the time to share your experience with the update process.
You’re absolutely right that modern Mac apps have moved toward a more seamless and unobtrusive update flow, and we agree that reducing friction here would go a long way in improving the overall user experience.
We’ve shared your insight with the development team for further review as we continue to improve the installation and update process across our apps. Thanks again for helping us make things better!
Is there a version of any Topaz software that runs on a platform that wouldn’t support symbolic links? That way an install process that finds NO models could create the models directory, and any second installation (or, I suppose, update process) could check the size/version/etc before creating a shortcut to the models for its own use. This would confine the required changes to the installation routine.
As I understand it, symbolic links are called “symlinks” in the windows world. Google reassuringly tells me it’s the same in Mac land. And it seems the command to make one in MacOS is the same it used to be when all my machines ran linux.
At this time, Topaz Photo (and all previous versions of the application) does not support this behavior. Each installation currently maintains its own separate model files without shared linking between apps or versions.
We understand the value of a more efficient storage approach and have noted similar feedback from other users. While this type of support isn’t currently available, your insight is appreciated and will be shared with the team as we continue to explore ways to optimize model storage in the future.
Gigapixel and Photos together need 70GB of space on my hard drive; an almost unheard of amount and mainly for the models. This makes me question the value of an application that is only marginally better at what it does than the competitors who typically require less than 10% of that space for Ai models. I am going on 20 years as a Topaz suite customer, having licensed at one time or another nearly every app you ever sold. I struggle to imagine a scenario where this bloat ultimately benefits the users in the long run.
From a normal viewing distance that would be true however, it’s a whole different ball game up close, pixel pipping Topaz renderings are far superior than, anyone else and apart from certain Denoising situations Topaz is way ahead of it’s competitors in terms of Sharpening, Upscaling and general image Enhancements.
In which case, they have developed the best AI Models available and if those models have to be large so, I can achieve the results I require then so be it.
That’s probably why Adobe turned to and choose Topaz and not their competitors when implementing third party AI Models within their products.
Yes, I am wondering with the high cost and using 50Gb of disk space (more than I consume in a few days of photography) if I am in for the long run. I’ve just run the latest update and out of interest looked at my disk space. Topaz is better than the other stuff but one does wonder.
Edit: After I posted this it occurred to me that there might be a lot of old stuff as my installation was old and updated over and again.
Deleting the app file (the 50Gb file) and doing a new install has got back a lot of space. Painlessly. Mind you I do have gigabit internet so it didn’t take too long.
Since Gigapixel relies on generative AI, it is natural for its model files to be larger than those used in traditional upscaling tools. In general, larger models have more parameters and are trained on more data, which directly contributes to superior results.
If you’re familiar with the generative AI landscape, you’ll appreciate that Topaz’s models are actually quite compact by modern standards. The largest FP32 model is just 5.29 GB, while the FP16 versions come in at around 2.64 GB.
Keep in mind that other apps might not have the models locally on the computer, which would require these to use the cloud constantly to process images. With our apps, a lot can be done without using the cloud, locally on the computer.
Adding my vote for this. Right now Topaz Photo AI alone takes up over 60GB of space on my SSD. When I investigate the directory, I seem to be seeing a lot of duplicate model files as well as what are likely old model files that are no longer being used. It would be really nice to have a proper model manager as some of the things that need models like Dust and Scratch I don’t use - so if I could natively not download these until I actually need to use them it would be appreciated.
If we HAVE to have separate sets of models for each topaz product (and I now only have Photo, Photo AI, and Gigapixel (both)), then we need to eliminate - or make manageable - the apparent reload of every set on every update. My internet connection only runs at a nominal 6Mb/sec. It takes well over 24 hours to install an update. Heaven forbid that Microsoft issue a patch which requires a Windows restart while the topaz update is running.
Or maybe it would be feasible to allow installs to pause and restart from the last (completed) download…
Isn’t it possible to make the installation of models depend on what the client is processing? There is no point in everyone acquiring a gigantic volume of models without ever needing them for work. Separately, you can quite safely make different versions of the product, addressed to different users, respectively, a smaller volume of models. Just thinking.
Having had a couple of fails with update for Topaz Photo 1.2, and a refusal to run 1.3.something, I downloaded the msi installer to run a “repair”. The hope was that this would get me out of downloading zillions of gigabytes of model files. No dice. So before I run the .msi to install 1.3.3, the checksum is as follows:
SHA256 hash of C:\Users\mike\Downloads\TopazPhoto-1.3.3.msi:
85acc0a9b72098c2c8a308c0249e448d913339519ca18d7a47787c9bf657e25d
Anyone have an idea of whether this is correct? I would suggest that Topaz publish this (or another (as long as it is identifiable!)) hash for all the products we download. It’s a comparatively low cost safety enhancement for the user community, and could save hours of download time when there is a kink in the download.