Yes I know I can open multiple images, but the problem is AIgigapixel slows down to a halt when the number exceeds more than a few hundred.
Try loading 15000 images… it’s too much for AIgigapixel and it simply chokes.
I’m not sure why, but it seems like it’s updating some internal list each time a new image is loaded (when you load multiple images in one go), and it seems like it has to go through the full list for each single image, which leads to a slower and slower loading. Eventually, after it has ingested a few hundred images, it becomes so slow that loading 15000 images would eventually take days or weeks to complete (I’m not talking about the resize processing time, which is actually quite fast)
In Topaz Studio this load-problem does not exist, because the batch-processing mode doesn’t need to actually load all images at once, but goes through processing the images one after another.
A.I. Gigapixel will only be limited by your hardware. We’ve had reports of users successfully running up to 10,000 images at once, though, admittedly, they have some wicked powerful hardware.
The actual processing vs interface rendering is handled a bit differently, and I suspect the limit on your interface loading has to do with System RAM size. Do you think you could provide a DXDiag report? Here’s how:
From the Windows desktop, press the Windows key and the R key to open the Run window.
In the Open: field of the Run window, type “dxdiag”. Click OK or the Enter key to open the DirectX Diagnostic Tool (if prompted to check if your drivers are digitally signed, click Yes).
In the DirectX Diagnostic Tool window, click the Save All Information button.
In the Save As window, the DirectX information will be saved as a text (.TXT) file. Designate where the file will be saved (such as the Desktop or other easily accessible location) and then click on the Save button.
After you’ve saved the file, please attach it to your reply.
Thanks so much for your patience and I will await your reply.
I did some tests and found that AIgiga (at least on my system) got progressively slower at loading when the number of images increased. Loading 10 images in one go was fast, a 100 took longer than 10x10, a 1000 took longer than 100x10, and above 1000 it just took too long. So I ended up using the ‘sweet-spot’ of 1000 images per go and just repeated that 16 times until all images were processed.
Thanks for sending the system report to me, Jacob. I see you have a pretty powerful system, and I think your comment here describes why the performance might degrade as the image list gets larger. I’m going to use your example to describe this issue to the developers.
if we could run TopazLabs Studio and plugins like AIGigi work as command line commands - these problems would go away ( many different ones.) most modern tools ( including Photoshop and other filter plugins) have a command line version with options. SO imaingin topazStudio.exe -i input.jpg -s (adjustment script> ( which would start TS, load the input file and run an already made adjustments script/macro). Where then in say in Window 10 but similar in Mac, you can use say a .bat script to run anything you want and as many times you want with a simple script - like for all the files in a folder, save them to another folder and run as many topazStudio.exe commands in between including commands to other programs like photoshop actions and other programs - this would make folks like us who do extensive batching and smart scripting able to do more and do it faster. Better to be developing this then making cheap batching tools in TS.
Personally I don’t care how it’s done, as long as I can get the software to process a folder full of images without needing my intervention during the process.
I’m not sure editing .bat files is for all people though. It’s easy enough when you know how, but can seem daunting at first.
But yeah, it would certainly make the problem I have go away
I have experimented more, to see if it is my windows-system that is the cause of the issue (since I understand other people have no problems loading thousands of images in one go), but so far I have not found anything that made any difference.
I’ve tried turning ON and OFF ‘background processing’, and thumb-nail display, but it doesn’t make any difference.
I can load 1000 images, without getting a prolonged ‘hanging’ of the UI.
But if I load 2000 it just hangs.
I wonder if windows does something in the background which cause things to get bogged down. But since I don’t know what to look for I can’t really make any progress at the moment.
I also tried it for some videos with virtualdub frame extractions. With ffmpeg you can merge the extractions quickly back. With this technique you miss information from adjacent frames but the tools that I used to test that (that promise excellent results) were not that good. A single image upscale with AI gigapixel gave better results imho.
I’m running into the same problem here. I can batch a few hundred images, but at some point, my computer will become completely unresponsive. My machine is relatively high-spec (64GB RAM, i7-5960x, Quadro K4200). If I watch the batch processing, RAM usage will climb higher and higher until all system RAM is used. RAM is not freed until I close the application. It seems like a leak, or that memory is not being freed after each image.
I’ll also throw in a vote for a CLI. That would allow me to distribute processing to my render farm and actually use AI Gigapixel in production.
I submitted this to the Topaz Studios tech support, but thought I would put this out there also, since Gigapixel is absolutely amazing!
Recently, I was trying to upscale a 720p video by exporting it as an image sequence in Premiere (24 frames per second), and then upscale each image using the Gigapixel software to 4K UHD.
The video was batched to 37,700 images.
Unfortunately, when trying to import 37,000 images into Gigapixel… the memory usage skyrockets (I think to the total size of the images), and then just crashes.
Fix the importing, to only import files into memory one at a time (or per the number of available GPU’s), or import them without a preview image (simple solution I’m sure).
Allow as a stand-alone back-end server process like Media Encoder from Adobe to simply listen to a folder, and render accordingly.
Allow multi-GPU rendering (I actually never got to this step, because Nvidia doesn’t allow Consumer GPU’s on Windows Server anymore, so maybe this DOES work, but I don’t know). Renting a cloud-based windows server with 4x-8x Nvidia 1080 GPU’s is very do-able nowadays.
Perhaps do the video upscaling in-app (Topaz mentioned this will be in the R&D eventually, I’m sure it won’t be easy).
Usage of Tensor Cores on Nvidia and AMD GPU’s. This will drastically speed up render time, once fully implemented, since these chips are specific to A.I.-based rendering.
Obviously the process of rendering 37,000 images, then importing them into Gigapixel, upscaling each image, and then re-importing to Premiere is slow and cumbersome…
But the quality of Upscaling in Gigapixel is absolutely incredible… about 2-3x better than the competition (Waifu, Red Giant 4K, Alienskin Blowup, Photoshop, and After Effects).
In that case a command line .exe would even be easier, then you can wrap it within your own scripts with the amount of threads you like, if one crashes and your output <> exit code 0 then you can retry it with a new instance. Config file could be generated with the GUI app for a single image.
Just thought I’d chime in here. I had no idea I was not the only one using AI for upscaling movies. I just did this for about 160,000 frames (only manages 2000 at a time) and although the results were pretty good, I’m not sure they were worth the month or so it took on and off to do them compared to Adobe After Effects new upscale algorithm. Perhaps I was de-blurring too strong but of course I’m not going to do the whole lot again! I’d love to see this technology available for movie input!