Yeah, I stopped using the QEMU Passthrough thing because I was literally sacrificing 1/2 my system which isn’t optimal. If I’m away from the PC I’d rather use the whole thing for VEAI or whatever and if I’m using it…well it’s a powerfull enough system that I didn’t notice it that much unless I wanted to use the 2nd GPU for something.
I would have thought AMD would be much easier on Linux. AMD drivers are open source, Nvidia’s likely never will be. AMD tends to work better for gaming as well using the wine/proton translation layers assuming for the same reason.
I built a Windows box pretty much just to run VEAI. I will say that while I’d prefer a native Linux build, VEAI running in wine would probably work very well if you supported some GPU workload in Windows that Linux could properly translate in wine. I tested it and the GUI and everything worked fine in wine and it would run on the CPU and detect the GPU’s. IIRC debug in wine was pointing at ONNX and directML was never implimented in VKD3D
This is great news. I hope it all comes to fruition.
I imagine Linux support would be an issue for the team. Perhaps it could be suggested that there is no official support but that us Linux users could form community support and help each other like in many other Linux based projects?
I run Arch (BTW) with a 3700X CPU, 32Gb Ram, and an RTX 3090 GPU.
A few additional points. It was mentioned that gpu detection on Linux might be difficult but that is not something I have experienced. Topaz has always correctly identified my (Nvidia) gpu (granted this is through WINE). The problem as far as I can tell is the windows specific DirectML which is not implemented in WINE. Worth noting that CPU (Intel) works fine as I think Intel’s framework is cross-platform. I think it behooves Topaz to move toward more open, cross-platform frameworks. This in my opinion kills several birds with one stone as it mitigates the need of juggling multiple different versions for each operating system. Also offering Linux versions as AppImages or Flatpaks again removes the need of juggling releases on a per distribution basis (debian, arch, etc…). At best a dedicated Linux version would be fantastic, however just getting gpu support working when running the Windows version in WINE would be the next best thing.
It is more complicated than anticipated.
At first it we will be only able to support Nvidia GPUs using cuDNN/TensorRT. Then will look into adding support for AMD GPUs with RocM . Intel CPU and iGPU support will also be there.
Linux support would be great thanks for working on it. I cant be alone in wanting to offload this work to a linux machine rather than a windows machine. Rendering, sharpening, machine learning is sent to the Linux boxes because thats all they do.
Bonus to be able to use the desktop client as well, as everything in my workflow is Linux based.
I’ll certainly throw my hat into the ring as a potential Linux tester. I’m currently running VEAI in a KVM VM, because I’m not willing to sacrifice a high-end box to Windows.
(As an aside, if Linux isn’t supported, it would be really nice to at least support the product in virtual environments with GPU passthrough or vGPU. Playing the “we don’t support you because you’re using a VM” card for every minor problem really is pretty darn unfriendly.)
As I read through this post, I see a lot of discussion about desktop market share, but I don’t think of VEAI as a desktop product (or at least not purely a desktop product). Upscaling a video can take multiple days; that’s definitely a server-type workload, which I would like to run on a proper server.
In the very early stage, VEAI (Gigapixel for Video) only run on CPU and Nvidia GPU. This time for linux, is it possible to release the veai ffmpeg filter for Nvidia GPU first?
Heck I’d be happy with just Gigapixel working on Linux. VEAI can take so long I have a dedicated PC for that, but Gigapixel I often just want to upscale a few pics I found online and having to switch to a different PC just to do that is a pain.
Some people claimed it works in WINE using CPU but I never managed to get it to. I’d be fine with it not using the GPU, if it worked at all.
I would guess that it is working without the GPU being involved, though, correct? I saw no way to make the GPU work with any of the products up to now. VEAI reports using the GPU, but it’s clearly running only on the CPU due to the processing time being reported.
Actually it appears to be using the GPU but being only a GTX 1650 its not exactly fast. It wont work at all if Face Recovery is turned on, so must be another library involved I need to deal with or something.
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 510.68.02 Driver Version: 510.68.02 CUDA Version: 11.6 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|===============================+======================+======================|
| 0 NVIDIA GeForce ... Off | 00000000:01:00.0 On | N/A |
| 23% 45C P0 74W / 75W | 2540MiB / 4096MiB | 94% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
+-----------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=============================================================================|
| 0 N/A N/A 2499 G /usr/libexec/Xorg 1213MiB |
| 0 N/A N/A 2873 G /usr/bin/kwin_x11 1MiB |
| 0 N/A N/A 2897 G /usr/bin/plasmashell 81MiB |
| 0 N/A N/A 4020 G ...alexatkin/firefox/firefox 9MiB |
| 0 N/A N/A 9408 G ...AI\Topaz Gigapixel AI.exe 676MiB |
+-----------------------------------------------------------------------------+