There’s a “try it free” button on the website. If it works for someone, they can buy it as it is. For the next 12 months after purchase, they can then choose between the version they tried, liked, and paid for, or they can upgarde to one of the new versions if it works for them.
Guys, I think I’ve found the perfect combo to enhance the Star Strek DS9 series. First pass, I used the latest Artemis medium model for debuzzing. It’s v13, if I’m not mistaken. I left the original resolution of the episode (in SD). Then, once the initial processing was done, I used my preset below for enhancement and scaling.
Generally speaking, if I make my custom preset directly, I often get noise artifacts in bright areas such as illuminated surfaces, as in this screenshot. But having made a first pass with artemis, I don’t have the problem.
But it was all a blur in the beginning.
It would be nice if the template could be improved to avoid this kind of thing happening to illegible text, like on the far right.
Upgrades no (after end of support/upgrade) but bug fixes definitely yes, established softwares still fix bugs for a while after the next major release. Vegas Pro, Boris FX and Paintshop Pro come to immediate mind. So you’ll get something like v. 5.0 replacing say 4.x, then the 5 x series plus an occasional 4.x.x for a while, until significant bugs are fixed.
That’s what I meant, bug fixes but as you say, not feature upgrades.
I was going to warn you that Artemis will likely cause many undesired artifacts the moment an outdoor scene happens. Mostly if there are trees in the background. Keep a lookout for scenes like that.
I have to hand it to Topaz. The revised Iris LQ and Protheus v4 are a real step up. When I initially got TVAI it was Artemis all the way for me, but nowadays it doesn’t really compare in any way to those two.
Just too bad I have to suffer the v4 UI to use them (hate to fan the v4 UX flame, but couldn’t resist sharing my relative appreciation for the v3 UX work in the hope of a much wanted reversal on the UI front)
Didn’t you get it? ATM there is no true AI somewhere out there!!! Maybe there is but you won’t get your hands on it for sure. ALL of this promoted AI features and functions etc is just machine learning, algorithm, functions or processes which get some input data and an objective output, the our hw is doing everything reach that output target. The better this algorithm is the better the result is. And the more sophisticated this algorithm is the result is even better with the need of even more hw power (in our case mostly the GPU).
Or am I the only one with this view of the actual state of technology?
OK, I’ll check at a later date. But here I’m not upscaling with Artemis. Only denoising. I’m using Artemis MQ v13 to make sure I get the best for denoising.
It still might happen—I don’t remember. I used to use Artemis for that too. I switched to something else for awhile. Now I’ve been using Nyx Fast. The main reason I stopped using Artemis for denoising was it made the final image more oil-painting-like. From your shared examples, I would say that’s not an issue for you.
Yeah, it really depends on the source. For me, it’s a DVD source downloaded from the Net. I had to deinterlace all the videos with HandBrake. And here, denoise with artemis works very well. I haven’t tested Nyx. I’ll see about season 5. Right now, I’ve played all the episodes of season 4 with artemis for the first phase (denoising). But it’s true that on my copy, it works really well.
Just wondering if you’ve tried the Enhanced Speech for Dialogue in Premiere Pro Beta on a sample? I have the Beta installed alongside the release version and I usually fool around with the Enhanced Speech just to see how I can improve the dialogue or sound in general. BTW, you can open a Premiere Pro file in Beta and experiment, but can still open the same file in the release version, just without the Beta features enabled if you’re open to experimenting.
Hello, I’m not using the Beta version of Premiere pro. But the functionality in question will come at some point in the final version. Ultimately, I install the Beta versions on a virtual machine. But not on my physical machine, to avoid instabilities. But otherwise I often watch videos of new functions added in the betas.