Using Topaz AI for Legal Video Work

Our company does a lot of video production work for attorneys and law firms. We have been asked countless times over the years to try to sharpen/clean up surveillance video for various reasons. We bought the Ultimate Bundle and have used it for several surveillance video projects over the past few months. It takes 10 hours of work we would normally spend with Adobe products and really simplifies the process.

In my opinion, using Topaz AI doesn’t change the actual video, it enhances it. Would you agree with that statement? Topaz AI has the ability to detect and enhance the video for things we can’t see with the naked eye.

For instance: We enhanced a video where someone tripped over something. The surveillance video was poor quality but when we ran it through our Topaz AI it cleaned it up so you could clearly see what was tripped over. The actual video isn’t being changed, we’re not putting something in the video that wasn’t there, we are simply cleaning up the video to make it clearer.

My question is: By using Topaz AI to clean up the video is what’s recorded actually being changed or simply enhanced. I’m looking a definition of what Topaz AI does to the video to clean it up.

There are a number of posts in this forum from users who have found that TVAI created unwanted faces and/or patterns out of things like hair, grass and gravel. For most of us, it’s an annoyance that we deal with by adjusting settings, but in a legal dispute someone could probably argue that the software has created things that weren’t there.

1 Like

dependend on the model used the application will more or less hallucinate details to the footage that were originally not in the source material. you can more or less avoid this by using proteus with conservative settings.

As an opposing attorney I would probably love to hear that Topaz was used to enhance video. That would be called a “slam dunk” for the defense. :laughing:


I think your best bet is to use an editor with a sharpen function, which will just enhance edges. Then provide both the original and the sharpened copy for inspection and let whoever you’re showing them to decide for themselves.

We strongly do not recommend the use of our Video AI technology for any forensic or legal application.


Though I think some of the models would be okay to use, mainly Gaia. Some others are, by design, not displaying the same information in the output, as the input.
For example, if there is some text that is unreadable in the original, but readable in the TVAI output—every word of it comes from the AI training data and NOT from the original video.
Iris models are the worst offenders, in that they are made to enhance faces. But again, those faces are just the AI calculating what would fit best, based on the training data.

1 Like

As someone who is blown away with Topaz AI upscaling if I were a defendant in a criminal or civil case and I knew any AI enhancement was being used on footage allegedly incriminating me, I would have absolutely no difficulty being able to convince a jury that you cannot rely on it for “accuracy” 100% of the time. I would simply give them “before” and “after” examples where it is not so much a miracle and instead should simply have “given up” trying to enhance a particular section of the frame or even a few seconds of footage here and there and simply left it “as is”.

One example I have that could even be analogous to the sort of low resolution, high noise, high grain footage you might come across in surveillance is some PAL encoded footage I have of a motor race from 1987. The Topaz AI still amazes me - constantly - in how accurate it can be even in what seem to be dire circumstances (poor quality of the input) - but there is also a clear point where it sometimes completely gets it wrong - a car in a very distant shot for example becomes a mere “blob of moving noise” - actually much worse than the original.

I have also seen textures appear that I know were never there to begin with. On clothing for example and even on faces.

That said, the product is as stunning as it is amazing simply because these issues are very few and far between, are generally benign and are by a country mile and a half the lessor of the two evils - those evils being doing nothing with that poor quality footage or using Topaz knowing there are ocassionally going to be anything from glitches to inaccuracies to outright fabrication of what is in the picture.

It is for that reason I wouldn’t go near it with a barge pole for any legal related work. And that is no indictment on the product which to me reaches minor miracle status. It is just that the stakes in a legal proceeding are obviously incredibly high and in the name of fairness I think any footage shown in court cases should be “as is” and then it is up to a jury to make up their own minds as to how much weight it carries and therefore how incriminating it may be.

I have concrete examples of Proteus making a complete balls-up and it typically most happens when the footage depicts an object that is very unclear but as a human I know exactly what it is either through context or because I saw it a handful of frames earlier. But to Proteus, it simply cannot reach a conclusion - in this case I wish it would not do anything to that particular object - but when it does it makes it much worse than the original.

1 Like

I have some old PAL motorsport footage which although released commercially to commercial standards on a DVD was likely made from an analogue backup of an early generation Betacam cassette. One of the cars has cigarette sponsorship on the bonnet and so long as the car is within a certain distance of the camera and the height of the camera relative to the bonnet is over a certain threshold, it does a fantastic job. And it even subtly blurs out the footage once it becomes far too indistinct. But in some cases, if the distance from the camera is long enough and the angle (relative height of camera to bonnet line) is low enough, the cigarette brand literally becomes a bar code look-alike!

I have an example of TVAI giving a refrigerator a skin texture (there was a small printing on the surface which the AI deemed to be a tattoo and thus got confused…)

1 Like

P.S. my favourite AI fail:

Looking at the image and the distribution of those black blotches on the fur you do know where this misdetection comes from - and at the place where the AI was coded (China) a Panda might be seen more often than such a cat (at least one that’s not already cooked on your dish)…
But it’s still extremely funny.

1 Like