Today, 08:57 AM
It just depends. A short enough clip with enough "GPU horsepower" thrown at it can look essentially indistinguishable from reality. While there are certain "telltale signs" that something is generated with AI, there's no guarantee that any of those will be present in any given clip, especially if it's short. I've been using local image-generation AI (doesn't have to go over the Internet, it all runs locally in my own home) to generate images that you would not be able to tell by any means are not real photos.
Photoshop released an edit-detection feature, but that feature itself uses AI. AI-detection-AI can detect certain kinds of edits and fakes but there is no guarantee it can pick them all out. So yes, you can have a fake photo or video-clip where nobody can actually prove it's fake. I think the age of AI is making it clear why the old biblical standard is, "In the mouth of two or three witnesses, every word will be established." You need flesh-and-blood witnesses to establish what exactly has happened, regardless of funny-business. Their testimony may include things like "this video footage is real, I recorded it on my camera", but it's the testimony that establishes it, not the bare footage itself. Two or three witnesses are still required, that is, lone-gun video evidence is not sufficient to prove a serious accusation when there is reasonable doubt that it could have been faked. IOW, images/video etc. should be treated as an unreliable witness unless their provenance can be reliably established. I expect the courts will end up ruling on this at some point.
Connect With Us