视频造假
Recently, Reddit has been making news again with a subreddit in w hich people use a machine learning tool called “Deep Fake” to automatically replace one person’s face with another in a video. Obviously, since this is the internet, people are using it for two things: fake celebrity porn and inserting Nicolas Cage into random movies.
最近,Reddit再次发布新闻,人们在视频中使用机器学习工具“ Deep Fake”来自动将一个人的脸替换为另一个人的脸。 显然,由于这是互联网,因此人们将其用于两件事:假名人色情片和将尼古拉斯·凯奇(Nicolas Cage)插入随机电影中。
While swapping someone’s face in a photograph has always been relatively easy, swapping someone’s face in a video used to be time consuming and difficult. Up until now, it’s mainly just been done by VFX studios for big budget Hollywood movies, where an actor’s face is swapped onto their stunt double. But now, with Deep Fake, anyone with a computer can do it quickly and automatically.
虽然在照片中交换某人的脸总是相对容易的,但在视频中交换某人的脸过去既费时又困难。 到目前为止,这主要是由VFX制片厂完成的,用于制作好莱坞大型预算电影,演员的脸被换成他们的特技替身。 但是现在,有了Deep Fake,拥有计算机的任何人都可以快速,自动地做到这一点。
Before going any further, you need to know what a Deep Fake looks like. Check out the SFW video below which is a compilation of different celebrity face swaps, mainly involving Nic Cage.
在进行进一步操作之前,您需要了解Deep Fake的外观。 观看下面的SFW视频,其中包含有关Nic Cage的不同名人面Kong互换的汇编。
The Deep Fake software works using machine learning. It’s first trained with a target face. Distorted images of the target are run through the algorithm and it learns how to correct them to resemble the unaltered target face. When the algorithm is then fed images of a different person, it assumes they’re distorted images of the target, and attempts to correct them. To get video, the Deep Fake software operates on every frame individually.
Deep Fake软件使用机器学习来工作。 首先使用目标脸进行训练。 目标的失真图像通过算法运行,并且学习如何校正它们以使其与未改变的目标面部相似。 然后,当算法被提供给另一个人的图像时,它将假定它们是目标的扭曲图像,并尝试对其进行校正。 要获取视频,Deep Fake软件会在每个帧上单独运行。
The reason that Deep Fakes have largely just involved actors is that there is a lot of footage of them available from different angles which makes training more effective (Nicolas Cage has 91 acting credits on IMDB). However, given the amount photos and video people post online and that you really only need about 500 images to train the algorithm, there’s no reason ordinary people can’t be targeted too, although probably with a little less success.
Deep Fakes基本上只涉及演员的原因是,有很多不同角度的镜头可供使用,这使得培训更加有效(Nicolas Cage在IMDB上有91个表演学分)。 但是,考虑到人们在网上发布的照片和视频的数量,并且您实际上只需要大约500张图像来训练算法,就没有理由也不能将普通人当作目标,尽管成功的可能性可能会少一些。
如何发现假货 (How to Spot a Deep Fake)
Right now, Deep Fakes are pretty easy to spot but it will get harder as the technology gets better. Here are some of the giveaways.
目前,Deep Fakes很容易发现,但随着技术的进步,它将变得越来越难。 这是一些赠品。
Weird Looking Faces. In a lot of Deep Fakes, the faces just look weird. The features don’t line up perfectly and everything just appears a bit waxy like in the image below. If everything else looks normal, but the face appears weird, it’s probably a Deep Fake.
奇怪的表情。 在很多“深造假货”中,这些面Kong看起来很奇怪。 这些功能并不能完美地排列在一起,所有功能看起来都像下面的图片一样有点蜡质。 如果其他一切看起来正常,但脸部看起来很奇怪,则可能是“深假”。
Flickering. A common feature of bad Deep Fake videos is the face appearing to flicker and the original features occasionally popping into view. It’s normally more obvious at the edges of the face or when something passes in front of it. If weird flickering happens, you’re looking at a Deep Fake.
忽隐忽现。 糟糕的Deep Fake视频的一个常见特征是面部似乎闪烁,原始特征有时会突然出现。 通常在脸的边缘或前面有东西通过时更明显。 如果发生怪异的闪烁,则表示您正在查看“深渊假货”。
Different Bodies. Deep Fakes are only face swaps. Most people try and get a good body match, but it’s not always possible. If the person seems to be noticeably heavier, lighter, taller, shorter, or has tattoos they don’t have in real life (or doesn’t have tattoos they do have in real life) there’s a good chance it’s fake. You can see a really obvious example below, where Patrick Stewart’s face has been swapped with J.K. Simmons in a scene from the movie Whiplash. Simmons is significantly smaller than Stewart, so it just looks odd.
不同的机构。 冒牌货只是面Kong互换。 大多数人尝试获得良好的身体匹配,但这并不总是可能的。 如果该人看起来明显更重,更轻,更高,更矮或在现实生活中没有纹身(或者在现实生活中没有纹身),那么很有可能是假的。 您可以在下面看到一个非常明显的示例,在电影Whiplash的场景中,Patrick Stewart的脸与JK Simmons交换了。 西蒙斯比斯图尔特小得多,所以看起来很奇怪。
Short Clips. Right now, even when the Deep Fake software works perfectly and creates an almost indistinguishable face swap, it can only really do it for a short amount of time. Before too long, one of the problems above will start happening. That’s why most Deep Fake clips that people share are only a couple of seconds long, the rest of the footage is unusable. If you’re shown a very short clip of a celebrity doing something, and there’s no good reason it’s so short, it’s a clue that it’s a Deep Fake.
短片。 现在,即使Deep Fake软件可以完美运行并创建几乎无法区分的人脸交换,它也只能在很短的时间内完成。 不久,上述问题之一将开始发生。 这就是为什么人们共享的大多数Deep Fake剪辑只有几秒钟长,其余片段无法使用的原因。 如果显示某位明星做某事的片段很短,并且没有充分的理由说明它太短,那么就可以证明这是“假货”。
No Sound or Bad Lip Syncing. The Deep Fake software only adjusts facial features; it doesn’t magically make one person sound like another. If there’s no sound with the clip, and there’s no reason for their not to be sound, it’s another clue you’re looking at a Deep Fake. Similarly, even if there is sound, if the spoken words don’t match up correctly with the moving lips (or the lips look strange while the person talks like in the clip below), you might have a Deep Fake.
没有声音或嘴唇同步不良。 Deep Fake软件仅调整面部特征; 它不会神奇地使一个人听起来像另一个人。 如果剪辑没有声音,也没有理由不发出声音,那是您正在寻找Deep Fake的另一个线索。 同样,即使有声音,如果说话的单词与移动的嘴唇没有正确匹配(或者当人说话时,嘴唇在下面的剪辑中看起来很奇怪),您可能会感到很虚假。
Unbelievable Clips. This one kind of goes without saying but, if you’re shown a truly unbelievable clip, there’s a good chance you shouldn’t actually believe it. Nicolas Cage has never starred as Loki in a Marvel movie. That’d be cool, though.
令人难以置信的剪辑。 这种说法毋庸置疑,但是,如果您看到了一个真正令人难以置信的剪辑,则很有可能您实际上不应该相信它。 尼古拉斯·凯奇(Nicolas Cage)从未出演过奇迹电影中的洛基(Loki)。 不过那太酷了。
Dubious Sources. Like with fake photos, where the video supposedly comes from is often a big clue as to its authenticity. If the New York Times is running a story on it, it’s far more likely to be true that something you discover in a random corner of Reddit.
可疑来源。 就像伪造的照片一样,视频的真实性通常也很重要。 如果《纽约时报》刊登了一个故事,那么您在Reddit的任意角落发现的东西就更有可能成为事实。
For the time being, Deep Fakes are more of a horrifying curiosity than a major problem. The results are easy to spot, and while it’s impossible to condone what’s being done, no one is yet trying to pass off Deep Fakes as genuine videos.
就目前而言,“深造假”更多的是令人恐惧的好奇心,而不是主要的问题。 结果很容易发现,虽然不可能纵容正在做的事情,但没有人试图将“ Deep Fakes”作为真正的视频传播。
As the technology gets better, however, they’re likely to be a much bigger issue. For example, convincing fake footage of Kim Jong Un declaring war on the USA could cause a major panic.
但是,随着技术的进步,它们可能会成为一个更大的问题。 例如,说服金正恩(Kim Jong Un)向美国宣战的假镜头可能会引起严重恐慌。
翻译自: https://www.howtogeek.com/341469/how-to-spot-a-deep-fake-face-swapped-video/
视频造假