视频造假_如何发现“深造假”面部切换视频
視頻造假
Recently, Reddit has been making news again with a subreddit in w hich people use a machine learning tool called “Deep Fake” to automatically replace one person’s face with another in a video. Obviously, since this is the internet, people are using it for two things: fake celebrity porn and inserting Nicolas Cage into random movies.
最近,Reddit再次發(fā)布新聞,人們?cè)谝曨l中使用機(jī)器學(xué)習(xí)工具“ Deep Fake”來自動(dòng)將一個(gè)人的臉替換為另一個(gè)人的臉。 顯然,由于這是互聯(lián)網(wǎng),因此人們將其用于兩件事:假名人色情片和將尼古拉斯·凱奇(Nicolas Cage)插入隨機(jī)電影中。
While swapping someone’s face in a photograph has always been relatively easy, swapping someone’s face in a video used to be time consuming and difficult. Up until now, it’s mainly just been done by VFX studios for big budget Hollywood movies, where an actor’s face is swapped onto their stunt double. But now, with Deep Fake, anyone with a computer can do it quickly and automatically.
雖然在照片中交換某人的臉總是相對(duì)容易的,但在視頻中交換某人的臉過去既費(fèi)時(shí)又困難。 到目前為止,這主要是由VFX制片廠完成的,用于制作好萊塢大型預(yù)算電影,演員的臉被換成他們的特技替身。 但是現(xiàn)在,有了Deep Fake,擁有計(jì)算機(jī)的任何人都可以快速,自動(dòng)地做到這一點(diǎn)。
Before going any further, you need to know what a Deep Fake looks like. Check out the SFW video below which is a compilation of different celebrity face swaps, mainly involving Nic Cage.
在進(jìn)行進(jìn)一步操作之前,您需要了解Deep Fake的外觀。 觀看下面的SFW視頻,其中包含有關(guān)Nic Cage的不同名人面Kong互換的匯編。
The Deep Fake software works using machine learning. It’s first trained with a target face. Distorted images of the target are run through the algorithm and it learns how to correct them to resemble the unaltered target face. When the algorithm is then fed images of a different person, it assumes they’re distorted images of the target, and attempts to correct them. To get video, the Deep Fake software operates on every frame individually.
Deep Fake軟件使用機(jī)器學(xué)習(xí)來工作。 首先使用目標(biāo)臉進(jìn)行訓(xùn)練。 目標(biāo)的失真圖像通過算法運(yùn)行,并且學(xué)習(xí)如何校正它們以使其與未改變的目標(biāo)面部相似。 然后,當(dāng)算法被提供給另一個(gè)人的圖像時(shí),它將假定它們是目標(biāo)的扭曲圖像,并嘗試對(duì)其進(jìn)行校正。 要獲取視頻,Deep Fake軟件會(huì)在每個(gè)幀上單獨(dú)運(yùn)行。
The reason that Deep Fakes have largely just involved actors is that there is a lot of footage of them available from different angles which makes training more effective (Nicolas Cage has 91 acting credits on IMDB). However, given the amount photos and video people post online and that you really only need about 500 images to train the algorithm, there’s no reason ordinary people can’t be targeted too, although probably with a little less success.
Deep Fakes基本上只涉及演員的原因是,有很多不同角度的鏡頭可供使用,這使得培訓(xùn)更加有效(Nicolas Cage在IMDB上有91個(gè)表演學(xué)分)。 但是,考慮到人們?cè)诰W(wǎng)上發(fā)布的照片??和視頻的數(shù)量,并且您實(shí)際上只需要大約500張圖像來訓(xùn)練算法,就沒有理由也不能將普通人當(dāng)作目標(biāo),盡管成功的可能性可能會(huì)少一些。
如何發(fā)現(xiàn)假貨 (How to Spot a Deep Fake)
Right now, Deep Fakes are pretty easy to spot but it will get harder as the technology gets better. Here are some of the giveaways.
目前,Deep Fakes很容易發(fā)現(xiàn),但隨著技術(shù)的進(jìn)步,它將變得越來越難。 這是一些贈(zèng)品。
Weird Looking Faces. In a lot of Deep Fakes, the faces just look weird. The features don’t line up perfectly and everything just appears a bit waxy like in the image below. If everything else looks normal, but the face appears weird, it’s probably a Deep Fake.
奇怪的表情。 在很多“深造假貨”中,這些面Kong看起來很奇怪。 這些功能并不能完美地排列在一起,所有功能看起來都像下面的圖片一樣有點(diǎn)蠟質(zhì)。 如果其他一切看起來正常,但臉部看起來很奇怪,則可能是“深假”。
Flickering. A common feature of bad Deep Fake videos is the face appearing to flicker and the original features occasionally popping into view. It’s normally more obvious at the edges of the face or when something passes in front of it. If weird flickering happens, you’re looking at a Deep Fake.
忽隱忽現(xiàn)。 糟糕的Deep Fake視頻的一個(gè)常見特征是面部似乎閃爍,原始特征有時(shí)會(huì)突然出現(xiàn)。 通常在臉的邊緣或前面有東西通過時(shí)更明顯。 如果發(fā)生怪異的閃爍,則表示您正在查看“深淵假貨”。
Different Bodies. Deep Fakes are only face swaps. Most people try and get a good body match, but it’s not always possible. If the person seems to be noticeably heavier, lighter, taller, shorter, or has tattoos they don’t have in real life (or doesn’t have tattoos they do have in real life) there’s a good chance it’s fake. You can see a really obvious example below, where Patrick Stewart’s face has been swapped with J.K. Simmons in a scene from the movie Whiplash. Simmons is significantly smaller than Stewart, so it just looks odd.
不同的機(jī)構(gòu)。 冒牌貨只是面Kong互換。 大多數(shù)人嘗試獲得良好的身體匹配,但這并不總是可能的。 如果該人看起來明顯更重,更輕,更高,更矮或在現(xiàn)實(shí)生活中沒有紋身(或者在現(xiàn)實(shí)生活中沒有紋身),那么很有可能是假的。 您可以在下面看到一個(gè)非常明顯的示例,在電影Whiplash的場(chǎng)景中,Patrick Stewart的臉與JK Simmons交換了。 西蒙斯比斯圖爾特小得多,所以看起來很奇怪。
Short Clips. Right now, even when the Deep Fake software works perfectly and creates an almost indistinguishable face swap, it can only really do it for a short amount of time. Before too long, one of the problems above will start happening. That’s why most Deep Fake clips that people share are only a couple of seconds long, the rest of the footage is unusable. If you’re shown a very short clip of a celebrity doing something, and there’s no good reason it’s so short, it’s a clue that it’s a Deep Fake.
短片。 現(xiàn)在,即使Deep Fake軟件可以完美運(yùn)行并創(chuàng)建幾乎無(wú)法區(qū)分的人臉交換,它也只能在很短的時(shí)間內(nèi)完成。 不久,上述問題之一將開始發(fā)生。 這就是為什么人們共享的大多數(shù)Deep Fake剪輯只有幾秒鐘長(zhǎng),其余片段無(wú)法使用的原因。 如果顯示某位明星做某事的片段很短,并且沒有充分的理由說明它太短,那么就可以證明這是“假貨”。
No Sound or Bad Lip Syncing. The Deep Fake software only adjusts facial features; it doesn’t magically make one person sound like another. If there’s no sound with the clip, and there’s no reason for their not to be sound, it’s another clue you’re looking at a Deep Fake. Similarly, even if there is sound, if the spoken words don’t match up correctly with the moving lips (or the lips look strange while the person talks like in the clip below), you might have a Deep Fake.
沒有聲音或嘴唇同步不良。 Deep Fake軟件僅調(diào)整面部特征; 它不會(huì)神奇地使一個(gè)人聽起來像另一個(gè)人。 如果剪輯沒有聲音,也沒有理由不發(fā)出聲音,那是您正在尋找Deep Fake的另一個(gè)線索。 同樣,即使有聲音,如果說話的單詞與移動(dòng)的嘴唇?jīng)]有正確匹配(或者當(dāng)人說話時(shí),嘴唇在下面的剪輯中看起來很奇怪),您可能會(huì)感到很虛假。
Unbelievable Clips. This one kind of goes without saying but, if you’re shown a truly unbelievable clip, there’s a good chance you shouldn’t actually believe it. Nicolas Cage has never starred as Loki in a Marvel movie. That’d be cool, though.
令人難以置信的剪輯。 這種說法毋庸置疑,但是,如果您看到了一個(gè)真正令人難以置信的剪輯,則很有可能您實(shí)際上不應(yīng)該相信它。 尼古拉斯·凱奇(Nicolas Cage)從未出演過奇跡電影中的洛基(Loki)。 不過那太酷了。
Dubious Sources. Like with fake photos, where the video supposedly comes from is often a big clue as to its authenticity. If the New York Times is running a story on it, it’s far more likely to be true that something you discover in a random corner of Reddit.
可疑來源。 就像偽造的照片一樣,視頻的真實(shí)性通常也很重要。 如果《紐約時(shí)報(bào)》刊登了一個(gè)故事,那么您在Reddit的任意角落發(fā)現(xiàn)的東西就更有可能成為事實(shí)。
For the time being, Deep Fakes are more of a horrifying curiosity than a major problem. The results are easy to spot, and while it’s impossible to condone what’s being done, no one is yet trying to pass off Deep Fakes as genuine videos.
就目前而言,“深造假”更多的是令人恐懼的好奇心,而不是主要的問題。 結(jié)果很容易發(fā)現(xiàn),雖然不可能縱容正在做的事情,但沒有人試圖將“ Deep Fakes”作為真正的視頻傳播。
As the technology gets better, however, they’re likely to be a much bigger issue. For example, convincing fake footage of Kim Jong Un declaring war on the USA could cause a major panic.
但是,隨著技術(shù)的進(jìn)步,它們可能會(huì)成為一個(gè)更大的問題。 例如,說服金正恩(Kim Jong Un)向美國(guó)宣戰(zhàn)的假鏡頭可能會(huì)引起嚴(yán)重恐慌。
翻譯自: https://www.howtogeek.com/341469/how-to-spot-a-deep-fake-face-swapped-video/
視頻造假
總結(jié)
以上是生活随笔為你收集整理的视频造假_如何发现“深造假”面部切换视频的全部?jī)?nèi)容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: Android One和Android
- 下一篇: 找到特定ip地址 修改ip_您如何找到网