It’s fake.
Just how serious deepfake videos and audios can affect national politics came to light last week.
An audio last week of what sounded like President Ferdinand Marcos Jr. calling on the Armed Forces of the Philippines (AFP) to prepare to attack a foreign country was revealed to be a fake.
The Presidential Communications Office (PCO) had warned against an audio deepfake that made it appear that the President had ordered the AFP “to act against a particular foreign country.”
Over the weekend, the Philippine National Police (PNP) announced that they had information on the source of the technology used to manipulate an earlier recording of the President.
PNP spokesperson P/Col Jean Fajardo told local media that an investigation was ongoing.
Fajardo said: “We have already identified the possible source of this deepfake audio but as to the extent of his involvement here, that’s still subject to investigation.”
Some reports pointed to a foreigner as the possible source but the PNP refused to confirm or deny this.
The PCO – which handles the administration’s communications – said in a statement that the disturbing message came to their attention after noting “a video content posted on a popular video streaking platform circulating online that has manipulated audio designed to sound like President Marcos.”
A by-product of artificial intelligence (AI), deepfakes are found all over social media with some coming out for entertainment purposes while others have more sinister motives.
The quality is so good that there’s a growing number of YouTube posts of AI-created music that sound exactly like the Beatles, even though two of the band’s original members died many years ago. The deepfake audios eerily mimics John Lennon and George Harrison as if the songs have just been recorded.
Deepfake videos have also become commonplace in Hollywood blockbusters. A few years ago, the late actor Paul Walker was “resurrected” in one of The Fast and the Furious movies using AI.
A well-known newscaster told this reporter that he was fully aware of the fake videos using his likeness but added that he could do little because it would have cost a small fortune to sue the responsible parties.
In the latest case involving the Philippine President, the PCO said: “The audio deepfake attempts to make it appear as if the President has directed the AFP to act against a particular foreign country. No such directive exists nor has been made.”
A version of the same audio has Marcos ordering the armed forces “and special task groups” to take the necessary measures should China “attack” the Philippines.
Fajardo also said, “whether this is intentional or not, those people behind this deepfake audio will be held accountable.”
She said the PNP’s anti-cybercrime group is coordinating with the Department of Information and Communications Technology in its investigation.
The Department of Justice has also directed the National Bureau of Investigation to find the person or persons behind the deepfake audio and to file the necessary legal actions against them.
The PCO reminded the public to be mindful and responsible of the content they share in their respective platforms.
The fake audio comes at a time of heightened tensions between the Philippines and China due to the territorial disputes over the West Philippine Sea, which Beijing insists is part of the South China Sea.