Deepfake video targets Pakistan ex-first lady Bushra Bibi

  • Oops!
    Something went wrong.
    Please try again later.
  • Oops!
    Something went wrong.
    Please try again later.

A deepfake video of Pakistan's former first lady Bushra Bibi in which she appears to talk about a kiss with a man has attracted tens of thousands of views online. Bibi's face was superimposed on the footage, which showed a different woman speaking in a video posted in November 2022.

"This video is from when the mother of youthiyas Bushra Bibi was a flexible dancer," read an Urdu-language Facebook post that shared the video on April 26, 2024.

"Youthiya" is a slur against supporters of Bibi's husband Imran Khan, Pakistan's ex-prime minister and former international cricketer, while a "dancer" is sometimes used to describe a woman considered to have poor morals.

The pair were both convicted of corruption and breaking Islamic marriage laws in the lead up to February's general election, which saw military-backed parties come to power.

Khan, who was ousted from power in a no-confidence vote in April 2022, says the cases were designed to keep him from contesting.

The video appears to show a man interviewing Bibi about a kiss she supposedly gave him in public -- an act that would be considered inappropriate in conservative, Muslim-majority Pakistan.

Bibi appears to respond that anyone who "has a problem" with the kiss can phone her.

<span>Screenshot of the false Facebook post, captured on May 10, 2024</span>
Screenshot of the false Facebook post, captured on May 10, 2024

Bibi's portrayal in the clip is at odds with the religiously devout image that she and Khan have presented since their marriage in January 2018, months before Khan's election as prime minister.

They met when he approached her for spiritual guidance, and she wears a face-covering hijab when appearing in public.

The video racked up more than 127,000 views and was shared in similar posts here and here on Facebook and also on X, formerly Twitter.

But the video is a deepfake -- a different woman's face was replaced with Bibi's likeness.

'Faith healer' interview

A graphic that reads "MN point" appears at the bottom of the video at the 11-second mark.

A keyword search on Facebook found the original video posted on a page called "MN point" on November 23, 2022 (archived link).

The video shows a woman identified as Rabia peerni (Rabia the faith healer) -- not Bushra Bibi.

<span>Screenshot comparison between the video in the false post (left) and the MN point video from November 2022 (right)</span>
Screenshot comparison between the video in the false post (left) and the MN point video from November 2022 (right)

The video's Urdu-language caption reads: "Rabia peerni told the truth about what media people have been doing. After the kissing video, the interviewer asks Rabia for forgiveness".

In the video, the interviewer refers to an earlier clip in which he kissed Rabia peerni on the cheek.

That video was posted on MN point's YouTube channel on November 7, 2022 (archived link).

Rabia tells viewers they can contact her directly if they take issue with the video and proceeds to give her phone number.

Manipulated video

Meanwhile, the video appearing to show Bushra Bibi featured telltale signs of manipulation, including the misalignment of her speech and mouth movements.

Deepfake experts also said a glitch at the start of the video suggested it was altered.

"If you look closely in the first few seconds of the video, Bushra Bibi's face is turned slightly towards the interviewer," Sara Oscar, senior lecturer in visual communication at the University of Technology Sydney told AFP.

"In these moments, Bibi's face glitches back to the original person."

T J Thomson, a visual communications lecturer at Australia's RMIT University, also picked up on the momentary glitch.

"This all happens within a millisecond so viewers might miss this when watching the video at normal speed but slowing the video down helps make the inconsistencies more apparent," he told AFP.

AFP has published a guide to spotting AI-generated content here.