David Do is, on the surface, unassuming and respectable. He owns a house just outside Toronto with his partner, drives a Tesla and is paid $121,000 a year as a hospital pharmacist.
But he leads a double life as a key person behind the world’s most notorious website for non-consensual, AI-generated porn of real people: MrDeepFakes.com. He has never been fully identified until now.
MrDeepFakes was the most popular site globally for deepfake porn, and hosted tens of thousands of non-consensual and sometimes violent deepfake videos and images of celebrities, politicians, social media influencers and private citizens, including Canadians.
This week, after CBC News’s visual investigations unit — in collaboration with open-source investigative outlet Bellingcat and Danish publications Politiken and Tjekdet — contacted Do about his role in the site’s operations, MrDeepFakes shut down for good.
A message posted to the site’s homepage states that a “critical service provider has terminated service permanently,” adding that the site “will not be relaunching.”
The site had more than 650,000 users, some of whom charged hundreds of dollars to create custom videos. And the content — which ranges from graphic strangulation scenes involving an AI fake of actor Scarlett Johansson to group sex with actor Natalie Portman to masturbation videos of musician Michael Bublé — has gotten more than two billion views since the site’s inception in 2018.Â
But not everyone on the site was a Hollywood actor or famous singer.
“[It’s] quite violating,” said Sarah Z., a Vancouver-based YouTuber who CBC News found was the subject of several deepfake porn images and videos on the website. “For anybody who does think that these images are harmless, just please consider that they’re really not. These are real people … who often suffer reputational and psychological damage.”
Creators on the site also took requests from people asking for deepfake porn of their partners and wives.
Sharing non-consensual deepfake porn is illegal in several countries, including South Korea, Australia and the U.K. It is also illegal in several U.S. states, and while there is no federal law yet, the House of Representatives passed a bipartisan bill banning it in April.
However, it’s not a crime in Canada. Prime Minister Mark Carney pledged to pass a law criminalizing it during his federal election campaign, saying, “We will make producing and distributing non-consensual sexual deepfakes a criminal offence.”Â
Requests for comment ignored
David Do took care to hide his association with MrDeepFakes; his name never appeared anywhere on the site. But Do’s role can be pieced together using data from the web, public records and forensic analysis of the site.
According to that research, Do had a central role in running the MrDeepFakes site and has a long online history discussing the details of operating a popular adult website that makes a profit.
CBC News, over a period of several weeks, sent multiple emails to Do seeking a response. Despite opening the emails several times, according to mail trackers attached to the emails, he never responded.Â
When a CBC News reporter hand-delivered a letter to Do at Markham Stouffville Hospital, where he works as an in-patient pharmacist, on April 11, he said, “I don’t know anything about that.”
“What do you mean?” the reporter asked.Â
“I’m at work right now,” Do said, and insisted on going back to work.
AI-generated deepfake porn images are getting easier to make and harder to fight. The National breaks down how it works, the real-life impact on victims and what the options are if fake images of you start circulating online.
After this interaction, Do’s Facebook profile and the social media profiles of family members were taken down.Â
Two weeks later, a review posted on Do’s Airbnb account indicated he was in Portugal.
Then, on May 4, MrDeepFakes.com went offline — apparently, for good.
“A critical service provider has terminated service permanently. Data loss has made it impossible to continue operation,” says a shutdown notice on the website’s main page. “We will not be relaunching. Any website claiming this is fake. This domain will eventually expire and we are not responsible for future use. This message will be removed around one week.”
On May 5, a CBC News reporter approached Do in an attempt to interview him about his role in the website. Do told the reporter he didn’t want to be recorded and that he was busy, before driving away in his vehicle.Â
Hidden in plain sight
MrDeepFakes was shrouded in secrecy. Members were advised to remain anonymous using aliases and video creators were paid in cryptocurrency. The site’s hosting providers, according to a previous report by Bellingcat, moved around the world. But Do slipped up, leaving clues hiding in plain sight.Â
It started with a username.
As early as 2018, a user with the handle DPFKS was an administrator of the MrDeepFakes forum, where people could pay to have custom deepfakes made of celebrities and private individuals, even spouses.
In a 2018 post on the forum site Voat — a site DPFKS said they used in posts on the MrDeepFakes forum — an account with the same username claimed to “own and run” MrDeepFakes.com.
“I just got home from my day job,” they said in another post on Voat, “now back to this!”
DPFKS also said that “MrDeepFakes.com was formerly dpfks.com” in a 2018 post on the MrDeepFakes forum. The same unique Google Analytics tag is linked to both URLs, indicating they are managed by the same owner.
DPFKS did more than run the website; they created more than 150 deepfake porn videos. They even posted a dataset of more than 6,000 photos of U.S. Rep. Alexandria Ocasio-Cortez so other users could create non-consensual deepfake porn.Â
Running that username through a searchable database that includes information pulled from the open web and previous data breaches reveals additional clues: usernames, IP addresses, date of birth, physical home addresses — and several personal email addresses featuring combinations of the name Duy David Do going back more than 10 years.
Another email also appeared: DPFKScom@gmail.com, which was the contact email for MrDeepFakes until 2020. That email address also appeared in the website’s source code at the time.
This is where Do’s careful veil of secrecy fell apart: many of these email addresses had registered accounts on different websites using the same unique 11-character password, which was exposed in multiple data breaches.
Do’s personal emails were also linked to a Yelp account for a user named David D. who lives in the Greater Toronto Area, and an Airbnb account that contained a photo of Do. A profile on document-hosting site Issuu under the username “dpfkscom” that links to the MrDeepFakes website also lists its location as Canada.Â
That Airbnb photo matched the profile of an Ontario pharmacist named David Do who works at Markham Stouffville Hospital and Uxbridge Hospital in the Oak Valley Health network.
A 2020 Instagram post from Oak Valley Health features a picture of Do, quoting him saying that his role as a pharmacist “moves beyond dispensing medications,” and that he’s part of a team that is “often involved in medication management … and ensuring safe practices.”
Oak Valley Health, Do’s employer, told CBC News it had initiated an internal investigation into the matter “in consultation with legal counsel.”
“We are unable to make further comment, but want to make clear that Oak Valley Health unequivocally condemns the creation or distribution of any form of violent or non-consensual sexual imagery.”
The Ontario College of Pharmacist’s code of ethics states that no member should engage in “any form of harassment,” including “displaying or circulating offensive images or materials.”
The Ontario College of Pharmacists told CBC News the “allegations you have shared with us are extremely serious” and that it was “taking immediate steps to look into this matter further and determine the necessary actions we need to take to protect the public.”
Do has taken steps to maintain his own privacy. Property records show that he and his partner own a home in the Toronto area. The property is blurred on Google Maps, a privacy feature that is available upon request.
Private citizens, violent deepfaked videos
The faces on MrDeepFakes are, in many cases, household names: Taylor Swift, actor Emma Watson, former prime minister Justin Trudeau, U.S. politician Alexandria Ocasio-Cortez, Ivanka Trump and climate activist Greta Thunberg, just to name a few.
Some of the videos are violent in nature. “Scarlett Johannson gets strangled to death by creepy stalker” is the title of one video; another called “Rape me Merry Christmas” features Taylor Swift.
The list of victims includes Canadian American Gail Kim, who was inducted into the TNA Wrestling Hall of Fame in 2016 and has made recent appearances on reality-TV shows The Amazing Race Canada and The Traitors Canada.Â
Kim hadn’t seen the videos of her on MrDeepFakes, because “it’s scary to think about.”
“A lot of young people commit suicide because they’re shamed for things that are absolutely not them,” Kim told CBC News in an interview. “I wish there was some kind of way to control these things.”
Not all of the people featured on the site were celebrities. For example, one of the site’s rules stated only social media influencers with more than 120,000 Instagram followers are acceptable to deepfake, and that non-celebrities cannot be deepfaked without consent.
Yet CBC News found deepfake porn of a woman from Los Angeles who has just over 29,000 Instagram followers.
“Every time it is being used on some really big-name celebrity like Taylor Swift, it emboldens people to use it on much smaller, much more niche, more private individuals like me,” said the YouTuber Sarah Z.
Users also post about getting videos of their partners or wives. One person wrote that they want “an expert to deepfake my partner” and to “please [direct message] me requirements and cost.”
Rapid evolution in deepfake porn
Deepfake porn technology has made significant advances since its emergence in 2017, when a Reddit user named “deepfakes” began creating explicit videos based on real people.
“In 2017, these [videos] were pretty glitchy. You could see a lot of glitchiness particularly around the mouth, around the eyes,” said Suzie Dunn, a law professor at Dalhousie University in Halifax, N.S.Â
In 2025, she said the technology has progressed to where “someone who’s highly skilled can make an almost indiscernible sexual deepfake of another person.”
Creating a high-quality deepfake requires top-shelf computer hardware, time, money in electricity costs and effort. According to a 2025 preprint study by researchers at Stanford University and UC San Diego, discussion around assembling large datasets of victim’s faces — often, thousands of images — accounts for one-fifth of all forum threads on MrDeepFakes.Â
These high-quality deepfakes can cost $400 or more to purchase, according to posts seen by CBC News.
Making a digital fake doesn’t necessarily require training a bespoke AI model. There are now countless “nudify” apps and websites that can do face swaps in seconds. They’re getting easier to use — and many of them are free.
According to a report by cybersecurity firm Security Hero, there has been a 550 per cent increase in the number of deepfakes from 2019 to 2023.
In 2023, the firm found there were more than 95,000 deepfake videos online, 99 per cent of which are deepfake porn, primarily of women.
In 2025, MrDeepFakes hosted more than 70,000 deepfake porn videos.
“They see the women in these images as digital objects,” said Dunn. “In a really serious way. It really discourages people from going into politics, going, even being a celebrity.”
Profiting from deepfake pornÂ
Even early on in the site’s existence, it was evident David Do was having trouble dealing with the growth of MrDeepFakes, and the money it brings in with advertising.Â
In 2018, an account linked to Do’s MrDeepFakes email — DPFKScom@gmail.com — with the username “Aznrico” posted a message on a web-hosting forum asking for help “identifying bottlenecks and reasons for slowness” for an adult site “that gets around 15-20k visitors per day.”
In March 2025, according to web data platform Semrush, MrDeepFakes got more than 18 million visits.
A cryptocurrency trading account for Aznrico later changed its username to “duydaviddo.”
An account with the username Aznrico also posted on an auto forum in 2009: “my car is the 06 lancer ralliart,” a reference to the Mitsubishi Lancer Ralliart. Public records obtained by CBC News show that a 2006 Mitsubishi Lancer Ralliart is registered to Do’s father, and the car appears in Google Maps imagery of Do’s parents’ house from 2009 onwards.
In 2020, another account linked to Do — it originally had the username “dj01039,” an abbreviation of davidjames01039@gmail.com, which is an email linked to a PayPal donation button that appeared on MrDeepFakes in 2019 — posted that he is looking for “business solutions” for the adult site where he is the “webmaster,” and that it makes up to $7,000 per month.Â
An account on another forum with the username “dj01039” was registered with the DPFKScom@gmail.com address, records show.Â
But DPKFS’s influence on the website went beyond hosting and IT. According to their account on the MrDeepFakes forum, DPFKS posted 161 deepfake porn videos.
On the site’s forum, one user asked for a deepfake of Korean pop star Yeri. The user Paperbags — formerly DPFKS  — posted that they had “already made 2 of her. I am moving onto other requests.”
In a May 2018 exchange, a user requested a Sandra Bullock video. Paperbags responded, “Okay i’ll see what I can do. Maybe in a couple weeks.” Four weeks later, he posted, “currently working on Sandra Bullock” and shared a link to a video titled “Sandra Bullock Nude Ass Pounding.”
Evolving legal landscape
Despite the popularity of deepfake porn and widely available tools to make it, laws in Canada and internationally are just starting to catch up.Â
Creating and sharing non-consensual deepfake AI porn of adults is not a criminal offence in Canada, but laws around child pornography have recently been applied by courts to encompass AI deepfakes, said Moira Aikenhead, a lecturer at the University of British Columbia’s Allard School of Law.Â
A recently passed law in British Columbia makes it easier for adult victims to pursue civil recourse such as takedowns and damages. Similar laws exist in Prince Edward Island, New Brunswick and Saskatchewan.Â
“Only the federal government can pass criminal legislation,” said Aikenhead, and so “this move would have to come from Parliament.”
In the U.S., no criminal laws exist at the federal level, but the House of Representatives overwhelmingly passed the Take It Down Act, a bipartisan bill criminalizing sexually explicit deepfakes, in April. A patchwork of laws exist at the state level.
In the U.K. and Australia, sharing non-consensual explicit deepfakes was made a criminal offence in 2023 and 2024, respectively.Â
But victims of deepfake porn in Canada are still waiting for recourse beyond civil litigation.Â
“There’s only so much that I as an individual can do,” said Sarah Z. “Any change will have to be legislative and systemic.”
Do you have something that you think needs investigating? You can send tips to eric.szeto@cbc.ca.