Deepfaking Prompts Concerns as TV Doctors Fall Prey to Health FraudDeepfaking Prompts Concerns as TV Doctors Fall Prey to Health Fraud Technology has enabled a concerning trend, as TV doctors are being impersonated through “deepfaking” to promote fraudulent health products. Deepfaking Unveiled “Deepfaking” involves using artificial intelligence to map someone’s face onto a different body, creating realistic videos. This technology has found its way into the realm of health fraud, with TV doctors such as Michael Mosley, Hilary Jones, and Rangan Chatterjee being targeted. Unveiling the Fraud An investigation by the British Medical Journal (BMJ) uncovered videos on social media depicting these doctors promoting products that make exaggerated claims, such as curing high blood pressure and diabetes or selling hemp gummies. Challenges of Detection Henry Ajder, an expert on deepfake technology, warns that detecting deepfakes has become increasingly difficult as the technology improves. The sophistication of these videos makes it challenging to discern the authenticity of the content. Impact on Victims Dr. Hilary Jones actively fights against the proliferation of such videos by hiring a social media specialist to identify and remove them. However, the ease with which these videos can be recreated under different names poses a constant challenge. Call for Vigilance John Cormack, a retired doctor, emphasizes the financial benefits for fraudsters who utilize deepfaking instead of investing in legitimate research and product development. Meta’s Response Meta, the parent company of Facebook and Instagram, has acknowledged the investigation’s findings and stated that it will examine the provided examples. The company maintains its stance against misleading content and encourages users to report any violations.
Sign up for our free Health Check email to receive exclusive analysis on the week in healthcare
Receive our free Health Check email
Michael Mosley is among TV doctors who have been ‘deep faked’ on social media to promote health fraud, an investigation has found.
According to the British Medical Journal, Mr Mosley and other doctors, including Hilary Jones and Rangan Chatterjee, are being used to promote products that claim to cure high blood pressure and diabetes, and to sell hemp gummies.
‘Deepfaking’ is the use of artificial intelligence to map the digital likeness of someone’s face and place it onto a body that is not their own, in order to create videos.
In a report dated Wednesday BMJ, revealed that a video had been made on Facebook in which Dr. Hilary Jones was promoting a drug that would cure high blood pressure, while on the Lorraine program.
The BMJ said more such videos have been discovered, but it did not specify how much the investigation has yielded.
Henry Ajder, an expert on deepfake technology, told the publication: “Over the past year, we have seen a huge growth in this form of deepfake fraud, particularly on YouTube and X. Many are selling fraudulent cryptocurrencies, investment schemes or medical products, with varying degrees of sophistication.”
Dr. Jones hires a social media specialist to scour the web for videos that misrepresent his views, and then attempts to remove them.
“Even if you do that, they’ll pop back up the next day under a different name,” he said.
According to Mr Ajder, it can be difficult to recognize deepfakes because the technology has improved.
He added: “It’s difficult to quantify how effective this new form of deepfake fraud is, but the growing number of videos now circulating suggests that bad actors are having some success.”
John Cormack, a retired doctor from Essex who worked for The BMJ, said: “The bottom line is that it’s much cheaper to spend your money on making videos than on doing research and coming up with new products and bringing them to market in the conventional way.”
A spokesperson for Meta, the company that owns both Facebook and Instagram and which hosted many of the videos Cormack found, told The BMJ: “We will examine the examples provided by the British medical journal.
“We don’t allow content that intentionally misleads or attempts to deceive others, and we’re continually working to improve detection and enforcement. We encourage anyone who sees content that may violate our policies to report it so we can investigate and take action.”