Viral AI bodyswaps hit 14M views, deepen deepfake risks now!

Viral full‑body AI swaps using Kling AI’s motion control surpass 14M views — experts warn these harder‑to‑detect deepfakes could fuel impersonation, non‑consensual imagery and disinformation.

  • Viral full-body AI swaps replaced a creator’s face and body with actors, reaching over 14 million views on X.
  • Creators used Kling AI’s 2.6 Motion Control to produce seamless whole-body character swaps of actors like Millie Bobby Brown, David Harbour, and Finn Wolfhard.
  • Researchers say full-body deepfakes remove many visual cues used to detect earlier face-only manipulations.
  • Experts warn the tools could fuel impersonation scams, non-consensual imagery, political disinformation, and corporate fraud.
  • Technologists urge investment in intrinsic detection methods, platform review, and clear liability and disclosure rules.

A viral clip this week showed Brazilian creator Eder Xavier swapping his face and body with actors from Stranger Things using Kling AI’s 2.6 Motion Control, and the versions posted to social platforms have exceeded 14 million views on X. The post drew attention from industry figures, including a share by a16z partner Justine Moore, who shared the video online. The clips demonstrate a shift from face-only deepfakes to full-body synthetic media.

- Advertisement -

The viral examples coincide with broader model development, including newer systems such as Veo 3.1, Nano Banana, FaceFusion, and OpenAI’s Sora 2, which expand access to high-quality generated video. Researchers warn these tools can quickly move beyond demos into widespread misuse. Emmanuelle Saliba of GetReal Security said “The floodgates are open. It’s never been easier to steal an individual’s digital likeness—their voice, their face—and now, bring it to life with a single image.”

Academic researchers emphasize technical escalation. Yu Chen, a professor at Binghamton University, said “Full-body character swapping represents a significant escalation in synthetic media capabilities.” He noted these systems must handle pose estimation, skeletal tracking, clothing transfer, and natural motion across the whole body. Chen added that older detection methods relied on boundary inconsistencies and temporal artifacts that full-body swaps can hide.

Security experts caution about harms beyond fraud, including non-consensual explicit content, political disinformation, and corporate deception. Observers on social media expressed alarm — one commentator tweeted “We’re not ready.” Researchers recommend platforms build automated detection, human review capacity, and escalation paths while policymakers consider liability and disclosure rules.

✅ Follow BITNEWSBOT on Telegram, Facebook, LinkedIn, X.com, and Google News for instant updates.

- Advertisement -

Previous Articles:

- Advertisement -

Latest News

Backpack Airdrops Equity to VIP Token Stakers

Backpack plans to offer equity to users who stake its upcoming token and join...

AI Demands Crypto to Keep Pace, Investor Says

Veteran investor Jordi Visser argues that AI "can’t survive without crypto," as the fiat...

Bitcoin Plunges as Israel Strikes Iran, Bounces Back Over $68K

Bitcoin plunged nearly 5% to near $60,000 following reports of U.S.-Israel strikes on Iran,...

XRP’s Future: $13.5 Trillion Hurdle to $150 Dream

Reaching $150 per XRP would require a $13.5 trillion market cap, a figure nearly...

BRICS Rising: Lula and Modi Boost India-Brazil Trade Amid Global Shift

Brazilian President Lula da Silva arrived in India with a 300-person delegation aiming to...

Must Read

How to Check The Rarity of An NFT

Whenever you invest in an NFT collection, you might have noticed that some NFTs are more expensive than others. NFT collections are often made...
🔥 #AD Get 20% OFF any new 12 month hosting plan from Hostinger. Click here!