Viral AI bodyswaps hit 14M views, deepen deepfake risks now!

Viral full‑body AI swaps using Kling AI’s motion control surpass 14M views — experts warn these harder‑to‑detect deepfakes could fuel impersonation, non‑consensual imagery and disinformation.

  • Viral full-body AI swaps replaced a creator’s face and body with actors, reaching over 14 million views on X.
  • Creators used Kling AI’s 2.6 Motion Control to produce seamless whole-body character swaps of actors like Millie Bobby Brown, David Harbour, and Finn Wolfhard.
  • Researchers say full-body deepfakes remove many visual cues used to detect earlier face-only manipulations.
  • Experts warn the tools could fuel impersonation scams, non-consensual imagery, political disinformation, and corporate fraud.
  • Technologists urge investment in intrinsic detection methods, platform review, and clear liability and disclosure rules.

A viral clip this week showed Brazilian creator Eder Xavier swapping his face and body with actors from Stranger Things using Kling AI’s 2.6 Motion Control, and the versions posted to social platforms have exceeded 14 million views on X. The post drew attention from industry figures, including a share by a16z partner Justine Moore, who shared the video online. The clips demonstrate a shift from face-only deepfakes to full-body synthetic media.

- Advertisement -

The viral examples coincide with broader model development, including newer systems such as Veo 3.1, Nano Banana, FaceFusion, and OpenAI’s Sora 2, which expand access to high-quality generated video. Researchers warn these tools can quickly move beyond demos into widespread misuse. Emmanuelle Saliba of GetReal Security said “The floodgates are open. It’s never been easier to steal an individual’s digital likeness—their voice, their face—and now, bring it to life with a single image.”

Academic researchers emphasize technical escalation. Yu Chen, a professor at Binghamton University, said “Full-body character swapping represents a significant escalation in synthetic media capabilities.” He noted these systems must handle pose estimation, skeletal tracking, clothing transfer, and natural motion across the whole body. Chen added that older detection methods relied on boundary inconsistencies and temporal artifacts that full-body swaps can hide.

Security experts caution about harms beyond fraud, including non-consensual explicit content, political disinformation, and corporate deception. Observers on social media expressed alarm — one commentator tweeted “We’re not ready.” Researchers recommend platforms build automated detection, human review capacity, and escalation paths while policymakers consider liability and disclosure rules.

✅ Follow BITNEWSBOT on Telegram, Facebook, LinkedIn, X.com, and Google News for instant updates.

- Advertisement -

Previous Articles:

- Advertisement -

Latest News

India’s BRICS 2026 Aims for Resilience, Currency Independence

India will chair the 18th BRICS summit in New Delhi in 2026, focusing on...

Block Reportedly Plans to Lay Off 10% of Workforce

Block Inc., owner of Cash App and Square, may reduce its workforce by as...

Ex-Ripple CTO: Nations Adopting XRP for Geopolitical Strategy

XRP adoption is a strategic geopolitical move by nations seeking a neutral settlement currency,...

Coinbase Resists Nevada Attempt to Block Prediction Markets

Coinbase shares remained resilient near recent highs despite Nevada regulators seeking an emergency court...

ARK Sells $22M Coinbase, Buys Bullish

Ark Invest sold another $22 million worth of Coinbase shares on Friday, continuing a...

Must Read

Symbiosis Crypto Bridge: Your Guide to Moving Assets Between Blockchains

What is a Cross-Chain Crypto Bridge?Why Choose Symbiosis for Your Cross-Chain Needs?Support for 50+ BlockchainsAutomatic Routing for the Best RatesNo Need for RegistrationDirect Wallet...
🔥 #AD Get 20% OFF any new 12 month hosting plan from Hostinger. Click here!