US, UK Employees Risk Data Leaks Using Chinese GenAI Tools, Study Finds

  • Employee use of Chinese generative AI tools in the US and UK is widespread and often unsanctioned.
  • Researchers found over 1,000 users uploaded sensitive company information to China-hosted AI platforms in a month.
  • Source code, legal documents, and personal data were among the types of sensitive information shared.
  • Permissive data policies of Chinese GenAI services raise concerns about confidentiality and compliance.
  • Some companies are adopting monitoring and policy enforcement tools to manage risks related to AI use.

A new study shows that employees in the United States and United Kingdom are using Chinese generative Artificial Intelligence (GenAI) platforms at significant levels, according to research by Harmonic Security. This activity often occurs without formal approval or oversight from company security teams.

- Advertisement -

Over a 30-day period, Harmonic Security analyzed the behavior of 14,000 employees across several organizations. Nearly 8% accessed China-based GenAI platforms, such as DeepSeek, Kimi Moonshot, Baidu Chat, Qwen (by Alibaba), and Manus. The study identified 535 separate incidents where users uploaded confidential or sensitive data, including business records, engineering documentation, and personal information.

Of the total 17 megabytes of company content uploaded, about one-third was related to software source code or technical documents, with the remainder involving confidential financial reports, merger documents, legal contracts, and customer details. Harmonic Security noted that DeepSeek was involved in 85% of all reported incidents.

Many of these AI platforms, as highlighted in the study, operate under unclear or flexible data policies. In some cases, terms of service permit the use of user-uploaded data for further AI training. The use of these tools by employees can pose risks regarding company confidentiality and regulatory compliance, especially for firms handling sensitive customer or proprietary information.

To address these concerns, Harmonic Security has launched policy enforcement technology that provides real-time monitoring of employee AI use and detection of unsanctioned data uploads. Companies can restrict access to certain apps by location, limit the types of information uploaded, and prompt users with warnings or information about company policy.

- Advertisement -

The research indicates that awareness alone is not preventing risky use of external GenAI tools. Harmonic Security reports that about one in twelve employees works with Chinese GenAI platforms, often without knowing about data residency or potential risk exposure.

Further information about Harmonic Security’s efforts to protect sensitive data and enforce company AI use policies is available at harmonic.security.

✅ Follow BITNEWSBOT on Telegram, Facebook, LinkedIn, X.com, and Google News for instant updates.

Previous Articles:

- Advertisement -

Latest News

Intel Slides 17% After Q1 Guidance Miss; Supply Constraints.

INTC shares fell more than 17% on Friday after a quarterly report and weak...

Gold’s FOMO Drains Bitcoin: Prices Falling, Metals Rise Soon

The author argues that Bitcoin prices are likely to weaken because fewer groups need...

Paradex refunds $650,000 to 200 users after error in markets

Paradex refunded $650,000 to roughly 200 users after a maintenance error caused unintended liquidations.The...

Tesla Drops Autopilot to Boost FSD; TSLA Dips Modestly Today

Tesla said on Friday it will discontinue its basic driver-assistance package, Autopilot, to...

Telegram Crypto Scam Alert: 100K+ Channels Turn on Followers

Trusted trading signal groups with 100K+ subscribers now promote fake platforms that lock funds...
- Advertisement -

Must Read

What Are Anonymous Debit Cards And How Do They Work?

You've heard about anonymous debit cards, but what are they really? Anonymous Debit Cards are cards that let you make purchases without revealing your...
🔥 #AD Get 20% OFF any new 12 month hosting plan from Hostinger. Click here!