According to new data from Netskope, employees are allegedly still sharing confidential company information with chatbots like ChatGPT and AI writers despite the obvious risk of leaks or hacks.
The study, which involves 1.7 million users in 70 international organisations, discovered an average of 158 instances of source code posting to ChatGPT per 10,000 users each month, making it the most important organisational vulnerability ahead of other kinds of sensitive material.
Even while there are significantly fewer instances of regulated data (18 incidents/10,000 users/month) and intellectual property (4 incidents/10,000 users/month) being posted to ChatGPT, it is obvious that many engineers are just unaware of the harm that may be done by source code leaks.
Netskope also emphasized the surge in interest in artificial intelligence along with ongoing exposures that could lead to weak points for enterprises.
According to statistics, consumption of GenAI apps has increased by 22.5% in the last two months, with a large organisation of over 10,000 users using an average of five AI apps daily.
ChatGPT takes the lead, with eight times as many daily active users as any other GenAI app.
Along with ChatGPT (84%) and Grammarly (9.9%), Bard (4.5%) rounds out the top three generative AI apps used by businesses worldwide. Bard is growing at a strong 7.1% per week compared to ChatGPT’s 1.6% per week.
Although many will contend that uploading source code or other sensitive information may be prevented, Ray Canzanese, director of threat research at Netskope, claims that it is ‘inevitable.’
“Organizations should focus on evolving their workforce awareness and data policies to meet the needs of employees using AI products productively,” James Robinson, the company’s Deputy Chief Information Security Officer said.
The company advises admins and IT teams to implement sufficient contemporary data loss prevention solutions, provide periodic user instruction, and prohibit access to superfluous programmes that represent a disproportionate risk.