Sunday, April 14, 2024
HomeTechGoogle does not trust Bard and asks its employees not to share...

Google does not trust Bard and asks its employees not to share sensitive information with the chatbot


Google employees have been informed by the company about the risks of using chatbots in the workplace.

Google does not trust Bard and asks its employees not to share sensitive information with the chatbot
Bard is Google’s AI-based chatbot, aimed at competing with ChatGPT

Google doesn’t seem to be in favor of that your employees use chatbots within the work environment, including your own service, Bard. At least that was confirmed by sources close to the company to the Reuters agencywho assure that Google has informed its employees about the risks of this type of tools.

Above all, Google wants its staff to avoid share confidential information with chatbots such as Bard or ChatGPT to reduce the risk of information leaks related to your projects or technologies under development.

Google is concerned that its employees share sensitive information with Bard

Despite having advertised Bard as a safe tool and respectful of privacy, Google would prefer that its employees limit the use of the chatbot in their work environment, especially when it comes to sharing confidential information or sensitive data.

In the past, some chatbots have been shown to have the ability to “absorb” the data obtained during their conversations with users to use them during their learning, and later it was seen how they repeated that data in totally different conversations, thus increasing the risk of leaks of sensitive information. Apart from that, in some cases content shared with chatbots is reviewed by humanswhich unleashes another privacy risk.

Similarly, Google is not entirely convinced about the effectiveness of this type of tools in some tasks, such as those related to programming. For this reason, it would have been suggested to the employees avoid using code generated by chatbotsas it may contain errors.

Apple and Samsung also limited the use of chatbots to their employees

Google has been the last to “suggest” users to avoid using Bard and other chatbots during their work activities, or at least not to share confidential information with them. However, it’s not the only one.

samsung was one of the first to prohibit the use of ChatGPT to its employeesand decided start developing your own tool intended to be used by your staff.

Shortly after, Apple took similar steps by restricting the use of ChatGPT and other similar tools based on artificial intelligence to its employees, also by fear of information leaks.

The case of Google is even more striking, since she doesn’t even seem sure that her own tool will be able to keep secret the data obtained during conversations with users.

Source link


Please enter your comment!
Please enter your name here

Most Popular