Google warns its staff about chatbots including Bard: Report

According to the report, the company has advised the employees not to access its confidential material AI chatbotsThe people said and the company confirmed citing a long-standing policy on protecting information.

chatbots, among them Bard and chatgpt, are human-voice programs that use so-called generative artificial intelligence to interact with users and respond to myriad prompts. The report states that human reviewers can read chats, and the researchers found that the same AI can reproduce data absorbed during training, creating a leak risk.

In addition, Alphabet has also cautioned its engineers to avoid direct use of computer code that could generate chatbots, some people told Reuters.

When Reuters asked for comment, the company said Bard may make unwanted code suggestions, but it still helps programmers. Google also said that it aims to be transparent about the limitations of its technology.

A related factor is how Google wants to avoid commercial losses from software launched in competition with ChatGPT.

At stake in Google’s race against ChatGPT proponents OpenAI And Microsoft Corp. is investing billions of dollars and still has untold advertising and cloud revenue from new AI programs.

Google’s caution also reflects what is becoming a security standard for corporations to warn personnel about using publicly available chat programs.

The companies told Reuters that a growing number of businesses around the world have installed railings on AI chatbots, including Samsung, Amazon.com and Deutsche Bank. Apple, which did not return a request for comment, reportedly has too.

According to a survey of nearly 12,000 respondents, including top US-based companies, as of January nearly 43 percent of professionals were using ChatGPT or other AI tools, often without disclosing to their bosses.

Google told Reuters it has had detailed talks with Ireland’s Data Protection Commission and is addressing regulators’ questions, following a report from Politico on Tuesday that the company was postponing Bard’s EU launch this week. Further details regarding the privacy impact of the chatbot were pending.

concern about sensitive information

Such technology could draft emails, documents, even software, promising to speed up tasks. However, this content may also contain misinformation, sensitive data, or copyrighted passages from the “Harry Potter” novels.

According to a Google privacy notice updated on June 1st: “Do not include confidential or sensitive information in your Bard conversations.”

Some companies have developed software to address such concerns. For example, Cloudflare, which defends websites against cyberattacks and offers other cloud services, is marketing the ability for businesses to tag and restrict certain data from flowing externally.

Google and Microsoft are also offering conversational tools to business customers that will come with higher price tags but refrain from absorbing data into public AI models. The default setting in Bard and ChatGPT is to save users’ conversation history, which users can choose to delete.

Yusuf Mehdi, Microsoft’s consumer chief marketing officer, said it is “understandable” that companies would not want their employees to use public chatbots for work.

“Companies are taking a methodically conservative approach,” Mehdi said, explaining how Microsoft’s free Bing chatbot compares with its enterprise software. “There, our policies are much more strict.”

Microsoft declined to comment on whether it has a broad ban on employees entering confidential information into public AI programs, although a separate executive there told Reuters it restricted their use personally. Is.

Cloudflare CEO Matthew Prince said typing confidential matters into chatbots was “like turning a bunch of PhD students loose on all your private records.”

(with inputs from Reuters)

catch all technology news And updates on Live Mint. download mint news app to receive daily market update & Live business News,

More
Less

Updated: June 16, 2023, 07:23 AM IST