Apple Bans ChatGPT Access for Employees Amid Data Leak Concerns: Report

By default, ChatGPT stores users' chats - which are later used to train AI models or inspected by the company to ensure compliance with policies.

By default, ChatGPT stores users’ chats – which are later used to train AI models or inspected by the company to ensure compliance with policies.

Apple joins a growing list of unions that have asked their employees to stop using generative AI-based chatbots, including OpenAI’s ChatGPT.

Following Samsung, Apple joins a growing list of companies that have asked their employees to stop using generative AI-based chatbots — including OpenAI’s ChatGPT — in order to communicate about internal company matters. Leakage of confidential information can be avoided.

According to a report in The Wall Street Journal, Apple has said that Generative AI cannot be used for work purposes. They have also barred the use of other AI-based platforms such as Github Copilot – which incidentally is owned by Microsoft – which allows users to automate code writing.

By default, ChatGPT stores users’ chats – which are later used to train AI models or inspected by the company to ensure compliance with policies. However, OpenAI has recently enabled users to disable chat history, although it still retains data from the last 30 days prior to deletion.

Similar to other companies, Apple may have concerns about employees using generative AI for daily tasks such as code writing, due to the advanced capabilities of chatbots such as ChatGPT. In such scenarios, there is a risk of internal company information being accidentally leaked to the outside world. As a result, it is likely that Apple is banning the use of generative AI for its employees for this reason.

In related news, earlier this month, Samsung reportedly banned the use of Generative AI, including ChatGPT, on office devices such as computers, tablets and phones. The ban also applies to personal devices connected to internal company networks, Bloomberg noted. However, in Samsung’s case, internal data on ChatGPT had already been leaked.