Fearing leaks, Apple restricts ChatGPT use among employees

An AI-generated cartoon depiction of a chatbot being crossed-out.

Enlarge / An AI-generated cartoon depiction of a chatbot being crossed out. (credit: Benj Edwards / Stable Diffusion)

According to internal sources and company documents reviewed by The Wall Street Journal, Apple has restricted its employees’ use of ChatGPT and AI coding tools such as GitHub Copilot for fear of leaking confidential data to outside sources. Meanwhile, Apple is also reportedly developing similar AI technology.

ChatGPT is a conversational large language model (LLM) developed by Microsoft-backed OpenAI that is capable of tasks ranging from answering questions and writing essays to assisting with programming chores. Currently, the ChatGPT AI model only runs on OpenAI or Microsoft servers and is accessible through the Internet.

Apple’s decision to limit the use of external AI tools stems from a concern that confidential data could be exposed due to the nature of these AI models, which send user data back to developers for cloud processing and assistance in future AI model improvements.

Read 7 remaining paragraphs | Comments

Source

Leave a Reply

Your email address will not be published. Required fields are marked *