Apple Allegedly Restricted the Use of ChatGPT Among Its Employees to Avoid a Samsung-like Massive Leak

The Company is Also Developing its Own Language Models

Apple Allegedly Restricted the Use of ChatGPT Among Its Employees to Avoid a Samsung-like Massive Leak

The Company is Also Developing its Own Language Models

ChatGPT is on everyone's lips, and with good reason, as it represented a huge leap in the way we interact with artificial intelligence and the way we use it. That said, not everyone wants to jump on its bandwagon, and the proof of that is in Apple.

According to The Wall Street Journal, Apple restricted the use of ChatGPT among its employees. That's not all! They also prevent them from using other artificial intelligence tools such as GitHub's Copilot, which helps program more quickly and easily.

But why on earth is Apple doing this? The answer is simple: they want to avoid making the same mistake as Samsung. Recall that in April, it was reported that Samsung employees shared confidential company information with ChatGPT, including source code and recording of a private meeting. This information ended up in the hands of OpenAI.

That's not all, as the report also states that Apple is working on developing its own language models like ChatGPT. Thus, the intention is for employees to use internal tools when available to avoid leaks and have a tool that fits their needs.

It should be noted that it is unknown exactly how Apple is restricting the use of ChatGPT among its employees. That is, it is unclear whether it is completely prohibited or if some limits have been put in place to prevent the leakage of sensitive information.

What do you think of this news? Do you think Apple is doing the right thing? Let us know in the comments.

Comments

 
 
  • Best

  • New