It’s therefore little surprise to witness Musk’s fury upon hearing Apple’s announcement yesterday that it is “integrating ChatGPT access into experiences within iOS 18, iPadOS 18, and macOS Sequoia, allowing users to access its expertise—as well as its image- and document-understanding capabilities—without needing to jump between tools.”
Musk aired his concerns about the partnership on his social media platform, X, formerly known as Twitter.
“If Apple integrates OpenAI at the OS level, then Apple devices will be banned at my companies,” he wrote. “That is an unacceptable security violation.
“And visitors will have to check their Apple devices at the door, where they will be stored in a Faraday cage.”
A Faraday cage is used to block some electromagnetic fields.
‘New standard for AI privacy’
Apple outlined a raft of privacy measures related to AI at this week’s Worldwide Developers Conference in California.
While CEO Tim Cook said the company’s new AI strategy and features, named ‘Apple Intelligence’ is the “next big step” for his company, the organization also laid out what it believes is a “new frontier” for privacy in artificial intelligence.
The cloud intelligence system, named Private Cloud Compute (PCC), was created to “make sure that personal user data sent to PCC isn’t accessible to anyone other than the user—not even to Apple.”
Apple’s ten-page, near-4,000-word announcement of PCC also promises “more to come” from the company on user privacy and forms only part of its AI security strategy.
Alongside the reveal of the PCC system—which Apple said independent experts can inspect to verify privacy—the company, with an approximately $3 trillion market cap, specifically addressed concerns relating to OpenAI.
It said: “Privacy protections are built in for users who access ChatGPT—their IP addresses are obscured, and OpenAI won’t store requests. ChatGPT’s data-use policies apply for users who choose to connect their account.”
An announcement by OpenAI mirrored the pledges: “Privacy protections are built in when accessing ChatGPT within Siri and Writing Tools—requests are not stored by OpenAI, and users’ IP addresses are obscured.”
Musk is seemingly unconvinced, particularly because Apple is using an outside source instead of building its own.
He wrote on X: “It’s patently absurd that Apple isn’t smart enough to make their own AI, yet is somehow capable of ensuring that OpenAI will protect your security and privacy. Apple has no clue what’s actually going on once they hand your data over to OpenAI. They’re selling you down the river.”
Apple and OpenAI did not immediately respond to Fortune’s request for comment.
On a roll with his criticism of the partnership, Musk continued to post a meme showing Apple and OpenAI sharing the data gathered from iPhones, before adding: “Here’s the problem with ‘agreeing’ to share your data: nobody actually reads the terms and conditions.”
Grok phone
Musk fans are used to the billionaire teasing them about what he might be working on next—and for years, they have theorized about whether a Musk entity will release a smartphone.
The partnership between Apple—which the SpaceX owner has found himself in a bunfight with before—and OpenAI has reignited these rumors and were stoked by Musk himself.
One user on X said they wanted a Grok phone, to which Musk replied: “If Apple actually integrates woke nanny AI spyware into their OS, we might need to do that!”
When another user suggested X might link up with an Android phone maker for a product, Musk added: “It is not out of the question.”
Get updates delivered to you daily. Free and customizable.
Welcome to NewsBreak, an open platform where diverse perspectives converge. Most of our content comes from established publications and journalists, as well as from our extensive network of tens of thousands of creators who contribute to our platform. We empower individuals to share insightful viewpoints through short posts and comments. It’s essential to note our commitment to transparency: our Terms of Use acknowledge that our services may not always be error-free, and our Community Standards emphasize our discretion in enforcing policies. We strive to foster a dynamic environment for free expression and robust discourse through safety guardrails of human and AI moderation. Join us in shaping the news narrative together.
Comments / 0