Apple AI dares tech wizards to $1m breach challenge
Tech company, Apple, is going wild with a million dollars offer to any hacker, sorry security researcher, who is able to breach its top of the range safety of the ‘Private Cloud Compute,’ built for its Apple Intelligence.
Apple Intelligence, according to the company, is a built-in artificial intelligence system coming to the iPhone 16, 15 Pro and 15 Pro Max in the week.
The company announced Thursday that anyone with interest and a technical curiosity’ to perform ‘their own independent verification of our claims’ to the extent of breaching the integrity of the product earns a million dollars.
The ‘Private Cloud Compute’ comprises the servers that will receive and process user requests for Apple Intelligence when the AI task is too complex for on-device processing. The system, according to Apple, features end-to-end encryption and immediately deletes a user’s request once the task is fulfilled.
There are different payouts for certain discoveries, but the $1 million goes to anyone who can run code on the system without being detected and accessing sensitive parts.
The AI-powered product was unveiled in September, revealing new features for sorting messages, generative writing and creating unique emojis.
But only those with the high-end iPhone 15 smartphones and the new iPhone 16 will have access to the highly-anticipated platform.
Apple CEO Tim Cook said it marks ‘a new chapter in Apple innovation’.
Largely, Apple Intelligence is focused on so-called ‘generative’ AI models, which allow users to create text or images from prompts. And Apple says it wants to make sure the system is secure.
‘To further encourage your research in Private Cloud Compute, we’re expanding Apple Security Bounty to include rewards for vulnerabilities that demonstrate a compromise of the fundamental security and privacy guarantees of PCC,’ the company shared in the announcement.
Apple had previously only allowed third-party auditors into its Private Cloud Compute, but now anyone can take a stab at it.
Those up to the challenge also have access to a security guide and virtual environment, which let people analyze Private Cloud Compute inside the macOS Sequoia 15.1 developer preview.
That means individuals participating will need a Mac with an M-series chip and at least 16 GB of RAM to access.
Apple will pay $100,000 to anyone who can execute ‘unattested’ code – or code requests that are unverified by the company.
Anyone who can gain access to a user’s request data or other sensitive information about the user outside the ‘trust boundary’ – a boundary where program data or execution changes its level of ‘trust’ – could win up to $250,000.
And if someone can gain ‘access to a user’s request data or other sensitive information about the user outside the trust boundary’ they will receive $150,000.
A small $50,000 reward is in place if anyone can hack the system to accidentally or unexpectedly expose data.
‘We believe Private Cloud Compute is the most advanced security architecture ever deployed for cloud AI compute at scale, and we look forward to working with the research community to build trust in the system and make it even more secure and private over time,’ said Apple.
But there will be a waitlist once it launches due to potentially high demand.
The launch version of Apple Intelligence will come with Writing Tools for proofreading and rewriting, Smart Replies that quickly respond to messages, Notification Summaries, Clean Up in Photos, and an initial redesign of Siri, and more.
But certain features – including Genmoji, Image Playground, ChatGPT integration, and Visual Intelligence – are set to arrive as part of iOS 18.2, which should enter beta in the next month or so.
Others will come even later. The fully revamped Siri, for example, won’t be available until sometime in 2025.
Additionally, next week’s release is only compatible with US English, which means countries outside of the US will have to wait until December to be able to use Apple Intelligence in their languages.