Apple is offering out $1 million to anyone who can complete a new AI system task.
You read that right - if you consider yourself something of a cybergeek then you might be in for a chance to walk away with a million dollars.
But first, you’ll need to crack Apple’s new AI system.
Advert
There has been a buzz about the introduction of Apple Intelligence ever since it was first announced earlier this year.
It’s finally being released today (October 28) with the launch of iOS 18.1.
iPhone fans can’t wait to try out the update for themselves after the tech firm told users that it would open a world of ‘new possibilities’ for their phone.
Advert
In fact, CEO Tim Cook claimed that it ‘raises the bar for what an iPhone can do’.
According to Apple, the AI system is ‘designed to protect your privacy at every step’.
So while it is aware of your personal information, it doesn’t collect it.
And the firm is so confident about its security measures and its Private Cloud Compute (PCC) technology that it’s offering $1 million to anyone who can crack it.
Advert
The tech firm itself can’t even access the stuff sent to PCC, which is built with custom Apple silicon and a ‘hardened operating system designed for privacy’, thanks to end-to-end encryption.
Apple describes the tech as the 'most advanced security architecture ever deployed for cloud AI compute at scale', a comment which is sure to sting for its competitors.
PCC was built with a core set of requirements, including stateless computation of personal user data, enforceable guarantees, no privileged runtime access and most importantly, non-targetability and verifiable transparency.
Advert
To put these latter two to the test, Apple is encouraging people to try and hack into the system.
Last week, Apple called on ‘all security researchers - or anyone with interest and a technical curiosity’ to conduct their ‘own independent verification’ of the company's big claims about the capabilities of PCC.
It said in an announcement: “To further encourage your research in Private Cloud Compute, we're expanding Apple Security Bounty to include rewards for vulnerabilities that demonstrate a compromise of the fundamental security and privacy guarantees of PCC.”
Third-party auditors have already had a go at it, but now, anyone can attempt to break into the system.
Advert
And for your valiant efforts, you can earn a myriad of different rewards - but one lucky bugger could get their hands on a whopping $1 million if they can run code on the system.
All they have to do is avoid detection and access sensitive info. Simple, right?
The idea is that Apple will be able to ‘learn more about PCC and perform their own independent verification of our claims’ by letting the tech-savvy try and break into it.
So, what are you waiting for?