During BlackHat 2019, Apple announced that they were opening their Bug Bounty program to the public. They also promised to provide select researchers special devices that would simplify research into their platforms. I would be grossly understating the facts if I said that this was long overdue — research into Apple platforms has always required researchers to stockpile 0days, or wait indefinitely for others to release details about the bugs they had reported. If Apple had really made good on its promises, it could have empowered individual researchers to spend their time working instead of wasting time just trying to get their foot in the door. But if their lawsuit against Corellium the very next day was a touchstone for the future, things weren’t as great as they seemed to be.

It wasn’t until mid-2020 that scant details began to emerge on these research devices, no thanks to Apple. The first sign was an additional kernel image in iOS 14 betas, which had symbol names for functions – something which Apple moved fiendishly fast to correct as soon as people talked about it on Twitter. While there is an argument to be made that they are somehow protecting their intellectual property by redacting symbol names, it apparently hadn’t been an issue until the previous year. If anything, it’s another example of the hostile behavior towards the research community that has become the norm with Apple.

Soon, Apple opened up applications for obtaining these devices to “researchers with a proven ability to find system security vulnerabilities,”. Ignoring the blatant barrier to entry for newcomers, the criteria to qualify is remarkably nebulous. Several prominent researchers who go above and beyond this criterion have had their applications denied by Apple, often without comment. Even if someone is deemed worthy by them, they have to agree to some incredibly absurd terms.

“If you use the SRD to find, test, validate, verify, or confirm a vulnerability, you must promptly report it to Apple.”

Ignoring the misplaced idealism that an unwilling actor wouldn’t just ignore this restriction, even having one of these devices opens you to the possibility of tainting your future work. There is no way to prove if you used these devices for your work, and I certainly will bet on Apple’s lawyers over anyone a researcher could hire to defend themselves. The situation is so remarkably hopeless that almost all of the best researchers I know didn’t even bother applying to this program. Even Google’s Project Zero chose not to apply for one, as it would conflict with their policy on responsible disclosure.

The fact that researchers have decided to fend for themselves over this official program should be deeply alarming for Apple. The most appalling takeaway here is that things didn’t have to be this way. There is, quite simply, no reason for Apple to be the arbiter of eligibility or terms for researching their platforms. There are a number of ways available to Apple for making a retail iPhone accessible to researchers and curious customers alike. Curiously, doing this is just a matter of choice for Apple — they have been doing it internally for years now.

After reading the usual rhetoric on the subject of adding such a feature to retail devices, I feel obliged to point out that this is possible without compromising their security. After this “demotion” operation, the device could wipe its encryption keys and print big scary warnings on every boot. This is hardly a novel idea, for the capability of unlocking the device’s bootloader has been commonplace on Android devices for almost a decade.

To put it in no uncertain terms, this is the only way for Apple to truly level the playing field for anyone who wants to research the security of their devices. Until then, their claim to security is by obscurity foremost.