
Researchers who find major flaws in Apple products could now get a major payday from the company. (Gabrielle Lurie/AFP/Getty Images)
Apple will finally start paying cash rewards to researchers who find security problems in their products, the company announced at the BlackHat cybersecurity conference in Las Vegas on Thursday. But the news left some experts wondering why Apple was so late to the party.
So-called bug bounty programs are standard among most tech giants. They can help consumers by encouraging independent researchers to help companies fix security flaws. When a researcher finds a legitimate problem, rewards can range all the way up to the six figures.
But Apple historically had a different approach — offering a tip line where people could report security problems but not handing out any reward other than putting the researcher's name on a thank you page on its website.
[Facebook pays 10-year-old Finnish genius $10,000 for exposing flaw in Instagram]
"It was kind of an insult," said Matthew Green, a computer science professor at Johns Hopkins University who has alerted Apple about security problems in its products and didn't know why it took them so long to launch a bounty program. "Maybe after San Bernardino they felt there was some pressure to acknowledge that it wasn't enough to give people a pat on the back, even if they can't pay enough to compete with the free market," he said.
Earlier this year, Apple faced criticism over its lack of a bug bounty program when the FBI paid an unknown entity more than $1 million for help breaking into an iPhone used by one of the San Bernardino, Calif., shooters. Without a bug bounty program, some argued, the only way researchers could make money from finding bugs in Apple products was by selling them off to the highest bidder — in this case, the FBI.
But Apple now has a chance "to compete for the type of exploits available and control how they'll be used," said Jeff Pollard, principal analyst focused on IT security at Forrester Research.
[Fiat Chrysler will pay you $1,500 if you can do this one thing to its cars]
Apple's new rewards program will launch in September, but it won't be open to anyone. It will be invite-only — at least in the beginning. Researchers who approach the company with a major problem also will be invited to join.
The program will only reward researchers for finding a few categories of problems. The biggest payouts will be for bugs affecting software built into components that help Apple's devices start up securely. Researchers who find one of those could get up to $200,000.
That makes Apple's bug bounty program one of the most lucrative out there. Google's similar program, which launched in 2010, upped its largest bounty — for breaking the security of its Chromebooks — to $100,000 in March. Microsoft's largest per bug payout is also $100,000.
Apple's big rewards may persuade researchers to turn to the company when they find a problem, rather than sell the information on the open market — where vulnerabilities might be used for criminal or state-sponsored hacks.
But even with Apple's hefty new payouts, other places may still pay a better premium for ways to hack into the company's products.
Last year, Zerodium, a company that deals in buying zero-day vulnerabilities from researchers and selling them to large corporations and government agencies, said it paid $1 million to researchers who were able to remotely hack into the latest version of Apple's mobile operating system, iOS.
The company doesn't offer the same massive payouts for problems in other products. "No software other than iOS really deserves such a high bug bounty," Zerodium founder Chaouki Bekrar told The Washington Post last year, citing Apple's strong security built into its products.
Apple's longstanding reputation for focusing tight security may be one reason the company held out against an official bug bounty program so long, even as other sectors like the automotive industry, airlines and the Department of Defense started to get in on the game.
But the rise of Apple Pay may be another factor, along with San Bernardino, that pushed the company to be more aggressive about getting help from the larger security community, JHU's Green said.
"They really need to know that system is secure" because there's actual money at risk, Green said. "I think a lot of researchers would rather hand vulnerabilities over to Apple than to someone who is going to use it to steal people's payment information."
Comments
Something to say?
Log in or Sign up for free