iPhone Encryption: 5 Ways It's Changed Over Time
Apple's battle with the FBI has put iPhone encryption in the spotlight. However, some might be surprised that the company's encryption efforts have evolved slowly and are not that different from those of other smartphone makers. Here's a look at the 5 phases of the process so far.
![](https://eu-images.contentstack.com/v3/assets/blt69509c9116440be8/blta095b88e936b84d8/64cb38e2b03b1f2ba1071471/Slide1_Intro_iPhone_ymgerman_iStock_000074797465_Medium.png?width=700&auto=webp&quality=80&disable=upscale)
Despite the recent spotlight on Apple's iPhone encryption technology because of itsĀ fight with the FBI over providing access to data stored on a locked iPhone 5c used by one of the shooters in the San Bernardino terrorist attacks, the iconic computer and device maker's encryption efforts are not so different from those of other smartphone makers and software companies over the years, said one security expert.
"They are keeping up with the industry, but are not a pioneer," said Dan Schiappa, senior vice president and general manager of theĀ Sophos Enduser Security Group, in an interview with InformationWeek. "But one thing that they have that the others don't is a proprietary messaging system that encrypts messages from end-to-end, so they have a leg up in encryption there compared to other smartphone makers."
Apple's inability to access encrypted information stored on the iPhone of the San Bernardino terrorist's device has prompted the FBI to seek the unprecedented use of the All Writs Act of 1789 to expand its authority and force Apple to build a backdoor into its iOS software. Apple is refusing to comply with this request. The company notes that once a backdoor is created it can also be used by malicious attackers should they get their hands on the code.
[See Encryption Debate: 8 Things CIOs Should Know.]
Nonetheless, a recent Pew Research study found that 51% of Americans say they believe Apple should unlock the iPhone used by the San Bernardino shooter. Microsoft cofounder Bill Gates has made statements that have been seen by some as being inline with that sentiment. However, other tech titans have backed Apple's position.
Interestingly, Apple's public key encryption method is not all that different from those that other companies are using. The iPhone was introduced in 2007, at a time when Microsoft was already using encryption in the 2005 release of Windows Vista, Schiappa said. Android smartphones, which later emerged commercially in 2008, also rely on device encryption.
Schiappa pointed to the importance of embedding the encryption key in the chipset, but security expert and hacker Jonathan Zdziarski said that encryption is all about the type of iOS used.
In particular, iOS 8, which was launched in 2014, was a significant development in Apple's encryption efforts because it linked the encryption keys to a user's passcode, or pin, according to Zdziarski. He speculated that Apple will likely further increase the strength of its encryption, which could possibly result in the device facing longer delays between tries to unlock the device.
Schiappa also noted that there is definitely a need for encryption, which is designed to ensure the bad guys don't get access to a user's information, such as credit card account information, bank account data, or even pictures stored on the device. But once a backdoor access is created for law enforcement officials, it becomes a backdoor for everyone -- including hackers and cyber-criminals.
Here is a look at the history of encryption on the iPhone and the views of Apple CEO Tim Cook and late cofounder Steve Jobs on the issue of encryption and privacy. Let us know where you stand in the debate by chiming in in the comments section.
Does your company offer the most rewarding place to work in IT? Do you know of an organization that stands out from the pack when it comes to how IT workers are treated? Make your voice heard. Submit your entry now for InformationWeek's People's Choice Award. Full details and a submission form can be found here.
When the iPhone was first introduced in 2007, the mobile OS X had only a four-digit password and no encryption, according to Schiappa. In subsequent versions of the iPhone, through iPhone 3G, Apple did not use encryption.
Following the iPhone launch in 2007, some security firms expressed concern that the iPhone would begin to attract the attention of malicious attackers as more users engaged in online shopping via smartphone.
The 3GS used a minor form of default encryption that could remotely wipe a user's data from the phone, but it was not designed to protect individual files, photos, or other information from unauthorized access, Schiappa said. "The encryption was more of an administrative measure where the key could be revoked, [...] block access to the phone, and turn it into a brick."
Apple, however, did make its crypto library available to third-party app developers so they could encrypt data in their apps. But this feature was not made available by default.
One iPhone developer and hacker was able to crack the so-called enterprise-friendly encrypted iPhone 3GS within two minutes using readily available freeware, according to Wired.
When the iPhone 4s launched in 2011, it did not come with the significantly improved encryption features. But when Apple introduced iOS 8 in 2014, it made the new mobile operating system backwards compatible with iPhone versions going back to iPhone 4s.
"The model of iPhone has nothing to do with what can and can't be encrypted on it. It's the version of the iOS," said Jonathan Zdziarski, a security expert and hacker. "As of iOS 8, all data is encrypted using keys that are tied to the user's passcode, whereas iOS 7 and below used keys that were not tied to the passcode, so that Apple could decrypt much of the information (like email attachments and photos)."
With the launch of the iPhone 5s in 2013, Apple added its A7 chip and Touch ID fingerprint sensor. The fingerprint reader would release the pin, which then authenticated the phone and allowed the user to use the device. The iPhone 5s was introduced with iOS 7.
"The iPhone 5s started to encrypt, but it was only limited information," Schiappa said. "Once a user logged in, then only select things were encrypted." The encrypted information would include contacts, notes, and calendar items, for example, but did not include email attachments or photos.
The iPhone 5c, which was used by one of the shooters in the San Bernardino terrorist attack and is at the center of Apple's fight against the FBI, does not have Touch ID and uses the A6 chip instead of the A7. Despite those differences, the 5c uses the same type of encryption as the 5s because they both run the same iOS, according to Zdziarski. Also, instead of Touch ID, iPhone 5c users manually enter their pin to access their phone.
Apple's A-series processors allowed the encryption key to be stored in the device's chip, making it harder to extract the encryption key. Storing the key in the hardware has now become a common practice, Schiappa said.
Apple uses public key encryption for its devices, in which one key encrypts information and a second key decrypts it. Each iPhone is shipped with a unique identifier, in which the user creates a pin that either they or their Touch ID will activate to authenticate the phone. Once the phone has been authenticated, it can be used.
Last year, Apple unveiled its iPhone 6s. That device also featured Touch ID, along with an A9 chip and iOS 9 that were architected together for advanced security features. But with the iPhone 6s, users were able to automatically encrypt everything on their phone, including photos and email attachments, once the Touch ID released the pin to authenticate the device.
And for the law enforcement agencies that are pressuring Apple to give it access via a backdoor to users' data on their iPhones, they clearly would be interested in obtaining the additional treasure trove of photos and email attachments, as well.
As smartphones increasingly store more of users' personal, financial, and location information, the need to guard such data goes up exponentially, CEO Tim Cook outlined in a letter to customers.
"All that information needs to be protected from hackers and criminals who want to access it, steal it, and use it without our knowledge or permission," Cook said. "Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us."
Should the FBI ultimately prevail in its demands that Apple create a backdoor to the iPhone OS, Cook outlined how the government could potentially invade users' privacy.
"It would have the power to reach into anyone's device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone's microphone or camera without your knowledge," Cook warned.
While Apple cofounder Steve Wozniak told CNBC that he believes the late Steve Jobs would have fought for users' privacy in the current Apple encryption fight with the US government, it's not so clear how strong of a conviction Jobs would have had on privacy and encryption, based on some of his past statements.
When Jobs addressed privacy issues relating to location-tracking on its devices during an AllThingsD conference in 2010, Jobs stated: "Privacy means people know what they're signing up for -- in plain English, and repeatedly." As a result, the Apple location tracking feature would only be activated if users opted in.
However, Jobs also said there may potentially be times when it would be necessary to invade a user's privacy without giving them a choice. In 2008, Jobs acknowledged that Apple would use a "kill switch" to remotely disable a user's app that they had purchased in the App Store, if the company discovered that the app was in fact a malicious program.
"Hopefully, we never have to pull that lever, but we would be irresponsible not to have a lever like that to pull," Jobs told the Wall Street Journal in an interview. That comment raised privacy concerns among some in the tech community, who said they felt uncomfortable with the notion that Apple could control what applications would be used on their phone.
While Apple cofounder Steve Wozniak told CNBC that he believes the late Steve Jobs would have fought for users' privacy in the current Apple encryption fight with the US government, it's not so clear how strong of a conviction Jobs would have had on privacy and encryption, based on some of his past statements.
When Jobs addressed privacy issues relating to location-tracking on its devices during an AllThingsD conference in 2010, Jobs stated: "Privacy means people know what they're signing up for -- in plain English, and repeatedly." As a result, the Apple location tracking feature would only be activated if users opted in.
However, Jobs also said there may potentially be times when it would be necessary to invade a user's privacy without giving them a choice. In 2008, Jobs acknowledged that Apple would use a "kill switch" to remotely disable a user's app that they had purchased in the App Store, if the company discovered that the app was in fact a malicious program.
"Hopefully, we never have to pull that lever, but we would be irresponsible not to have a lever like that to pull," Jobs told the Wall Street Journal in an interview. That comment raised privacy concerns among some in the tech community, who said they felt uncomfortable with the notion that Apple could control what applications would be used on their phone.
-
About the Author(s)
You May Also Like