It’s amazing that one little iPhone 5C has become the crux of a personal data privacy vs. national security firestorm with implications that could affect all of us—in fact, the world at large—for years and decades to come.
In fact, it wouldn’t be a surprise to see on TV someday Don Wildman, host of the History Channel’s “Mysteries at the Museum,” at the Computer History Museum doing a segment about the phone that once belonged to the late Syed Farook of Redlands, Calif. Farook and Tashfeen Malik shot and killed 14 people and wounded 21 others in San Bernardino, Calif., on Dec. 2, 2015, before they were caught and killed by law enforcement personnel.
Because of its potential value to the Federal Bureau of Investigation in finding out more about terrorist networks, Farook’s iPhone stands to become a historical artifact for the digital age, worthy of significance alongside the first portable calculator, the Altair computer and even the Pong video game.
Symbol of Data Privacy vs. National Security
That iPhone also has become a symbol of data privacy vs. national security, two sides that are strictly divergent in their interests and purposes. When the FBI obtained a lower court order to see the contents on the iPhone, and when Apple refused to create a backdoor to harvest the data, citing data privacy policies and overall trust that its customers have invested in the company for years to safeguard their personal data, we’ve moved inescapably into a make-or-break moment involving no less than national security versus personal liberty.
The question now, of course, is this: Should Apple download this personal information—even though the owner is deceased—and hand it over to the authorities? The company has until Feb. 26 to comply with the court order to open the iPhone, take off the data and turn it over to the FBI. Apple CEO Tim Cook will have none of this, and the company is preparing a legal defense.
The next questions: What legal and moral precedents does this set, no matter what eventually is settled in the case? How do we keep our private information private, and when the common good is impacted, when should private information need to be released to the authorities?
Since this conflict came to light Feb. 16, when Cook published an open letter to customers about the court order, new angles to this story have cropped up having to do with security on the device. For example, a county official (yet unnamed on Feb. 22) reset the access password remotely in an effort to open the phone. This act now disallows a backup of the information on the phone and in the iCloud, so the original data on the phone is now in a tenuous state.
Nonetheless, this all boils down to one central issue: Does the government have the right to circumvent encryption security to obtain personal information stored on a device or in a cloud, in order to perform research in a criminal case—or for any other reason?
Legal Decision Will Become Landmark
Precedents set here will affect criminal investigations and court decisions for years to come.
eWEEK asked these questions to a group of respected IT security and data protection professionals:
1) Do you stand behind Apple’s/Google’s position on this, or the FBI’s? Why?
2) What solution to this data privacy problem would you put forth if you were a mediator in this dispute?
3) Is it possible to isolate that iPhone and just have Apple get the data from it and send it to the FBI so that no backdoors are involved?
Their answers are on the following four pages.
Photo courtesy of YouTube
Security Pros Offer Opinions, Solutions for FBI vs. Apple
Here are their perspectives. It’s important to note that all these thoughts are personal opinions that do not necessarily reflect the opinions and policies of their companies.
Jeremiah Grossman, founder, WhiteHat Security:
A1: Yes, we stand with Apple. And Twitter. And Microsoft. And Facebook. And Mozilla. And Box. Because the security and privacy rights of the people must never be given up, otherwise what exactly are we defending?
A2: I’d imagine what Apple—and with what the rest of Silicon Valley is doing right now—is [re]-designing their products and services to make it mathematically impossible to comply with a similar government order that’ll eventually come. To counter, it’s likely government is drafting up new legislation that will require backdoor access into technology products. Who exactly is eventually going to win that fight will be hard to predict.
A3: I had the same question, but the answer to that remains unclear.
Gunter Ollmann, chief security officer, Vectra Networks:
A1: (I stand behind) Apple, given they have chosen to tie this request to the bigger political debate over weakening encryption standards—and the repercussions, should they lose—could be extensive to the entire security industry. If Apple loses this appeal over the FBI request for exploiting a vulnerability—which is now positioned as a backdoor—then a precedent may become set in the entire backdoor debate.
A2: The debate is actually largely moot. If vendors were required to install backdoors or include recoverable keys in the encryption they use, there are a near-endless number of applications and software additions that can be installed by the user to ensure that those backdoors are irrelevant.
A3: I believe so. However, Apple likely fears that by complying with this request—to create a custom patch for a vulnerable phone—it will open the door to subsequent law enforcement requests to provide support in investigations of similarly vulnerable [old] iPhones. This would appear not to scale well and could be financially demanding.
Jeff Schilling, CSO, Armor:
A1: I stand behind the FBI’s position. A court has decided that the evidence on the phone is critical to an open investigation and has ordered Apple to comply. To be clear, the court order is to provide a vector to open this one phone, not create a backdoor to use on all Apple phones without the consent of the owners of all iPhones. I believe Apple is trying to confuse the public and pivot this into a privacy issue. I support Apple’s right to appeal, but if the court decides they should comply, I would expect them to comply.
A2: Conditions could be set such that the “work around” that Apple creates to allow the FBI to crack this phone is not shared with the government. While Director [James] Comey has advocated for tech companies to provide law enforcement a backdoor into their devices, he is not getting that cooperation. This leaves the FBI in the situation they are in now; they must go to court to get an order to compel the tech companies to give them access as their investigations and warrants require. In this situation, the FBI just wants the evidence on this one phone.
A3: Yes, according to open-source reports, the FBI just wants to have Apple alter the software on the phone to set the conditions so that the FBI can brute-force the password.
Security Pros Offer Opinions, Solutions for FBI vs. Apple
Morey Haber, vice president of technology, BeyondTrust:
A1: Based on public-record information, Apple has assisted the FBI 80 times [or more] since 2008 accessing devices that were under investigation. This request is no different than any other, except the potential complexity to access the phone is exponentially increased due to local encryption and a feature to erase all data if a PIN [personal identification number] is unsuccessfully tried after 10 times. Apple has decided to take a stand based on a custom piece of firmware that could be used to remove key security features and allow brute-force access to the phone. The custom firmware is not a threat to individuals or anyone else unless it is leaked into the wild and a hacker found a way to distribute it to cell phones everywhere. If they did, Apple could easily stop signing that version, like they do with all previous versions, and the point would be moot.
I believe that in one-off cases like this, and the possible threat of future terrorism, Apple should help the FBI. If the method the court order suggests is “too risky” for unknown reasons, then mediation can find another way. After all, iOS 9 can be jailbroken, and the modifications needed for the phone to access the data can probably occur with over-the-counter hacks currently available. I firmly believe the security community can do this without the PIN.
A2: Apple has been slow in adopting new and commonly acceptable features since the inception of the iPhone and iPad. If you remember, it took them years to admit the phone needed copy and paste. Why can’t Apple add a commonly accepted feature to the home pin screen like everyone else: “Forgot my PIN.” They provide hints on OS X, sensitive Websites have “forgot my password,” and in the most extreme cases, the PIN is sent via USPS. I understand the basic mechanics of the Secure Enclave; why the PIN needs to be entered the first time the phone is booted, etc. There is no reason Apple cannot add this feature, especially if they trust all our passwords in iCloud under a keychain with a one-time password. The same technique could be used for an iPhone. By the way, if you have young children, you may have experienced them changing the PIN and not remembering. That just leads to a bricked device.
A3: Yes, I believe Apple could retrieve the data in a number of ways, which include disassembly of the device and copying the contents from the chips directly. There are probably a few approaches they could use based on their design and development standards. If someone says no, I would suggest to look deeper. After all, the court order has identified a method that Apple rejected. There is probably another way and a lot of intellectual property is being revealed based on this alone. The other techniques may just not be in Apple’s interest and they need to take a stand.
As for the FBI accepting the information [as obtained by Apple itself], I am not an attorney. I do not know the legal ramifications of a third party providing this data. However, if the FBI was involved in every step and informed of the procedure, I don’t see why this would be an issue. This leads to the premise above: They [Apple] may not want the FBI to know or have decided to take a stance for marketing, political or other reasons. Remember, they have been cooperative in the past. So why now? This is a one-off backdoor. Nothing more. Apple can create plenty of mitigating controls to ensure safety of the code. My only fear is the insider threat. The creators would know how to make the backdoor that currently does not exist, and it would be a commodity to the FBI, NSA or anyone else. Right now, the claim is no one has that knowledge and the FBI would have to accept another method to get the data.
J.J. Thompson, founder and CEO, Rook Security:
A1: No. In this request, the FBI is turning to Apple to help with something that is feasible only through working with the vendor. The solution is intended to be specific to this one phone, and it will not set dangerous precedent as the situation is very specific.
A2: Apple can sit with the FBI and the U.S. Attorney on the case to craft documentation that specifically outlines what can and cannot be done to modify the Secure Enclave firmware written and signed by Apple—for this one device to be modified, for this specific purpose only. The U.S. attorney and the FBI can re-submit the warrant to improve the specificity of the request so that Apple and the public’s concerns are addressed.
A3: In theory, yes. iOS adds a 256-bit, device-unique secret key called a UID to the mix, and to store that key in hardware where it’s hard to extract from the phone. Apple claims that it does not record these keys nor can it access them. This key and the UID/GID fusion process is conducted within the Secure Enclave. Only the device knows the UID—and the UID (according to Apple) can’t be removed from the Secure Enclave, which means cracking attempts have to be attempted on the device itself. Apple could provide custom firmware that attempts to crack the keys on the device.
Security Pros Offer Opinions, Solutions for FBI vs. Apple
Matt Peterson, president & CEO, eFileCabinet:
A1: Apple has built their iPhone business on the foundation of keeping their users’ data footprint private. Their business model and the trust of their customers are in jeopardy if they open this Pandora’s box. As the CEO of a cloud-based data management company, I am particularly sensitive to the privacy and anonymity of our customers and the data they want to keep private. That’s what they pay us to do, and that’s what they pay Apple to do as well.
A2: If law enforcement wants to come and search my house or a computer in my house, they have to get a search warrant that is signed by a judge. I think this should be no different in the case of wanting to extract data from a phone. There has to be case-by-case oversight.
A3: While they wouldn’t publicly state this for obvious reasons, I’m sure Apple already has the capability to retrieve data on a case-by-case basis. But just because you can, doesn’t mean you should.
Aziz Gilani, partner at Mercury Fund:
A1: I am pretty strongly opposed to the government’s attempts to get Apple to crack their encryption on iPhones. I think that demanding a backdoor to encryption: a) was opposed by our founding fathers; b) will inevitably be exploited by criminals; c) provides a new tool for repression by authoritative governments; and d) will create another reason for international customers to abandon U.S. technology companies.
As John Ashcroft pointed out in 1998 during a previous debate on encryption, our founding fathers actually had access to some very strong encryption tools with Thomas Jefferson developing a virtually unbreakable cipher himself. Despite knowing the powers of encryption, they never demanded that law enforcement have access to keys or tools to crack codes. He nailed the argument thoroughly when arguing against backdoors when he said:
“The FBI has argued that a system of mandatory access would make it easier for law enforcement to do its job. Of course it would, but it would also make things easier on law enforcement if we simply repealed the fourth amendment.”
A2: As we learned through Snowden, Manning, and the OPM [U.S. Office of Personnel Management] hacks, the government is terrible at securing data. The private sector is no better as we learn almost daily with various data breaches and hacks. Do we seriously think that a universal backdoor to the data on any data device will go undiscovered? If you do, I also have a bridge I want to sell you.
A3: Once Apple implements a backdoor to the iPhone, foreign governments will also demand access to encrypted information on their seized iPhones. Even if you completely trust the U.S. government, how do you feel about the Chinese, Russian, Iranian, or Syrian governments having the power to access encrypted data from their citizens?
As we learned during the Snowden revelations, foreign companies don’t trust our government and switched away from American technology companies when their cooperation with the NSA was exposed. Microsoft, Cisco, IBM, Intel and Hewlett-Packard were all impacted and lost key accounts.
Security Pros Offer Opinions, Solutions for FBI vs. Apple
Ron Heinz, managing partner, Signal Peak Ventures:
A1: While this is indeed a polarizing issue, I believe that both the technology industry and the intelligence community can and must drive to a satisfactory middle ground. Most technology companies today build products that safeguard end-user’s data but also allow law enforcement teams access to data based on bona fide legal orders. I am a strong advocate of this middle-ground approach and believe it strikes the right balance of providing government access, while respecting both user privacy and integrity of the source code. Think of this as the equivalent of an electronic search warrant—very analogous to a physical warrant—and let’s move forward.
Kris Lahiri, co-founder and CSO at Egnyte:
A1: As chief security officer at Egnyte, I place a heavy emphasis on the importance of both security and privacy. In regards to the issue between the FBI and Apple, we are in firm agreement with Apple’s stance to deny the FBI’s request for creating a backdoor into the device. Should Apple choose to comply, it would not only set a terrible precedent for any legal matters moving forward; it would undermine all of the advancements tech has made in protecting customers’ data privacy and data security. Any kind of backdoor, or bypass, of Apple’s security protocol would create additional vulnerabilities that can be exploited by hackers.
A2: The simple solution here would be for the FBI to find an alternative way to obtain the information necessary in order for justice to be served—one that would still protect the Fourth Amendment rights of the parties involved. The current proposed solution of providing the FBI infinite attempts to unlock the phone is not a viable option, as the FBI would be able to break into the device in less than 30 minutes, violating any right to privacy. Any mandate to bypass the encryption technology in place would be a major setback technologically, legally and ethically.
A3: While I cannot speculate on Apple’s abilities, I would assume they do have the ability to extract the data and hand it over to the FBI, without creating a backdoor for the FBI to use as they please. However, at this point, Apple would be sending a terrible message that they do not care about the privacy of their customers. Moving forward, that would be a nightmare situation for their business: broken trust with their customer base, stain on their brand, potential decline in sales, etc. In my opinion, the best thing Apple can do is to continue to stay transparent with the public and stand tall in their efforts to protect customers’ right to data privacy.
Ridgely Evers, founder and CEO, Trustpipe:
A1: (I am) unambiguously behind Apple, because they are standing up for their customers as well as their principles. Both Apple and Google have pledged to their customers that user data belongs to the user, and sacrosanct. And they’ve gone to considerable lengths to make sure that, as vendors, they no longer have the ability to violate that promise.
It is unreasonable for the government to attempt to retroactively change that commitment.
A2: And, I believe, it is also unwise; it is not in the country’s best interests to be able to co-opt hardware or software producers. It violates the trust between vendor and customer and significantly impedes their ability to do business internationally.
————————————————————-
We’re merely in the early stages of decisions, discussions, discovery and dissemination of information in this case, one that is destined for law libraries, civics classes and, very likely, a museum.