Quote of the day—Tim Cook

The U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.

Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.

Tim Cook
February 16, 2016
A Message to Our Customers
[Such a concession to the government would fail The Jews In The Attic Test. No further discussion is required.—Joe]

Share

24 thoughts on “Quote of the day—Tim Cook

  1. The letter isn’t very clear about why this is so bad. Some of the press articles explain it better. (Tim Cook really should have done a much better job of explaining his reasoning!)
    There are a couple of considerations. One is that a hack like the one asked for, supposedly “just for this phone” could be modified to work on other phones. Or a different phone could be modified to look like this one.
    The other consideration is that giving in once lets the camel’s nose into the tent. Then what happens next is that every government will tell Apple “we want this phone opened too, and you can’t refuse us now because you’ve already done this once before, so clearly it’s easy for you to do it again”. In other words, let this happen once, and the next thing is that every tinpot dictator will use this, too. Or at least every tinpot dictator with enough economic power — say, China. Do you want China to have the keys to your iPhone? I didn’t think so.

    The real mistake Apple made is that this protection mechanism is in software, so they are in a situation of “we could do this but we don’t want to”. They should have put it into hardware, because then the answer would be “we can’t do that, it’s cast in silicon”.

    Time to find out what the analogous Android story is. Most likely worse, because Android is open source so the feds can simply make up their own version and install it. That assumes there isn’t some ironclad mechanism to prevent installing of unauthorized versions. It will be interesting to see what Silent Circle (maker of the Blackphone) says about this issue.

    • The answer from Silent Circle was quite distressing. (1) see our privacy policy, and (2) we’re not planning to comment on the Apple controversy.
      The privacy policy, of course, talks about their services, not their OS or hardware. And the fact that a company that markets itself the way Silent Circle does is refusing to support Apple makes me doubt that I should do business with them, as I have been considering for some time now.

  2. This is showing my ignorance of the guts of electronics, but I don’t get why .gov needs any help. The saying from back when I was younger is that nothing is secure that someone else has physical access to. If you have the phone you should be able to spoof it’s memory into giving up whatever is on it….

    • The likely answer is laziness. Unless you’re talking about storage that’s embedded within the processor so it’s not accessible without the help of the software, you’re clearly correct.
      Even if the conversion from PIN to crypto key is handled by a TCG chip or something like it that makes use of a secret value inside, the auto-erase machinery is apparently in software. So they could clearly pull off the chips that matter — the flash memory chips where the data lives, and a key management chip if there is one — then build their own PIN-breaking machine. But that requires real work, and it’s easier to try to force private companies to do the work and bear the costs.

    • The physical access claim of non-security has some caveats. Physical access allows one to change permissions on files or install key loggers, or a lot of other security defeating things. But doesn’t always guarantee access to the secured data. Simple example: I could give you a sheet of paper with an encrypted message that is impossible to decrypt by any means without the key (a one-time pad).

      They aren’t using a one-time pad but if memory is encrypted with a key supplied by the user then they would still probably need to obtain the key through means other than the user. For a small enough crypto text sample a near brute force attack on the key (impractical) may be the best you can do.

      In this case, almost for certain, the key does exist on the device, the user just supplies a PIN, which has a much smaller range than the key, which is used to block access to the key. They want to do a brute force search for the PIN. But the O/S erases the device after 10 failed attempts.

      I don’t know that they did this, but if I were the engineer doing the encryption I would put the key in a small section of non-volatile memory in the processor itself. It can then read the key without it appearing on an external bus. With proper microcode in the processor I think it could then be made inaccessible without the actual PIN or extremely small probes on the naked chip. If the microcode also implemented an erase of the key after N failed attempts the device ends up being, for all practical purposes, uncrackable.

      • But wouldn’t the weak point there still be the PIN? So you can’t get to the key itself, but wouldn’t the PIN have to be externally accessible so that it can take input from the screen and changed? Get the phone to cough up the PIN and that gets you the key. Again, my knowledge stops at using the things so maybe it’s my own ignorance….

        • No. The key and PIN (or just a hash of the PIN) would reside inside the CPU. There would be code, in the CPU, that accepts a user supplied PIN for validation. It compares this (or a hash of it) to the stored version in the small nonvolatile memory in the CPU. It either returns with the key (if a match) or a “failed to match” error code.

          Again, this is how I would do it if I had complete control over the hardware and software. I don’t know how they actually implemented it. It may be something far better or it may be something similar to what you envision.

          • The problem is that if the key is unlocked by a PIN, and that unlock process is done in software, the software can be changed to allow the unlock attempts to be done at very high speed and without limit. That’s what Apple is dealing with here.
            What you describe would be the right answer provided the unlock machinery, and the protections blocking fast exhaustive search, are in ROM or gates — not in any form of updatable program memory.

  3. Apple should make a counter-offer:
    They will unlock THIS particular phone (assuming they can) with measures in place to preserve chain of evidence. No more is required.

    • That is, I think, the most reasonable compromise they can make.

      “Send the phone here. Send a few FBI agents along to make sure the evidence is preserved, if you like. We will unlock/copy/clone this one phone for you. We will not create a tool that can be used on any/all iOS devices, but if this one phone can be unlocked and its memory preserved, we will help with that.”

      I’m also curious about the actual language in the judge’s order. Did he/she order Apple to assist in the investigation and unlock the phone (and it’s the FBI demanding the keys to the back-door)? Or did he/she really order Apple to create a software tool to bypass security on all their devices? One of those is reasonable. The other is overkill, overreaching, and should be appealed and challenged. I’d also like to see what probable cause the FBI presented to believe the phone actually has evidence. Last I checked, “there might be evidence there” isn’t good enough, although the death of the suspects certainly factors in.

      • I did some reading last night. It seems what the FBI wants to do is to brute-force the security. But there’s an obstacle, which is the fact that the device is erased after 10 consecutive bad PIN entries. It is that obstacle they want to have removed.
        The order is at https://assets.documentcloud.org/documents/2714005/SB-Shooter-Order-Compelling-Apple-Asst-iPhone.pdf — just 3 pages.
        One interesting thing is that it requires Apple to provide “reasonable technical assistance”, which is rather different from the order “cooperate or else” that the media reporting implies.
        It clearly requires Apple to build a custom iOS that omits the auto-erase feature, and adds a way to send PINs to the device at high speed. It doesn’t in any way propose what you suggest, i.e., to have Apple do the work rather than the FBI. This demonstrates that Charles Krauthammer doesn’t know what he’s talking about when he suggested just that last night.
        Even with a brute-forcing API, the actual break might take a while. So if Apple were to do the whole job it would clearly take them longer. But it’s a possibility.
        The order talks about a custom iOS that runs only on that phone. But most likely such a device lock could be defeated; if nothing else, you could simply substitute whatever chip has the serial number on it. Or you could remove the serial number check (though having signed software makes that harder; you’d have to do it with the bits on the device, not the software in the download file).
        In any event, if Apple were to do what you mentioned, that wouldn’t help with the camel’s nose problem: they would still have done it once, and from that time onward every government that wants to snoop on any phone would just come to Apple and say “you’ve done it before, do it again”.

  4. Pingback: Substitute the word 'gun' and read it... - Gunmart Blog

  5. The FBI wants to investigate. We already know who did it and they are dead so no trial. Therefore, any investigation would have to be about unidentified co-conspirators, financing, and the like. Everybody in the world knows generally where to look. One would hope that the FBI would have a great deal more specificity on this point. So the tradeoff would be any new information that may or may not be on the phone vs the risks of creating the tool. Given that the government has been trying to outlaw or subvert encryption for a while now, one has to assume that that is a major agenda here.

    • [A]ny investigation would have to be about unidentified co-conspirators, financing, and the like. Everybody in the world knows generally where to look. One would hope that the FBI would have a great deal more specificity on this point.

      I’ve said elsewhere, and maybe someone with better insight into cellular technology might be able to correct me, but here it is anyway:

      The phone was apparently owned by San Bernardino County (SBC), not the “suspects” (do we really have to call them “suspects”?). SBC is reportedly cooperating with the FBI investigation.

      If SBC owned the phone, then they also owned the data plan for the phone. Thus, as the data plan owners, they should be able to request the carrier (be it Verizon, AT&T, Sprint, or whoever) provide the FBI with data usage patterns, contacts lists (which are often backed up over the network), internet activity, frequently visited (or bookmarked) sites, etc. From that, they can learn where the “suspects'” logins were used, and go after those website owners for more information. If the carrier or website owners don’t feel inclined to supply this information, the FBI can always subpoena it as well.

      The point is, there’s a whole other trail of information to be had — or at least attempted — that doesn’t require demanding Apple code a backdoor for their OS. Have they tried following these breadcrumbs?

      Or is this a witch-hunt to get Apple — the only major supplier of smartphone devices without government-accessible security holes and with a locked-down, proprietary OS that prevents them from creating their own — to cave and compromise their OS?

      Or both; they’re not mutually exclusive.

  6. Bravo to Tim Cook for standing with Apple customers and (if you’ll excuse the expression) “speaking truth to power”. I hope he continues to stand his ground.

  7. Seems to me that if Apple _can_ do it then the iPhone at _best_ has security through obscurity. Security though obscurity can be broke if you’re willing to spend enough money and time (some of the money being spent on _other_ iPhones that you are willing to brick).

    • I don’t think that’s really accurate. What we have is the obvious fact that all security systems other than one-time pad are vulnerable to exhaustive search (“brute force attack”). The only question is how long the search takes. Modern ciphers have keys long enough that exhaustive search is not practical (and they have been studied long and hard to give confidence in the absence of shortcuts). The same goes for passwords or PINs: you can brute-force those too, again the question is how long it takes.
      Apple added the additional security mechanism of erase after a number of failed attempts, which certainly is an effective way to make exhaustive search difficult. A similar but less drastic way is to slow down attempts after a few failures; login passwords are sometimes handled that way.
      But as I pointed out, the feds, if they wanted to, could reverse engineer the whole thing and build an exhaustive search engine that goes around the erase machinery entirely. What we have here is an attempt to get that built the cheap way, by making Apple pay for it rather than the Feds.

  8. FYI. NYCPD says that they have 170 Apple Iphones they want cracked too. – Reported by NBC Nightly News Thursday Night.

    I don’t think that this will be the only phone, not by a long shot.

    Someone else asked the question, was the PIN written down in their apartment? I suppose we will never know since they let everybody and all the media wander through the apartment days after the attack?

    • The other problem is that it isn’t just US authorities that want to break into people’s phones. It’s also the French, and UK, and Chinese, and Russian authorities.

  9. Turns out that the fibbies HAD the password and inadvertently changed it to something they don’t know. Score another one for those who tell us they will keep us safe.

  10. Pingback: Apple stands on principle

Comments are closed.