US government asks Apple to help it brute-force iOS security

Apple often touts the security and privacy benefits of its iOS platform. The company takes a variety of measures to ensure that users' data is protected, like device encryption that's on by default and a trusted chain of execution to ensure the integrity of Apple software running on a phone or tablet. Problem is, suspected criminals use iOS devices, too, and law enforcement agencies in the United States have long desired an easy way around those privacy protections.

Apple has steadfastly resisted building that sort of skeleton key or backdoor into iOS, because it's generally understood that building backdoors into cryptosystems weakens them. Even so, that hasn't stopped the US government from continuing to demand that kind of access to Apple devices.

That conflict is now boiling over. A court order issued yesterday as part of the investigation into the 2015 San Bernardino terror attack compels Apple to provide the United States Federal Bureau of Investigation with a "signed iPhone Software File, recovery bundle, or other Software Image File" that can be loaded onto an iPhone 5C seized as part of the investigation. The court says that software should perform the following actions:

(1) it will bypass or disable the auto-erase function whether or not it has been enabled; (2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT and (3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.

 As the basis for its order, the government cites the All Writs Act, which says in part that "the Supreme Court and all courts established by Act of Congress may issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law."

Apple CEO Tim Cook publicly responded to this order in a fiery statement published this morning. He says the company "has no sympathy for terrorists," and that it's cooperated with the FBI's investigation so far. Cook says that the company has turned over relevant data that's in Apple's possession, and that it's made Apple engineers available to the FBI to advise the agency on its options for the investigation.

Now, though, he says "the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone." While the government has requested that any solution that Apple makes run only on the iPhone in question in this case, Cook says that assurance makes no sense. If the company creates a method of back-dooring this particular iPhone, he argues, that same method could be used to unlock any number of iPhones. Worse, he suggests, is the potential that malicious actors would seek to exploit the same vulnerability once it's revealed as a useful attack vector in the first place.

Cook further argues that the FBI is creating a dangerous precedent by doing an end-run around legislative action from the United States Congress and instead seeking what he calls an "unprecedented use" of the All Writs Act. He fears that if this legal justification is considered allowable, it would create "chilling" precedents that would be used to justify surveillance acts like intercepting users' messages, granting government access to users' personal data, or using a phone's microphone or camera, all without the user's knowledge.

Apple says that it plans to oppose the order. According to the court, the company has five business days to respond if it believes that the conditions of the order are "unreasonably burdensome," so we'd expect a detailed response soon.

Comments closed
    • derFunkenstein
    • 4 years ago

    I think I’m coming around on this. A law professor at George Washington University put it in pretty plain terms:

    [url<]https://www.washingtonpost.com/news/volokh-conspiracy/wp/2016/02/18/preliminary-thoughts-on-the-apple-iphone-order-in-the-san-bernardino-case-part-1/[/url<]

      • VincentHanna
      • 4 years ago

      Yup.

      All the people, on both sides, claiming that this is “clearly” anything, legal, illegal, whatever… They are wrong. This case [b<][i<]has to[/i<][/b<] go to court. There are no two ways about it. The constitutional "questions" are ignorant red herrings. This is [b<][i<]NOT[/i<][/b<] a constitutional issue, but Apple has a legitimate complaint that should be considered. The primary concerns of the case, as I see it, is 1) whether apple is being compelled to perform a function that the government should be responsible to perform under the all writs act, 2) whether there is an "alternate" method available to the government that would invalidate the invocation of the all writs act(whether the FBI screwing up the most reasonable "alternate method" matters) and 3) whether re-engineering the 5C's OS is a "reasonable" request, under the definition provided for in the all writs act, none of which are decided, at this point.

    • chuckula
    • 4 years ago

    Yeah, argument over: Even Ars Technica says that the FBI isn’t backdooring everybody’s iPhone: [url<]http://arstechnica.com/apple/2016/02/encryption-isnt-at-stake-the-fbi-knows-apple-already-has-the-desired-key/[/url<] To those of you who think you've outsmarted all the PhD's with cryptography experience with your magical "but if Apple signs one piece of firmware then ALL DEVICES EVERYWHERE ARE NOW BACKDOORED FOREVERS!" argument, please listen to your oracles over at Ars: [quote<]Assuming that this can be done (and done robustly), [b<][i<] it means that even if the custom firmware were given to nation-states or even published on the Internet, it would not serve as a general-purpose way of performing brute-force PIN attacks. It would be useless on any device other than the San Bernardino device.[/i<][/b<] To make such leakage less likely, the court order does allow for the possibility that the custom firmware might be used only at an Apple location, with the FBI having remote access to the passcode recovery system.[/quote<] If the pro-Snowden pro-Apple cheerleader squad can't even defend Apple, then you know it's over. While I'm sure I'll be downthumbed for having technical competency, the argument that Apple signing a particular firmware means that all governments everywhere can now force that firmware on all phones is not only flat-wrong, but also shows that the people making the argument think that Apple has ALREADY PROVIDED the magical "backdoor" that the governments want. Here's why: 1. Did Apple release an old software update that had bugs sometime in the past? Sure they did. 2. Did Apple *sign the old buggy software using its key*? Sure it did. 3. QED, every government everywhere can now force the old software with the bugs... you know, those magical "backdoors".. onto all iPhones everywhere. After all, those old software updates had the "magical signature" from Apple! 4. Except that point 3 isn't true [even though Apple's own fanboys are saying it's true] because somebody at Apple actually knows what a [url=https://en.wikipedia.org/wiki/Replay_attack<]Replay Attack[/url<] is and put in mechanisms to prevent those attacks.

      • blastdoor
      • 4 years ago

      [quote<]it means that even if the custom firmware were given to nation-states or even published on the Internet, it would not serve as a general-purpose way of performing brute-force PIN attacks. It would be useless on any device other than the San Bernardino device.[/quote<] This might be true in a narrow literal sense. If the firmware binary is placed on the Internet it won't be useful on any other phone if Apple is indeed able to make a firmware specific to that iPhone (using a unique identifier). Let's assume they can do that. And if the source code for the firmware were posted, that [b<]in and of itself[/b<] would still not be useful for attacking every phone, because you need Apple's digital signature to accomplish that. However: 1. the source code for such firmware could contain information not previously released or understood about how iPhone security works. That would mean that iPhone security has been weakened. 2. this is as much about law as it is about technology, and it really is possible to set dangerous precedents with law. In this case, the precedent would be that the government can compel a company to create new software to defeat its own security measures. This creates an incentive for companies to only create security measures that it knows it can defeat. And since companies like to minimize costs, they will have an incentive to create security measures that are not very costly to defeat. That sure sounds like weaker security to me. And who gets hurt by that? Mostly ordinary people, because competent criminals and terrorists will use other means of securing their data and communication. I increasingly believe that this is not a black and white situation. It is not "obvious" that Apple could comply with this court order without any negative consequences. It is also not obvious that there is no way that Apple could ever help the government gain access to data on a device that would be helpful in an important investigation. If there is a way to thread this needle, I think it will require a very carefully crafted new law... but I'm not optimistic that we can get such a thing in the current environment. So for now, I continue to side with Apple's position.

        • chuckula
        • 4 years ago

        Blah Blah Blah Blah Blah.

        Guess what: If Apple complies with this order AND THEN THE FBI DROPS A NUCLEAR BOMB ON SAN FRANCISCO then something bad will happen.

        Obviously, Apple MUST not comply with the law to prevent San Francisco from being obliterated in a nuclear attack!

          • blastdoor
          • 4 years ago

          Oh wow, now I see everything. You’re so right — it really is black and white. All answers are easy answers, you know what they are, and only fools and tools disagree with you.

            • chuckula
            • 4 years ago

            I’m sorry: You lost all rights to whine like an idiot after the following occurred:
            1. You used your clearly incomplete technical “knowledge” to come to a technical conclusion about the situation that was dead-wrong as I pointed out — and as your Apple pals over at Ars have now confirmed.

            2. You started acting like you & Timmy can rewrite the Constitution to completely change what the fourth amendment means and then — to top it off — claim that you are the beacon of “freedom” for outlawing all access to the encrypted devices that terrorists use to murder people while simultaneously saying that the rights of people who have never committed any crimes should be arbitrarily rescinded because you think guns are scary.

            Sorry sunshine, I’ve had it with your two-bit hypocrisy and frankly, Tim Cook was pretty competent at organizing Chinese slave laborers into getting iPhone parts made on time, but he clearly ain’t a technical or legal whiz.

            • blastdoor
            • 4 years ago

            1. What technical conclusion did I come to? Could you quote it?

            2. When did I say anything about the 4th amendment? Could you quote that?

            edit:

            Wait — what? Terrorists use phones to kill people? How does that work exactly? Up until now, I thought they primarily relied on guns and bombs.

            One thing that you are actually basically right about (it might be the closest you’ve ever come to replying to something I’ve actually said, rather than your imagination) — I’m very anti-gun. I want the 2nd amendment repealed. You can quote me on that.

            • blastdoor
            • 4 years ago

            BTW, shouldn’t you have said that I’m entering a world of pain?

            [url<]https://www.youtube.com/watch?v=t_TdCs9GA4w[/url<] Every time you start ranting, that's the scene I think of.

      • VincentHanna
      • 4 years ago

      That is a whole lot of text to hang on a sentence that begins

      “[b<]Assuming[/b<] that this can be done." Such competency. Much technical.

    • ZGradt
    • 4 years ago

    Meh. If you think your data is safe behind a 4 digit passkey, you’re dreaming. I’m sure the NSA eggheads are laughing at everyone while this farce plays out. I’m resigned to the fact that whatever encryption I use is at most a nuisance to the powers that be, and only a real deterrent against the laziest of criminals.

      • VincentHanna
      • 4 years ago

      If the entire device is properly encrypted (all new ios/android devices are) and the passkey (which is linked to the encryption key) is deleted after 10 failed attempts, it’s pretty secure. It’s as secure as the encryption that you are using.

      Too many people (even in technology circles) act like the NSA’s hacker monkeys have magic powers or something. Not even the NSA can brute force a 128bit or 256bit encryption scheme using modern hardware. Something on the order of a few billion years on average.

      Honestly, do you think the FBI would really care/pursue this writ if they could just break AES instead?

        • Master Kenobi
        • 4 years ago

        Yes because the cracking/hacking methods utilized by CIA/NSA/etc are not admissible in a court of law without producing the “how” they accomplished the task, which lets everyone know how it was done. I don’t see any of the other 3 letter agencies being willing to compromise any such techniques if they have them. The FBI needs this to be strictly legal and admissible in court, hence why they are trying to compel Apple to assist.

          • VincentHanna
          • 4 years ago

          I’m sorry, what I heard was “Yes, I believe in magic NSA hacker monkeys because it suits my paranoia better”

          • BIF
          • 4 years ago

          You’re not paying attention.

          The very nature of “brute force” is to try all possible combinations, and to try them as fast as possible.

          This is not possible if the device is locked (or deleted) after 10 tries. Or even 100 tries. Even a 4-digit pin code, with 10,000 possibilities, might not be cracked in 100 attempts, which is only 1% of the total entropy of possibilities. 10 tries is 0.1%. Better guess good!

          There are a lot of password locking strategies. Some after only 3. Others after 5. Or 10. Some make you wait a period of time after a certain number of bad guesses (I have heard that newer iOS devices do this, but don’t know for sure).

          Whether or not they make you wait, I know of very few that will allow you to keep trying endlessly. Except my Masterlock on my locker at the health club.

            • VincentHanna
            • 4 years ago

            It’s interesting to me that apple went with such a hard-line approach, honestly. Not because I think it’s “unconstitutional” or some nonsense like that, but I think it’s impractical. Unless I’ve missed some aspect of their implementation, it seems like anyone who has a young kid and an iphone is simply asking for trouble, given that 10 attempts will, in essence, factory wipe a phone… Plus, that seems like a lot of power to give to anyone who happens to find a phone under a park bench, bullies on a playground, etc.

            To me, its more surprising that more iPhone users haven’t been rioting in the streets over this feature. I guess, because most people don’t enable it in the first place.

    • Captain Ned
    • 4 years ago

    New factoid which I hadn’t previously seen. Seems that dead jihadi’s phone is owned by his employer, the San Bernardino County Public Health Department, and they have consented to the search.

    Any legal basis for Apple to deny the effort just evaporated in my mind.

    [url<]http://arstechnica.com/apple/2016/02/encryption-isnt-at-stake-the-fbi-knows-apple-already-has-the-desired-key/[/url<]

      • willmore
      • 4 years ago

      Not in mine. Whose phone it was never impacted demanding the support of a third party.

      Want fun facts, then realize this demand was made in the case of the guy who sold them the guns they used. It’s not a ‘terrorism’ case at all.

      How about this, why don’t they demand that the San Bernardino County Public Health Department provide the unlock code or, better yet, demand that they use the remote unlocking ability they provisioned the phone for. They did what any responsible employer does with devices they hand to employees, right?

        • VincentHanna
        • 4 years ago

        [quote<]Want fun facts, then realize this demand was made in the case of the guy who sold them the guns they used. It's not a 'terrorism' case at all.[/quote<] So? They have consent, and they have a warrant. What type of case it is, or who is being prosecuted is irrelevant. [quote<]How about this, why don't they demand that the San Bernardino County Public Health Department provide the unlock code or, better yet, demand that they use the remote unlocking ability they provisioned the phone for.[/quote<] This, I agree with. IF this option exists, then under the law, the FBI clearly has a responsibility to pursue this under the law, which specifically only activates in cases where there are not other options available. In fact, if this option is available, I would think it would supersede the present request, even if San Bernadino Co Health Dept hadn't given them consent.

      • VincentHanna
      • 4 years ago

      Well then you don’t understand the basis of their challenge. Since whether it is a court-ordered search, or an employer authorized search is irrelevant.

      The basis of their challenge is related to the government compelling them to produce a method to invalidate their own security, because they (the government) lack the ability to break Apple’s encryption themselves.

      The equivalent of forcing a safe manufacturer to disable the (insert contents-destroying security feature here) so that the feds can buzz saw in. Even assuming that it is [i<]possible[/i<], doing so would invalidate the security feature, which is a key aspect of the thing that they are making. Apple still has a STRONG legal basis to maintain that it is unlawful to mandate that they make an inferior product (or make a product inferior post hoc) because the FBI doesn't have the right tools to execute the search, whether or not the government is in possession of a warrant, because it is also the government's responsibility to EXECUTE the warrant, not just obtain it.

      • BIF
      • 4 years ago

      This could be fixed in future versions of i devices.

      Provide a feature that allows for an owner and a user. In some cases those will be the same person. In other cases, such as a device owned by a company and distributed to its employees, then they will be different.

      Many companies make it clear to their employees that any information you store on the company device is not necessarily private. It’s the same as using a company email system for personal use.

      This feature would also allow parents to be registered as the owner, having full access to the device that they give to their children.

      By providing for separate entities owner versus user, Apple then removes themselves from cases such as this.

      But in the case of a personal device for whom the owner is the same as the user, then the security needs to be inviolate. If it’s not, then nothing else matters and there’s no reason for people to stay with that technology.

    • adisor19
    • 4 years ago

    I don’t think y’all understand the implications here.

    The FBI is asking Apple to use their Private Key to sign a new special firmware that removes the anti password brute force part of the code. This way, the FBI will be able to brute force the pass code and gain entry to the data on the device.

    Since this firmware will be signed with Apple’s Private Key, it’s reasonable that it could then be flashed on other iPhones with some competent hacking etc. In other words, once the backdoor is created, the cat is out of the bag and can not be put back in.

    Tim’s point is that they would be betraying their customers privacy and trust and I completely agree with his point. I hope the courts will see how dangerous this precedent is and overrule the current stance.

    Adi

      • blastdoor
      • 4 years ago

      [quote<]Since this firmware will be signed with Apple's Private Key, it's reasonable that it could then be flashed on other iPhones with some competent hacking etc. In other words, once the backdoor is created, the cat is out of the bag and can not be put back in.[/quote<] There are many key points, but this is one. *Is* it reasonable to believe that once created, this cat would be out of the bag? That's a human question as much as it is a technical question (really, it's more of a human question than a technical question). I think we all make a mistake if we assume that it's "obvious" how slippery or non-slippery the slope is. It *might* be possible to find a way to balance both concerns about data security for law-abiding entities and concerns from law enforcement that they be able to access data from law-breaking entities. But I don't think the FBI gives a crap about balancing competing, legitimate concerns. I think Cook is more sensitive to the FBI's perspective than they are to his, but ultimately he is also not the guy who should make the final judgement. Ideally we would get a new law. But given the degree of political dysfunction in the US (which is almost entirely the fault of the right), we don't get a new law anytime soon. In the meantime, we have to choose among suboptimal options. My inclination is to side with Cook on this. But the more I think about it, the more likely I think it is that he will ultimately lose. So I guess I hope I'm wrong about everything.

      • willmore
      • 4 years ago

      One would think that an apple fan boy would know more about how this actually works. When an iDevice requests an OS update, the update given to it is signed only for that device. That update isn’t good for any other device.

      I still think the FBI is in the wrong here.

        • adisor19
        • 4 years ago

        Hence “with some competent hacking” part in there.

        But now that I think about it, this isn’t about that, this is about the government being able to force any tech company to crack open users devices and it’s slippery slope. It’s a big story now but if Apple fails to defend their position and get forced to crack this phone, it sets a precedent for any such future requests.

        Adi

    • the
    • 4 years ago

    So ummm…. why can’t the FBI have the NAND chip desoldered and duplicated made via low level flash dump? Each duplicate will be encrypted as well but that prevents data loss if too many repeated tries are used and the wipe on failure option is enabled (which has yet to be determined). Worst case scenario is that they’d have to put a BGA socket onto an iPhone motherboard (technically feasible but difficult) so that they’d have to swap the NAND chip after so many failed attempts.

    Also as far as automated code entry to brute force a working code, couldn’t the FBI put another piece of hardware between the iPhone’s logic board and touch screen cable to mimic presses? Sure, this wouldn’t be as fast as a raw script running over a Lightning cable but it wouldn’t require a human to manually enter every combination.

      • blastdoor
      • 4 years ago

      Cheaper to force Apple to do stuff than do stuff themselves.

        • the
        • 4 years ago

        Cheaper/easier maybe true but that doesn’t necessarily translate into [i<]should[/i<]. Especially in light of an alternative that would not compromise security for the rest of us.

          • blastdoor
          • 4 years ago

          I agree

      • nafhan
      • 4 years ago

      Grabbing the NAND is what they would do if they cared about the data… They probably don’t need the data, though. It’s unlikely that it’s important or will reveal anything new.

      They care about forcing Apple into a position where they provide a firmware that allows the FBI to easily get the data off of other iPhones where they have a more tenuous legal position to request the data. I think Cook is making that point in his statement.

    • lmc5b
    • 4 years ago

    I think everyone fails to realize that the point is even if the ability to unlock Iphones doesn’t leave the FBI or Apple, that means from now on they are gonna get Apple to unlock every Iphone they want. While they claim to be an isolated case, it always starts out that way and in a few years it will be done because of a parking ticket.

    • trackerben
    • 4 years ago

    The FBI is asking a US firm to provide localized physical means to access files which may provide clues on further planned attacks on American citizens and property. The kind of ready exploitation kit which Microsoft and other willing US firms have been providing US law enforcement and military for decades.

    As I’ve said elsewhere, the US firm claims it has a contrary new need derived from commercial obligations, which it and its allies believe overrides the precedents of a critical US anti-terror investigation. This is the issue here: which party has the superior need and can justify it not just lawfully but also morally. In the end it resolves in potential costs to either corporate resources or else personal lives.

    Which need one honors depends on which of the following goals one would willingly uphold in this case: an industrial coalition’s objective to secure its stakeholders’ secrets and loyalty, or a public agency’s duty to protect its nation’s people and property.

    • blastdoor
    • 4 years ago

    The San Bernardino attackers used guns.

    Buying guns is generally legal and easy in the United States. The FBI could not have stopped these people from buying the guns that were used to do the killing.

    The FBI cannot stop others from buying guns in the future.

    arguing about encryption kind of misses the elephant in the room

      • lmc5b
      • 4 years ago

      Before going for the same comment that always gets thrown around you should realize that these attacks only happen in “gun free zones” where regular people DON’T HAVE A GUN to defend themselves, and those areas are targetted exactly because of that fact.

        • Tirk
        • 4 years ago

        That would be interesting IF true but its not.

        For one, San Bernardino, is not in any way a “gun free zone” any one of those people at the party could have legally had a gun if they wanted to.

        Second, these crimes happen where the perpetrators happened to live, the likely hooded of someone traveling hundreds of miles to another state to commit a shooting when they could commit it in their home state is far less likely. Although, the gun fanatics occupying Oregon public lands might become the exception if we realize just how retarded it is to allow a bunch of white men carrying guns to freely travel to a different state and take over any land they want…… oh wait.

        Third have you ever looked at gun statistic deaths before? I find it humorous that so many people think what you said is true when there is ABSOLUTELY NO DATA to support what you say. There is however data that consistently supports increased gun deaths per person with increased gun ownership. I am personally fine with limited gun ownership but I DO NOT support that by making up facts.

      • DrCR
      • 4 years ago

      The San Bernardino attackers broke gun laws. Solution: Pass more gun laws.

        • blastdoor
        • 4 years ago

        Gun laws in the U.S. are ineffective because they are weak half measures that are then poorly enforced. But they are weak half measures because of the second amendment.

        The real solution is to repeal the second amendment and outlaw the sale of guns to, or ownership by, anyone other than the police or military. After the passage of enough time, it might become possible to even outlaw guns for the police (as is the case in the UK).

        It would be infeasible to actively go after guns already in private hands, but it can still be made illegal to own them, illegal to carry them, illegal to buy/sell them, and illegal to buy ammunition for them.

        I realize of course that these things will not happen for at least another 30 years. So in the meantime, we’ll continue to argue over second order crap.

          • Anovoca
          • 4 years ago

          Making gun laws will stop gun violence about as well as anti-murder laws have stopped us from killing each other.

            • blastdoor
            • 4 years ago

            So… do you seriously think that if murder were legal there wouldn’t be more murders?

            • Anovoca
            • 4 years ago

            Laws are nothing more than Writs. Writs are pieces of paper. Pieces of paper do not stop bullets or knives, nor do the keep your home safe from burglars; they cannot stop a car from being started by a drunk person, or keep you wife from sleeping with the neighbor. They are simply piece of paper.

            Morality is the only thing that prevents these things . Laws cannot teach people morality, they can only set a precedent of accountability.

            • blastdoor
            • 4 years ago

            I repeat: do you seriously think that if murder were legal there wouldn’t be more murders?

            edit: I find it revealing that my question cannot be answered with anything other than down votes. I think it’s because the answer is obvious. Of course there would be more murders if murder were legal! Your argument is absurd and you can’t defend it.

            Guess what — money is just pieces of paper, too. If that has no meaning to you, I’d be happy to take all of yours off your hands.

      • trackerben
      • 4 years ago

      The US is a unique case. Guns are readily smuggled in for use in massive terror attacks even in heavily controlled regions like Western Europe. Regulating US gun ownership will only catch the low-hanging safety fruit. Once that situation is exhausted then the real difficulties begin, as they discovered in Paris.

        • blastdoor
        • 4 years ago

        I see no reason not to catch low hanging fruit.

          • trackerben
          • 4 years ago

          That’s true, but there’s a follow-on complication due to the fact that community-focused gun ownership offers some alternatives to inadequate policing. You may have gotten the low-hanging fruit, but you’re also letting the tree flourish with new branching problems. For once you’ve eliminated guns in the hands of responsive citizens, you’ve eliminated an entire class of law & order options appropriate to lightly governed regions like wildernesses, unsecured borders, and restive neighborhoods.

          For most communities, the police serve more as a deterrent than as mitigation against stranger crime. That’s why anyone of big importance or wealth who can afford or rate it usually gets bodyguards. When rogues are less deterred, because they’ve either armed up relative to their victim class or are exploiting service failures like during riots or natural disasters or sexual mob assaults, then you’ve got a class of opportunistic attackers slipping past thin deterrent layers against which you’ve developed no robust point defenses in the community.

          Ironically, Apple’s argument for a deep and pervasive point-security model applies better here than in the case for after-the-fact investigation, because it involves securing the most important thing of all.

            • blastdoor
            • 4 years ago

            [quote<] For once you've eliminated guns in the hands of responsive citizens, you've eliminated an entire class of law & order options appropriate to lightly governed regions like wildernesses, unsecured borders, and restive neighborhoods.[/quote<] I think that's silly romanticism for the American west. It has no relevance today. Nobody in the US *needs* a gun, outside of police and professional security. For the vast majority of gun owners, the gun is a symbol of power, an illusion of control, or a glorified toy. It just makes small men feel big. But almost everybody has a genuine need to keep their financial and other sensitive data out of the hands of organized crime. I place much more value on having my data secure than on being able to own a gun.

            • dextrous
            • 4 years ago

            As long as there are bad people in the world, I want to be able to legally defend myself and my family via the best means possible. Put another way – I want the most effective arms available to give me the best chance of survival if it came down to it.

            Some people are scared of guns. Some people think guns can somehow do evil things on their own. In reality, a gun is a tool. A tool in the hands of somebody with bad intentions results in bad things. Even if you could prevent anybody in the world from owning a single gun, bad people would still do bad things – only then we’d start talking about outlawing knives, pitchforks, etc. The logical conclusion of the thinking “people kill people with guns, so we need to ban guns” is “people kill people with their bare hands, so we need to cut off everybody’s hands at birth.”

            Instead, we need to allow law-abiding citizens an effective way of defending themselves from bad people. Btw – the 2nd amendment was really about people being able to defending themselves against a repressive government.

            • trackerben
            • 4 years ago

            It would be a little silly in heavily patrolled areas and/or with culturally homogenous and peaceful populations, i.e. Japan. The US doesn’t enjoy that best-case, it’s a big country with varied situations and I think gun ownership is best regulated differently according to circumstances, by lower-level authorities who know their local situation. Some peaceful cities can get away with restricting usage in public, others where there’s little confidence in community policing may have to relax carry and engagement rules a bit.

            I sure agree privacy between parties must be preserved, and only pierced if lawfully warranted in exceptional cases. I think this case shows Apple is closest to delivering a broadly accepted platform for information privacy.

            • Krogoth
            • 4 years ago

            Sorry, but I never will place my complete trust and safety in hands of any government. I will also never relinquish my rights that allow me to defend myself.

            Weapons are a necessity for civilization whatever it is guns, blades, missiles etc. It help keeps honest, civilized people. Civilized and honest.

            You are hopeless native and/or foolish to think otherwise.

            • VincentHanna
            • 4 years ago

            So the wild west was more “civilized and honest” than today?

        • Krogoth
        • 4 years ago

        Gun Control is simply ineffective at deterring violent crimes.

        Criminals will be able to acquire guns from underground sources while law-aiding citizens continue to suffer from the bureaucratic nightmare that is involved with Gun Control measures. The only crowd that benefits from Gun Control is authoritarian governments.

          • trackerben
          • 4 years ago

          Well also gangs and narco-state militias benefit a lot. Civil society can quickly get more brittle and over-reactive than many think. During the L.A. riots, entire cities locked down, and armed volunteer militias quietly formed in many neighborhoods.

          After Koln, many gun dealers in Germany and Austria ran out of basic inventory almost immediately, nationalist rallies have grown like wildfire, and politics have turned toxic to a degree not seen since the 1940s. Or since the 2000s, if the US is included. The resort to effective means of self-protection is returning to Europe in waves of fear due to government failure. [url<]https://www.youtube.com/watch?v=UKAQX74yRyc[/url<]

      • Anovoca
      • 4 years ago

      [quote<] arguing about encryption kind of misses the elephant in the room [/quote<] The FBI is terrible at prevention?

        • Tirk
        • 4 years ago

        I’m actually surprised there are not more terrorist attacks in the U.S. considering how much we interfere militarily across the globe.

        I don’t expect an institution to be perfect at stopping all crime, the FBI seems to be doing a good job considering the global climate to U.S. actions abroad.

      • NovusBogus
      • 4 years ago

      Clearly civilians with guns killing a few thousand people a year is a problem that can only be solved by restricting their use to governments that kill a few thousand people every day. They’re the responsible ones, after all.

        • Krogoth
        • 4 years ago

        People will kill each other regardless if you outlaw weapons in any shape and form.

        Prohibitions just gives more power to authoritarian and criminal types.

    • oldog
    • 4 years ago

    Remember when MS said IE couldn’t be removed from Windows?

    • UnfriendlyFire
    • 4 years ago

    And how long would it take for a rogue FBI or NSA agent to sell the backdoor info on the blackmarket for loads of Bitcoins?

    Not too long ago, a Department of Energy employee attempted to sell nuclear info to what he thought were terrorists or hostile nations, and instead was caught by undercover agents. He also offered to help hackers break into the Department of Energy’s servers, and provided an email list of over 50 employees for the hackers to phish.

    From what I’ve heard, he’s only getting 10 years of prison for attempted high treason, assisting with hacking the US government, and sell nuclear info to terrorists or countries such as North Korea or Iran.

    • blastdoor
    • 4 years ago

    Thinking about this from the terrorist point of view…

    Would I really trust Tim Cook that Apple’s products are secure? I mean, he’s a rich gay capitalist who runs the most profitable company in the world, and that company is headquartered in The Great Satan. I’m sure terrorists have a range of motivations and beliefs, but I’m guessing 99% of them aren’t inclined to trust a person or company that fits that description.

    So if I’m a competent terrorist, I think I’d rely on encryption techniques that can’t be broken just because the FBI issues a subpoena.

    So…. if forcing Apple to do this doesn’t hurt competent terrorists, what’s the upside here? I guess we could slow down incompetent terrorists, but incompetent terrorists are likely to make plenty of other mistakes. Do we really want to screw everyone else for such a limited potential benefit?

      • davidbowser
      • 4 years ago

      I generally don’t give terrorists as much credit as you do, simply because I think they start out as normal folks and normal folks can’t balance their checkbook or evidently pick a presidential candidate.

      Your end statement is my thought exactly. This smacks of someone at the FBI on a power-trip with a hard-on for Apple. The phone companies and internet service providers in the US already give up anything and everything at the drop of a hat. Almost everything that exists on an iPhone also exists on a cloud service somewhere. At the very least, the FBI would be able to get every call and website visited by the person in question without ever involving Apple.

        • blastdoor
        • 4 years ago

        Most would-be terrorists probably don’t fall into the “competent” category, so I’m not sure that we really disagree.

        It’s just that the competent ones are the ones who really make a big impact (like the 9-11 guys). Those are the ones who are hardest, and most important to stop, and least likely to be stopped in ways that depend on US corporations complying with FBI subpoenas.

        The incompetent ones make mistakes left and right, and so can be stopped without compromising everyone else’s security.

        Sure — there’s probably a non-zero number of incompetent terrorists that would only be stopped by compromising everyone else’ security. But the cost is compromising everyone else’s security.

      • tipoo
      • 4 years ago

      That’s actually a fair point, lol. Homosexual rich capitalist running one of the most profitable companies on earth based out of the great satan, that sounds like the stuff of terrorist nightmare fuel.

      • VincentHanna
      • 4 years ago

      You assume 99% of terrorists are Muslim. That’s racist.

        • blastdoor
        • 4 years ago

        To quote me:

        [quote<]I'm sure terrorists have a range of motivations and belief[/quote<]

        • Zizy
        • 4 years ago

        Christians would be offended by gay, Russians/Chinese wouldn’t like USA, anarchists don’t like capitalist part…
        The only ones who could possibly side with Tim Cook on all these points are wall street bankers, politicians and similar. Which might be the worst terrorists of them all, but are on the right side of law (the law-making one).

          • blastdoor
          • 4 years ago

          Bingo!

      • hansmuff
      • 4 years ago

      The NSA knows very well what they’re asking for, and it has little to do with that one incident. Which by the way is about device decryption, which doesn’t help when (as you rightfully point out) terrorists don’t trust Apple to begin with, use an Android fork with encryption on SMS and well there you go. The government will then ask for more powers but at least they got device decryption done.

      Piece by piece…

        • HERETIC
        • 4 years ago

        There’s more to this than-Is it worth giving up our freedom so we can catch some terrorists,
        and how much do we trust our govt and law enforcement agencies………………..

        The FBI want apple to do this for FREE-A concept that apple does not understand-
        If you can’t mark it up 400% “WE DO NOT UNDERSTAND”
        If there was a profit to be made on this apple would comply faster than you or I could blink..

    • NeelyCam
    • 4 years ago

    So much backdoor discussion for a family site…

      • Tirk
      • 4 years ago

      Yes I think it means exactly what you think it means 😉

    • TheJack
    • 4 years ago

    A. How are you?
    B: Ask NSA.

      • ronch
      • 4 years ago

      Why go to the NSA when you’ll know everything you’ll want to know (and then some) about someone on their Facebook account? I swear, people post the most stupid things on Facebook.

    • ronch
    • 4 years ago

    Well, if those criminals had a phone from the likes of Xiaomi, Oneplus, Lenovo or (wait for it…) DOOGIE I’m sure they would’ve jumped right in and aided the FBI right away, which would get them a lot of publicity.

    A guy walked by a store and saw Doogie phones on display. There’s a particular model that caught his eye. So he went inside the store and asked the personnel, “How much is that Doogie in the window?”

      • derFunkenstein
      • 4 years ago

      You mean the one with the waggly security hole?

      • Voldenuit
      • 4 years ago

      [quote<]A guy walked by a store and saw Doogie phones on display. There's a particular model that caught his eye. So he went inside the store and asked the personnel, "How much is that Doogie in the window?"[/quote<] Such phone. Much wow.

    • xeridea
    • 4 years ago

    Why doesn’t the FBI just call the NSA, they have plenty of experience with backdoors…

      • nanoflower
      • 4 years ago

      Even if the NSA does have the ability to crack an iPhone I doubt they want that information to become public knowledge.

        • NovusBogus
        • 4 years ago

        This, pretty much. If the spooks do it they’ll never be able to use it in a public court case.

      • ronch
      • 4 years ago

      The NSA wouldn’t let them have any data they have. The FBI will have to find a backdoor to the NSA.

        • the
        • 4 years ago

        That isn’t necessarily true. There are rare occasions where the NSA will share data with the FBI.

        In this particular case though, it would be the in the NSA’s best interest not to share in hopes that the court forces Apple to modify the firmware. This would then eventually find its way into the hands of the NSA.

          • ronch
          • 4 years ago

          It…. was a joke.

    • chuckula
    • 4 years ago

    I have a problem with network-accessible backdoors that allow anyone to waltz in and grab data.

    I have much less of a problem with a specific physical hack on a phone that obviously can’t be done casually or on a mass scale and is obviously done under court order after there’s been more than ample due process.

    Tim Cook is arguing about the former but the court order is more about the latter.

    On a more practical note, what they really need is a proper side-channel attack to extract the key so that they can run offline decryption attacks to crack the passphrase.

      • cobalt
      • 4 years ago

      Not sure I agree with “obviously can’t be done casually or on a mass scale” — I’m not sure how obvious it is that once the firmware is created it won’t be abused or find its way into the wild, and I’d hate to rule out mass attacks based on it.

      As I saw others point out elsewhere, given how much info the FBI already has in this specific case, including from Apple, it’s unlikely they need to force Apple to do this for any reason related to the case, and simply want set a precedent about what they can compel third parties to do (for free?).

        • chuckula
        • 4 years ago

        [quote<]Not sure I agree with "obviously can't be done casually or on a mass scale" -- I'm not sure how obvious it is that once the firmware is created it won't be abused or find its way into the wild, and I'd hate to rule out mass attacks based on it. [/quote<] That's trivially simple to avoid in numerous ways. First, Apple could make it an unsigned firmware that literally can't work via the normal update mechanisms because it lacks a cryptographic signature. Second -- and even more effectively -- Apple can make a special forensics firmware that literally lacks 99.9% of the functionality of the regular IOS except for a simple interface to enable access to the security chip. The phone would effectively be bricked at that point, no network access, no magical NSA backdoors, and no way to fool some poor innocent end user into thinking that the phone is operating normally when it's not. As for poor Apple having to pay for all this, consider it a small restitution fee for the $1 Billion they got in shady iPad sales to the LA county school district. It beats sitting in jail next to the school officials who were convicted for corruption.

          • davidbowser
          • 4 years ago

          You seem to be focusing too much on Apple and not on the precedent.

          What prevents another government (like the EVIL EMPIRE OF CANADA) from asking for the same thing for the prosecution of an American Citizen? Once it exists, any government in any country that Apple operates could do this. In those cases, nobody really cares if they brick the phone, as long as they get the data. This is brute force, not clandestine operations.

          What prevents Apple from having to really fight that battle in another country is that Apple (and other US companies) have the backing of the US Govt to say NO to any foreign agency when it comes to this stuff. The REAL intelligence agencies (not the FBI) do NOT want any such decryption capability to exist in any public way, and the FBI has made this very public.

            • chuckula
            • 4 years ago

            As I mentioned below, Apple has the right to refuse to do business in repressive countries. If they aren’t making big money from those countries. they are also under no obligation to comply with repressive requests from those countries.

            The fact that Apple in real life has no problem doing business with just about every nasty country you can think of [as long as it pads the bottom line] while Tim Cook parades around acting like St. Snowden of Freedom is left as an excercise for the reader.

            • trackerben
            • 4 years ago

            Apple is one of my favorite computing suppliers, but you’re spot on about their Snowden-like “morally superior” posturing.

            • blastdoor
            • 4 years ago

            Agreed. It’s not about Apple.

            But Chuckie always is arguing as much with the voices in his head as he is with any person in the real world.

          • cobalt
          • 4 years ago

          Just to be clear: I’m not a fan of Apple and I don’t feel sorry for them. I am very worried that this is a dangerous precedent, though. (I don’t think you were accusing me personally of feeling sorry for them, just wanted to point out I have no dog in this fight except the precedent.)

          Also, I’m certain you’re right that they would take every feasible technical measure to keep this from being exploitable, but (a) I think they’re still putting information about how to do it in the wild and I’m wary of underestimating the creativity of malware authors, and (b) the firmware as described would still be exploitable as-is for the FBI and phone thieves alike.

          • SecretSquirrel
          • 4 years ago

          You need to read the ruling. It specifically states that the functionality Apple is ordered to create cannot in any way modify the OS image or data in flash. It is to run in RAM only. It is to bypass any mechanisms that the OS has to prevent such an application from being loaded or running. It is to disable two key security features: increasing delay between PIN attempts and auto-wipe after 10 failures.

          The government has stated to that it is ok (must?) run on only this phone. The problem here is that because of the what they have to disable, that functionality must exist in the application being loaded. Therefore an application that can run on any phone can just as easily be created. It would not be a huge step to include the brute force PIN cracker in the program. If Apple can do it, so can someone else. Once someone else can, all bets are off.

      • derFunkenstein
      • 4 years ago

      If a physical backdoor is just plugging in a lightning cable and running an app, that app will make it into the wild.

        • chuckula
        • 4 years ago

        At that point maybe we should ban the use of PCs since I can hack pretty much any PC with physical access, and that includes imaging encrypted drives for offline password cracking.

        Some of this stuff is getting silly. There’s a difference between mass surveillance of all communications and a highly specific search of a specific device.

          • derFunkenstein
          • 4 years ago

          Security is a binary state. If there’s a backdoor available—limited or otherwise—the device is insecure.

          I’m not going to dream that what’s on my phone is important to anybody other than me, but that doesn’t mean I support this request, either.

            • chuckula
            • 4 years ago

            As I mentioned below, THIS IS NOT A BACK DOOR.
            The courts aren’t asking Apple to magic up some button that decrypts every iPhone on the face of the earth. The courts *are* asking Apple to remove the specific barriers to brute-force cracking this particular phone. Oh, and the courts are damn well within their powers to do so, it’s not unconstitutional, it’s not violating anbody’s rights, and it sure as hell isn’t some magical “backdoor” that gives the FBI remote access to everybody’s phone.

            • derFunkenstein
            • 4 years ago

            I know what they’re asking for, but I disagree that a method to disable a security feature (wipe the phone once the max attempts have been reached) isn’t a backdoor. It allows for unlimited brute-force attempts. Neither of us are lawyers, so we’re not inclined to change our minds as to whether the courts are “within their powers” based on the mere statement that we said so. I don’t think it’s a bad thing to have a public discussion, either.

            • blastdoor
            • 4 years ago

            [quote<]Neither of us are lawyer[/quote<] You touch on a crucial point. This isn't just about computer technology. The human side of this is even more complex. If Apple ultimately loses this fight, I hope it at least happens with a new law being passed that sets clear bounds on what can and can't be done. This shouldn't be done under the authority of a 200 year old law.

            • Tirk
            • 4 years ago

            Scalia is rolling in his grave to hear you say that. Original-ism at its finest, or worst?

            Ah to adapt laws to the times, such a lost art.

            • willmore
            • 4 years ago

            [quote<] Original-ism at a fettish[/quote<] FTFY

            • cphite
            • 4 years ago

            They’re asking Apple to develop a version of the OS that effectively disables the barriers to brute force cracking.

            Those barriers exist so that in the case where someone steals your phone, they cannot simply brute force their way past the encryption and get to your data.

            So say Apple complies with this request… once this version of the OS gets out into the wild – and it absolutely, unequivocally will – that means that someone who steals your phone can now use brute force to bypass the encryption; thus rendering encryption pointless.

            It also sets up a dangerous precedent where the government (and not just the US government) can demand that companies (not just Apple) provide this sort of access on demand. And every time that happens, that’s another piece of code that is just waiting (for not very long) to fall into the hands of criminals.

            And bear in mind, they’re actually asking Apple to create – for free – something that doesn’t exist, and which will – by it’s very creation – harm Apple as a company. Pardon my French, but fuck that.

            So yes, this absolutely will end up violating the rights of everyone who desires reasonable privacy for their own data. It will end up worsening security for online activity – including perfectly legal activity.

            Terrorism sucks, no question; and it’s horrific when it happens. But in the grand scheme of things, you’re more likely to be killed by lightning than by terrorism. I for one am tired of seeing freedom after freedom stripped away under the pretext of “defending” us from something that is, frankly, less of a real threat than an average drive to the grocery store. I for one would rather take my chances with freedom, than live in a police state that doesn’t actually make me any safer anyway.

            • VincentHanna
            • 4 years ago

            If that is your definition of “security” then “security” is a theoretical construct.

            Nothing has security. Nothing ever will have security.
            If that’s the case, then there is no question here. That’s the problem with binaries. Once the binary is triggered, then there is no gradation, by definition. Seems like a silly definition to me.

            • derFunkenstein
            • 4 years ago

            I don’t see how that’s an excuse to make the situation worse.

            • VincentHanna
            • 4 years ago

            There is no such thing as worse. It’s a binary. By definition, it cannot get “worse.”

            • derFunkenstein
            • 4 years ago

            It’s a binary, but there is the separate issue of how much work has to be done to fix it.

            • VincentHanna
            • 4 years ago

            So it’s not a binary, now it’s exponential.

      • blastdoor
      • 4 years ago

      Right… because *obviously* once this Apple-signed firmware is created, it *obviously* won’t be placed onto a computer that has access to the Internet. And *obviously* every single person who comes in contact with it both at Apple and the government won’t leak it. Because *obviously* there is no incentive for anyone in organized crime or working for the Chinese military to do whatever it takes to gain access to such a thing, once it’s created.

      Obviously.

      And *obviously* it was totally fine for me to hand over all of my personal information to the FBI as part of a background check. *obviously* the OPM wouldn’t be hacked. Nothing to worry about there. We can trust the FBI.

        • chuckula
        • 4 years ago

        See, it’s d-baggery like this, especially when defending a multi-billion dollar company that gleefully spies on its own users to make a buck, that really puts people off.

        Using your line of “logic” we obviously shouldn’t ever put people in prison for murder because through some vaguely defined slippery slope everyone will end up in prison for life.

        Oh, and in my previous post I pointed out a few simple and completely effective ways to prevent the so-called slippery slope from even being a theoretical possibility, but you clearly don’t have the technical skill to actually address those points so manic Tim Cook parroting it is.

          • blastdoor
          • 4 years ago

          You made no specific technical suggestion — you just described the desirable features of an approach. That’s like saying you solved the problem of interstellar space travel by suggesting that somebody develop a space ship that can fly faster than the speed of light.

          If you’ve got a design for warp drive, let’s see it.

            • chuckula
            • 4 years ago

            Yeah, so you regularly call Apple the second coming including the A9X beating Skylake, IOS running the world, blah blah blah but you think that making a stripped down firmware is beyond the technical capability of what was until recently the world’s most valuable corporation.

            OK, you need step-by-step instructions for the “geniuses” at Apple?
            Here they are:
            1. Open Xcode.
            2. Remove practically everything so that this firmware is clearly incapable of being used as a “back door”.
            3. Compile in the simple features that allow for the simple brute-force attacks requested by the FBI. It isn’t even a real backdoor. Hell, your own damn software allows brute forcing via bugs that crop up, so this firmware is ACTUALLY MORE SECURE than active versions of IOS!

            This is beyond stupid, I think the terrorists were more willing to compromise than you are.

            • blastdoor
            • 4 years ago

            I’ve never said any of the things you just attributed to me. Even in this very thread, I haven’t said what you’re attributing to me regarding the development of a stripped down firmware being beyond Apple’s technical capability.

            I will repeat what I actually have said before — you don’t argue with the people in front of you. You argue with the voices in your head.

      • Pitabred
      • 4 years ago

      So you’re in favor of backdoors in your device that can be accessed by the government? And you’re sure they’d never use that in any kind of irresponsible way, nor would the secrets to it ever get out and be used by nefarious actors other than the government? Or not even just the US government… Apple is multinational corporation, subject to the laws of multiple jurisdictions.

      The price of freedom is a little bit of risk. I’m ok with that.

      FYI, what’s happening is that Apple has a security chip that has a set UUID, and mixes that with the password. If the security authorization fails too many times, it forgets that UUID and makes a new one, and poof, it’s all inaccessible. There’s not much of a side-channel other than using an electron microscope on the delidded security chip, which is still nearly impossible. It’s not like they can watch the dude put his password in or anything.

        • chuckula
        • 4 years ago

        1. Did I ever say there should be a backdoor installed in all devices? Nope.

        Did you actually think critically about what I wrote instead of going OMG SNOWDEN!?!?! Nope.

        2. Apple is subject to the laws of various jurisdictions. If you think they are so entitled to take a moral stand that’s great: They should stop trying to make profits from jurisdictions that are repressive. They are under no obligation to cooperate with repressive regimes where they don’t do business. Apple, however, likes all the profits they get from those countries while pretending to be all high & mighty.

          • nanoflower
          • 4 years ago

          Well, to be fair their stance clearly isn’t based on a moral issue. That’s just for public consumption. From a business perspective this would hurt them since once they do this (assuming they can) there will be more requests from other law enforcement people and aside from whatever ongoing costs are associated with it the main issue is that it takes away an advertised advantage the iPhone has over competitors since their security will no longer be unbreakable.

            • TheJack
            • 4 years ago

            A conspiracy theory here: Apple will be declared the winner of this battle with the US government and all will think, oh, apple is really secure. And the NSA will be laughing their guts out.

          • Deanjo
          • 4 years ago

          [quote<]2. Apple is subject to the laws of various jurisdictions. If you think they are so entitled to take a moral stand that's great: They should stop trying to make profits from jurisdictions that are repressive. They are under no obligation to cooperate with repressive regimes where they don't do business. Apple, however, likes all the profits they get from those countries while pretending to be all high & mighty.[/quote<] Don't forget that in communication, there is more than one party who may also reside out of the jurisdiction of the party seeking that information. If I say texted you, does that give my country the right to extract information from your phone?

            • chuckula
            • 4 years ago

            [quote<]If I say texted you, does that give my country the right to extract information from your phone?[/quote<] No country in history has ever had a "right". Countries have "powers". So the correct question is: Does some other country have the power to get something from your phone? I dunno, where do you live? Once again, this conversation is about the rule of law vs. the rule of corporations. Love them or hate them, the FBI is a government agency that has a legal mandate to investigate crimes. The FBI is accountable to lots & lots of people. Apple is accountable only to its shareholders, and even that's dubious sometimes. The U.S. Constitution has absolutely no prohibition on searching the contents of a specific phone after plenty of due process has been made to ensure that this isn't an unreasonable search or seizure. Apple has decided that it can make the law instead of the people of the U.S. If this were any other company, the socialists like Blastdoor would be wetting themselves in rage, but all of the sudden it's OK in this specific case. The rest of this is abstract bullcrap about magical NSA backdoors and how somehow searching one particular electronic device is the same thing as spying on every single communication in the entire world. It's complete nonsense to cover for the fact that Apple feels it is above the law.

            • blastdoor
            • 4 years ago

            Ha ha ha… now I’m a socialist! Awesome.

            I agree that there should be a rule of law, not a rule of corporations.

            [quote<]The U.S. Constitution has absolutely no prohibition on searching the contents of a specific phone[/quote<] Yeah... in fact the US Constitution is pretty silent on phones entirely. I guess that means when it comes to phone, the government can do whatever it wants. Same goes for automobiles, computers, airplanes, lawn mowers, and indoor plumbing. If it is possible for the government to gain access to people's computers, phones, etc without simultaneously making it easier for organized crime organizations and foreign intelligence services to do the same thing, then I'm all for it. It's just not clear to me that it's possible. Also -- I totally concede that I don't have the technical qualifications to determine whether it's possible. But that's not because I'm stupid -- it's because this is a complex problem. If you think that it's an easy problem and that you've figured out some easy, obvious answer... well, I don't agree.

            • VincentHanna
            • 4 years ago

            Feel The Bern!

            • Deanjo
            • 4 years ago

            [quote<]Once again, this conversation is about the rule of law vs. the rule of corporations[/quote<] That's where you are wrong. It is also about the rights of the individual. If it was not than your constitution (and similar documents around the world pertaining to rights of gov't vs individual) is worth spit. [url<]https://www.eff.org/deeplinks/2016/02/eff-support-apple-encryption-battle[/url<]

            • chuckula
            • 4 years ago

            [quote<]That's where you are wrong. It is also about the rights of the individual.[/quote<] No, this has nothing to do with the rights of the individual whatsoever. This is a specific case against a specific dead terrorist where there is ample cause for a search warrant that has been duly granted by the courts after plenty of judicial review. Guess what: There was a federal judge who has already reviewed the rights of the accused in detail and has made a finding based on the laws of the U.S. and in keeping with the Constitution that searching this phone is reasonable. You got a problem with that? Take it up with the judge. You know who sure as hell isn't a judge? Tim Cook. An unaccountable CEO of a multibillion dollar corporation sure as hell doesn't make legal decisions. Try reading the U.S. Constitution sometime, the 4th amendment clearly outlaws unreasonable searches & seizures. Well, there's absolutely nothing "unreasonable" going on here. The rest of it is complete B.S. posturing by a multi-billion dollar corporation that thinks it should make the law instead of the voters in the U.S.

            • Deanjo
            • 4 years ago

            It has EVERYTHING to do with the individual. It is the individual that is being investigated. It is the individual that may be harmed and the 4th amendment makes no provision as to warranting a unwilling third party (which apple is) to accommodate that search and seizure of which they do not own. The fourth amendment applies to the one being investigated.

            • chuckula
            • 4 years ago

            “the 4th amendment makes no provision as to warranting a unwilling third party (which apple is) to accommodate that search and seizure. ”

            Of course it doesn’t. The court, however, most certainly has the authority to make such an order.

            • Deanjo
            • 4 years ago

            The court has the authority to make any order. If it is morally right or even actually legal is subject to interpretation. Thus why there is an appeal process and higher courts.

            • dragontamer5788
            • 4 years ago

            You do realize that the “individual” in this case is the San Bernadina shooters, right? They’re already dead, the FBI wants to know if they were in contact with ISIS and whatnot. [b<]We all know that the information is on their iPhone[/b<]. The FBI has a warrant signed by a 3rd party judge giving them the authority to investigate the San Bernadina shooters and their possessions. All fourth-amendment issues have been passed as far as I can see.

            • Deanjo
            • 4 years ago

            Yes I do realize who those individuals are, but the others that they may have been communicating with may or may not have had anything to do with it or known intentions. One thing is certain however, any person on that phone would then be on numerous watch lists and investigations for the rest of their lives regardless if they had anything to do with it or not. Hell even an errant message or misdialed number will be put under scrutiny.

            If you misdialed that number, would you like to forever be on a watch list for possible association?

            • dragontamer5788
            • 4 years ago

            Wow, the conspiracy is strong within you. You don’t seem to have any idea how these investigations proceed.

            If mommy calls them and says “Come home for cookies”, such a message would not implicate that number or put people onto a watchlist. That’s the benefit of actually opening up the phone and looking at the messages directly.

            At the moment, the FBI likely only has metadata (ie: the called numbers that have gone to / from the phone). [b<]So they're already way past the point of your hypothetical case.[/b<]

            • Deanjo
            • 4 years ago

            It’s not “conspiracy” mentality at all. It’s reality. Until you can prove that absolutely nobody in the entire history has been wrongly investigated, it is just real life. Is “Come home for cookies?” a code phrase or is it just as it is sent? To put context, further intrusive investigation is required.

            Is that drunk that keeps misdialling my cell phone number for a cab an associate or is it a real wrong number?

            I could go on and on and on. Not to mention that even when ruled out, the findings are forever archived for further scrutiny by whomever later on if they wish.

            • dragontamer5788
            • 4 years ago

            Then maybe you should do something about metadata, instead of complaining about this issue.

            Your arguments are irrelevant with respect to the data that the FBI will find on the phone. [b<]The FBI already knows the numbers that tried to call, or have come from, this phone.[/b<] Do you understand this concept? So stop making arguments assuming the FBI doesn't know that particular information. The FBI is looking for other bits of information as part of this investigation.

            • Deanjo
            • 4 years ago

            I don’t agree with the collecting of metadata of the innocent either, nor do any privacy groups or anyone sane.

            Back to my drunk repeatedly calling me in the middle of night thinking I was the cab number. If that guy went off the hinge and started shooting people, guess what, they see my number being called multiple times from the same guy as roughly the same time every week, guess who is being investigated for no other reason other than having a number that was close to a taxi cabs driver.

            • VincentHanna
            • 4 years ago

            And this “investigation” hurts you how?

            • VincentHanna
            • 4 years ago

            You are right. It’s not a conspiracy, but it [b<]is[/b<] a paranoid fantasy.

            • VincentHanna
            • 4 years ago

            There is this new invention… it’s called a caller ID. Perhaps you should look into it. The FBI has their complete call and SMS history going back 24 months at least, and was able to get it in about 5 minutes with a simple phone-call.

            • VincentHanna
            • 4 years ago

            If it is about the individual, his rights HAVE BEEN upheld. Case closed.

            • VincentHanna
            • 4 years ago

            Nobody is making a 4th amendment challenge here, so your point is invalid. The 4th amendment gives the government to search a person’s possessions, it does not necessarily allow the government to compel safe manufacturers to make their safes however “crackable” the federal government thinks is an appropriate level of crack-ability.

            That is not a constitutional issue.

            • derFunkenstein
            • 4 years ago

            it sets a precedent. Next it’ll be used against a live terrorist. And shortly later, it’ll be used against someone who the government thinks is a terrorist. Eventually, hell, why not just exploit it on everyone because you never know. But no, that’d never happen. The NSA and FBI are far too trustworthy and upstanding.

            • VincentHanna
            • 4 years ago

            [quote<]Next it'll be used against a live terrorist.[/quote<] So? Nobody is arguing that it shouldn't be. [quote<]And shortly later, it'll be used against someone who the government thinks is a terrorist. [/quote<] As long as due process is being upheld... P.S. Isn't that the current situation? There is no conclusive evidence that these guys have any ties to ISIS, correct? Could have skipped the slippery slope fallacy and just started here. [quote<]Eventually, hell, why not just exploit it on everyone because you never know.[/quote<] Because you need physical possession of the phone? Because that [b<]IS[/b<] unconstitutional?

            • Pitabred
            • 4 years ago

            You don’t even know the Constitution.

            The US Government can search that phone all they want, nobody’s prohibiting anything. They can do whatever they feel like with it. They just can’t compel Apple to help them with it, or even make it easier for them. The government is having a fit because it’s not easy to get in. In this specific instance (which you seem to keep referring to), it is effectively impossible to get the data from the phone. There is just no way to do it, period, end of story, full stop. There will never be any access to the data on this particular phone.

            So after being told this, the US government pursued this order, which will effectively compel Apple to make their devices less secure. The “bullcrap” is that there’s no way to search one particular device without creating the backdoor for ALL of them. Which is the whole problem. Apple have complied with everything asked of them up until the point of having to make all of their devices inherently insecure.

            I am against the government setting a precedence of weakening security because they want easier access. If that means a bit more risk to me personally, I’ll take it, because it overall lowers my risk across the board from all actors, including the government as well as foreign governments, etc.

            For someone who treats “socialism” like it’s a dirty word, you seem to sure be gung-ho about giving the government sweeping powers. Or is it only the military and police arm of the government that you’re a fan of having sweeping powers, in which case it seems like you’d be in favor of a militaristic dictatorship?

          • Pitabred
          • 4 years ago

          1. Yes, you did. “a specific physical hack on a phone that obviously can’t be done casually or on a mass scale” is called a backdoor. The point of a backdoor is to get in when you can’t get in through the front door, ie, the normal user password. Don’t get upset at me because you don’t know what words mean.

          As for critical thinking, I’m reasonably certain that I have a better grasp of the situation here, given your ample mistakes in comprehension and explanation. I told you why your ignorant suggestion of using a side-channel attack was impossible. Saying “nope” like you actually have any kind of intellectual platform to stand on is… silly.

          2. That’s nice. They’re making the moral stand that they are not going to build in backdoors for anyone, anywhere. That doesn’t sound unreasonable to me. Because there’s no way to guarantee that only the good ‘ol US government is the only one with the key. If it exists, it can be exploited with enough resources.

            • chuckula
            • 4 years ago

            Actually, it’s you who is wrong and it clearly shows you don’t understand what is going on here. At no point did the FBI ask for Apple to magic up a way for them to get in other than via the normal password… the FBI is [i<]trying[/i<] to go through the front door using standard password cracking techniques. Apple, however, has made this difficult by programming the security chip to nuke the cryptographic key after a few incorrect attempts. The U.S. Court system most certainly has the legal authority to ask Apple to modify this particular phone to make going through the front door via normal brute-force methods easier. It's not a back door. It never was a back door. It's a specific modification to one phone that makes going through the FRONT DOOR easier, and it's 100% within Apple's power to do.

            • Pitabred
            • 4 years ago

            The thing you’re seemingly too dense to understand is that it is impossible. They may as well ask a goose to lay a golden egg. They’ll have just as much success. Your assertion that it’s 100% within Apple’s power is based on your technological ignorance and incompetence, and I’m not going to go back over exactly why when I did so previously. Go look up how Apple’s security processor works (here, I’ll do it for you: [url<]http://blog.cryptographyengineering.com/2014/10/why-cant-apple-decrypt-your-iphone.html[/url<] )

            • chuckula
            • 4 years ago

            Idiot.
            Did the FBI ask Apple to decrypt the phone? No.

            Did I ever say that Apple could magically decrypt the phone? No.

            Did I make a technically accurate statement about the operation of Apple’s security chip and how it can be modified to let the FBI perform a brute force process? Yes.

            If you had actually read the PDF that you linked would you have realized that everything I said was right? Yes.

            • Beahmont
            • 4 years ago

            The A6 does not have a trial count and auto-erase count in hardware. It’s in firmware. Apple already posses firmware for an A6 that allows you to have infinite tries to guess the password. It was called iOS 6. Firmware for iOS6 can do everything that the Feds are requesting. That the Feds are requesting irrelevant things is besides the point. The password is secured in hardware in the A6. The password hash that has to be calculated has to be done in hardware on the A6. The A6’s password hashing hardware has a minimum physical time to run consecutive hashes of aprox. 5 seconds between tries. The Judge’s Order merely removes software delays.

            The Judge’s Order is absolutely irrelevant for A7 processors and beyond as the trial count and auto-erase count and process is done in hardware.

            This is by definition not a back door. This is removing the mine field in front of the front door.

            • VincentHanna
            • 4 years ago

            Good. The feds should flash iOS6. Wonder why they are wasting the court’s time with this…

            • trackerben
            • 4 years ago

            Tim Cooked up a story about broad backdoors that *might* propagate, when all Apple was being asked was to apply a strictly bound configuration to enable more timely “front door” access, at its own premises under its full control, which Apple could then obviate at their discretion after suspect files have been extracted.

            [url<]http://www.bbc.com/news/technology-35601035[/url<] [i<]...Control the process, but not know how it's done. This is an interesting line, as it is suggests the FBI is willing to allow Apple to work on the phone at its own HQ, and in a way that doesn't risk the encryption software being released into the world.[/i<] [url<]http://blog.trailofbits.com/2016/02/17/apple-can-comply-with-the-fbi-court-order/[/url<]

      • Tirk
      • 4 years ago

      I can’t believe it, but I agree with Chuckula 100%.

      Tim Cook is merely posturing for the fans. Its hilarious that Apple shouts privacy for personal data concerns all the while happily collecting PERSONAL DATA from their user base. So everyone is fine with private companies hoarding your personal data and selling it for profit but the government shouldn’t even be able to access one specific phone because of a criminal investigation? Excuse me for not wearing my tin foil hat.

        • blastdoor
        • 4 years ago

        I think Cook is taking on far too big of a risk for this to just be posturing for the fans. He is refusing to cooperate with an investigation into a terrorist attack on the US. If you can’t see how incredibly risky that is for Apple then you haven’t been paying attention world events since August of 2001.

        It might turn out that Cook has arrived at the wrong conclusion regarding what is best for Apple, its customers, and society at large. But I don’t think this is a marketing ploy. I think he believes what he’s saying.

Pin It on Pinterest

Share This