Cynthia Dill, a Maine state senator, opined in the opined in the Portland Press Herald that:

Instead of – or in addition to – ordering Apple to build a technological back door to its phones, why aren’t we ordering weapons makers to put technology in their products that can be used by law enforcement to protect us from gun violence?

Let me see if I can shorten that sentence for State Senator Dill:

If we can get our hands on the technology to violate Apple users’ 4th Amendment rights, why don’t we go all the way and develop the technology to strip people of their 2nd Amendment rights as well?

Wait, I can do better:

If we can put our left boot on your throats, we can put our right boot on your throats too.

I think that just about sums it up.  I fall squarely in the camp the believes that Apple should, in no uncertain terms, deny the FBI’s request to build in a law enforcement backdoor to the Apple encryption.  In Katz v. United States (1967), the Supreme Court extended 4th Amendment protections to all areas that a person had a “reasonable expectation of privacy.”  The opinion in Katz established a “two tier” test for what constitutes a reasonable vs. unreasonable search.

  1. Does a person have a subjective expectation of privacy at the time of the search?
  2.  Is that subjective expectation reasonable – i.e., would society as a whole recognize that moment as being private?

The Katz example is a public phone vs. a phone booth.  If you are on a public phone and can be over heard, you don’t have a reasonable expectation of privacy.  If you are in a phone booth that you can shut to door to, you do have a reasonable expectation of privacy since you took steps to avoid being overhead.  Since companies, like Apple, advertise the security of their phones and data encryption systems as features, and people and companies often select which phones and carriers to use because of their data encryption and security features, it would be quite reasonable to assume that there is an expectation of privacy in a locked, encrypted phone.

This is not about hacking the San Bernardio shooter’s phone.  This is about giving the Federal Government the ability to hack everybody’s phone, and the Federal Government tells us to “trust them,” that they won’t abuse that ability.  Apple, in the past, has helped law enforcement hack individual computers/phones/etc.  A backdoor into the system for law enforcement is a means of lowering the bar of difficulty for law enforcement to access data.  Perhaps lowering the bar enough that the Fed could claim that with this technology, that all encrypted data is “in plain view.”  Call me a paranoid cynic, but somehow I get the impression that if the Fed had the ability to backdoor their way through Apple’s encryption, by next week they’d have a Bluetooth wand that they could wave over your phone and download all of info off of it with one swipe.  It’s not like the NSA Spent six years listening to people’s calls or the IRS was used to harass groups based on their political leanings or anything like that.

So, an attempt by the government to obtain technology to enhance their ability the violate the Constitution, has given State Senator Dill justification for wanting MORE technology capable of violating the Constitution.

Of course, she retreats to her corner of “it’s for the children:”

“Why is it a 6-year-old child can pick up an iPhone and be prevented from accessing its contents because of a passcode, but that same child can pick up a gun and shoot his 3-year-old brother in the face and kill him by accident?”

Her justification is a bald faced lie.  She wants the ability to render people’s legally obtained guns inoperable at a distance.  Rendering them inoperable is effectively no different can confiscation.  A gun that doesn’t work is as useless as no gun at all.  The aftermath of Hurricane Katrina established that mass gun confiscation was a 2nd Amendment violation.  So a remote deactivation of guns, whether it happens to one person or many would likewise be the same violation.

The two phrases I don’t see are “warrant” or “court order” suggesting that she feels the government should have the power to deactivate anybody’s guns at any time.

“Hey, there is a riot going on down town, let’s just shut off all the civilian guns in the city… wait, how about the county… better yet, the whole state.  Who cares if some people are defending their businesses, homes, and lives with their guns, it’s for the children.”

The demand for smart guns that the government could remotely disable is a constitutional double whammy.  If When your gun is deactivated without a warrant or court order, your 4th Amendment rights were violated, and because it was your gun that was rendered useless, it was your 2nd Amendment rights that were violated.  But considering just how much alacrity she has for violating one Amendment, i’m sure violating another simultaneously is just frosting on the cake.

I mean if you are going to advocate for totalitarian behavior, go big or go home, right?

All this OpEd proves to me is that 1) there is no limit to the restrictions some people want to place on our rights, and 2) I will never own a smart gun as long as I live.

Spread the love

By J. Kb

15 thoughts on “Statists gonna state”
  1. It’s actually not this cut and dry though. At least in regards to Apple and this phone. This wasn’t private property but belonged to the county the shooter worked for. That makes it public property, not private. As for the technical aspects, that’s where it gets even murkier.

  2. Ms. Dill is likely still feeling pissy over the passage of Constitutional carry in Maine last year. It never ceases to amaze me that these type of folks can say such things with a straight face. And then be surprised when people don’t take them seriously.

  3. It’s not about the shooter’s phone. It’s about your phone and my phone and the government being able to bypass the encryption at will and possibly without a warrant.

  4. As FORMYNDER has said, it’s not as cut and dry. He’s already touched on the fact that this was not the shooter’s phone, but his employer’s phone, and that employer has consented. A couple more pertinent aspects on this particular phone issue came via The Volokh Conspiracy legal blog. These pieces make good points about Apple’s position and their behavior in other areas related to ownership/security/searches:

    https://www.washingtonpost.com/news/volokh-conspiracy/wp/2016/02/20/has-apple-made-iphones-illegal-in-the-financial-industry/

    https://www.washingtonpost.com/news/volokh-conspiracy/wp/2016/02/20/or-is-apple-happy-to-build-a-backdoor-as-long-as-it-makes-money-from-it/

  5. Let me make my position crystal clear since you seem to be confused.

    Apple should crack Farooq’s phone. It is a singe device owned by the city which consented. If if the city didn’t consent, if the judge issued a warrant, that’s good enough.

    I am adamantly against a LEO backdoor. That is a whole different issue and an FBI overstep.

    The government shouldn’t have the ability to bypass private encryption for millions of people just because of one terrorist. I just don’t trust them not to intercept and read every encrypted txt and email sent.

    Farooq’s phone and the FBI backdoor are two totally separate issues.

    “We need to search Farooq’s house.”

    Ok.

    “We also need a front door key for every house in San Bernardino.”

    Um… no.

  6. Thanks for clarifying. I wasn’t confused — you hadn’t stated that you thought Apple thought should crack *this* phone. From what I’ve read, the FBI is not asking for Apple to create a tool to be given to the FBI for the FBI (or any law enforcement/government entity) to possess and use at will, which is what everyone is calling “a law enforcement back door.”

    I do agree with Apple that once a tool is created, it’s likely that it will get out. However, (again, from what I’ve read) the proposed tool in question is not a key to the back door, but a tool that would turn off the auto-erase to allow the FBI to continue trying tens of thousands of keys without the phone wiping itself once a given number of wrong keys are tried. Semantically, this could be seen as creating a “back door,” I concede.

  7. Given the length and number of characters a password can be, giving the government an unlimited number of hack attempts to break it is a backdoor. I’ve had software engineers tell me that a decent computer could figure out a password in minutes if not seconds with unlimited attempts. Maybe a universal key is a bad metaphor. It’s more like a lock pick kit. Perhaps a better metaphor is: if the government can slim jim their way into your home, you had no expectation of privacy.

  8. That is probably true for a system that does not require fingers to press numbers or forces a wait after a certain number of incorrect tries. However, Apple’s security guide says “It would take more than 5 ½ years to try all combinations of a six-character alphanumeric passcode with lowercase letters and numbers.”

    From the link below:

    “After five wrong guesses, you’re forced to wait a minute. After nine wrong guesses, you have to wait an hour.”

    “When you enter a passcode into your iPhone, the processor has to make a calculation to check if your code is correct. But Apple has made the math so complicated that it takes about 80 milliseconds — roughly 1/12 of a second — for the phone to crunch the numbers.”

    “Even with Apple’s assistance in bypassing the lockouts — even if you can instantly input different passcodes without penalty — Apple is saying that it would still take the phone about 1/12 of a second to process each attempt.”

    “If the shooters picked a six-letter passcode that only uses numbers or lowercase letters, there are over 2.1 billion possibilities. At about 12 tries a second, that’s about five and a half years to go through them all”

    https://www.washingtonpost.com/news/wonk/wp/2016/02/17/how-long-it-takes-to-crack-an-iphone/?wpisrc=nl_draw2

    I’m not trying to quibble here. They could get it right on the first try or within the first 100 tries — but it could take a lot longer.

  9. missing the main point of JKB’s article: They want to destroy the Constitution – one amendment at a time.

  10. I think people are missing that the FBI would not do the right think with the information any way with their track records of bungling things.One more thing is another reason they may want the information of that phone like making sure it can be destroyed or controlled for higher reasons and cover ups.

  11. FYI: The county by their own regulation/rules should have installed a “backdoor” sw, something they did actually not do. So, bureaucracy slept on the steering wheel and requires an exemption now.. ? ?

  12. It’s not about what’s on the phone; it’s a work phone belonging to the county, and it’s hard to imagine what useful information he would have on it rather than his personal phone. Further, at this point, months later, any information is largely useless.

    This is about setting a precedent. And they’ve found a case with good facts to seek that precedent – after all, he killed all those people; who could object?

    The precedent isn’t even fully set, and it’s already being applied elsewhere. NYTimes, 2/23/16, “Apple Faces U.S. Demand to Unlock 9 More iPhones.”

    It’s not about the information on the phone, folks. It’s about backdooring encryption. And make no mistake about it, once they have that backdoor, they will use it indiscriminately. And all the while they will claim it is for your own good.

  13. “Why is it a 6-year-old child can pick up an iPhone and be prevented from accessing its contents because of a passcode, but that same child can pick up a gun and shoot his 3-year-old brother in the face and kill him by accident?”

    You mean the passcode that you think Apple should be forced to break?

    Also she probably doesn’t get the subtle yet important difference that locking an iPhone with a passcode is optional. Nobody is forcing you to lock your phone- unlike “smart guns”.

  14. Has Ms. Dill, Legisweasel (insert apology to weasels, here), not heard of, oh heck, what do they call those things….Oh, Right! GUN SAFES? Or, heck, keeping the non gun safe firearm on your person?

    Worked very successfully for The Plaintiff and me when our kids were young/before they left the nest.

Comments are closed.