PRIVACY LAWS SHOULD NOT APPLY TO SAN BERNADINO KILLERS
By Matthew Patton
By refusing to comply with a court-issued order to assist the FBI, Apple has catapulted themselves (and the general public, by extension) onto a very slippery slope.
We have to ask ourselves: at what point does the right to privacy exceed or supersede national security concerns? And given how the government has flown the star-spangled banner of “national security” to push sketchy agendas in the past (the Japanese-American internment camps of World War II, for example), I am not suggesting that tech companies write a blank check for the government to sign.
This is different. This is an attempt to get into a phone of someone that, along with his wife, killed 14 people last December in San Bernardino, California. The government has made it clear that they want access to that phone, and that phone only.
That phone may (or may not) lead to fellow conspirator’s contact information, and how plans were laid out. They say the information that could lead to the prevention of further crimes.
However, Apple CEO Tim Cook is adamant that the privacy of all customers who huddle under the Apple umbrella should be protected. His letter to customers (on the Apple website) explains why.
It smacks of hypocrisy and self-degradation that couldn’t be more disingenuous.
Today’s impregnable security algorithm is tomorrow’s wet paper bag
For instance, he states that all “information needs to be protected from hackers and criminals who want to access it, steal it, and use it without our knowledge or permission,” in his letter to customers Feb. 16.
OK, on the surface this is a noble goal. But it begs the question: why is Apple gathering all that information to begin with? Why is it okay for them to hoard this information—and use it at their discretion when customizing ads and applications according to that consumer’s personal info—but it’s not okay for the government to have that same access? If handing over that information contributed to their profits, would this even be a conversation?
And another thing he wrote was incredibly misleading. He argues that creating a “key” for one iPhone would, in effect, be creating a key for every iPhone, and something like that in the wrong hands would be disastrous. Essentially, the government is asking Apple to hack their own systems.
Woe is me
Technology is constantly changing, ever in flux. Today’s impregnable security algorithm is tomorrow’s wet paper bag. Designers, programmers and engineers are constantly pushing the envelope, pushing the limits and pushing themselves to see what they’ll come up with. Every product is analyzed to death, and in some successful cases, reverse-engineered to improve upon the original concept.
For Tim Cook to act like this would be a huge burden is ingenuine. It wasn’t that long ago that iPhone users were all in a tizzy because their iCloud systems got hacked in August 2014, resulting in a slew of unsavory celebrity photos being strewn across the Web. One could say the event was the impetus for Apple to reinforce their own security measures to the degree that they have. And that’s fine.
But here’s the thing: for every improvement that companies like Apple make, hackers and the like will continue to make similar strides just to prove that they can beat the system. It’s a never-ending game, an endless waltz.
It’s only a matter of time before someone figures out how to beat this latest version of Apple’s security and then Apple’s development team of programmers and engineers will be back to the grind to create a better product.
Tim Cook and Apple miss the mark with their reasons for refusing to adhere to a court order. They are potentially shielding activities that need to be brought to light. There’s no getting around that.
Contact a reporter