Widgets Magazine

OPINIONS

The heart of darkness: The encryption rift between tech companies and law enforcement

Last Friday at Stanford, President Obama declared the cyberworld a modern day “…Wild, Wild West ,” a place where hackers are desperadoes and laptops are their weapons of choice. Cybercrime is on the rise and, although tech companies have taken serious steps to quell this threat when it affects them directly, they have been less enthusiastic when it doesn’t. Intel’s security division has estimated the annual global cost from cybercrime at more than $400 billion. This puts tech and other companies on the defensive against hackers. Meanwhile, law enforcement plays offense by investigating and prosecuting cybercrime. But they are cautious allies, not teammates, and they don’t always follow the same playbook.

Technology is developing quickly, and more secure ways of protecting our private data are being released every day. But these protections can put information beyond the reach of law enforcement, causing it to “go dark.” Although technology companies have served critical roles in assisting the efforts of law enforcement (often designing their products with built-in channels that allow law enforcement to access crucial data), this is no longer the norm.

Apple CEO Tim Cook recently talked about developing systems that “don’t store the details of [your] transactions,” (such as iOS 8), making data inaccessible to the companies that make them. This means these data in these devices, some of which may be or may become critical to law enforcement investigations, are irretrievable.

Apple washes its hands of how its devices are used, advertising on its information page that it cannot respond to government warrants for data extraction from iOS 8 devices. Good for Apple; not necessarily for us. Now, companies reply the way Tim Cook did, saying,If law enforcement wants something, they should go to the user and get it. It’s not for me to do that.” But Cook’s advice is empty—once you go to the user, covert investigation becomes impossible. Criminals, especially the most dangerous and sophisticated ones, are often the relevant users and are generally disinclined to cooperate.

It’s not surprising that tech companies from Apple to WhatsApp have raced to see who can develop the best data security. The publicization of WikiLeaks and Edward Snowden’s whistleblowing, among other incidents, have fueled public mistrust of government surveillance, which has extended to law enforcement agencies. Consumers want more robust privacy protections and tech companies have responded. In some cases this initiative may spring from genuine concern for user privacy, but much of it seems to be driven by the desire to grab market share from competitors.

Free competition is the cornerstone of our economy, but we rightly regulate socially costly practices. The government must address negative externalities, since industries have little incentive to do so. Regulations can reduce a company’s profits and competitive edge, but they also protect public goods like the environment and national security. It is not unreasonable to impose limits on a company’s ability to create systems so secure that they put our safety at risk by preventing law enforcement from ever accessing those protected data.

Criminals gravitate toward secrecy, thus encrypted systems have become a salt lick for those who mean us harm. Robert Hannigan, the Director of GCHQ (Britain’s intelligence and security organization) has described US tech companies as: “…the command-and-control networks of choice for terrorists and criminals…[even if they]…are in denial about [this].” His statements are not unfounded. Criminal organizations from ISIS to Silk Road have increasingly relied on encrypted technology systems to propagate their messages, capitalizing on the impermeable protection afforded by this extreme privacy.

We have dealt with a similar problem before. In 1994, Congress passed the Communications Assistance for Law Enforcement Act (CALEA), requiring that all digital telephone networks be wiretap-enabled. This proved to be profoundly helpful, even critical, to law enforcement investigations of some of the most serious crimes and criminal organizations in the past twenty years. However, this law is now outdated; we need new regulations to govern new technology.

Twenty years ago, the U.S. Congress determined a privacy standard that protected innocent citizens while allowing law enforcement officials to do their job, as long as they obeyed the law and obtained warrants for their investigations. The Fourth Amendment to the U.S. Constitution protects the privacy of its citizens, but it has never completely shielded them from law enforcement. Even our homes, long considered to be our most private domain, can be searched if a judge grants a warrant. Saying we want to protect the content of our smartphones above that of our homes may demonstrate an important shift in privacy values. But has the will of the people really changed?  Or have people not yet understood the link between inaccessible data and the flourishing of criminal organizations?

People are rightly concerned about the data that technology enables to be collected and preserved. In some situations, backdoors intended for law enforcement use may become a source of vulnerability that allows criminals to access data, while “going dark” can protect information from criminals and law enforcement officials alike. However, we cannot rely on tech companies to make these tradeoffs appropriately, since they may have little understanding of national security concerns and little incentive to interfere with criminal organizations that don’t threaten them or their consumers. Likewise, policymakers may not fully understand the the technological constraints involved in safeguarding data from criminals while making it available to law enforcement. That’s why communication and cooperation between government officials, security experts, and tech companies is essential to managing these complex situations.

The United States tech industry has made great strides in protecting our private data, and there have been instances of cooperation between tech companies and law enforcement. But a dangerous rift is forming between what technology enables and what law enforcement needs. Thus, we must reconcile our protective instincts towards our personal data with our protective instincts towards ourselves and our country. Law enforcement and tech companies like Apple need to work together. Otherwise, we may trade one insecurity for another.

Contact Claire Zabel at czabel@stanford.edu and Joseph (Joey) Zabel at joezabel@stanford.edu.

  • Stanford Alum

    “Criminals gravitate toward secrecy, thus encrypted systems have become a salt lick for those who mean us harm”

    We have to stop this innuendo that only criminals have something to hide. That’s the mindset of a police state. There are many reasons why I might not want my data to be technically accessible by the government beyond being in violation of criminal laws. The most obvious is holding unpopular politcal views. Once stored, they are accessible by future political actors even though current ones might be sympathetic to my views, it might be the case that future ones are less friendly. Brendan Eich’s demotion as CEO of Mozilla should be a wake up call to those who belive that non criminals should open all their personal secrets to the view of government.
    So kudos to the high tech industry for standing up to the bullies who run the US government.

  • The Survival Wire

    As our government no longer represents our will, literally wants to tax us to death and can arrest, and or imprison us for an unknowable number of victimless infractions, privacy becomes more paramount every day. Screw the criminals, I am much more concerned with the government: the IRS attacking me for politics, with the EPA deciding my backyard is a wetland, or the police confiscating my cash just because they can or landing in my house making a mistake on the address of a no-knock raid.
    Claire & Joey, you are a bunch of idiots. Please go and try to fix ISIS.

  • Guest

    Well done highlighting a very real issue in US gov. These nut-job anarcho-libertarian comments are laughable.

  • Jimperialist Pig

    I’m not clear how anything the government did caused the private sector media campaign leading to Eich’s resignation. This does articulate a valid point, though: the privacy invasions we should be most worried about aren’t coming from the guys who need a warrant to hold anything they find against you; it’s coming from private entities.

  • Student

    That’s not accurate though. The US government has a history of harrassing journalists that are critical of it’s programs and spying on activists.

  • Eunice

    The fundamental question raised in this article–how do you balance security and and privacy while respecting both– is critical to a just and safe society. The authors bring up both sides fairly and highlight an important issue. We should not let loose anything that can’t be reasonably controlled or coordinated — whether it’s technology (through appropriate legal limitations) — or government, if it violates its limits, (through the ballot box). While a history of secret government spying exists, and is a cause for suspicion, none of that spying went through the correct legal channels– the courts. To say the government no longer represents anyone’s will is just a failure to engage on a complex issue — and by the way, if you needed the police I’m sure you too would expect them to come help you, your political views aside, and not say “oh well, screw the criminals” let the citizens fend for themselves. Seems like you have a bit of a contradiction in your views when you oppose measures that help foil the secret communications of terrorists but then say “go try and fix Isis.” Calling people “idiots” is not an argument but actually a failure to be able to argue the merits.

  • Stanford Alum

    What the authors, and people who have made similar arguments in the past, fail to grasp with the whole “privacy vs security balance” narrative is that with encryption, this is a false narrative.

    The computer security community does not need government’s permission to develop spying proof crypto sytems. Case in point: TrueCrypt. Until its developers decided to shutdown the project last year – rumor has it that pushed by the NSA or similar- it was the most widely used on the fly disk encryption system. So secure that the FBI, with the methods it can use on the open, is unable to break it. The project was shutdown and a new fork was started by other developers hosted in countries like Switzerland
    that are friendlier to user privacy.
    When it comes to encryption of personal data, the question is not “balance”, rather, it will happen whether the government likes it or not. The question that the government should be addressing is not how to force companies to create backdoors into crypto systems (which it cannot do as a practical manner) but how it can develop new investigative methods to prosecute criminals without assuming that the old rules of government accessing people’s personal info on demand still apply.