Note: this is post #4 of a multi-part series.
Thus far, we have seen how proposals to reform “Section 230 Immunity” or to regulate the Internet advertising market are either contradictory or likely to backfire (see here re: § 230 and here re: ads). What about the problem of black hat hackers and data breaches? Why can’t we use the law to protect or at least enhance user privacy and the security of user data from external hacking threats? Alas, here is another area where our lofty legal rhetoric and growing calls for regulation fail to match up to our technological reality. If anything, as I shall explain below, it is the government who represents the greatest threat to our data privacy!
As Lemley notes in Part 1 of his paper (p. 318), “we want to stop [black hat] hacks and data breaches.” By way of example, in July of 2019 Facebook agreed to pay a $5 Billion fine to the Federal Trade Commission for allowing private user data to be exposed (see here, for instance), and numerous States have enacted local laws imposing legal liability on companies with insecure data- and information-protection systems. (On this note, see the sources in footnote 63 of Lemley’s paper.) But here’s the rub: when tech companies respond to our demand for privacy by providing strong end-to-end encryption platforms like WhatsApp or secure devices like the Apple iPhone, it is the government — specifically, law enforcement agencies — who ends up objecting to and thwarting these technological privacy protections!
Once again, Professor Lemley is worth quoting in full (p. 319, footnotes omitted, emphasis added): “Law enforcement, it turns out, wants to make sure it has a back door into our phones and our text messages, and if there isn’t one it has even tried to force tech companies to build it. This is a battle that has been continuing for a quarter century, since the government tried to build a backdoor into digital phone technology in the 1990s. The issue then was secret communications supporting terrorism, while more recently it tends to be child sexual abuse or, even more recently, white supremacy. But the claim is the same: People will use encryption to hide the bad things they are doing, so law enforcement must have the power to break encryption.”
I will sum up by restating the irony of this perverse state of affairs: (a) Hackers are a threat, and people want to protect their communications and their devices; (b) private tech companies build end-to-end encrypted systems to thwart hackers and meet this legitimate demand for privacy; (c) government agencies, however, insist on a “back door” to maintain their snooping powers, making it easier for hackers to access our data. But we can’t have it both ways. As Professor Lemley correctly points out toward the end of Part 1 of his paper, when we use law to require encrypted platforms and devices to have “backdoors” for law enforcement agencies, we are also inevitably making those same systems more vulnerable to black hat hackers!
I will proceed to Part 2 of Lemley’s two-part paper next week, beginning on Monday, Oct. 4.