Supreme Court rulings show how technology and the law evolve
I would like to take a step back to take in a broader, bigger picture than usual in this article, moving across time as well as into the legal sphere. Technology isn’t just hardware and software; it’s a fundamental component of every aspect of our lives and our society.
When U.S. Attorney General William Barr recently admitted that adding backdoors for law enforcement made encryption less effective, it was, surprisingly, news. But for a long time, the U.S. government had been in public denial, saying that state surveillance and access could exist happily with top-level security. Now, as renowned expert Bruce Schneier wrote, we can have a real debate about this trade-off:
“With this change, we can finally have a sensible policy conversation. Yes, adding a backdoor increases our collective security because it allows law enforcement to eavesdrop on the bad guys. But adding that backdoor also decreases our collective security because the bad guys can eavesdrop on everyone. This is exactly the policy debate we should be having – not the fake one about whether or not we can have both security and surveillance.”
Whether you are for unbreakable encryption in public hands or not, in these matters it is essential to admit that we cannot have everything or make everyone happy all the time.
As our technology evolves, our laws must adapt to keep up. Entirely new laws, even entirely new regulatory branches, are required to address new inventions as they become commonplace. Millions of automobiles could not be regulated the same way as the horse-drawn carriage. Pharmaceuticals, airplanes, banks, telephones, food production, and weapons are regulated for the public interest – some would say too much and others too little.
This push and pull of private and public interest tends to create a functional equilibrium, although this can take decades and never truly end. Occasionally there are severe corrections, such as when the U.S. government broke up Rockefeller’s Standard Oil in 1911. Regulations often follow a crisis that captures public attention, which isn’t a bad way to go about it, although it would be nice to avoid such crises by getting ahead of them. Pre-emptive regulation can lead to overregulation, which can hamper innovation and investment.
Inventions aren’t created with social impact in mind, let alone legal repercussions. Profit is the usual driver, and the long-term effects are ignored and unknowable. A successful drug may turn out to have side-effects that go undetected for years and then be banned. Facebook and YouTube may contribute to political radicalization, but it’s hardly what the social network or video-sharing service were created for. It’s up to researchers, journalists, law enforcement, and eventually, politicians, to look into these effects and whether they are significant enough to spur regulatory action.
While our laws adapt constantly, what about our rights? Do they evolve along with new inventions? Isn’t the definition of a right something that is intrinsic, unchanging, eternal? Well, not exactly, for better and worse. Foundational documents in the annals of citizen rights, such as the English Magna Carta in 1215 and the American Bill of Rights in 1791, limit the power of government – kings and presidents and parliaments – over the people. That is, it’s not what the government has to do for you, but what the government cannot do to you.
For example, the Fourth Amendment to the U.S. Constitution protects a citizen against “searches and seizures” by the government without probable cause – usually requiring a warrant issued by a judge. (As every TV crime show watcher has learned, even if evidence of a crime is found, it might be discarded in court if it is obtained without such a warrant.)
But does the Fourth Amendment’s right to security in your “person, houses, papers, and effects” apply to your phone calls? To your web-surfing data, your purchasing preferences, or even your fingerprints and genomic code? Are these things also your property? Or do they belong to the owner of the server that stores them, or to the company whose algorithms extract them and process them?
Owning digital data is already complex. If you read the fine print, you might find that you don’t truly own the software, apps, digital music, or e-books you bought and paid for, at least not in the way that you own a paper book. You instead bought a license to download it, or display it, a license that can be revoked by the publisher or distributor. A song or book you enjoy today might disappear from your devices tomorrow with a Thanos-like snap of Universal’s or Amazon’s corporate fingers.
You might hope that data you create would be different, but often it’s worse. An app you install to apply amusing filters to your photos might claim to own all rights to those creations, and even to use your photos for advertising for the company’s profit (not yours, of course). When people complained about their images being used without permission, companies simply added language to their apps’ terms of service granting permission, and, since nobody reads those endless pages of legalese before clicking “OK,” they often have little defense in court.
These may seem like gray areas to people like me who didn’t grow up with online photo sharing and social media (or digital cameras, or the internet…) but to this generation, it’s obvious that your data should remain yours unless you sell it or otherwise give express consent. My tween daughter knows her data is sent and stored all over the world, but assumes it’s still hers in every meaningful way, or should be. This is a healthy view – the issue is making a legal framework that supports it.
In 1928, the U.S. Supreme Court ruled on a landmark case, Olmstead v. U.S., and showed how the law can fall behind technology on rights. In a 5-4 decision, Chief Justice (and former president) William Taft argued that wiretapping the defendant’s telephone without a warrant didn’t violate the Fourth Amendment because previously it had only been applied to physical searches.
But while it didn’t help Olmstead (a bootlegger and alcohol smuggler during Prohibition, speaking of outdated laws), Judge Brandeis’ dissent became far more influential than Taft’s majority opinion. He argued that listening to calls was no better than opening a sealed letter without a warrant – even though the telephone hadn’t been invented yet when the Fourth Amendment was penned by James Madison. As Brandeis wrote, “Subtler and more far-reaching means of invading privacy have become available to the government. Discovery and invention have made it possible for the government, by means far more effective than stretching upon the rack, to obtain disclosure in court of what is whispered in the closet.”
This seems obvious today, and the battle goes on with electronic government surveillance ostensibly for counterterrorism, video cameras in public places, and the myriad forms of tracking with and without consent on social media.
In hindsight, the learned Taft sounds a little ridiculous stating that a wiretapping subject didn’t deserve privacy protection because the phone lines “(reach) to the whole world from the defendant’s house or office” and so he was essentially broadcasting freely to the public. Today, when we are connected to the entire planet at all times, our rights must expand to match the scope of our technology, not be frozen in time.
Being an “originalist” regarding the U.S. Constitution risks turning today’s Supreme Court judges into modern-day Tafts. A recent example resulted in another 5-4 decision, but this time against the government. The FBI obtained cell phone data without probable cause warrants and used it to charge Timothy Carpenter. The question for the Court was whether using the location and movement of cell phone data violated Carpenter’s Fourth Amendment rights.
To cite one summary, in 2018 the Court narrowly decided that the Fourth protects the “reasonable expectation of privacy,” not only property. “Expectations of privacy in this age of digital data do not fit neatly into existing precedents, but tracking persons’ movements and location through extensive cell-site records is far more intrusive than the precedents might have anticipated.”
That’s the technology/rights problem in a nutshell – legal precedents cannot keep up with technology, and even inalienable rights must adapt to survive.
That’s the technology/rights problem in a nutshell – legal precedents cannot keep up with technology, and even inalienable rights must adapt to survive. Four judges dissented, mostly on the grounds that cell phone records are the same as any other business records the government can obtain – that is, not the property of the person whose records they are. Judge Clarence Thomas, in an echo of Taft 91 years earlier, also asserted that since the records weren’t searched on Carpenter’s property, it wasn’t his information. (Following that logic, the only way your data would be protected is if you ran your own personal server – and we all remember how much trouble that can cause!)
Perhaps we shouldn’t be too hard on the absolutists and constructionists who push back against the tide of technology reshaping the legal landscape. They may sound like dinosaurs, and are almost inevitably on the wrong side of history, but finding balance takes time, with chaos and division along the way. Argument is essential and rushing into the future can be as fraught as trying to hold it back. A successful system brings together private industry, government agencies, and consumer groups and NGOs like EPIC. Transparency, accountability, and healthy public debate are our best allies in navigating the technology and regulatory collisions.
By the way, even the hallowed First Amendment, principally protecting the freedom of speech, has seen countless legal infringements, such as regulations for obscenity, libel, and false advertising. Technology has also played a part, necessitating regulation first of physical mail sent to your home and then spam sent to your inbox. After all, one’s liberty should not come at the cost of another’s pursuit of happiness.
Many of the underlying algorithms we rely on are only as good as the human knowledge they come from. And sometimes, the knowledge transfer from humans to formulas falls short.
Security weaknesses align seamlessly with the spreading of disinformation. The purveyors of disinformation know this and have taken to spreading malware via vulnerable mobile apps.