Bruce Schneier on the Future of Privacy

Last Friday I travelled to London to see a talk by security visionary and cryptographer Bruce Schneier. The event was a fund-raiser for the Open Rights Group, and was chaired by its Executive Director, Jim Killock. His was not a demanding role. The capacity crowd of disciples, many of whom were also ORG supporters, needed no introduction to Schneier or his work. Personally, I’m an admirer of his thinking, and have been known to quote him on this blog.

The title of the talk was “The Future of Privacy” and Schneier’s treatment of his topic was comprehensive. He started by listing some technologies and practices that can threaten our privacy: overt surveillance systems; mobile phones, RFID tags and the like that produce personal information as a byproduct; automatic identification technologies such as ANPR; and unique identifiers in gadgets such as digital cameras or colour laser-printers.

Schneier reminded us of his famous saying, that just as greenhouse gasses are the polution of the industrial age, data is the polution of the information age. Data is generated when we transact business, swipe our loyalty cards, use a travel card or drive through an automatic toll-booth. We give it away when we socialise by email, instant messenger and Facebook. Sometimes other people release data about us – possibly without our consent. As the cost of processing and storing all this information falls to zero even data of marginal value becomes worth keeping. In fact it’s often cheaper to keep everything than to decide what should be deleted! Data that was ephemeral 20 years ago is now stored.

In the information society most data about us isn’t controlled by us. In the US, laws protect the data that is under our control, but in the information society it tends not to be. Our Gmail, phone records, medical records, financial transactions and photos of us on Facebook are all controlled by someone else. EU law is substantially better in this area but it could still be improved.

Such a wealth of data enables new forms of surveillance. For example, surveillance can now occur backwards in time. This was done in London after the 7/7 bombings – the people responsible, and the route they took on the day, were identified after the fact from surveillance-system footage. Pervasive data collection also enables wholesale surveillance – not “follow that car” but “follow every car.”

What will be the privacy impact of our society’s continuing technological advancement?

Schneier believes a step change is coming. We live in a unique time: cameras are everywhere AND we can see them; identity checks happen all the time AND we know they’re happening. However technology is a great distrupter of equilibriums and Moore’s law is a friend of intrusive tools. Soon face-recognition software will obviate the need to carry ID – when you walk into your workplace they’ll already know who you are and whether you’re supposed to be there.

New invasive technologies will emerge and become pervasive: digital video surveillance with automatic face recognition; networked cameras that can track people through a city automatically; better tracking of our personal devices through their radio signatures or RFID tags; better quality images from cameras. Our era will herald the death of ephemeral conversation. Soon everything we say and do will be on the record. We could try to reject these technologies, but once general adoption occurs, opting out starts to look suspicious. In some cases the authorities have already argued that, “They left their mobile phone at home, which shows they didn’t want anyone to know where they were going.”

What can we do about these threats to our privacy?

Schneier doesn’t believe we can engineer our way back to a more private world. Privacy-enhancing technologies already exist and they could go a long way towards retoring the balance if they gained widespread adoption. However people are seduced by convenience so they tend to make bad privacy trade-offs. We’re on Facebook because our friends are, and while we’re chatting to them we’re focused on the conversation, not on how much data we’re releasing or to whom.

A lot can be done by paying attention to the default settings of software and systems. Most of us won’t change these so if they are secure from the outset any loss of privacy will be minimised. However companies like Facebook make more money the more public we make our data so there’s no incentive for them to set privacy-enhancing defaults.

We need to press for legislation that protects privacy: comprehensive laws regulating what can be done with personal information about us and more privacy protection from the police. However the law finds it difficult to keep up with the pace of technological change.

We also need to start talking about the value of privacy. We want it as a social good. Individual privacy protects us from those in power and it’s also a fundamental human need. Privacy is a part of dignity.

Schneier rejects the security versus privacy notion as a false dichotomy. Only identity-based security reduces privacy and the effectiveness of this is limited. Physical security measures such as locks and burglar alarms don’t reduce privacy. Nor does knowing that you might have to fight back if terrorists hijack your flight. We don’t need to know who’s sat next to us on an aeroplane – we just need to know know whether they’re planning to blow it up! However checking intent is difficult so we check identity instead and pretend that’s the same thing.

Privacy and openness have different effects on Governments and citizens. Government secrecy increases its power whereas transparency and openness reduces it. Conversely, forced openness in people increases the inbalance in power between them and the state, yet forced openness in Government reduces the gap. The balance we need to strike is between liberty and control not privacy and security. Real security comes from having both liberty and privacy.

The above notwithstanding, sometimes we are forced to trade between security and privacy, for example when we give the police the power to search our homes. In such cases we can maintain the balance of power through audit and oversight. Search warrants are a security measure that restrict police searches to only those cases where a magistrate – an impartial advocate for the suspect – can be convinced there are reasonable grounds for suspicion.

Schneier concluded that the death of privacy is over-stated. Left unregulated and unconstrained, technology tends to tip the balance of our society against individual privacy, however it doesn’t make the balancing act go away. Society can choose to deliberately reset the balance with legislation.

We may ultimately have to wait for a new generation of digitally-savvy lawmakers to take office before the the future of privacy can be guaranteed in the information age.