The undercover war on your internet secrets: How online surveillance cracked our trust in the web

A black shrouded figure appears on the screen, looming over the rapt audience, talking about surveillance. But this is no Big Brother figure seeking obedience though, rather the opposite.

Perhaps even his nemesis.

NSA contractor-turned-whistleblower Edward Snowden is explaining how his former employer and other intelligence agencies have worked to undermine privacy on the internet and beyond.

“We’re seeing systemic attacks on the fabrics of our systems, the fabric of our communications… by undermining the security of our communications, they enable surveillance,” he warns.

He is speaking at the conference via a video link from Russia, where he has taken refuge after leaking the documents detailing some of the NSA’s surveillance projects. The room behind him is in darkness, giving away nothing about his exact location.

“Surveillance is not possible when our movements and communications are safe and protected – a satellite cannot see you when you are inside your home – but an unprotected computer with an open webcam can,” he adds.

Over the last two years a steady stream of documents leaked by Snowden have laid bare how intelligence agencies in the US and the UK have waged a secret war against privacy on the internet. How they have worked to undermine the technologies used by billions of people every day to protect everything from mundane messages – or webcam chats – to their most secret thoughts.

One of the most significant technologies being targeted by the intelligence services is encryption.

Online, encryption surrounds us, binds us, identifies us. It protects things like our credit card transactions and medical records, encoding them so that – unless you have the key – the data appears to be meaningless nonsense.

Encryption is one of the elemental forces of the web, even though it goes unnoticed and unremarked by the billions of people that use it every day.

Edward Snowden speaking at the CeBIT tech show
 Image: Deutsche Messe, Hannover

But that doesn’t mean that the growth in the use of encryption isn’t controversial.

For some strong encryption is the cornerstone security and privacy in any digital communications, whether that’s your selfies or for campaigners against an autocratic regime.

Others, mostly police and intelligence agencies, have become increasingly worried that the absolute secrecy that encryption provides could make it easier for criminals and terrorists to use the internet to plot without fear of discovery.

As such, the outcome of this war over privacy will have huge implications for the future of the web itself.

The code wars

Codes have been used to protect data in transit for thousands of years, and have long been a key tool in warfare: the Caesar cipher was named after the Roman emperor who used it to protect his military secrets from prying eyes.

These ciphers were extremely basic of course: the Ceasar cipher turned a message into code simply by replacing each letter with the one three down in the alphabet, so that ‘a’ became ‘d’.

Ciphers became more sophisticated, and harder to break, over the centuries, but it the Second World War that demonstrated the real importance of encryption – and cracking it. The work done at Bletchley Park to crack German codes including Enigma had a famous impact on the course of the war.

As a result, once the war was over, encryption technology was put on the US Munition List alongside tanks and guns, as an ‘auxiliary military technology’ which put restrictions on its export.

“The real fundamental problem is the internet and the protocol it’s all based on was never intended to be secure.” Alan Woodward, Surrey University

In practice, these government controls didn’t make much difference to ordinary people as there were few uses for code-making – that is, encryption – outside the military.

But all that changed with the arrival of the personal computer. It became an even bigger issue as the huge economic potential of the web became apparent.

“The internet and the protocol it’s all based on was never intended to be secure, so if we are going to rely on the internet as part of our critical national [and] international infrastructure, which we do, you’ve got to be able to secure it, and the only way to do that is to layer encryption over the top,” explains Professor Alan Woodward, a computer security expert at the University of Surrey.

Few would be willing to use online shopping if their credit card details, address, and what they were buying was being sent across the internet for any to see.

Encryption provides privacy by encoding data onto what appears to be meaningless junk and it also creates trust by allowing us to prove who we are online – another essential element of doing business over the internet.

“A lot of cryptography isn’t just about keeping things secret, a lot of it is about proving identity,” says Bill Buchanan, professor of computing at Edinburgh Napier University. “There’s a lot of naïveté about cryptography as to thinking it’s just about keeping something safe on your disk.”

But the rise of the internet suddenly meant that access to cryptography became an issue of privacy and economics as well as one of national security, immediately sparking the clash that came to be known as ‘the crypto wars’.

Governments fought to control the use of encryption while privacy advocates insisted its use was essential not just for individual freedom, but also to protect the commercial development of the nascent internet.

What followed was a series of skirmishes as the US government and others made increasingly desperate – and unsuccessful – efforts to reassert control over encryption technologies. One example – in the mid-90s involved the NSA designed the Clipper chip, a way to give the agency access to the communications on any devices on which the chip was installed.

Another attempt at government control during this period came with the introduction of key escrow. Under the scheme, the US government would agree to license encryption providers, if they gave the state access to the keys used to decode communications.

On top of this were rules which only allowed products that used weak and easily-cracked encryption to be exported from the US.

gchq.jpg
Banksy’s surveillance themed street art in Cheltenham, near to GCHQ’s HQ
 Image credit: Stephen Clarke / Shutterstock.com

Remarkably there was an unwelcome reminder of those days of watered-down encryption with the appearance of the recent FREAK flaw in SSL. The vulnerability could be used to force web browsers to default to the weaker “export-strength” encryption, which can be easily broken.

Few experts even knew that the option to use the weaker encryption still existed in the browsers commonly used today – a good example of the dangerous and unexpected consequences of attempts to control privacy technologies, long after the political decisions affecting it had been reversed and forgotten.

But by the early 2000s, it appeared that the privacy advocates had effectively won the crypto wars. The Clipper chip was abandoned, strong encryption software exports were allowed, key escrow failed, and governments realised it was all but impossible for them to control the use of encryption. It was understood that if they tried, the damage they would do to the internet economy would be too great.

Individual freedoms, and simple economics, had overwhelmed national security. In 2005, one campaigning group even cheerfully announced “The crypto wars are finally over and we won!”

They were wrong.

We now know that the crypto wars were never over. While privacy campaigners celebrated their victory, intelligence agencies were already at work on breaking and undermining encryption. The second stage of the crypto wars – the spies’ secret war – had begun.

Antique names, modern surveillance

Naming their most confidential, controversial, and expensive projects after civil war battles was probably a dark inside joke that the spies of the NSA and GCHQ never expected to see made public.

But Bullrun and Edgehill – the first battles from the American and English civil wars respectively – were the names given by the US and British intelligence services to their attacks on the encryption systems that underpin the communications of billions of people.

The documents provided by Snowden detail at least some of this secret war. It’s where those civil war-inspired codenames were revealed, just one part of a multi-billion dollar assault on the use of encryption which has been gradually revealed over last two years.

According to a top secret briefing paper published by The Guardian newspaper, the aim of ‘Project Bullrun‘ (the first Battle of Bull Run ended in victory for the Confederates) was explicitly to “defeat the encryption used in specific network communication technologies.”

Another Snowden document published by The New York Times detailed some of the methods the NSA was using with the aim of “defeating network security and privacy.” The project involved multiple sources and methods (“all of which are extremely sensitive and fragile”), including “computer network exploitation” (a polite way of saying hacking into a network), collaboration with other intelligence agencies, investment in high-performance computers, and the development of advanced mathematical techniques.

Bullrun claimed to be able to circumvent the encryption used in SSL, https, SSH, encrypted chat, VPNs and encrypted VoIP – many of the most widely used privacy and security technologies deployed today.

The UK’s intelligence agency GCHQ also has a related encryption-cracking effort, called Edgehill (the Battle of Edgehill was an early victory for King Charles I of England) which focused on attacking encrypted traffic certified by three major internet companies, finding flaws in virtual private networks, and identifying digital certificates that it might be able to crack.

GCHQ’s headquarters, known as ‘The Dougnut’ in Cheltenham.
 Image: Ministry of Defence

A 2013 NSA budget request – revealed in another of the Snowden documents – shows that the NSA’s plans included creating backdoors into commercial encryption systems and influencing the standards and specifications used as the foundations of privacy technologies with the intention of making their access easier.

The document states: “Resources in this project are used to… insert vulnerabilities into commercial encryption systems, IT systems, networks and endpoint communications devices used by targets.”

The list goes on: another cryptography budget request published by The Intercept states: “This project enables the defeat of strong commercial data security systems; develops capabilities to exploit emerging information systems and technologies that are employed or may be employed by SIGINT targets; develops analytic algorithms, processes, and procedures to exploit emerging information systems technologies; and develops initial recognition, exploitation, and prototype solutions against new technology targets.”

And last year the US National Institute of Standards and Technology was forced to remove a cryptographic algorithm from its list of random number generators- after allegations that the NSA had deliberately weakened it to make it easier to crack.

It’s not just the NSA and GCHQ that have been tinkering with encryption either: the CIA has also been revealed to have waged a campaign against the encryption used to secure iPhones and iPads with the intention of being able to use the devices to spy on their targets.

But possibly the most audacious attack by the NSA and GCHQ on the privacy and security of communications was a heist aimed at grabbing encryption keys from SIM maker Gemalto, according to a story published by The Intercept.

The attack is striking in that Gemalto was not the final target: the move was likely aimed at gathering information on users of mobile phones with Gemalto technology onboard located in Afghanistan, Yemen, India, Serbia, Iran, Iceland, Somalia, Pakistan, and Tajikistan. Gaining access to the keys would have given spies access to call made on those phones that would be otherwise scrambled. Targeting a company simply because is made technology used by others was, until then, unheard of.

Gemalto carried out an investigation into the hacking attacks in 2010 and 2011, and found there had been no mass leak of encryption keys. “We are conscious that the most eminent state agencies, especially when they work together, have resources and legal support that go far beyond that of typical hackers and criminal organizations. And, we are concerned that they could be involved in such indiscriminate operations against private companies with no grounds for suspicion,” it said.

GCHQ’s response was the standard one: “All of GCHQ’s work is carried out in accordance with a strict legal and policy framework which ensures that our activities are authorised, necessary and proportionate, and that there is rigorous oversight, including from the Secretary of State, the Interception and Intelligence Services Commissioners and the Parliamentary Intelligence and Security Committee.

“All our operational processes rigorously support this position. In addition, the United Kingdom’s interception regime is entirely compatible with the European Convention on Human Rights.”

It’s worth noting that only a tiny fraction of the Snowden documents have so far been made public. It may well be that these are just a small proportion of the incidents that make up a far larger secret war.

The encryption backlash

Of course, it’s often argued that all of this activity is simply the NSA and GCHQ doing their job: they break codes and have done for decades, to make sure that criminals, terrorists, and others cannot plot in secret. If this means exploiting weaknesses in software in order to eavesdrop on those who are plotting crime, then so be it.

As GCHQ told a government enquiry set up after the Snowden revelations: “Our goal is to be able to read or find the communications of intelligence targets.”

From that perspective, they’re doing nothing more than the code-breakers of Bletchley Park did back in WWII – cracking codes in secret to fight the country’s enemies.

But many argue that the analogy doesn’t hold: Bletchley worked on cracking codes used by, and only by, the Nazis. What the NSA and GCHQ have been doing is breaking the codes used by everyone, good and bad, both outside of the US and inside it. By doing so, they risk undermining the security of all communications and transactions.

Those weaknesses and backdoors built in the NSA and its colleagues elsewhere can be used by hackers and hostile states as easily as they can be by our own intelligence agencies. Access for them to spy on the few automatically means insecurity for the rest of us.

As Snowden told the recent CeBIT conference in Germany: “When we talk about security and surveillance, there is no golden key that allows only good guys to read the communications of only terrorists.”

Some privacy advocates also argue that no government should never have such a capability to trawl through the lives of individuals. “It produces an inescapable prison. We can’t let this happen. We have to, as a matter of civic hygiene, prevent it from happening,” Phil Zimmermann, the creator of the PGP encryption algorithm, said recently.

And if the Snowden revelations themselves were an embarrassment for the intelligence agencies, the consequences for their intelligence gathering capabilities have been far worse.

One document had revealed that the NSA had been systematically scooping up unencrypted traffic travelling between the distributed datacentres of internet companies, giving them access to vast amount of customers’ email, video chats, browsing history, and more.

In response the big internet companies such as Yahoo and Google rapidly starting encrypting this traffic to shut out the watchers. As one cryptography expert, Matthew Green from Johns Hopkins University, noted at the time: “Good job NSA. You turned Yahoo into an encryption powerhouse.”

Encrypting data links between datacenters was only the beginning. As the revelations continued to tumble out, more companies decided it was time to increase the privacy of their services, which meant even more encryption.

“If those of us in positions of responsibility fail to do everything in our power to protect the right of privacy we risk something far more valuable than money. We risk our way of life.” Tim Cook, Apple CEO

“Encryption has only really become a big issue again because Snowden showed the world how insecure the infrastructure was and how it was being abused by intelligence agencies and so companies started reacting,” said Gus Hosein, the executive director of campaigning group Privacy International.

Perhaps surprisingly, given the decade-long assault on encryption, it seems the fundamentals remain strong, so long as it has been well implemented. As Snowden said: “Encryption works. Properly implemented, strong crypto systems are one of the few things that you can rely on,” before adding the caveat: “Unfortunately, endpoint security is so terrifically weak that NSA can frequently find ways around it.”

Consumer applications are jumping on the encryption bandwagon. In November 2014, the popular WhatsApp messaging service also switched on end-to-end encryption for hundreds of millions of users who post billions of messages each day.

Apple’s iOS 8 operating system now encrypts iMessage conversations and Facetime video chats end-to-end and notes: “Apple has no way to decrypt iMessage and FaceTime data when it’s in transit between devices. So unlike other companies’ messaging services, Apple doesn’t scan your communications, and we wouldn’t be able to comply with a wiretap order even if we wanted to.”

Using end-to-end encryption like this effectively locks out law enforcement which would previously have been able to access communications at the datacenter with a warrant.

Speaking at a cybersecurity summit hosted by the White House at Stanford University, Apple CEO Tim Cook’s made his position clear, saying providing privacy was a moral stance: “History has shown us that sacrificing our right to privacy can have dire consequences. We still live in a world where all people are not treated equally. Too many people do not feel free to practice the religion or express their opinion or love who they choose, a world in which that information can make the difference between life and death.”

tim-cook-wh-security-thumbnail.jpg
Apple CEO Tim Cook at the White House cyber security summit
 James Martin/CNET

“If those of us in positions of responsibility fail to do everything in our power to protect the right of privacy we risk something far more valuable than money. We risk our way of life,” said Cook.

Apple isn’t alone in this. The Electronic Frontier Foundation lists a variety of applications that to a greater or lesser extent now encrypt communications in transit or end-to-end.

The backlash had begun to gather pace.

*******

“Post-Snowden, the companies are now making their devices technically inaccessible even to themselves.” David Omand, former GCHQ director

This unexpected shift towards greater privacy caught the intelligence services and law enforcement off guard. They suddenly found that easy sources of data had gone dark. Senior officials on both sides of the Atlantic began to warn that criminals and terrorists would be able to slip through their fingers. As GCHQ’s new director Robert Hannigan said:

“Techniques for encrypting messages or making them anonymous which were once the preserve of the most sophisticated criminals or nation states now come as standard. These are supplemented by freely available programs and apps adding extra layers of security, many of them proudly advertising that they are ‘Snowden approved’. ”

He wasn’t alone in voicing such fears. Late last year, one of his predecessors Sir David Omand gave a similar warning to a government privacy and security inquiry.

“Law enforcement faces increasing difficulty in accessing heavily encrypted material that may be found on their suspects’ mobile phones or computers… Post-Snowden, the companies are now making their devices technically inaccessible even to themselves, so warrants are rendered moot.” said Omand.

And it’s not only the intelligence agencies that are warning about the risk that encryption poses, either. Early this year, British prime minister David Cameron unexpectedly upped the stakes by getting involved, too: “In our country, do we want to allow a means of communication between people, which even in extremes, with a signed warrant from the home secretary personally, that we cannot read?” The speech that contained these remarks was widely interpreted as an attack on the use of strong encryption, and either a veiled call for the return of to the failed 1990s policy of key escrow or possibly even forbidding the use of end-to-end encryption in the UK.

Days later, another leaked document revealed that the EU’s counter-terrorism coordinator Gilles de Kerchove wanted internet companies to share their encryption keys, warning that de-centralised (end-to-end) encryption was making lawful interception “technically difficult or even impossible“.

“It’s their fault that life is going to get terribly difficult for them, because they were caught trying to steal from the cookie jar, or just breaking the cookie jar wide open by smashing it on the floor,” countered Privacy International’s Hosein.

And few experts think that encryption is going to be banned anytime soon, no matter what the politicians might think.

“It’s not that people want terrorists to be able to operate with impunity. It’s the practical implications of some of what’s been said,” the University of Surrey’s Woodward said. “The trouble is that everybody relies on encryption on the internet. So if you were to ban it, you would make it almost impossible to do any business online.”

“It’s their fault that life is going to get terribly difficult for them because they were caught trying to steal from the cookie jar.” Gus Hosein, Privacy International

As Woodward points out, since this was debated in the 1990s and 2000s, the technology has moved on. For example, thanks to something called perfect forward security, new encryption keys are issued for every transaction, so something like key escrow would no longer work.

“It would send us back to the dark ages of the internet. The protocols we created in the past really didn’t have security in mind. They’re still based on someone typing at a terminal,” said Buchanan from Edinburgh Napier University.

Another unexpected consequence of the revelations about Western intelligence agencies’ behaviour is that, unsurprisingly, other nations have also demanded access to encryption keys. That’s the problem with putting backdoors into secure systems: once one nation, law enforcement agency, or legal system has them – officially or unofficially – then everybody wants one.

For example, a new anti-terrorism law in China which could be adopted into law in 2015 would require technology firms that want to do business in the country to turn over their encryption keys and communications records to the government. President Obama has complained about the proposed legislation, demonstrating neatly that one country’s dangerous backdoor security vulnerability is another country’s essential tool for fighting terrorism.

Unscrambling the future of encryption

As the more subtle attempts at undermining security become impossible, spies will have to find alternative routes to access their targets. It is likely significant that earlier this year the UK government published the legal framework under which GCHQ and other British spies can hack, use bugging devices (or even steal and replace) computers, servers, routers, laptops, and mobile phones to either obtain information or conduct surveillance.

The guidelines create a legal framework for such behaviour under UK law, and even discuss potential intelligence gathering activities which involved hacking attempts against people who are themselves not targets of intelligenc agenies, noting “Where it is proposed to conduct equipment interference activity specifically against individuals who are not intelligence targets in their own right, interference with the equipment of such individuals should not be considered as collateral intrusion but rather as intended intrusion.”

This gives some credence to Snowden’s recent claim that intelligence agencies are targeting IT staff because they have access to systems and databases.

It’s also worth noting that, despite the anguished howls from law enforcement, spy agencies and others still have plenty of data left to sift.

Firstly, encryption is really, really hard to get right: as projects like Bullrun and others have proved, the intelligence agencies and law enforcement still have plenty of ways around it. There are legal tools, for example: the UK has legislation in place which makes it an offence to not hand over encryption keys when requested by law enforcement, punishable by up to five years in prison.

And while many tech companies may well encrypt customers’ data when it is on the move – for example, between datacentres – but they cannot secure it entirely using end-to-end encryption, simply because they need to look at it themselves at some point in order to, for example, sell advertising against the subject matter of the email. The advertising-driven business models of Silicon Valley rule out the pervasive use of strong end-to-end encryption, and that means intelligence agencies and police can continue to gain access to vast amounts of information.

Police and intelligence agencies still have plenty of other data sources – the metadata on communications, for example, including who you have called, when, and for how long, CCTV, and more.

“Law enforcement agencies have access to more data now than they have had in the history of time. Pre-Facebook, how hard would it be for any law enforcement agency on the planet to find out all your known associates, they’d have to question dozens of people to find out who it is you know. They are able to get access to vast amounts of information just by asking. They complain that they’re not getting enough information but they’ve had more than they’ve ever had before,” said Privacy International’s Hosein.

Edinburgh Napier University’s Buchanan echoes the sentiment: “There are now so many ways that investigators can actually investigate someone who is suspected of committing a crime there isn’t really a problem. This isn’t going to shut the door.” Good old fashioned policing and follow-the-money are still the most effective ways of catching the bad guys.

And widespread usage of strong encryption is not the worst scenario for the spies: harder to crack and harder to detect technologies are already either in existence or in development. One such technology is steganography – hiding communications within digital images – and it’s incredibly hard to spot. Equally, quantum encryption could do away with the inherent weakness of the public key infrastructure systems used today and make messages impossible to intercept.

Still, even the experts don’t really know how the future of encryption is going to play out: there is apparently no way of accommodating both the desire of the intelligence agencies to be able to access the data they want with the safe and secure working of the web as we know it. They are mutually exclusive, and mutually antagonistic. Like the best encryption, the problem of making national security and privacy to work together seems uncrackable.

“Many of us agree with the sentiment – I am one of them – that from a security perspective you don’t want people who would do you harm being able to talk in secret. But at the same time if your answer to that is to ban encryption, that is a very bad way; the technology is not good or evil, it is the people using it,” said the University of Surrey’s Woodward.

“If we can’t secure these things then people will die.” Gus Hosein, Privacy International

Technology is unlikely to offer a way out of this impasse. As the power of supercomputers (or more likely giant cloud arrays) continues to grow, it’s easy enough to increase the size of the key – from 516, to 1024, to 2048 and onwards.

Even if quantum computers, long touted as a way of cracking all encryption almost immediately, become widespread (and “we’ve been talking about viable quantum computers since the 80s and they’re always 10 years away,” Woodward notes) the reality is that, although they would undermine encryption in one way, they will also boost it through quantum key distribution.

But the stakes may continue to rise, as least from a certain point of view.

“The security of our common computing infrastructure is even more important now than it was back then. Back in the 1990s, the reason we won was because every economy wanted to be the best marketplace for ecommerce on the planet so they knew they could not put constraints on security technology if they wanted to enable all that ecommerce,” said Privacy International’s Hosein.

And soon those issues of privacy and security will become as concrete as the buildings we live in. With the advent of smart grids, the internet of things and smart cities we will be using the web to monitor and control real world systems. “If we can’t secure these things then people will die,” he warns. This also raises another issue: as our houses and even clothes are filled with sensors, what sort of privacy is appropriate? Is it right that we should be snooped on through our smart TV or networked baby monitor? Can we draw a line anywhere?

When President Obama was asked about the issue of encryption his response was nuanced. While he said he supported strong encryption he also noted: “The first time an attack takes place and it turns out that we had a lead and we couldn’t follow up on it, the public is going to demand answers, and so this is a public conversation that we should end up having.”

It’s entirely possible to argue that we don’t need another public debate about encryption: that we had one back in the 1990s where privacy overwhelmed national interest – it’s just that the intelligence services didn’t like the answer.

But there is there are plenty of good reasons why we do need to go over the arguments about encryption again.

“This is a public conversation that we should end up having.” President Barak Obama

Back in the 1990s and 2000s, encryption was a complicated, minority interest. Now it is becoming easy and mainstream, not just for authenticating transactions but for encrypting data and communications.

Back then, it was also mostly a US debate because that was where most strong encryption was developed. But that’s no longer the case: encryption is made everywhere now as is effectively a commodity, which means no one country cannot dictate global policy anymore.

Consider this: the right to privacy has long been considered a qualified rather than an absolute right – one that can be infringed, for example, on the grounds of public safety, or to prevent a crime, or in the interests of national security. Few would agree that criminals or terrorists have the right to plot in secret.

What the widespread use of strong, well-implemented encryption does is promotes privacy to an absolute right. If you have encrypted a hard drive or a smartphone correctly, it cannot be unscrambled (or at least not for a few hundred thousand years).

At a keystroke, it makes absolute privacy a reality, rewriting one of the fundamental rules by which societies have been organised. No wonder the intelligence services have been scrambling to tackle our deliberately scrambled communications.

And our fear of crime – terrorism in particular – has created another issue. We have demanded that the intelligence services and law enforcement try to reduce the risk of attack, and have accepted that they will gradually chip away at privacy to in order to do that.

However, what we haven’t managed as a society is to decide what is an acceptable level of risk that such terrible acts might occur. Without that understanding of what constitutes an acceptable level of risk, any reduction in our privacy or civil liberties – whether breaking encryption or mass surveillance – becomes palatable.

The point is often made that cars kill people and yet we still drive. We need have a better discussion about what is an acceptable level of safety that we as a society require, and what is the impact on our privacy as a result.

As the University of Surrey’s Woodward notes: “Some of these things one might have to accept unfortunately there might not be any easy way around it without the horrible unintended consequences. You make your enemies less safe but you also make your friends less safe by [attacking] encryption – and that is not a sensible thing to do.”

Working at the White House, we don’t get easy problems, easy problems get solved someplace else White House cybersecurity coordinator Michael Daniel

And while the US can no longer dictate policy on encryption, it could be the one to take a lead which others can follow.

White House cybersecurity coordinator Michael Daniel recently argued that, as governments and societies are still wrestling with the issue of encryption, the US should come up with the policies and processes and “the philosophical underpinnings of what we want to do as a society with this so we can make the argument for that around the planet… to say, this is how free societies should come at this.”

But he doesn’t underestimate the scale of the problem, either. Speaking at an event organised by the Information Technology and Innovation Foundation, he said: “Working at the White House, we don’t get easy problems, easy problems get solved someplace else, they don’t come to us. This is one of the hardest problems I know about, certainly thats anywhere close to my job. And I think it’s clearly not one that’s going to be resolved easily, simply or quickly.”

Which brings us back to those civil war codenames, Bullrun and Edgehill, which may serve as an inadvertent, gloomy prophecy about the future effectiveness of the intelligence agencies, unless we have a better discussion about how security and privacy can work together online.

If not, it’s worth remembering the Cavaliers and the Confederates both won the first battles of the English and American civil wars, just as both would finally lose their bloody and divisive civil war. Perhaps, after a few early victories in the new crypto war, the intelligence agencies may face a similar defeat, outpaced by encryption in the long term.

It may be that in a few decades, the spies look back at the tribulations of the first and second crypto wars with something approaching nostalgia.

Advertisements

About skicat56

Snow Sports Industry veteran – Husband – Father – Network IT Ninja & Former Powncer. Old enough to know better but young enough to start a new career.
This entry was posted in Cryptography, Security and tagged , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s