E2E encryption in danger (again) – sign the petition

You should sign the petition on the official UK government site.

The UK Government is at it again: trying to require technical changes to products and protocols which will severely impact (read “destroy”) the security of users. This time, it’s the Online Safety Bill and, like pretty much all similar attempts, it requires the implementation of backdoors to things like messaging services. Lots of people have stood up and made the point that this is counter-productive and dangerous – here are a couple of links:

This isn’t the first time I’ve written about this (The Backdoor Fallacy: explaining it slowly for governments and Helping our governments – differently, for a start), and I fear that it won’t be the last. The problem is that none of these technical approaches work: none of them can work. Privacy and backdoors (and this is a backdoor, make no mistake about it) are fundamentally incompatible. And everyone with an ounce (or gram, I suppose) of technical expertise agrees: we know (and we can prove) that what’s being suggested won’t and can’t work.

We gain enormous benefits from technology, and with those benefits come risks, and some downsides which malicious actors exploit. The problem is that you can’t have one without the other. If you try to fix (and this approach won’t fix – it might reduce, but not fix) the problem that malicious actors and criminals use online messaging service, you open out a huge number of opportunities for other actors, including malicious governments (now or in the future) to do very, very bad things, whilst reducing significantly the benefits to private individuals, businesses, human rights organisations, charities and the rest. There is no zero sum game here.

What can you do? You can read up about the problem, you can add your voice to the technical discussions and/or if you’re a British citizen or resident, you should sign the petition on the official UK government site. This needs 10,000 signatures, so please get signing!

Should I back up to iCloud?

Don’t walk into this with your eyes closed.

This is a fairly easy one to answer. If the response to either of these questions is “yes”, then you probably shouldn’t.

1. Do you have any sensitive data that you would be embarrassed to be seen by any agent of the US Government?
2. Are you a non-US citizen?

This may seem somewhat inflammatory, so let’s look into what’s going on here.

It was widely reported last week that Apple has decided not to implement end-to-end encryption for back-ups from devices to Apple’s iCloud.  Apparently, the decision was made after Apple came under pressure from the FBI, who are concerned that their ability to access data from suspects will be reduced.  This article is not intended to make any judgments about either Apple or any law enforcement agencies, but I had a request from a friend (you know who you are!) for my thoughts on this.

The main problem is that Apple (I understand – I’m not an Apple device user) do a great job of integrating all the services that they offer, and making them easy to use across all of your Apple products.  iCloud is one of these services, and it’s very easy to use.  Apple users have got used to simplicity of use, and are likely to use this service by default.  I understand this, but there’s a classic three-way tug of war for pretty much all applications or services, and it goes like this: you get to choose two out of the following three properties of a system, application or service, but only two of them.

  1. security
  2. ease of use
  3. cost

Apple make things easy to use, and quite often pretty secure, and you pay for this, but the specific cost (in inconvenience, legal fees, political pressure, etc.) of making iCloud more secure seems to have outweighed the security in this situation, and led them to decide not to enable end-to-end encryption.

So, it won’t be as secure as you might like.  Do you care?  Well, if you have anything you’d be embarrassed for US government agents to know about – and beyond embarrassed, if you have anything which isn’t necessary entirely, shall we say, legal – then you should know that it’s not going to be difficult for US government agents such as the FBI to access it.  This is all very well, but there’s a catch for those who aren’t in such a position.

The catch is that the protections offered to protect the privacy of individuals, though fairly robust within the US, are aimed almost exclusively at US citizens.  I am in no sense a lawyer, but as a non-US citizen, I would have zero confidence that it would be particularly difficult for any US government agent to access any information that I had stored on any iCloud account that I held.  Think about that for a moment.  The US has different standards to some other countries around, for instance, drug use, alcohol use, sexual practices and a variety of other issues.  Even if those things are legal in your country, who’s to say that they might be used now or in the future to decide whether you should be granted a visa to the US, or even allowed entry at all.  There’s already talk of immigration officials checking your social media for questionable material – extending this to unencrypted data held in the US is far from out of the question.

So this is another of those issues where you need to make a considered decision.  But you do need to make a decision: don’t walk into this with your eyes closed, because once Apple has the data, there’s realistically no taking it back from the US government.