Courtesy cliparts.io

Pokémon Go and Location Privacy

There is one species Pokémon that even the most dedicated of Pokémon Go players are unlikely to ever catch, and that of course makes it all the desirable.

Privachu like to be left alone to go about their lives. They are not unfriendly and can be quite gregarious. They are also not as rare as one might think given how difficult they are to get hold of. What makes Privachu different from all other Pokémon, is that they choose when and how to reveal themselves, rather than just broadcast their location to anyone that might want to find them. And of course they will only reveal themselves to others they trust not to pass the information on to people they do not want to be found by.

OK, they don’t exist really, I’ve just made them up (though if anyone from Niantic wants to create Privachu, I am willing to be reasonable on the royalties – do get in touch).

Pokémon Go, the augmented reality mobile location based game, is currently taking the world by storm, but has been the source of some significant concern around the amount of personal data collected by the app, and how this may be shared. This is especially important because it is played largely by children.

Much of the early privacy concern focussed around the fact that users appeared to be required to give Niantic, the company behind the game, full access to their Google account (one of the main ways of registering in the game), which would include all their contacts and any documents stored in Google Docs.

However, it was fairly quickly revealed that this was actually the result of a configuration error, which was rapidly corrected, and that Niantic did not make use of or tried to access any of the extra information it didn’t need to verify the identity of the player. Nevertheless, even this short lived issue might have impacted millions of people and should provide a summary lesson in putting privacy thinking at the heart of the user experience design process.

The long term privacy issues with Pokémon Go however clearly focus on the location issue. Of course location based digital services have been around for at least as long as the smartphone itself. Aside from the obvious ubiquity of connectivity, location driven services are the smartphones killer app, the one that makes it worth all the investment in many ways.

What is perhaps different about Pokémon Go, is that it is not simply collecting location data – but it is actively incentivising large numbers of people to visit particular locations where Pokémon can be caught.

Yes there are big questions around the privacy concerns of sharing (selling) of location information with third parties, and those questions are already giving rise to investigations, notably in the USA and Germany.

What I think is more interesting is – how are decisions made about where to place PokéStops, and what Pokémon are to be found there? There is a huge potential here for a kind of targeted manipulation, the encouragement of particular audiences and profiles to visit specific locations. Niantic would be crazy if they didn’t see the potential in selling this capability, and I would be very surprised if on some level they are not already either doing it or thinking about doing it. There will be a powerful profit motive for it. Want to drive more visitors to your location? Pay for a particular Pokémon to make an appearance, or your competitor will.

Then of course there are also the unintended applications of the data. There have already been stories of crimes, even a murder, linked to the location data elements of the game. How long before the first major hack is uncovered?

Pokémon Go is going to be an interesting privacy story for quite some time I think. Not simply because of its huge popularity, though in no small part because of that, but the use of location data is only going to grow over the coming years, and the issues are only going to get more complex. The popularity of Pokemon Go and the huge data it generates, will almost certainly make it a pioneering proving ground for both the problems, and hopefully the solutions.

Meanwhile, if you’d like to know where to find Privachu, you will have to wait for them to reach out, when they have learnt to trust you.

Consent under the General Data Protection Regulation

The consent of the individual for use of their information has long been a cornerstone of privacy and data protection law around the world.  It is widely seen as one of the simplest and most transparent way to ensure fair and legal processing.  Yet in many ways consent has come under increasing attack in terms of its suitability to achieve this in a balanced way.  In a digital world, with ever more personal data being collected and analysed, on ever smaller screens, or in the case of many Internet of Things (IoT) devices no screen at all, the utility, validity and viability of consent based data processing is regularly questioned, even if the alternatives seem paternalistic or sneaky.

With this in mind it only seems right to delve into the consent provisions laid out in the General Data Protection Regulation (GDPR) and see what we find.  I’m not going to promise a complete analysis here of all the aspects of the regulation that touch on or are touched by the issue of consent, but hopefully will cover the most salient, practical points of concern.

The Definition

Article 4 of the GDPR provides the core definition of consent as:

any freely given, specific, informed and unambiguous indication of his or her wishes by which the data subject, either by a statement or by a clear affirmative action, signifies agreement to personal data relating to them being processed;

Although the final text only requires consent to be explicit for certain types of data processing, the definition here sets quite a high bar for all forms of consent.

Affirmative Action

Notably, we have this idea of “a clear affirmative action”, and in Recital 25 this is spelled out in terms of both what is and isn’t valid so:

This could include ticking a box when visiting an Internet website, choosing technical settings for information society services or by any other statement or conduct which clearly indicates in this context the data subject’s acceptance of the proposed processing of their personal data.

Silence, pre-ticked boxes or inactivity should therefore not constitute consent.

This last element particularly seems to destroy the notion of ‘implied consent’ where simply using a service, particularly a digital one, can be taken as an indication of agreement.

So the subject must take an action, and that action will have to be a clear indication of consent.  This would appear to rule out any other actions a user might make on their device that could easily be misinterpreted, a subject I may return to at a later date.

Freely Given

There is a particularly high bar for determining whether or not consent is freely given and this may create the greatest difficulties for certain types of digital services.

There must be a “genuine and free choice”, which is particularly emphasised in Article 7(4):

When assessing whether consent is freely given, utmost account shall be taken of the fact whether, among others, the performance of a contract, including the provision of a service, is made conditional on the consent to the processing of data that is not necessary for the performance of this contract.

Many so-called ‘free’ web services rely on monetisation through behavioural advertising, which itself means profiling of visitors.  If access to those services is made conditional on allowing profiling – then there can be no valid consent for the profiling activity.

One of the recent trends we have seen is publishers preventing visitors using Ad-Blockers from viewing content.  This strategy may have to be re-thought, particularly as Recital 32 makes clear: “consent should not be regarded as freely-given if the data subject… is unable to refused or withdraw consent without detriment.

Article 7(3) also makes the point that “It shall be as easy to withdraw consent as give it.

When taken in conjunction with the first point about affirmative action, this suggests that if consent is provided through an action like a click on a button or link, then to be freely given it must also be withdrawn through a similarly simple and easily accessible action.

Specific and Informed

For consent to data processing to be specific, it must be separated from other type of consent and actions.  This might mean for example that agreeing to the terms of service for delivery of an item you have bought online, should be a separate action from agreeing to have your data shared with third parties for marketing purposes.

In addition, being informed means knowing about all the different purposes of processing, and knowing the identity of the data controller, as a bare minimum.  It also means being informed of ones rights, such as the ability to withdraw consent or object to some types of processing, like profiling.

Although these kind of provisions have been around a long time – the requirements to meet them are much more defined in the GDPR.  There has been a long history of smaller websites in particular cutting and pasting privacy notices from other sources without much thought.  That kind of approach will be much higher risk under the GDPR.  To produce a valid notice, organisations will have to have a thorough knowledge of their uses of personal data.

Demonstrating Consent

One of the many significant changes introduced by the GDPR is the move towards greater organisational accountability and a shifting of the burden of proof for compliance.

So one of the conditions for valid consent, in Article 7(1) states “the controller shall be able to demonstrate that consent was given by the data subject to the processing of their personal data.

This means not just recording the fact that someone ticked a box in a form, but having an audit trail that links the action to any notice and the actual processing of the data concerned.

Failure to be able to verify consent records in some way will itself be a breach of the requirements for legal consent. This not only exposes the organisation to a risk of enforcement, it can also potentially render large swathes of personal data useless for any purposes that are reliant on consent.

Administrative Fines

It is well known that the GDPR creates the ability for regulators to impose huge fines on organisations for compliance failures.  What has been less publicised is the granularity of detail of how these fines might be meted out.

In the UK we saw throughout 2015 how the ICO handed out its largest fines for unsolicited (read unconsented) marketing.  The GDPR strengthens the hand of regulators for this type of enforcement.

So in Article 79 we see that infringements of the basic principles of processing “including conditions for consent” can be subject to the highest level of fines, which may be the higher of 20 Million Euros or 4% of  “total worldwide turnover of the preceding financial year”. Ouch.

Conclusion

This area of compliance has until now and for many businesses been the least likely to be well managed, and most likely to be bending or breaking the rules.  Under the GDPR legally valid, documented consent could well become one of the most important things to get right.

If you need any help preparing for the GDPR, and particularly with issues around use and proof of consent, please get in touch today.