Pokémon Go and Location Privacy

There is one species Pokémon that even the most dedicated of Pokémon Go players are unlikely to ever catch, and that of course makes it all the desirable.

Privachu like to be left alone to go about their lives. They are not unfriendly and can be quite gregarious. They are also not as rare as one might think given how difficult they are to get hold of. What makes Privachu different from all other Pokémon, is that they choose when and how to reveal themselves, rather than just broadcast their location to anyone that might want to find them. And of course they will only reveal themselves to others they trust not to pass the information on to people they do not want to be found by.

OK, they don’t exist really, I’ve just made them up (though if anyone from Niantic wants to create Privachu, I am willing to be reasonable on the royalties – do get in touch).

Pokémon Go, the augmented reality mobile location based game, is currently taking the world by storm, but has been the source of some significant concern around the amount of personal data collected by the app, and how this may be shared. This is especially important because it is played largely by children.

Much of the early privacy concern focussed around the fact that users appeared to be required to give Niantic, the company behind the game, full access to their Google account (one of the main ways of registering in the game), which would include all their contacts and any documents stored in Google Docs.

However, it was fairly quickly revealed that this was actually the result of a configuration error, which was rapidly corrected, and that Niantic did not make use of or tried to access any of the extra information it didn’t need to verify the identity of the player. Nevertheless, even this short lived issue might have impacted millions of people and should provide a summary lesson in putting privacy thinking at the heart of the user experience design process.

The long term privacy issues with Pokémon Go however clearly focus on the location issue. Of course location based digital services have been around for at least as long as the smartphone itself. Aside from the obvious ubiquity of connectivity, location driven services are the smartphones killer app, the one that makes it worth all the investment in many ways.

What is perhaps different about Pokémon Go, is that it is not simply collecting location data – but it is actively incentivising large numbers of people to visit particular locations where Pokémon can be caught.

Yes there are big questions around the privacy concerns of sharing (selling) of location information with third parties, and those questions are already giving rise to investigations, notably in the USA and Germany.

What I think is more interesting is – how are decisions made about where to place PokéStops, and what Pokémon are to be found there? There is a huge potential here for a kind of targeted manipulation, the encouragement of particular audiences and profiles to visit specific locations. Niantic would be crazy if they didn’t see the potential in selling this capability, and I would be very surprised if on some level they are not already either doing it or thinking about doing it. There will be a powerful profit motive for it. Want to drive more visitors to your location? Pay for a particular Pokémon to make an appearance, or your competitor will.

Then of course there are also the unintended applications of the data. There have already been stories of crimes, even a murder, linked to the location data elements of the game. How long before the first major hack is uncovered?

Pokémon Go is going to be an interesting privacy story for quite some time I think. Not simply because of its huge popularity, though in no small part because of that, but the use of location data is only going to grow over the coming years, and the issues are only going to get more complex. The popularity of Pokemon Go and the huge data it generates, will almost certainly make it a pioneering proving ground for both the problems, and hopefully the solutions.

Meanwhile, if you’d like to know where to find Privachu, you will have to wait for them to reach out, when they have learnt to trust you.

Optanon GDPR Compliance Manager

We have been working for several months now on a new platform to help organisations assess their readiness to comply with the EU General Data Protection Regulation (GDPR).

GDPR Compliance Manager will be released later this year as part of the stable of Optanon brand products that currently includes our Website Auditor and Cookie Consent solutions.

The platform will enable organisations to work out what changes they will need to put in place to meet the requirements of the GDPR before it comes into force.  In addition it provides planning and documentation functionality to support a change programme as well as produce the accountability documentation that will be required.

We will be releasing more information in the coming weeks and months, but for now, here is a preview screen shot.


If you would like to know more about how Optanon GDPR Compliance Manager might help you, and arrange a demo, please give us a call or drop us an email.

Privacy and Social Media: Incompatible or Indispensable?

The growth of social media platforms, and particularly their seeming indispensability to the lives of the digital natives, is often used as evidence of the death of both the desire for privacy and its attendant social relevance. In a post-Facebook world, aren’t privacy worries increasingly confined to the old folks’ home and a few wonks? Nobody reads privacy policies, so nobody cares.  QED.

Europe’s data privacy rules are about to be updated for the social media age.  A lot of effort over many years has gone into re-writing them.  Some say they will become too restrictive, others not protective enough of consumers’ interests, but all agree they will include the potential for massively increased fines for non-compliance.  But why go to all that effort if nobody really cares anymore?

In October 2014 the highly respected Samaritans, a charity trying to stop vulnerable people from hurting and killing themselves, released the Samaritans Radar app with no small amount of fanfare.  Anyone worried about a friend could sign up to get an alert if they posted something on Twitter that the Radar algorithm interpreted as a need for help.  Sounds great doesn’t it?  The Samaritans were very proud, taking the public data of tweets and putting it to good use to look out for vulnerable people.

There was an immediate outcry from privacy experts, the app was taken down within a few days under public pressure, and was also investigated by the UK data protection regulator, the Information Commissioners Office (ICO).

Why? All they wanted to do was to use publicly available information to help people help friends they might be concerned about.

The problem was a failure to look at the full picture.  The app was making judgements about the mental health of people without their knowledge and sharing it with a third party.  Anyone could get this analysis on anyone else, regardless of their actual motives and relationship with the person concerned.

The app was withdrawn before a full investigation could take place, not because of the risk of enforcement but the much bigger potential risk to reputation, which might have undermined the trust the Samaritans rely on to do their very valuable and important work. However the ICO still concluded that the app “did risk causing distress to individuals and was unlikely to be compliant with the DPA” [The UK Data Protection Act].

This extreme example highlights some important issues.  Data privacy laws are complex, and though they may fail to keep up with changes in technology, there are some underlying principles that reflect long established social norms and cultural expectations.  Practices may change quickly on the surface, but deep seated values shift much more slowly.

The world of social media sits at the fulcrum of the balance between the private and the public. This means that having a sophisticated understanding of what is both legal and acceptable is vital to the success of social platforms. People don’t read privacy policies because they rely on trust much more than terms and conditions.  Established privacy principles and laws play a vital role in building and maintaining that trust.  However trust can be lost very quickly, at a cost much higher than any regulatory fine, if the platform is perceived to have breached it.

Social platforms should pay attention to data privacy laws not just to avoid enforcement but because they say something very important about culture and expectations.  They might be able to ignore the some of the rules some of the time and get away with it for a while, but in the long term my bet is that faced with a choice between privacy and any individual platform, privacy will win out.

This article was originally published on the Global Marketing Alliance website:

Start Preparing for the DPR: Know Your Data

Organisations should not wait for the EU to finalise new privacy rules before taking action.  This was the message from the ICO as the UK data regulator released both a 3 year corporate plan, alongside the results of a new annual survey into consumer attitudes to privacy and data protection last week.

Although it seems likely that the Data Protection Regulation (DPR) won’t come into force until late 2017 at the earliest, the text is expected to be agreed by all parties by the end of this year.  There is still a lot of uncertainty over the finer details of the law, including what will or will not be considered as personal data, however it is safe to assume that many of the broad rights of individuals and responsibilities of organisations are not going to change too much.

There is therefore an opportunity for businesses of all sizes to begin some preparation now.  And one good reason to avoid delay is financial.  If you wait until the text is agreed to act, you are likely to find skills and expertise are suddenly in short supply and high demand.  When that happens prices rise and opportunists with no real expertise step in to offer quick fixes that could leave you out of pocket and no better protected than before.

The big question of course is, where do you invest now in order to save later?

These are what I think the top three issues are:

  1. Transparency with consumers is key.  Organisations will need to be able to explain in clear language what data they collect from and about individuals, and how they make use of it.
  2. Be prepared to defend your data use practices. Accountability is an important concept in the DPR, and this is about having a lawfully valid reason for your data use practices, which you are able to justify if challenged by either customers or regulators.
  3. Minimise risk wherever you can. Information about people is an increasingly valuable commodity, and that comes with increasing risks, both to the individuals themselves and to your business.  Data breaches and cyber-attacks are high profile examples, but there are others. Understanding these risks is the first step in counteracting them. Big fines are most likely to be handed out to organisations that fail to manage risks properly – and of course those fines are risks in themselves.

The first step in preparing for the impact of the DPR in all these areas comes down to one thing: Know Your Data.

To get a complete a picture you need to document:

  • What data are you collecting on individuals?
  • Where does it come from?
  • What are you using it for?
  • Where and how are you storing it?
  • Who is responsible for it and who has access to it?
  • Are you passing it on to any third parties?

Of course, simply having all this information to hand is not going to make you compliant, but it establishes a baseline to build a compliance program on.

What it does do is enable you to answer questions such as: Are we keeping people properly informed about data use? Do we have a justifiable reason for our collection and use of data? Are there steps we can take to reduce unnecessary use or risk?

Whatever the final form of the DPR in terms of its scope and the level of responsibilities placed on organisations, having the answers to these questions will be key to any project designed to ensure compliance.

And if you don’t Know Your Data, all you can be really sure of is that your business is carrying unknown data risks.  In a world of both big regulatory fines and bigger possible brand damage if you are caught doing something wrong with people’s information, that’s a pretty significant risk to carry.

We Can Help

If you just need somewhere to get started, then you can sign up for our DPR Readiness Toolkit, which includes a free spreadsheet template to start documenting your use of personal information.

If you need more detailed advice and help, then please give us a call.

Right to

Data Portability

Data Portability has become a major topic of interest since the EU proposed the Data Protection Regulation in 2012; however, it has been a topic of interest for quite a few years now. Most notably would be the 2007 founding of the DataPortability Project. But what exactly is data portability? How does it work? Is anyone currently offering it? Why is it important?

The idea of data portability is to support the principle concepts of consumer access and control over personal data. It is essentially about giving the individual the ability to ask for and to receive their data in a re-usable format; however, an important facet of data portability is a push toward an open standard for data storage and transfer. These protocols would be shared by many large organizations. Examples include being able to get copies of bank and credit card transactions in a spread sheet format, so that you can pass it to someone who can use the data to help with budgeting advice. Or, being able to download your Facebook profile, photos and data, so you can easily move it over to Google+.

Data portability is still in its infancy, so there are many ways that it currently works, but regardless of the methodology used, data portability is actually just one piece of the software coders know as a API. An API is an application programming interface: it’s a piece of the software that tells the software how to interact with other pieces of software. So, data portability can be thought of as a large scale API.

Currently, data portability works in very simple ways. First, embed codes and links for videos allow anyone the ability to post videos on another site. Pictures and blog posts can also be easily shared, particularly via special buttons on sites that allow automatic reposting on Facebook, Twitter, Digg, Reddit, Stumbleupon, etc. There’s also dynamic posting, which means that when content is posted on one social media site, it’s automatically posted to another social media account. But dynamic posting can also be the automatic posting of a link and description to a social media account (e.g. Twitter) when you post content on a site like However, probably the most exciting and interesting facet of data portability currently is OpenID. OpenID allows you to login to one site using the information already given to the first site. So, no longer do you need multiple accounts for each site across the internet, you can just sign in with one of your already existing logins.

Some sites have taken this further than others. For instance, Twitter offers little other than dynamic posting.

Google on the other hand offers OpenID access and the ability to download about two-thirds of the data they have collected about you. (You can do this by going to Google Takeout, you can download an archive of +1s, Blogger Posts, Buzz, Contact Lists, Google Drive, Google+ Circles, Google+ Pages, Google+ Photos, Google+ Stream, Hangouts, Profile, Voice, and YouTube.

Facebook also offers OpenID, but has a larger set of uses for it than Google currently because of their connection with online games and applications like Words With Friends. These games and applications use a specific user’s Facebook data to link them to their friends, keep track of their scores, and to automatically post on their Facebook wall on the end user’s behalf. Because of this, the dynamic posting options on Facebook are much greater than those available on Twitter. And just like Google, you can download your personal data; however, Facebook offers a larger data set than Google–they even offer you a list of keywords and interests they send you targeted ads for! You can download this information in your settings under the general tab (it’s on the bottom of the page in a small blue hyperlink).

Microsoft offers varied data portability across their software. One major area they offer data portability in is their Office 365 product. In Office 365, you can import or export your data very easily using a very simple automated process built into the software itself. You have 90 days after the end of your Office 365 subscription to export this data.

So, why is data portability seen as something important? Well, first, data sharing allows for simplicity: you can use many applications with the same pictures, login, password, profile, friends, etc. with the click of a button. However, on a more serious level, data portability allows people to have control over their data; however, at this current moment few providers make it simple to delete all of your information. Most portability currently is based on being able to export data and to use that data on other sites without harming the data already stored on the first site; however, this will soon change–at least for EU citizens–because of the new Data Protection Regulations that will be in full effect sometime in 2016. Even so, the current level of data portability keeps companies in check to a certain extent because end users are allowed to see the data being collected and therefore to make wiser decisions about what to share.

Cecilia Malmstrom

EU Data Protection Directive Safe Harbor

It is intriguing that with all the recent discussion of reforming the EU Data Protection Directive, Safe Harbor, the US framework to comply with Directive 95/46/EC, has not been discussed very often. Particularly when the necessity for US-based businesses to comply to the new EU Data Protection Regulation is such a hotly debated subject. So, let’s delve in and learn a bit about the EU Data Protection Directive and Safe Harbor.

The EU Data Protection Directive, Directive 95/46/EC, was finalized in 1995. As part of the larger framework of policies about privacy and human rights, the directive regulates the sharing of personal data between citizens of the EU and others. In a nutshell, it demands that personal data only be shared if it is processed transparently (the individual knows and consented to sharing that data); it is only taken for an explicit, legitimate purpose that is clearly defined; and it is only processed in accordance with its original purpose for being collected.

The EU Data Protection Directive also mandates that personal data only be shared with countries that have similar data protection regulations; however, this was not a very big concern until after the Internet became more prevalent. But by 2000, there were already over 360 million people online worldwide and the number of users was increasing everyday. Because of these issues, however, concern arose about what this meant for EU citizens and the private data they share with US-based organizations. So, the European Commission and the US Department of Commerce produced a framework for how US-based companies could comply with these regulations.

US-based companies who wish to comply with the US version of the EU Data Protection Directive, Safe Harbor, must uphold seven principles–notice, choice, onward transfer, access, security, data integrity, and enforcement. Notice and choice are connected: people must be informed about the data collection, its uses, transfers to third parties, and how to opt out of data collection.

Onward transfer is the policy that one entity may only pass data onto a third party if they are both already following all of these principles (and of course the original collector gave notice and got consent)–unless that third party is contracted by the data collector to process data solely for the data collector. Security means that the company must take reasonable measures to secure private data. Data integrity means that the data must be helpful to and about the purpose it was taken for. Access means that people must have access to their data and easily be able to correct incorrect personal data. Finally, enforcement is the policy that these principles must be enforced by a third party.

After this legislation passed in the US, the EU let out a final commission decision, 2000/520/EC, declaring “the adequacy of the protection provided by the safe harbour privacy principles“.* However, since then, the Safe Harbor framework has been heavily criticized. Leaving one to wonder if after reforming the EU Data Protection Directive, Safe Harbor will reformed or completely replaced with a new framework or…?