Consent under the General Data Protection Regulation

The consent of the individual for use of their information has long been a cornerstone of privacy and data protection law around the world.  It is widely seen as one of the simplest and most transparent way to ensure fair and legal processing.  Yet in many ways consent has come under increasing attack in terms of its suitability to achieve this in a balanced way.  In a digital world, with ever more personal data being collected and analysed, on ever smaller screens, or in the case of many Internet of Things (IoT) devices no screen at all, the utility, validity and viability of consent based data processing is regularly questioned, even if the alternatives seem paternalistic or sneaky.

With this in mind it only seems right to delve into the consent provisions laid out in the General Data Protection Regulation (GDPR) and see what we find.  I’m not going to promise a complete analysis here of all the aspects of the regulation that touch on or are touched by the issue of consent, but hopefully will cover the most salient, practical points of concern.

The Definition

Article 4 of the GDPR provides the core definition of consent as:

any freely given, specific, informed and unambiguous indication of his or her wishes by which the data subject, either by a statement or by a clear affirmative action, signifies agreement to personal data relating to them being processed;

Although the final text only requires consent to be explicit for certain types of data processing, the definition here sets quite a high bar for all forms of consent.

Affirmative Action

Notably, we have this idea of “a clear affirmative action”, and in Recital 25 this is spelled out in terms of both what is and isn’t valid so:

This could include ticking a box when visiting an Internet website, choosing technical settings for information society services or by any other statement or conduct which clearly indicates in this context the data subject’s acceptance of the proposed processing of their personal data.

Silence, pre-ticked boxes or inactivity should therefore not constitute consent.

This last element particularly seems to destroy the notion of ‘implied consent’ where simply using a service, particularly a digital one, can be taken as an indication of agreement.

So the subject must take an action, and that action will have to be a clear indication of consent.  This would appear to rule out any other actions a user might make on their device that could easily be misinterpreted, a subject I may return to at a later date.

Freely Given

There is a particularly high bar for determining whether or not consent is freely given and this may create the greatest difficulties for certain types of digital services.

There must be a “genuine and free choice”, which is particularly emphasised in Article 7(4):

When assessing whether consent is freely given, utmost account shall be taken of the fact whether, among others, the performance of a contract, including the provision of a service, is made conditional on the consent to the processing of data that is not necessary for the performance of this contract.

Many so-called ‘free’ web services rely on monetisation through behavioural advertising, which itself means profiling of visitors.  If access to those services is made conditional on allowing profiling – then there can be no valid consent for the profiling activity.

One of the recent trends we have seen is publishers preventing visitors using Ad-Blockers from viewing content.  This strategy may have to be re-thought, particularly as Recital 32 makes clear: “consent should not be regarded as freely-given if the data subject… is unable to refused or withdraw consent without detriment.

Article 7(3) also makes the point that “It shall be as easy to withdraw consent as give it.

When taken in conjunction with the first point about affirmative action, this suggests that if consent is provided through an action like a click on a button or link, then to be freely given it must also be withdrawn through a similarly simple and easily accessible action.

Specific and Informed

For consent to data processing to be specific, it must be separated from other type of consent and actions.  This might mean for example that agreeing to the terms of service for delivery of an item you have bought online, should be a separate action from agreeing to have your data shared with third parties for marketing purposes.

In addition, being informed means knowing about all the different purposes of processing, and knowing the identity of the data controller, as a bare minimum.  It also means being informed of ones rights, such as the ability to withdraw consent or object to some types of processing, like profiling.

Although these kind of provisions have been around a long time – the requirements to meet them are much more defined in the GDPR.  There has been a long history of smaller websites in particular cutting and pasting privacy notices from other sources without much thought.  That kind of approach will be much higher risk under the GDPR.  To produce a valid notice, organisations will have to have a thorough knowledge of their uses of personal data.

Demonstrating Consent

One of the many significant changes introduced by the GDPR is the move towards greater organisational accountability and a shifting of the burden of proof for compliance.

So one of the conditions for valid consent, in Article 7(1) states “the controller shall be able to demonstrate that consent was given by the data subject to the processing of their personal data.

This means not just recording the fact that someone ticked a box in a form, but having an audit trail that links the action to any notice and the actual processing of the data concerned.

Failure to be able to verify consent records in some way will itself be a breach of the requirements for legal consent. This not only exposes the organisation to a risk of enforcement, it can also potentially render large swathes of personal data useless for any purposes that are reliant on consent.

Administrative Fines

It is well known that the GDPR creates the ability for regulators to impose huge fines on organisations for compliance failures.  What has been less publicised is the granularity of detail of how these fines might be meted out.

In the UK we saw throughout 2015 how the ICO handed out its largest fines for unsolicited (read unconsented) marketing.  The GDPR strengthens the hand of regulators for this type of enforcement.

So in Article 79 we see that infringements of the basic principles of processing “including conditions for consent” can be subject to the highest level of fines, which may be the higher of 20 Million Euros or 4% of  “total worldwide turnover of the preceding financial year”. Ouch.


This area of compliance has until now and for many businesses been the least likely to be well managed, and most likely to be bending or breaking the rules.  Under the GDPR legally valid, documented consent could well become one of the most important things to get right.

If you need any help preparing for the GDPR, and particularly with issues around use and proof of consent, please get in touch today.


Optanon GDPR Compliance Manager

We have been working for several months now on a new platform to help organisations assess their readiness to comply with the EU General Data Protection Regulation (GDPR).

GDPR Compliance Manager will be released later this year as part of the stable of Optanon brand products that currently includes our Website Auditor and Cookie Consent solutions.

The platform will enable organisations to work out what changes they will need to put in place to meet the requirements of the GDPR before it comes into force.  In addition it provides planning and documentation functionality to support a change programme as well as produce the accountability documentation that will be required.

We will be releasing more information in the coming weeks and months, but for now, here is a preview screen shot.


If you would like to know more about how Optanon GDPR Compliance Manager might help you, and arrange a demo, please give us a call or drop us an email.

General Data Protection Regulation Top Ten Issues

The ink is barely dry on the draft, but the  EU General Data Protection Regulation (GDPR) looks set to change the regulatory environment for personal information not just in the EU, but around the world. Its aim is to create a legal infrastructure for the use of personal data that is fit for purpose, both today and in the future.

The GDPR was designed to increase legal certainty with regards to information flows both within the EU’s borders and beyond. It also introduces stronger consumer protections, with requirements for greater transparency and accountability about how data is used by businesses, not-for-profits and governments alike.

This is intended to give individuals increased trust in data practices.  Consumer research in the last few years has shown consistently high levels of concern and lack of trust in this area, and this is believed to be a potential brake on the future growth of digital technologies.

However, in order to achieve these goals the GDPR does come with some stings in its tail. It places much greater requirements on businesses to communicate effectively with customers, and obtain much clearer consent for the use of their data.  Organisations also have to provide customer choice mechanisms, and there is a greater emphasis on documenting data processing activity. And then of course there are the fines.

At over 200 pages it is a very wide ranging instrument.  However, for those who haven’t had time to read it yet, these are what we think the top 10 issues for most organisations will be.

1.  A broader definition of Personal Data

As we predicted earlier, the scope of what constitutes ‘personal data’ has explicitly been broadened to include any information ‘relating to’ an individual. This specifically includes ‘online identifiers’ so cookies and the advertising IDs seen in the mobile eco-system will be caught up, along with anything that contributes to identifying an individual, or links to such identifying information. This has some widespread implications for online tracking in particular.

2.  A higher bar for consent

Whilst the final text shied away from explicit consent as a requirement, except when special categories of (sensitive) data are concerned, there is still much emphasis on gaining consent through active user mechanisms like tick boxes.

A key part of the test of the validity of consent is whether consumers understand what they are agreeing to, and are given a meaningful choice. There is also a significant shift in the burden of proof.  You will need to be able to provide evidence that you obtained consent from specific data subjects, which is going to require much better record keeping for many organisations.

3.  Data Protection Officers

Although not a universal requirement, many organisations will be required to appoint a Data Protection Officer (DPO) to oversee data uses and ensure compliance with the law. They will be mandatory in the public sector, but for private sector organisations the key test will be whether the organisation is involved in “systematic monitoring of data subjects on a large scale“, however it is not clear at this time how ‘large scale’ will be interpreted.

Earlier, more detailed, requirements for the skills and experience of the DPO and guarantees over their employment, have been dropped but a key issue in the short to medium term will be a lack of the right people to fill such roles.

DPOs however can be outsourced, which may create a market for new services, especially to cater for the needs of smaller businesses.  The DPO responsibilities can also be given to someone alongside other work within the organisation, as long as this does not create a conflict of interest.  So training existing staff into the role could be a viable option for many.

4.  Transparency and Accountability

The GDPR scraps the need for controllers to register with their Data Protection Authority (DPA), but replaces this with a requirement to both better inform data subjects about practices and rights, and to keep records that can be made available on request – such as in the event of a data breach or a compliance complaint.  Such records are about demonstrating that the organisation has thought through the impact of its systems and processes, and made informed choices about how to comply with the GDPR.  The Data Protection or Privacy Impact Assessment (PIA) is one example of such documentation.  It is intended that a PIA will show that an organisation has considered the risks associated with its particular personal data practices, and taken reasonable steps to control or mitigate them.

There are also new requirements on the level of detail that organisations must provide to data subjects about their practices, as well as a need to make sure that this information is both accessible and easy to understand. In particular there is a need to explain the logic behind decisions made on the basis of analysing personal data – which may have particular significance in some sectors that have relied on such processes being largely secret. Organisations are also expected to inform subjects about their rights and how to exercise them.

5.  Data Protection by Design and Default

Although references to this have been cut back in comparison with earlier versions of the text, the GDPR contains requirements that the design of systems and processes are required to give consideration to compliance with the principles of data protection. Particular emphasis is placed on the ideas of only collecting data necessary to fulfil specific purposes, discarding it when it is no longer required, and protecting data subject rights.

It also sets up the possibility for the development of certifications and codes of practice that organisations can follow to help meet these requirements.  Keep an eye for these as they develop.  In particular we expect DPAs to get involved in this area.  They will be losing their registration fees and therefore needing new sources of income.  In the UK the Information Commissioners Office (ICO) has already been developing this idea, so expect it to continue. Trade bodies are also likely to have a role to play here.

6.  The Right to Erasure and Data Portability

These new data subject rights are likely to pose challenges for many organisations. The right to erasure is a clarification of the much talked about ‘right to be forgotten’.   Although the circumstances when the right can be exercised have been made clearer, the balancing against other rights and obligations is still needed.

The right to have a copy of your data in a machine readable form to transfer to another provider may be difficult at first, but it could also lead to better systems interoperability in the longer term – which is already a growing technology trend.  In particular this provision may facilitate the development of the market for ‘personal data stores’, an idea that has long been talked about, but not yet fully realised as providers have struggled with sustainable and scalable business models.

7.  Removal of Subject Access Request Fees

Data subjects have a right to know whether or not an organisation is processing their personal data, what that data is and the purposes of the processing.  The GDPR removes the ability to charge an upfront fee for providing such information, and there is a risk requests will increase as a result of this, pushing up costs.  Current allowable fees don’t exactly cover the cost of  a Subject Access Request (SAR), but are seen as a deterrent to time wasters.  If companies are no longer able to charge fees, it is feared this could open the floodgates to many more SARs.

Companies will be allowed to charge for subsequent copies of the same data, which may reduce the risk of this to some extent. However, it may be worth investing in making sure you can respond to such requests as efficiently as possible, which will not be easy in many cases.

8.  Reporting Data Breaches

Data controllers will be required to report data breaches to their DPA, unless it is unlikely to represent a risk to the rights and freedoms of the individuals concerned. However this qualification may be difficult to judge, so in many cases, it will be safer to notify. The notice must be made within 72 hours of becoming aware of it, unless there are exceptional circumstances, which will have to be justified.

Where the risks to individuals is high, then the data subjects themselves will also need to be notified, although a specific time scale is not specified for this.  It is also worth noting that the DPA can instruct an organisation to inform data subjects if they haven’t already, so we can expect to see further guidance on the circumstances when it would be correct to do so.

9.  Fines

The GDPR very deliberately raises the bar in terms of the ability for DPAs to issue fines for breaches of the rules.  They can go as high as 4% of global turnover.  Not only are these designed to ensure data protection becomes a board level issue, by taking into account worldwide revenues, they seek to side step attempts by multinationals to engage in fine-avoidance business structures.

It is also worth noting that fines can be levied without the necessity to demonstrate harm – although the largest ones will likely be reserved for cases where data subjects have directly suffered damages.

10.  Data Processor Responsibilities

Organisations that only process data on instructions from their client are not directly covered by the current data protection regime.  Their actions were assumed to be governed by agreement with the customer who would be the data controller, and therefore directly responsible for the actions of the processor. However this all changes under the GDPR, and processors now have direct legal obligations and responsibilities.

In particular this means that processors can in certain circumstances be held directly liable and be required to pay compensation to a data subject. It will therefore become very important to establish the contractual relationships and liabilities of the different parties in a controller/processor relationship, and the costs of some services by processors may rise to offset additional risks and insurance costs.


We hope you find this useful.  In future posts we will look at more details of what you can do to prepare, as well as looking into each of these areas in more detail.

In the mean time, if you have any questions and would like to know more about how the GDPR might effect your business, do get in touch and we will be happy to help.

The GDPR Has Landed

After nearly four years, and some of the most intense negotiations and lobbying in EU history, agreement on the text of what some are calling the most significant piece of privacy legislation ever was reached on Tuesday night (Dec 15th 2015).  Tweets from various key players even equated the agreement with the election of a new Pope, with talk of white smoke going up.

This was followed up this morning (Thursday 17th Dec.) with a vote by the European Parliament’s Justice Committee to formally adopt the text – although the support was not unanimous.  There will be a few more hurdles to get over but these are largely expected to be rubber stamping exercises.

It is expected that the Regulation will become EU law in the early part of 2016, with a two year lead in period before it becomes enforced.  The final wording of the document has been released, although it will be subject to cleaning up in terms of clause numbering, and of course it will also have to be translated into every language of the EU.

Time to Get to Work

With the text in place, the real work of organisations to make sure they will be in compliance with the new rules can now begin.  There will undoubtedly be a lot of analysis of the requirements in the months to come, to work out exactly it all means, and we of course will bring you as much of this as we can.

One of the things that makes the legislation so significant is its global reach.  Any organisation, regardless of its location in the world, that touches the personal data of EU residents and citizens in any way, is going to be affected by this law.

Although there are parts of it for everyone from privacy fundamentalists to Silicon Valley libertarians to dislike, almost all agree that this represents a new era of strong data protection and privacy for the digital age.

I suspect many privacy people around the world are going to be spending a big chunk of the holiday season working out how to hit the ground running come the New Year. I know I will.

Happy Christmas!

Touchnote Hack – A Notification Failure?

The personalised postcard company reported it has become the latest UK online service to have lost the personal information of millions of individuals as a result of an attack on its systems.

It seems I am one of them, but it looks like they are not planning on telling me.  This seems to me a big hole in their breach notification strategy.

I am not one of their customers, but somebody I know is and I have received postcards via their service.  So my address details are in their systems.  I know that my data is part of the breach only because the person who is the account holder told me they had received an email confirming that the information in their account was amongst the stolen records – and that includes my address.

However, Touchnote seems not to be too concerned about this, as its online notice about the incident contains the following:

We also confirm the card recipient’s name and postal address regrettably has been stolen as part of this data theft. However there is no action required by the recipient as this information alone cannot cause identity breach.

Now, I have taken care to avoid get my address information being widely available.  I opt out of the edited electoral register, and registered with the Mail Preference Service.  As a result I don’t get much promotional mail – which I am happy about as it saves everybody time and money, not to mention the environment.

I am now expecting this to change as the criminals will no doubt seek to get a return on their investment by selling my details on.

Worse than this, an obvious attack would now be to send me some mail, perhaps pretending to be Touchnote, to try and sell me some kind of identity protection.  It is not too hard to imagine how a cleverly worded letter, perhaps referencing the online news about the hack, could find enough people who would fall for some con that will result in them handing over money to the criminals. It would be even more effective if the criminals also had copies of the images used in past postcards – although this appears not to be the case.

Consider that the nature of the Touchnote service probably means the addresses of lots of elderly relatives who love to get personalised postcards of the grandchildren on holiday. There are already too many stories of the exploitation of the elderly and vulnerable in this way to not consider this a serious threat. Then of course such people also get added to ‘suckers’ lists and further monetised.

So Touchnote may be doing a reasonable job informing its customers, but logically, this represents less than half of the people who have had their information compromised.  At the moment it appears they have not thought about risks to them.

It is worth noting that under new breach notification rules in the forthcoming EU GDPR – all data subjects, have a right to be notified when a breach takes place.  Companies need to realise that this means a lot more people than just their paying customers.

Editorial Note: A correction was made at 15.30 on the date of publication.  The original article stated it was not known if Touchnote had lost uploaded photos.  It was later stated that no photos were accessed illegally in the hack. This is now clear in the article.

Building Websites on Privacy by Design Principles

One of the many significant changes being introduced by the European General Data Protection Regulation (GDPR) is the requirement to adopt principles of privacy by design (PbD) when creating or revising processes or technology.

Given that websites are regularly being re-designed and developed, often by out-sourced agencies, it is quite likely that when the requirements become law, web development projects will be the biggest, most immediate category of impacted software development activities.

Websites are also often the first and only point of contact between an organisation and its prospects and customers, who are also usually the most numerous category of data subject.  The website therefore sets the tone for the brand and its attitude to privacy principles.

With this in mind, it is rapidly going to become important to have an understanding of what this will mean for web design. This article provides a brief overview of the issues the businesses and web designers will need to think about.

Overview of PbD

Much more than an empty phrase buried in a long legal document, Privacy by Design (PbD) was a concept started by respected Canadian privacy regulator Ann Cavoukian back in the 1990s.  It provides a framework set of seven principles to guide development of new systems and processes handling personal information.

In brief, following these principles means building privacy into the design of a system as the default setting, ensuring personal data is kept secure and destroyed when it is no longer needed, providing users with transparency and meaningful choice with respect to the use of their data, and avoiding unnecessary trade-offs between privacy and other interests.

More detail is available here (PDF).

Looking at these principles, it is easy to see that the vast majority of websites would fail even the most lenient test of their application. More than that, when presented recently with a requirement for a partial implementation of these principles, pushed on to them through the law, many website owners exhibited a strong resistance to the principles.

I am of course talking about the ePrivacy Directive, aka cookie law.  The requirement for consent for the use of cookies embodies the PbD principles of privacy by default, transparency and meaningful choice.

More than 4 years after first coming into effect, although more and more responsible companies are moving in the direction of greater privacy choices, the law is given little more than lip service by millions of other sites.

More than that it is frequently derided by web professionals, many of whom have as little understanding of the law, as they accuse law makers having of technology.

The Impact on Website Design

A privacy by design approach to web design and development needs to take into account the two broad modes by which visitor privacy is impacted:

  • Volunteered personal data
  • Automated personal data collection

The volunteered data part is relatively easy, and many websites tackle this reasonably well, although there are a few things to look out for.  It is the automated bit that presents more challenges.  We will however look at both.

Volunteered Data

The most obvious source of volunteered data is when visitors submit their information through web forms.  Though on the surface this is a straight forward case of getting consent through a privacy policy and checkbox, there are a few things to consider:

The site must be clear about all the potential uses for the data, not just the uses the subject expects or is providing their details for.  Where any of those uses might be additional to the core reason the ‘privacy as the default’ principle would require data the subject opt-in to those additional uses, and not just to all future uses, but each specific use.

Even where opt-in consent has been obtained, there would also need to be an easily accessible option/control mechanism to opt out again at any time.

What happens to the submitted form data is another crucial design issue.  Is it emailed, sent to a CRM, stored in the website database? It is common for all three to happen, resulting in multiple copies of the data.  But if you are sending the data to another system – leaving it in the web database is an unnecessary security vulnerability, which many sites are exposed to. If you are not using the data operationally in your site (such as for logging in), clear down the data submitted through forms on a regular basis.

Not all volunteered data is captured directly through web forms however.  Can people set a language preference on your site? Do any interactions result in content being personalised? This could be considered volunteered personal data.  How are people notified about this? How is the information saved, for how long?  These are all valid PbD considerations.

Automatically Collected Data

This is generally the largest volume of data generated from your website.  Much of it happens through the use of cookies and other mechanisms which are set and read by the various applications your website uses.

In Europe there are specific rules around visitor consent and the use of cookies.  We will not go into the detail of those requirements here except to point out that by applying privacy by design thinking to the use of cookies, you will likely also to be largely compliant with the cookie rules.

Of course not all cookies contain or can be considered as personal data.  However the extended scope of the definition of personal data under the GDPR means that many types of cookies will likely fall directly and clearly in scope of the new rules.

In particular, cookies that act as unique device or user identifiers – such as those used for online tracking and user login, are likely to be considered as personal data under the EU GDPR.

This will therefore mean a need to evaluate all elements of your website that set cookies, identify whether these carry personal data.  The next stage would be to consider whether privacy-friendly alternatives exist, or if not, how to implement user controls.

This has particular implications for technologies that set third party cookies.  In particular it would no longer be possible to make the argument that ‘we are not responsible for third party cookies’. PbD requires site owners to shift the focus from cookies per se to decision to use the underlying technology. No website owner can realistically say ‘we are not responsible for the technology we add to our website’

A PbD approach to site design therefore requires that every bit of the technology infrastructure of a site will need to be evaluated for its impact on privacy, and require the provision of suitable default settings, notice and control.

PbD principles would suggest that you can’t just add a standard Facebook Like button to your pages by default. You would need to ask users to opt-in to such features, whilst also making sure that they are aware of the privacy implications of doing so.

This also holds true for a vast array of technologies and services that are provided by third parties as scripts and code embedded into pages.  Analytics, videos, music, discussion forums, and of course advertising – all of these page elements are typically served up from separate host domains that are more or less invisible to the average visitor.

All of the most common examples of these services involve some level of personal data collection. A requirement to follow PbD principles means giving consideration to the impact of these throughout the process of development.  This is no easy task as many technologies designed to be integrated into other sites, are not clear about their data collection practices.

The Impact on the User Interface

PbD requires a thorough examination of the architecture of a website and its privacy impacts. It also requires mechanisms for visitors to be able to make realistic privacy choices.

This of course means that there is a need for interfaces to support such choices.  And this may be one of the greatest challenges for web design.

The kind of notices that we have seen arising from attempts to comply with the cookie law will not readily suffice – they are neither granular enough nor present enough choice.  What will be needed is more dynamic interfaces, showing and hiding content and functionality based on choices made.

Such interfaces are not uncommon – the best web design already configures content and services around users, this is what ‘personalisation’ is. However, interface personalisation is generally not clear to the user, especially when and why it occurs.  Privacy by Design means not only making the fact of personalisation explicit, but providing explicit choices to visitors about whether or not it should take place.

Allied to this designers will also have to take into account whether or not they want to give access to content and services to people who make privacy choices that go against the economics of the services being provided.

So if a visitor comes to a free news site, which is paid for by privacy invasive advertising, yet chooses not to have the advertising, designers will have to decide if they should not be given access to the content.


The aim of this article has been to raise just a few of the issues that are going to face the web design profession once the new European data protection rules are finalised.

Clearly of course these are not just decisions for ‘designers’ in the traditional sense – they are also examples of some fundamental questions for digital strategy.

The new law will mean there will be no getting away from questions like these when it comes to a new web build.  So the time is also fast approaching when some answers will be needed.


Learning From Ashley Madison


The recent theft and subsequent leaking of the personal information of users of the Ashley Madison dating site for married people and its other stable brands is not the biggest data breach the world has seen in the last few years, but it is quite probably the most controversial.

Whatever people choose to think about the basic premise of the business, or the people that signed up to its services, the hacking and subsequent release of the data is illegal and quite likely to lead to serious harm for some of the users of the site.

It is therefore right that everybody who deals in the handling of personal data should look to see what they can learn from this event.

Context is King

The sensitivity of information, and therefore the lengths one should go to protect it, is often more reliant on context than the information itself.

Email addresses are personal information, yet they aren’t generally thought of as particularly sensitive or needing of close protection.  After all, they are about communication, so designed to be shared.

Stored in a database of people supposedly looking for an extra-marital fling however, is a completely different ball game.  As has been pointed out elsewhere, some of the emails leaked indicate users in Saudi Arabia, where adultery is a capital offense.  Though we don’t know it yet – the hackers may have condemned some people to death.

At the very least is seems inevitable that the marriages and careers of many people will be ruined.  It doesn’t even matter if no wrong doing took place, suspicion by the mere presence of an email address in the data, will be enough to change some people’s lives forever.

Transparency, Transparency, Transparency

Amazingly enough, the privacy policy on the site is not that long or complicated.  However, it is clear that different versions are served up to different users.  On first access I noted my location was recorded as in the UK, and I got a policy from Praecellens Limited, operating out of Cyprus.  However, I could switch my location to the USA, and then be served the policy from Avid Dating Life Inc. of Canada

What strikes me is that even a cursory reading rings huge alarm bells.  For a start the Cyprus policy, presumably for EU readers, is different, but it still uses US-style language, lots of references to PII rather Personal Data. So immediately it seems like a half-hearted job.

More importantly, it makes clear that although some information ‘may be considered as sensitive’ – the policy allows for any personal information to be sold to unspecified third parties for marketing purposes.  At the same time the policy also stresses how important privacy is to the business.

Of course we know that nobody reads privacy policies, and this seems to prove it.  I find it difficult to believe that anyone contemplating embarking on a clandestine affair would knowingly agree to such unspecified information sharing that could easily lead to legal disclosure of their use of the site.  All of which tells me that there needs to be clearer ways of surfacing this kind of information, and clearer indications of consent – something of course being called for under the EU Data Protection Regulation.

Beware the All Seeing Cookie

Running a very brief scan over a few of the public pages on the site we identified trackers from Google, Facebook and Twitter on the ‘Infidelity News’ blog.  These are all organisations that can tie online behaviour directly to real identities, meaning the site is directly leaking at the very least ‘interest’ data about identified individuals in a way that could immediately impact their wider social profiles unless they are extremely careful.

However, the site clearly ignores EU cookie law requirements for consent.  It doesn’t even notify visitors, let alone give them some control.  Yet this is very  clearly the sort of site that users might want to keep out of their browsing history.  Not giving users the option for simple controls. is not only a breach of the cookie rules, it shows either a cavalier attitude to privacy, or ignorance of the power of the cookie to identify individuals.

Privacy is not Security

It also seems despite the promises of the importance of privacy, little thought was put into this when designing the system.

Email addresses were allowed to be on the system unverified – breaking data protection rules about accuracy of data as well as opening up non-users of the system to potential harm. Although the company claims that sensitive information is encrypted at rest on disk, as noted above, in this case even emails are sensitive, and were clearly not encrypted. Or at least not encrypted well enough to prevent their release.

Similarly it has been widely reported that the password re-set feature, can be used to effectively reveal the email addresses of users registered on the site.

Some reports have suggested that the security on the site is generally better than many others, which also manages to highlight quite well that security and privacy are two different realms. I don’t know whether or not the company carried out any kind of privacy impact or risk assessment.  However, it seems obvious now that not enough attention was paid to privacy concerns in the development of the platform and its services.

A Watershed Event?

The nature of the business makes it an obvious choice target for malicious attack. If there had been more thought given to privacy, it would not have made a breach any less likely to happen, however it may have reduced the impact of it.

The very nature of the potential damage here could in fact become a force for change in the way that the law looks at privacy harms.  Most law courts reduce harm in data breaches to financial loss.  Many actions fail because direct financial harm is very difficult to establish.

In this case, financial harm is likely to be way down the priority list of members.  It will be the harm to their personal lives – in many cases irreparable – that will almost certainly been the focus of the inevitable law suits.  How the courts deal with this could open the door for the wider recognition of non-financial harms In breaches of privacy – and that may make this a watershed event.

UPDATE 24 August:

Sadly, just three days after writing this post, my worst fears appear to have come true: two Ashley Madison users who had their personal details published, have reportedly taken their own lives as a direct result. My deepest sympathies go out to their loved ones.

Collateral Damage in the Cloud: The Jurisdictional War over Personal Data

It may already be a little clichéd to talk of data as the new oil, but personal data is undoubtedly a lubricant of frictionless digital economics. The wheels of many free services would stop turning if consumers didn’t keep filling the tank with their Likes, tweets and cat videos.

However, just as both consumers and businesses have got used to the idea of sending all this information into the cloud without concerning themselves about where it actually goes, the business model of global services powered by distributed data is coming under attack. New legal frameworks are threatening to create or strengthen digital borders, stemming the flow of personal data migration.

Though legal restrictions on the global movement of personal data are not entirely new, the effectiveness of existing frameworks has more recently been called into question. As this data has become more important, valuable – and of course voluminous – tensions between different interests and cultural attitudes have increased to the point where ‘balkanisation’ of web services and the underlying infrastructure of the web is a very real possibility.

The EU-U.S. Safe Harbour programme

One of the biggest data trade deals is the EU-U.S. Safe Harbour programme, the most relied-upon legal instrument facilitating the exit of personal data from the EU to U.S. companies. That deal has been under threat ever since Edward Snowden went public over the collection and use of personal information by the U.S. and other allied intelligence agencies.

Alongside stories about lapses in regulation of U.S. companies signed up to Safe Harbour, existing arrangements have been the subject of intense negotiation over the last 2 years. As things currently stand, although no-one really wants it, the EU could pull the plug on Safe Harbour if its demands for change are not met by U.S. authorities.

If the data taps are forced off, what then? Much of the transatlantic movement of data would have to be brought to a halt. Even setting aside the economic consequences, the modifications to services required to ensure that data was prevented from flowing illegally would be significant.

Update: recent reports suggest agreement on a revised Safe Harbour deal is close.

One element of web balkanisation is the idea that companies may be forced to keep personal data within the jurisdictional boundaries of its original point of collection or risk not being allowed to trade within that country. In Russia, this is already a reality. From 1 September this year, all personal data on Russian citizens must be located in Russian data centres. Although the publicly-stated reason for this is to protect the privacy rights of Russians, there remain suspicions that the primary purpose is to ensure that State can better monitor its own citizens. Whatever the reason, international companies wanting to do business in Russia are expected to comply.

In the wake of Snowden, similar requirements were proposed in Brazil – although these were dropped at the last minute from an Internet Civil Rights Bill enacted in 2014. However data localisation laws do exist in parts of South East Asia, and India too is reportedly considering the idea.

Back in Europe there are the continuing negotiations over the draft General Data Protection Regulation (GDPR) to consider. This legislative juggernaut is also seeking to extend the jurisdictional boundaries of protection of personal data. Rather than go for a strict localisation approach, the GDPR is about attaching specific rights to the data regardless of where it ends up. Will companies need to develop solutions that tag location origination to personal data to then identify what rights apply?

Ireland at the centre of data sovereignty storms

There is much also talk about the ‘European Cloud’ but it is not really clear what this means. Twitter has recently announced that data of account holders outside of North America will be controlled from Dublin, and therefore subject to Irish Data Protection laws. Dropbox has also followed suit. However, what this may mean in the future in terms of where the data may need to be physically located, and the legal obligations of the data centres involved is very much unresolved.

Facebook has claimed Dublin as its EU regulatory home for years, but Belgian authorities have recently been challenging this assertion, demanding that Belgian law should apply to the personal data of Belgian residents.

Set against a general tide of increasing consumer privacy protections, there are conflicting demands for greater access to personal data for law enforcement. The UK government has talked about there being no place criminals can hide online, an apparent call for back doors and keys to encrypted services, with little recognition of how this might also make systems more vulnerable to the increasing volume and sophistication of malicious attacks on digital systems. Many of these are interested in getting their hands on the same data.

One case making its way through the U.S. courts at the moment involves Hotmail account data held in Dublin. U.S. law enforcement wants the information for an ongoing investigation and is arguing that Microsoft Corp. as the owner of the data centre, is obliged to hand it over under U.S. law. The company would also be obliged to keep that hand-over secret, which could be in breach of Irish law. Microsoft is fighting the case but so far they are losing the battle. If they do eventually have to hand the data over, there could also be a devastating loss of trust amongst its global customer base, as well as consequences for how Microsoft may have to re-structure its services.

Frontline responsibility for compliance with such laws normally lies with the service providers that are collecting the data, but they will look to their infrastructure vendors for both appropriate solutions and assurance of compliance. But the Microsoft case demonstrates it’s not always straightforward, and there are no indications that the stream of contentions and controversies are going to dry up anytime soon.


This article was originally published on the Stack:


Will Google Break up its Privacy Policy?


Google announced yesterday a major re-structuring which will see it broken up into a number of smaller businesses, owned by a new holding company, Alphabet, with a typically unconventional domain –

According Larry Page’s letter on the home page of the new site, the aim of this exercise is to make the financials of the company more transparent, and for individual businesses within the group to be more accountable.

There is a lot of detail missing about how the different businesses will be separated, but it appears that ‘new’ Google, as a subsidiary of Alphabet, though a ‘bit slimmer’ will retain the majority of the core business, with the more experimental arms in health, the incubator X Lab and its investment arms becoming their own entities.

It might be tempting to suggest that the EU’s call to break up the business due to competition concerns might in some way have influenced the move.  However, as the core businesses of search, advertising, Android, maps, and YouTube look set to remain together under Google, this seems unlikely.

All of which is very interesting, but as a privacy professional the first question that came to my mind was ‘what does this mean for the privacy policy?’  Any answer to that would be speculative at this time, but it is tempting to make some guesses anyway.

Remember in 2012, Google decided to unify all its data practices into a single privacy policy, allowing it to share data across all its businesses, and creating a push towards unifying user identities. This greatly expanded its capacity to track people across all its services, deepening its grasp on the digital lives of millions.

This approach has been under sustained attack in the EU ever since, and to some degree the company has already had to respond to calls for greater clarity.  In some countries the demand has been to keep the data siloed so that users could move between services in relative anonymity.

A recent reversal of the YouTube policy of requiring users to sign-in with Google Plus identities could have been a response to such calls as much as recognition that Google Plus was not going to be able to compete with Facebook.

When Google bought home automation business Nest, it was made clear that Nest data would remain separate however, regulated by its own policy.  Nest will almost certainly be its own business under Alphabet.

The more experimental work in health – a smart glucose detecting contact lens and Calico, a business looking at aging and longevity, which naturally involve the collection of highly sensitive personal information, will of necessity need to have stricter privacy policies, which will be easier to be accountable for as separated businesses.

In terms of the core digital data gathering and aggregation, I wouldn’t expect anything to change too much in the short term.  However, what this re-structuring does is pave the way to make it easier to break up the internet businesses further in the future.  Should the EU data protection authorities demand stronger privacy protections for consumers, and the new General Data Protection Regulation give them bigger and sharper teeth, then this re-structured group could more easily introduce changes in policy in response, by dividing up its data hoard into separate businesses without looking like it is bowing down to regulatory pressure.

I wonder how much such thinking figured in the decisions announced this week? We will surely never know.

GDPR Negotiations – Seconds Out, Round Three

The EU Council has confirmed, as widely expected, it has reached an agreement on the General Data Protection Regulation, almost three and a half years since the first publication of the proposals by the Commission.

The latest published version of the text shows significant differences of opinion remain amongst member states, but this milestone now paves the way for the Trilogue stage to begin.

This involves all three institutions of the EU – Council, Commission and Parliament, with a first meeting due to take place on 24th June.

With each body having separately agreed their own positions on the text they want, the Trilogue is the process of negotiating between the different positions, to come to a final text that all are prepared to sign up to.

This will not be an easy task, as significant differences of opinion exist on some fundamental issues.  If this were a boxing match, then you would have the Council in one corner, with a business friendly proposal, the Parliament in the other with a more privacy first, consumer focussed approach, and the Commission in the middle acting as referee.

The fight could still get intense, and someone might end up with a bloody nose, but with considerable pressure on finalising the legislation by the end of this year the big question now is who will land the winning blow?