Right to be forgotten and rigth to delisting
The right to be forgotten legitimates requesting a search engine to remove certain information publicly known while forbidding third parties to access it.
EU gives one of the first legal basis to the data protection law by the means of 95/46/EC Directive whose Article 17 recognizes and protect the right to be forgotten, but worldwide courts came to regulate internet protection for individuals before and after EU Directive.
As an example, Italian High Court repeatedly outlines that the right to be forgotten is entitled whenever relevance and effectiveness of information are missing (see Cassazione Civile 16111/2013) as well as trial courts do (see Tribunale di Roma 23771/2015).
EU Court of Justice Google v. Costeja Gonzalez
The most important legal precedent about the application of data protection law to search engines is in the matter of Google Spain SL, Google Inc. v Agencia Española de Protección de Datos, Mario Costeja González. On 13 May 2014 the EU Court of Justice ruled that “it must be pointed out that it is not necessary in order to find such a right that the inclusion of the information in question in the list of results causes prejudice to the data subject” clarifying that users can request search engines to de-list links to search results affecting their privacy, before relevant judicial authority against refusal, under Articles 7 and 8 of the EU Charter of Fundamental Rights and Article 12(b) and subparagraph (a) of the first paragraph of Article 14 of Directive 95/46/EC.
Characters involved: individuals, search engines and DPA
Data protection law upon which lays the right to be forgotten entitles a natural person, called data subject, whose personal data is processed by a controller to ask removal of any information that directly or indirectly identify him.
Search engines as data controllers
Data controllers to whom the removal is addressed, are search engines operators – see Google or Bing or Yandex as most frequently involved in legal precedents - that process personal data not limiting their activity to loading the data on an internet page as publishers do. Since the processing enables internet user to view private information related to an individual’s name, it is liable to affect fundamental rights of privacy and data protection. For this reason, search engine can be asked to demote their economic interest against fundamental rights entitled to data subject.
Data Protection Authorities
Regarding to EU area, delisting and removal are currently ruled by national protection laws which are applied in any unjustified refusal to provide. Data subject can turn their request to Data Protection Authorities or to national courts as long as rights can be exercised against national subsidiaries of search engines in their respective Member States.
Public figures’ right to be forgotten can be weakened
While individuals can count on a wider protection, public figures’ right to be forgotten can be demoted by their role in public life. Public figures are commonly considered for their media exposure due to their being politicians or high level professionals.
Resolution n. 1165 on the right of privacy for public fugures
On 1998, the Parliamentary Assembly of the Council of Europe issued the Resolution no. 1165 about the right of privacy defining the public figures. They are “persons holding public office and/or using public resources and, more broadly speaking, all those who play a role in public life, whether in politics, the economy, the arts, the social sphere, sport or in any other domain.”
Genuinely private information and ordinary private information
Information about their private life as health or family should be considered genuinely private and concealed to search while all other information even if private, should be considered relevant for their role in public life and related search results not to be delisted.
European Court for the Human Rights states on 2012 that the particular protection accorded to private individual’s life cannot be acknowledged to public figures when related information are facts capable of contributing to a debate in a democratic society.
Criteria commonly applied by DPAs to order delisting
Once enforced the request to delist a search result, DPAs should consider twelve rules issued by the Article 29 Working Party set up under Directive 95/46/EC. Before the applicant’s role in public life and the consequent relevance of private information of the public figures, the first criterion is on being the applicant a natural person and the search result being related to his name or pseudonym.
Third, when the data subject is a minor, the “best interest of the child” makes the delisting more likely to be approved. The fourth criterion to be assessed is the accuracy of data reported or the amount of factual information data contain as delisting is ordered when data are evidently inaccurate, inadequate or misleading.
Fifth, the data must be relevant and not excessive according to the interest of the general public accessing information, relating to the working life of data subject, data’s age, ability to offence and its appearance of verified fact rather than opinion. The sixth criterion is determined under Article 8 of the Directive 95/46/EC. It states that sensitive data are the ones with greater impact on data subject’s private life and DPAs are more likely to impose delisting when information to remove are about health, sexuality or religion.
Again, when data are out of date and not reasonably current they are more likely to be delisted. Criterion no 8 is related to the prejudice caused to the data subject that should bear a disproportionately negative impact on his privacy when there is not wider public interest on that information. Among other criteria for delisting, DPAs should consider whether the information bring a risk of identity theft or stalking for the data subject.
Tenth, DPAs’ evaluation attains to the context of publishing and to the data subject’s consent. Another criterion for assess appropriateness of delisting is the existence of a journalistic purpose beneath the publishing of the original content as it does not collide with the law provision under which search engines should organize search results. Then, DPAs consider delisting appropriate when publisher of the data does not have the legal power or obligation to make personal data publicly available and, last, when the data relate to a criminal offence it should be considered the seriousness of the offence as well as the age of the fact.
The criterion of relevance and not excessiveness of data
One of the most immediate criteria for delisting is the relevance and not excessiveness of the data. Relevance can be strictly related to the age of data since information published long time ago are more likely to be considered less relevant than recent ones.
Sub-criteria: work, offence, verified fact
But data should be also relevant and not excessive according to three different sub-criteria. First, whether the data relate to the working life of the data subject, since information related to professional life are likely to cause less harm than the ones related to private life. Indeed, relevance is proportional to the data subject’s current working life according to the nature of his work and the public interest on this information whenever data are not excessive. Second, whether information is excessive and constitutes an expression of offence against the data subject as long as it is a criminal offence itself or violates law. Third, whether information reflects an opinion or is a verified fact pointing out that opinions are not to be delisted for their unpleasantness.
Consent of data subject as an important criterion
The context of publishing plays a role in the approval of the request of delisting a search result. When the content is not made voluntarily public by the data subject or whenever it was not intended to be made public, delisting should be considered more appropriate by DPAs. If the legal basis of the publishing is the consent and it is then revoked or the data subject is made unable to revoke it, publishing then becomes unlawful and search results to be delisted.
February 18, 2016 Data Protection Authority of Italy decision on relevance of data
With decision issued by Italian DPA on February 18, 2016, under provisions of Article 2, paragraph 1, Article 3, paragraph 1, Article 11 paragraph 1 lett.) b) and e) and paragraph 2 of D. Lgs. 196/2003 Google Inc. is demanded to delist search result linking to not relevant and excessive information. An Italian data subject asks removal of web links referring to a criminal offence of him, related to drugs, dating back to 2004. Google’s search results linked to some newspapers online reporting data subject’s name and personal information that made the subject easily identifiable. Data subject lodged complaint for information not being relevant and remote.
EU and Italian legal precedent as basis for the decision
The DPA, according to the C-131/12 case of EU Court of Justice Google Spain v. Gonzalez, outlines that there is not a public interest to accessing an information related to facts dating so back, especially when the subject definitely grown up and changed his life. The lack of relevance is considered highly harming for the individual’s social and working life but Google, asked to, refused to remove the links due to persistent public interest into processing of data.
Guideline of implementation of the Court of Justice
DPA considered both criteria of Guideline on the implementation of the Court of Justice of Article 29 issued on 2014, November 26, that outline how information dating back several years ago are less relevant than current information and precedent of Corte di Cassazione 16111/2013 that states how right to be forgotten is harmed whenever effectiveness and currency of information are missing.
For these reasons, it orders Google Inc, pursuant to Articles 143, paragraph 1, let. b) and Articles 154, paragraph 1, let. c) to remove and delist search results linking to newspapers URL reporting not relevant information.