Christina Catenacci, BA, LLB, LLM, PhD
The Office of the Privacy Commissioner of Canada (OPC) investigated a complaint into the scraping of Facebook user data by the app “thisisyourdigitallife” (TYDL) and its subsequent selling of the data to Cambridge Analytica (CA) for psychographic modelling purposes between November 2013 and December 2015. The OPC made an application to the Federal Court of Canada (FCC) and argued that Facebook breached the Personal Information Protection and Electronic Documents Act (PIPEDA) because of its practice of sharing Facebook users’ personal information with third-party applications (apps) hosted on the Facebook platform.
The FCC dismissed the OPC’s application and confirmed that the OPC had not shown that Facebook failed to obtain meaningful consent from users for disclosure of their data, and it had not shown that Facebook failed to adequately safeguard user data. The OPC appealed. The Federal Court of Appeal FCA) allowed the OPC’s appeal—it found that the FCC made an error in its analysis of meaningful consent and safeguarding under PIPEDA. Therefore, the FCA concluded that Facebook breached PIPEDA’s requirement to obtain meaningful consent from users prior to data disclosure and failed in its obligation to safeguard user data.
Most are familiar with the Cambridge Analytica scandal that Facebook was involved with, but some may not know about how the whole thing started. It all began when Facebook launched its Platform that enabled third parties to build apps that could run on Facebook and be installed by users. There was also an app programming interface that allowed the third party apps to receive user information, called Graph API. By 2013, 41 million apps were available on Facebook.
Facebook required third-party apps (apps) to agree to its Platform Policy and Terms of Service in order to get access to Platform. For example, one provision required apps to only request user data that was necessary to operate their apps, and only use friends’ data in the context of the user’s experience on the apps. Another required apps to have a privacy policy that told users what data the apps would use and how they would use or share data.
Facebook admitted that it did not assess or verify the actual content of the apps’ privacy policies. In fact, it only verified that the hyperlink to an app’s privacy policy linked to a functioning web page.
Additionally, in November 2013, Dr. Kogan, a Cambridge professor, launched the TYDL app on Platform. The app had a personality quiz. Through Platform, Dr. Kogan was able to access the Facebook profile information of every user who installed TYDL—and all of the information of each installing user’s Facebook friends.
Just 272 Canadian users installed TYDL, and this enabled the disclosure of the data of over 600,000 Canadians. In December 2015, the media reported that user data obtained by TYDL was sold to CA and a related entity, and that the data was used to develop psychographic models for the purpose of targeting political messages towards Facebook users leading up to the 2016 American presidential election. It was not until this point that Facebook removed TYDL from Platform. Yet, Facebook never notified affected users, and it did not ban Dr. Kogan or CA from Platform.
Subsequently, the OPC received a complaint about Facebook and concerns about compliance with PIPEDA. The OPC commenced an investigation.
The OPC found that Facebook’s superficial and ineffective safeguards and consent mechanisms allowed the app, TYDL, to gain unauthorized access to the personal information of millions of Facebook users. Some of that information was subsequently used for political purposes.
Following its investigation, the OPC found the following:
The OPC discussed in a statement that Facebook’s refusal to act responsibly was deeply troubling given the vast amount of sensitive information people entrusted to the company.
The OPC could have had the ability to make an order declaring that Facebook violated Canadian privacy laws and order a remedy that could be enforced, but the OPC did not have that power and was forced to go to the FCC and ask for an enforceable order of its findings.
The FCC decided that the matter should be dismissed. In fact, the court reviewed Facebook’s data policy, terms of service, Platform policy, the user controls, and the educational resources explaining privacy basics. It noted that Facebook had teams of employees who were dedicated to detecting, investigating and combating violations of Facebook’s policies. It also noted that Facebook took about six million enforcement actions during the period in question, but Facebook did not provide the reasons for the actions.
The FCC also examined TYDL’s privacy policy and noted that it was unclear whether it was shown to users, and Facebook did not verify the contents of third-party policies.
At this point, the court stated that the purpose of PIPEDA was to balance two competing interests, and the court had to interpret PIPEDA in a flexible, common sense, and pragmatic manner. The court stated that the question before it was whether Facebook made reasonable efforts to ensure users and users’ Facebook friends were advised of the purposes for which their information would be used by the apps.
The FCC held that:
The FCC dismissed the application. The OPC appealed.
The FCA allowed the OPC’s appeal. That is, the court agreed with the OPC that the FCC made errors in its analysis when it sided with Facebook. The OPC argued that:
The FCA found the following:
Interestingly, the FCA commented that it was the responsibility of the FCC to define an objective, reasonable expectation of meaningful consent. It stated, “To decline to do so in the absence of subjective and expert evidence was an error.” Moreover, the FCA noted the curious double reasonableness requirement and stated, “If a reasonable individual were unable to understand how their information would be used or disclosed—as here—this ends the inquiry. An organization cannot exercise reasonable efforts while still seeking consent in a manner that is itself inherently unreasonable.”
Further, the FCA also noted that the data policy offered mundane examples of how the apps could use user data, and it did not contemplate large-scale data scraping, which occurred in this case. In particular, the FCA pointed out that the language in the policy was simply too broad to be effective.
The FCA also pointed out that the word, consent, had content and in this case the content was legislatively prescribed. It included an understanding of the nature, purpose and consequences of the disclosure. The FCC had to ask whether the reasonable person would have understood that in downloading a personality quiz, they were consenting to the risk that the app would scrape their data and the data of their friends, to be used in a manner contrary to Facebook’s own internal rules. It stated, “Had the question been asked of the reasonable person, they could have made an informed decision.” Indeed, the court emphasized that other contextual evidentiary points supported this perspective of a reasonable person. For instance, when looking at the contractual context, we see that these were consumer contracts of adhesion.
In terms of safeguarding, the FCA stated that the unauthorized disclosures in this situation were a direct result of Facebook’s policy and user design choices. In fact, Facebook invited millions of apps onto its platform and failed to adequately supervise them. The FCA stated that the FCC “failed to engage with the relevant evidence on this point, and this was an error of law.”
Facebook did not review the apps’ privacy policies even though the apps were able to download users’ data and that of their friends. Facebook also did not act on TYDL’s request for unnecessary information—a red flag. The FCA stated, “Facebook’s failure to take action upon seeing red flags amounted to Facebook turning a blind eye to its obligation to adequately safeguard user data.” And this was part of a larger pattern: Facebook never notified users about the scraping and selling of their data once Facebook became aware of this practice. Similarly, it did not ban Dr. Kogan or CA from Platform.
The FCA also clarified that Facebook’s conduct after the disclosure to TYDL was irrelevant—the safeguarding principle dealt with an organization’s internal handling of data, not its post-disclosure monitoring of data. However, it was important to note that Facebook’s post-disclosure actions contextually supported the finding that it did not take sufficient care to ensure the data in its possession prior to disclosure was safeguarded.
The FCA also mentioned that Facebook was entitled to rely on the good faith performance of contracts, but only to a point. It was telling that Mark Zuckerberg admitted that it would be difficult to guarantee that there were no bad actors using its Platform. The FCA stated that it was incongruent to expect a bad actor to carry out a contract in good faith. Facebook therefore should have taken further measures to monitor third-party contractual compliance.
And when it came to balancing under PIPEDA, the FCA highlighted that PIPEDA’s purpose, as set out in section 3, referred to an individual’s right of privacy, and an organization’s need to collect, use or disclose personal information. This is what had to be balanced. An organization had no inherent right to data, and its need had to be measured against the nature of the organization itself. There was a critical difference between one’s right to privacy and a company’s need for data, as set out in section 3.
The FCA held that Facebook’s practices between 2013-2015 breached Principle 3, Principle 7, and section 6.1 of PIPEDA and a declaration should issue to that effect.
The FCA noted that the Federal Trade Commission in the United States fined Facebook $5 billion for its role in this scandal. But the FCA noted that time has passed, and practices have evolved since this time period. The FCA stated, “The Court will not issue orders which would be of no force or effect.” It noted that the events that gave rise to this application took place a decade ago.
Therefore, the FCA allowed the OPC’s appeal with costs, and declared that Facebook’s practices between 2013 and 2015 constituted a violation of PIPEDA. The FCA stated that there would need to be a consent remedial order, and if there was not, the parties would have to make further submissions.
As we can see from this case, organizations need to comply with PIPEDA’s consent and safeguarding provisions, and it is not good enough to say that there are too many apps on a company’s platform and it is too difficult to read the apps’ policies.
More specifically, in this case it was necessary for Facebook to have sufficient policies, and review and monitor the policies of third-party apps to confirm compliance with their own policies. That is, it was important for Facebook (and the apps) to obtain meaningful consent from each user (users who installed the apps, and users who were the installer’s friends). Also, safeguarding obligations do not end once information is disclosed to the apps. Rather, it is necessary for Facebook to adequately supervise the apps and ensure that there is compliance with company policies.
I’ve discussed workplace gossip here before, and what bosses can do to prevent it or at least reduce the potential harm, but there are a couple of hyper-modern developments that I didn’t get into: reality television and the Internet. These two things have created a culture of “sharing”, for lack of a better word, that encourages people at play or work to divulge the most mundane and private details of their lives to others—the kind of information that one previously might only have shared with family or best friends.
Adam Gorley
I’ve discussed the Privacy by Design principle before, in the Inside Internal Control newsletter. In case you don’t know, PbD is an approach developed by Dr. Ann Cavoukian, the Privacy Commissioner of Ontario, which proactively embeds privacy protection by default in the design of an organization’s practices and products.
Colin Braithwaite