CPPA: problems and criticisms – appropriate purposes

Digital Charter Implementation Act

Canada is planning to revamp its comprehensive privacy law by repealing the existing comprehensive privacy law, PIPEDA, and by enacting Bill C-27, the Digital Implementation Act (“DIA”) to enact the Consumer Privacy Protection Act (CPPA), Personal Information and Data Protection Tribunal Act (PIDTA), and Artificial Intelligence and Data Act (AIDA). Bill C-27 replaced Bill C-11 (the former drafts of the CPPA and PIDTA). While the DIA attempts to rectify some of the criticisms with Bill C-11, many of the problems remain and problems have emerged in the new Bill. This blog series will address some of the more important problems with the DIA including issues in the CPPA, PIDTA and AIDA. My prior post focused on the purposes of Bill C-27’s preamble and an overview of how the bill fails to meet these purposes. This post focuses on the challenges with the amendments to the appropriate purposes provision and recommends changes to it.

The existing law under PIPEDA

Section 5(3) of PIPEDA contains the existing appropriate purposes limitation:

An organization may collect, use or disclose personal information only for purposes that a reasonable person would consider are appropriate in the circumstances.

The appropriate purposes provision is an incredibly potent vehicle that the Commissioner can use to find any collection, use, or disclosure of personal information violates PIPEDA even if these activities otherwise fully comply with the law. It acts as an override clause that trumps all of other provisions of the law and can be used to upset the otherwise carefully calibrated rights Parliament enacted to balance individual’s rights to privacy with the needs of organizations to use personal information for their purposes.

The proposed amendments to the law (discussed below) vastly broadens the scope of the provision and makes it application much more complex and should therefore be of critical concern.

Court interpretations of s5(3)

The courts have considered the principles to be used in interpreting the appropriate purposes limitation.

In Eastmond v. Canadian Pacific Railway, 2004 FC 852, the Commissioner had made a finding that CP’s use of surveillance cameras in the workplace violated the appropriate purposes limitation and recommended that they be taken down. When CP did not remove them, the applicant applied to the Federal Court for an order that they be removed. The Court disagreed with the findings of the Commissioner holding that the cameras did not violate S5(3).

The court canvassed in detail the test to apply in applying the appropriate purposes limitation. The Commissioner had advanced this four factor test:

    • Is the measure demonstrably necessary to meet a specific need?
    • Is it likely to be effective in meeting that need?
    • Is the loss of privacy proportional to the benefit gained?
    • Is there a less privacy-invasive way of achieving the same end?

The Court was prepared to take into account and be guided by the factors. It held, however, that factors which the Privacy Commissioner took into account in the case “may not necessarily be relevant in other contexts”. The Court noted that the analysis must be done in a contextual manner. It also held that the appropriate purposes for one act e.g. collection, may be different than the appropriate purposes for other act e.g., use or disclosure. It also ruled that the test “suggests flexibility and variability in accordance with the circumstances”. According to the Court:

[131]  Parliament clearly provided the appropriateness of purposes or why personal information needs to be collected must be analysed in a contextual manner looking at the particular circumstances of why, how, when and where collection takes place. Also, the appropriate purposes for collection may be different than the appropriate purposes for use and the appropriate purposes for disclosure of collected information, all of which suggests flexibility and variability in accordance with the circumstances.

The appropriate purposes limitation was also considered by the Federal Court in
Turner v. Telus Communications Inc., 2005 FC 1601, aff’d 2007 FCA 21. In the case, the Applicants sought a declaration that Telus had contravened Sections 5(1) and 5(3) of PIPEDA by requiring employees to provide biometric personal information for Telus’ use in authenticating identity. The Court did not consider the proper test to apply in interpreting the appropriate purposes limitation. Rather, it considered certain factors and did a brief analysis of them against the evidence adduced on the application. It did so without any comment on whether the factors considered would be appropriate for use in other cases. According to the Court:

[48] Taking into account the foregoing, and against the above brief analysis of: the degree of sensitivity associated with voice prints as personal information; the security measures implemented by Telus; the bona fide business interests of Telus as established on the evidence before the Court and to which the collection of voice prints is directed; the effectiveness of the use of voice prints to meet those objectives; the reasonableness of the collection of voice prints against alternative methods of achieving the same levels of security at comparable cost and with comparable operational benefits; and the proportionality of the loss of privacy as against the costs and operational benefits in the light of the security that Telus provides; I conclude that the collection of the voice print information here at issue would be seen by a reasonable person to be appropriate in the circumstances, as they existed at all times relevant to this matter, and against the security measures adopted by Telus.

The interpretation of the appropriate purposes limitation was again considered in A.T. v. Globe24h.com, 2017 FC 114. In this case, the Federal Court made an order that the Respondent, Sebastian Radulescu, contravened PIPEDA by collecting, using and disclosing on his website, www.Globe24h.com personal information contained in Canadian court and tribunal decisions for inappropriate purposes and without the consent of the individuals concerned. In reaching its decision the Court purported to summarize the principles the Federal Court had in the past considered in construing the appropriate purposes limitation (including the Turner v Telus case):

[73] Subsection 5(3) creates an overarching requirement that an organization “collect, use or disclose personal information only for purposes that a reasonable person would consider are appropriate in the circumstances.” This must also be read in light of the underlying purpose of Part 1 of PIPEDA provided by section 3.

[74]  In considering whether an organization complies with subsection 5(3) of PIPEDA, this Court has in the past considered whether (1) the collection, use or disclosure of personal information is directed to a bona fide business interest, and (2) whether the loss of privacy is proportional to any benefit gained: Turner v Telus Communications Inc, 2005 FC 1601, [2005] FCJ No 1981 at para 48, aff’d 2007 FCA 21.

OPC Guidance and findings on s5(3)

The OPC has provided guidance on s5(3). The 2018 Guidance, provided after the CP, Telus, and Globe24 cases were decided, is contained in Guidance on inappropriate data practices: Interpretation and application of subsection 5(3) (May 2018). The Guidance, in part, reads as follows:

A guiding principle

Subsection 5(3) “is a guiding principle that underpins the interpretation of the various provisions of PIPEDA”. In turn, it must be read in light of the underlying purpose of Part 1 of PIPEDA which is to balance the right of privacy of individuals with respect to their personal information and the need of organizations to collect, use or disclose personal information. In applying subsection 5(3), one is therefore required to engage in a “balancing of interests” between the individual and the organization concerned.

Reasonable person lens

Subsection 5(3) requires a balancing of these interests “viewed through the eyes of a reasonable person.”

Evaluating an organization’s purposes under 5(3)

The evaluation of subsection 5(3) requires an examination of whether the purposes are appropriate “in the circumstances.” As such, the analysis must be conducted “in a contextual manner” and look at the particular facts surrounding the collection, use and disclosure, “all of which suggests flexibility and variability in accordance with the circumstances”.

In applying subsection 5(3), the courts have generally taken into consideration whether: “1) the collection, use or disclosure of personal information is directed to a bona fide business interest, and 2) whether the loss of privacy is proportional to any benefit gained.” In Turner v. Telus Communications Inc, the Federal Court, in a decision affirmed by the Federal Court of Appeal, set out the following factors for evaluating whether an organization’s purpose was in compliance with subsection 5(3):

  • The degree of sensitivity of the personal information at issue;

Whether the organization’s purpose represents a legitimate need / bona fidebusiness interest;

Whether the collection, use and disclosure would be effective in meeting the organization’s need;

Whether there are less invasive means of achieving the same ends at comparable cost and with comparable benefits; and

Whether the loss of privacy is proportional to the benefits.

The OPC summarized its approach to the appropriate purposes limitation in the Clearview AI case stating the following:

In accordance with the OPC’s Guidance on inappropriate data practices: Interpretation and application of subsection 5(3), the OPC considers the factors set out by the courts in order to assist in determining whether a reasonable person would find that an organization’s collection, use and disclosure of information is for an appropriate purpose in the circumstances. These factors are to be applied in a contextual manner, which suggests flexibility and variability in accordance with the circumstances. In applying s.5(3), the courts have determined that the OPC is required to engage in a “balancing of interests” between the individual’s right to privacy and the commercial needs of the organization concerned. This balancing of interests must be “viewed through the eyes of a reasonable person.” Similar factors are also considered by OIPC BC in determining whether the purpose is reasonable.

In subsequent cases, while the OPC continues to refer to a list of factors taken from the Telus case, it still (mostly) follows it’s Guidance qualifying their use by saying that it “generally” applies these factors. See, for example Telecommunications firm failed to obtain appropriate consent for voiceprint authentication program, 2022 CanLII 91035 (PCC),

The proposed amended appropriate purposes limitation

Sections 12(a) and (b) would replace s5(3) with the following, with the new text being underlined.

Appropriate purposes

12 (1) An organization may collect, use or disclose personal information only in a manner and  for purposes that a reasonable person would consider appropriate in the circumstances, whether or not consent is required under this Act.

Factors to consider (all new)

(2) The following factors must be taken into account in determining whether the manner and purposes referred to in subsection (1) are appropriate:

(a) the sensitivity of the personal information;

(b) whether the purposes represent legitimate business needs of the organization;

(c) the effectiveness of the collection, use or disclosure in meeting the organization’s legitimate business needs;

(d) whether there are less intrusive means of achieving those purposes at a comparable cost and with comparable benefits; and

(e) whether the individual’s loss of privacy is proportionate to the benefits in light of any the measures, technical or otherwise, implemented by the organization to mitigate the impacts of the loss of privacy on the individual.

As can be seen, the factors in subsection (2) are somewhat based on the factors referred to in the Telus case and in the OPC Guidance. It may even be that the Bill C-27 drafters attempted to codify these factors.

In so far as Bill C-27 purports to codify the existing law or the OPC Guidance or jurisprudence, it materially failed to do so.

The factors referenced in Telus were never intended as a statement of the factors to always be used in assessing appropriate purposes. They were merely referred to as factors the court reviewed in that case. In fact, the Globe24 case, which relied on Telus, summarized the law as considering only two factors, namely whether (1) the collection, use or disclosure of personal information is directed to a bona fide business interest, and (2) whether the loss of privacy is proportional to any benefit gained.

Requiring all factors to be taken into account in every case is contrary to the teaching in CP which emphasized that even the 4 factors then proposed by the Commissioner did not need to be applied in every case. Further, the decision stressed the need for a contextual review which, contrary to the listed factors, requires “flexibility and variability in accordance with the circumstances”.  It is also inconsistent with the OPC Guidance which summarized all of the caselaw and with decisions of the OPC which, in cases like Clearview AI, also summarize the law and in other decisions which apply the factors noting that they are “generally” applied, not that they “must be taken into account” in every situation.

Material changes in the appropriate purposes limitation

The mandatory factors now included in s12(2) have also been reworded from the factors the Commissioner “generally” applies to make the appropriate purposes assessment even more complicated than non-the mandatory factors listed in the OPC Guidance, as shown in the table below.

(a) the sensitivity of the personal information; The degree of sensitivity of the personal information at issue;
(b) whether the purposes represent legitimate business needs of the organization; Whether the organization’s purpose represents a legitimate need / bona fide business interest;
(c) the effectiveness of the collection, use or disclosure in meeting the organization’s legitimate business needs Whether the collection, use and disclosure would be effective in meeting the organization’s needs
(d) whether there are less intrusive means of achieving those purposes at a comparable cost and with comparable benefits; and Whether there are less invasive means of achieving the same ends at comparable cost and with comparable benefits; and
(e) whether the individual’s loss of privacy is proportionate to the benefits in light of any the measures, technical or otherwise, implemented by the organization to mitigate the impacts of the loss of privacy on the individual. Whether the loss of privacy is proportional to the benefits.

 

Moreover, the scope of the prohibition has also been vastly expanded by the addition of the “manner” in which the information is collected, used or disclosed. This is a new open ended overriding prohibition that further overlaps with many of the other CPPA provisions including those pertaining to consent, transfers of information for processing, disposal, use of personal information to make automated decisions, and uses of de-identified information. It does so even though Parliament has set out explicit rules related to these important elements of the Bill.

By way of example only, the CPPA has a provision that requires persons that use automated systems to be transparent about their uses of such systems and to provide explanations upon request of how decisions were made. However, the CPPA does not go as far as the GDPR in prohibiting certain types of decisions being made solely by automated means. It can be presumed that Parliament made a decision not to go that far in the CPPA. However, the CPPA, particularly with this expansion, could now permit the Commissioner to take the position that making automated decisions or making automated decisions without a person in the loop is an inappropriate purpose or manner of using of personal information. In theory, the Commissioner could also find certain artificial intelligence systems’ use of personal information are illegal for reasons that may overlap with or even be inconsistent with AIDA, something that Parliament also would not have intended.

In short, it is not good policy to have such a broad override provision in the CPPA. It is a further mistake to leave such broach discretion to the Commissioner to unilaterally decide that a practice is illegal because the Commissioner has concluded a reasonable person would find it not appropriate. 

The new test is unworkable and would trump Parliament’s intended balance

The factors as applied to all possible conduct under the CPPA would be impractical and often impossible to apply. Even attempting to do so would be onerous, time consuming, and expensive. The provision, and in particular the last factor appears to be as complicated and fact intensive to apply as the Oakes test is under our Charter of Rights and Freedoms. While such a test is appropriate in the Charter context, it is manifestly not a feasible or workable test for organizations to apply in the privacy context. Such a test would create impossible burdens, especially for the creation of innovative products or services. It could impede innovation, contrary to one of the key purposes of the CPPA as set out in the preamble.

The most important provisions in the CPPA already include numerous guardrails including those that require they be consistent with reasonable expectations of individuals. These guardrails did not exist, or did not exist to such an extent, in PIPEDA. Accordingly, there is no sound policy basis to extend the ambit of this overarching provision in the CPPA.

The appropriate purposes limitation is inappropriate with the new enforcement regime

The appropriate purposes limitation fit comfortably within the PIPEDA enforcement regime, but creates major challenges under the new CPPA enforcement regime.

Under PIPEDA, if the Commissioner found a practice to be offside the limitation he could make a finding and recommend that the practice be stopped. But, if the person disagreed, the matter had to be adjudicated by a court which could reach a different decision than the Commissioner, as the court did in the CP case.

But, under Bill C-27, the Commissioner could make a finding that an activity is not appropriate and could make interim and final orders enforcement orders shutting down the practice. These orders would be virtually unappealable to the new Tribunal which can only set aside a decision based on an error of law. Yet, the error would almost invariably be an error of fact or mixed fact and law and would leave decisions on the scope of the CPPA almost exclusively to the Commissioner who alone could decide if a practice is for an inappropriate purpose or is exercised in an inappropriate manner.

Leaving such important questions to the Commissioner without appropriate judicial oversight raises questions of whether rule of law has been taken into account in the expansion of s12(2) in conjunction with the new enforcement and appeal regime. Even leaving such decisions to the courts raises questions about a democratic process that leaves key decisions about the scope of our privacy laws to the courts. See, Barry Sookman, Liability under the CPPA, and McCarthy Tetrault, The CPPA’s Privacy Law Enforcement Regime.

Interoperability with other laws

The proposed amendments to the appropriate purposes limitation would also move the federal legislation even further away and less interoperable with national and international data protection laws.

Provincially, the expanded law would go significantly further than British Columbia’s private sector privacy law that only permits the collection, use, or disclosure of personal information “for purposes that a reasonable person would consider appropriate in the circumstances”. It would also go much further that Alberta’s private sector privacy law, which restricts the collection, use or disclosure of personal information “only for purposes that are reasonable”. There is no comparable limitation in Quebec’s newly amended privacy law.

There are also no similar overarching principles in the privacy laws of our major trading partners including those in the United States. The GDPR has requirements that personal data be “processed lawfully, fairly and in a transparent manner in relation to the data subject” but there is no similar expansive provision that creates such an invasive privacy regime within a privacy regime even in the EU.

Summary and recommendations

The appropriate purposes limitation is an overarching limitation which trumps all of the carefully calibrated provision in the CPPA. It should therefore apply only in clearly determinable circumstances where there is a cogent and pressing need to disregard all of the other rules in the CPPA to find a practice otherwise legal to be illegal – if it is still needed or appropriate at all in the context of the CPPA. It is hard to imagine many cases where an organization would be able to obtain a valid consent under the CPPA when being seriously offside the appropriate purposes limitation. The converse is unfortunately not true. The provision is, in essence, a privacy law within a broad and carefully crafted privacy law. Its scope should not be made even broader than it is now and the test for its applicability should be flexible and easy for organizations to apply.

I therefore recommend:

Recommend: That s12(2) and the addition of “the manner”, be deleted.

Recommend: That appeals from S12(1) be de novo appeals that can ultimately be determined by the Tribunal, with an ability to appeal such decisions to the courts.

Recommend: In the alternative, s12(2) and the addition of “the manner” be deleted and replaced with the test formulated in the Globe24 case, “whether (1) the collection, use or disclosure of personal information is directed to a bona fide business interest, and (2) whether the loss of privacy is proportional to any benefit gained.”

Recommend only as last resort: That section 12(2) be amended so that the factors “may” be considered by a court and that the relatively more straight forward factors listed in the Telus case, replace s12(2).

 

 

 

Print Friendly, PDF & Email

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

automated decision making

Using privacy laws to regulate automated decision makingUsing privacy laws to regulate automated decision making



Making decisions about individuals using computers and computer algorithms is now commonplace. There are now also increasing proposals to use privacy laws to regulate automated decision making. One of the first explicit attempts to regulate ...

%d bloggers like this: