The Digital Charter Implementation Act: problems and criticisms – the preamble

Digital Charter Implementation Act

Canada is planning to revamp its comprehensive privacy law by repealing the existing privacy law, PIPEDA, and by enacting Bill C-27, the Digital Charter Implementation Act (“DCIA”). This law would enact three separate but interrelated laws, the Consumer Privacy Protection Act (CPPA), Personal Information and Data Protection Tribunal Act (PIDTA), and Artificial Intelligence and Data Act (AIDA). Bill C-27 replaced Bill C-11 (the former drafts of the CPPA and PIDTA). While the DCIA attempts to rectify some of the criticisms with Bill C-11, many of the problems remain and problems have emerged in the new Bill. This blog series will address some of the more important problems with the DCIA including issues in the CPPA, PIDTA and AIDA. This post will address the disparate purposes of the DCIA and outline some of the ways the bill fails to meet these purposes.

Preambles to federal laws are important. They seek to provide readers including courts with Parliament’s intentions in enacting laws. When the wording of a statute is ambiguous or if a law is challenged on constitutional grounds, preambles are often referred as aids in determining Parliament’s intentions.

The DCIA contains the following statements in the preamble:

Whereas there is a need to modernize Canada’s legislative framework so that it is suited to the digital age;

Whereas the protection of the privacy interests of individuals with respect to their personal information is essential to individual autonomy and dignity and to the full enjoyment of fundamental rights and freedoms in Canada;

Whereas Parliament recognizes the importance of the privacy and data protection principles contained in various international instruments;

Whereas trust in the digital and data-driven economy is key to ensuring its growth and fostering a more inclusive and prosperous Canada;

Whereas Canada is a trading nation and trade and commerce rely on the analysis, circulation and exchange of personal information and data across borders and geographical boundaries;

Whereas the design, development and deployment of artificial intelligence systems across provincial and international borders should be consistent with national and international standards to protect individuals from potential harm;

Whereas organizations of all sizes operate in the digital and data-driven economy and an agile regulatory framework is necessary to facilitate compliance with rules by, and promote innovation within, those organizations;

Whereas individuals expect a regulatory framework that ensures transparency and accountability with respect to how organizations handle their personal information and that is backed by meaningful enforcement;

Whereas the modernization of national standards for privacy protection to align them with international standards ensures a level playing field for organizations across Canada and assists them in maintaining their competitive position;

Whereas a modern regulatory framework governing the protection of personal information should promote the responsible collection, use and disclosure of such information by organizations for purposes that are in the public interest;

Whereas Parliament recognizes that artificial intelligence systems and other emerging technologies should uphold Canadian norms and values in line with the principles of international human rights law;

And whereas this Act aims to support the Government of Canada’s efforts to foster an environment in which Canadians can seize the benefits of the digital and data-driven economy and to establish a regulatory framework that supports and protects Canadian norms and values, including the right to privacy;

The recitals contain, among others, these important themes:

    • That privacy is important to enable individuals to enjoy their fundamental rights, which rights are contained in international instruments.
    • Organizations must be transparent and accountable with respect to their handling of personal information.
    • Canada’s privacy laws should be aligned with and interoperable with international standards and should foster trans-border data flows.
    • Canada’s privacy laws should support innovation.
    • Canadians need protection from harms caused by AI systems consistent with national and international standards and international human rights laws.

It is apparent that none of the listed goals can be absolute. Thus, the DCIA should carefully balance these goals to foster the achievement of the goals without undermining the achievement of other goals. In the quickly developing digital economy driven by emerging technologies including AI and the many potential uses of data, getting the balance right is of critical importance to Canadians and the Canadian economy.

In my view, the DCIA misses the mark in many respects. These will be more fully explained in future posts. But, by way of overview:

    • The CPPA is very prescriptive and rule-based and is much less flexible than PIPEDA. Examples are the stringent test and limitations on the use of implied consents.
    • The CPPA creates ambiguous new standards that are often difficult or impossible in practice to apply, along with detailed record keeping and assessment standards. Examples are the appropriate purposes limitation, the numerous standards that are based on what is reasonable or appropriate to expect, standards for obtaining valid consents, the legitimate interests exception, and requirement for explainability of decisions made using automated systems.
    • The CPPA is far more onerous than international standards. It is not interoperable in many respects with the GDPR or the laws of our major trading partners in the United States, provincial or other international standards. Examples are standards for obtaining consents, exceptions to consent such as for R&D, provisions related to anonymization of personal information, the disposal/erasure of data, and service provider obligations.
    • The CPPA has extremely harsh penalties. These are exacerbated by the nearly anorexic procedural protections including processes before the Commissioner and the non-existent appeal rights on findings of fact and mixed questions of fact and law made by the Commissioner. When one considers the ambiguous standards contained throughout the CPPA with the combination of weak procedural safeguards and the unappealability of key decisions including findings of liability (that can also trigger class actions with no due diligence defense), the very high AMPs  that can be imposed (fines by any other word) and new order making powers of the Commissioner (without even any automatic right to appeal interim orders of the Commissioner), the lack of  fairness and natural justice and risks to Canadian innovators and innovation is evident.
    • It is unclear what the references to privacy being a fundamental right enshrined in international instruments is intended to accomplish. It may be seeking to overcome the long line of established jurisprudence that even though privacy laws have a quasi-constitutional status, such laws including exceptions must still be construed using principles of statutory construction and not be read narrowly merely because of that status. If that is the intent, one might have expected an amendment to the purposes clause in the CPPA which still, appropriately, continues to express the need for balance between “the right of privacy of individuals with respect to their personal information and the need of organizations to collect, use or disclose personal information for purposes that a reasonable person would consider appropriate in the circumstances”.
    • With the enactment of AIDA Canada could be one of the first countries to enact a general law to regulate the use of artificial intelligence systems and data used for such systems. There are many known and other potentially serious problems with AI systems and some form of regulatory frameworks are likely required. For a recent book on the topic, see, Henry A. Kissinger et al, The Age of A.I.: And Our Human Future. However, there is currently no international consensus on the appropriate approach. The issue is still be studied in the EU (where there is a draft AI Directive and an AI Liability Directive) and the UK. Canada’s proposed framework, even though subject to future regulations to define its scope, may be pre-mature and could impose strict rules inconsistent with international standards. This could impede innovation in one of the most important technologies of the 4th industrial revolution. Moreover, considerable more thought needs to be given on which entities are most appropriate to oversee such regulation.

These themes will be further expounded up in subsequent blog posts.

 

 

 

 

 

Print Friendly, PDF & Email

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Alberta PIPA violates Charter says Supreme Court in IPC v United Food and Commercial WorkersAlberta PIPA violates Charter says Supreme Court in IPC v United Food and Commercial Workers



The Supreme Court released a landmark decision today in the  Alberta (Information and Privacy Commissioner) v. United Food and Commercial Workers, Local 401, 2013 SCC 62 case. In short, the Court found that while Alberta’s ...

%d bloggers like this: