Public consultation on the Privacy Act – Submission – Samuelson-Glushko Canadian Internet Policy & Public Interest Clinic

Samuelson-Glushko Canadian Internet Policy & Public Interest Clinic (CIPPIC)

Tamir Israel, Staff Lawyer

With written contribution & analysis from:

[Information was severed]

February 14, 2021

CC-BY-SA 4.0 2021 Samuelson-Glushko Canadian Internet Policy & Public Interest Clinic (CIPPIC)

Electronic version first published by the Samuelson-Glushko Canadian Internet Policy & Public Interest Clinic. The Samuelson-Glushko Canadian Internet Policy & Public Interest Clinic (CIPPIC) is a legal clinic based at the Centre for Law, Technology & Society at the University of Ottawa, Faculty of Law.

The Samuelson-Glushko Canadian Internet Policy & Public Interest Clinic has licensed this work under a Creative Commons Attribution Share-Alike 4.0 (International) License.

Creative Commons License
https://creativecommons.org/licenses/by-sa/4.0/

Samuelson-Glushko Canadian Internet Policy & Public Interest Clinic

University of Ottawa, Faculty of Law, Common Law Section
57 Louis Pasteur Street
Ottawa, ON K1N 6N5
Website: https://cippic.ca
Email: admin@cippic.ca
Twitter: @cippic

Corrections & Questions

Any errors, omissions or policy positions remain solely the responsibility of the primary author. Please send all questions and corrections to the primary author directly, at: tamir@cippic.ca

Table of Contents

Table of Recommendations

Introduction

The Samuelson-Glushko Canadian Internet Policy & Public Interest Clinic (CIPPIC) is pleased to provide its input into the government’s consultation on the Privacy Act. As noted in the Department of Justices’ Consultation Document, government data handling practices have evolved substantially since the adoption of the Act in 1983, as have the regulatory tools used by independent agencies around the world to ensure the human right to privacy is respected as personal information is gathered, analyzed and stored in ever-expansive ways.

This initiative, which seeks to modernize the Privacy Act, is therefore timely and welcome.

Section 1. Putting Human Rights & Privacy First

Privacy is a human right, protected by Articles 12 and 17 of the Universal Declaration of Human Rights and the International Covenant on Civil and Political Rights, respectively and includes the right to effective measures precluding unauthorized processing, the right to individual access, the right to openness of data handling processes, and the right to data accuracy.Footnote 1

In Canadian law, privacy is protected by sections 7 and 8 of the Charter, and laws that protect the right to privacy are classified as quasi-constitutional in nature.Footnote 2 This classification is “a reminder of the extent to which the protection of privacy is necessary to the preservation of a free and democratic society.”Footnote 3

The Privacy Act plays a critical role in securing the human right to privacy and, by extension, the fundamental values of autonomy, dignity and informational self-determination that underpin it.Footnote 4 The Act’s over-arching framework must be commensurate with its important constitutional nexus if the protection of privacy is to be fully realized.

A human rights centric approach recognizes the pre-eminent importance of privacy in relation to other, non-constitutional values such as administrative expediency.Footnote 5

Recommendation 1: Purpose clause must place privacy protection first

The Privacy Act’s purpose clause should be replaced with the following:

Purpose

X. The purpose of this Act is to extend the laws of Canada to protect individuals with respect to their personal information and in particular to protect the fundamental right to privacy in an era in which technology increasingly facilitates the accumulation, circulation and exchange of information.

A human rights approach must also ensure that incursions on the right to privacy only occur where it is proportionate and necessary to achieve a specifically articulated and legitimate objective.Footnote 6

Currently, a government agency may only collect personal information where it “relates directly” to an operational program or activity.Footnote 7 For decades, this was understood to limit collection to where it is both necessary and proportionate,Footnote 8 a requirement that was confirmed by non-binding Treasury Board of Canada Secretariat policies.Footnote 9 However, it is no longer clear that the Privacy Act legally requires government agencies to limit collection, use or disclosure of personal information to what is necessary and proportionate.Footnote 10 This is inconsistent with the standard for administrative collection imposed by other public sector laws across the country.Footnote 11

Under the existing Act, limits on use and disclosure are mostly limited by the purposes that animated the initial collection of personal information. The current statutory framework places no explicit limits on retention or on the types of purposes that can be advanced by an operating program. This is out of touch with modern data protection obligations, and holds private sector companies to a more stringent standard than their public sector counterparts.Footnote 12

In order to ensure the human right to privacy is adequately protected, the Privacy Act must encode an over-arching limitation ensuring that personal information will only be collected, used, retained or disclosed where proportionate and necessary to an objective that a reasonable person would consider appropriate in the circumstances.

Recommendation 2: Data practices must be necessary & proportionate

The Privacy Act must include an over-arching necessity and proportionality obligation:

X. Personal information will only be collected, used, disclosed or retained by a government institution where it is [strictly] necessary for and proportionate to purposes that a reasonable person would consider appropriate in the circumstances.

Section 2. Furthering Reconciliation

As noted in the Consultation Document, the Privacy Act plays an important role in state interactions with Indigenous Peoples. This modernization initiative offers an opportunity to improve reconciliation by formalizing role for Indigenous groups to participate directly in decision-making that impacts the privacy interests of their communities.

The Truth and Reconciliation Commission of Canada’s (TRCC) Calls to Action emphasize the importance of the Indigenous right to self-determination as expressed in the United Nations

Declaration on the Rights of Indigenous Peoples (UNDRIP) at Call to Action 43.Footnote 13 Beyond just self-determination, UNDRIP calls upon states to facilitate Indigenous Peoples’ “right to maintain and strengthen their distinct political, legal, economic, social and cultural institutions”Footnote 14 as well as the right to “maintain and develop their own indigenous decision-making institutions.”Footnote 15 Article 23 goes on to describe Indigenous Peoples’ right to exercise a right over development and delivery of programs through their own institutionsFootnote 16 while Article 31 describes the kinds of traditional knowledge and expressions thereof that Indigenous Peoples are entitled to maintain, control, and protect which includes “human and genetic resources.”Footnote 17

The OCAP (Ownership, Control, Access, and Possession) Principles guide Indigenous communities “in making decisions regarding why, how, and by whom information is collected, used, or shared.”Footnote 18 These principles reflect the rights of Indigenous Peoples to maintain and control information about their communities and includes reports, statistics, records, and other data. Entities such as the First Nation Information Governance Centre (FNIGC) have developed expertise and governance structures to ensure that Indigenous community input is incorporated into decision-making that impacts data practices, on the OCAP framework.

Embracing the OCAP Principles in the Privacy Act could be achieved through consultation with Indigenous groups. It could involve formalizing a role for the FNIGC and similar governance mechanisms representing other Indigenous groups within the parameters of the Act. This gatekeeping role would operate in addition to the protections in the Act, providing an additional layer of control. Depending on the nature and scope of the information, data can be repatriated to Indigenous groups and the governance bodies such as the FNIGC can operate as a ‘gatekeeper’ to that personal data.Footnote 19 Certain government activities including health and demographic research could be made contingent on approval by the FNIGC and in reliance on its community . Notably, FNIGC input must apply to data that is exempted from or imperfectly covered by the current Act, including de-identified and aggregate data.Footnote 20 Government use of this type of data can have significant impact on First Nation communities, and its use should reflect their communal privacy interests. Finally, the full realization of the OCAP Principles would necessarily entail increased funding for bodies such as the FNIGC and for other Indigenous communities to increase their own data governance capacity and exercise greater control over the information collected about them and held by federal government departments and agencies.

In addition, additional protections for specific categories of data might be adopted into the Privacy Act in consultation with Indigenous groups. For example, the Canadian Bar Association’s Privacy and Access Law and Aboriginal Law Sections have identified a risk that personal data of residential school survivors might be disclosed without their explicit consent.Footnote 21 Disclosure without input in this context could “result in deep discord within the communities whose histories are intertwined with that of the residential schools system” and would undermine reconciliation.Footnote 22

Recommendation 3: Incorporate Indigenous Community Control
  • In consultation with Indigenous groups, explore formalizing a role for community input mechanisms such as the First Nations Information Governance Centre.
  • Control and input provided through Indigenous governance bodies should expand the protections available in the Act, and apply to personal data that is currently imperfectly captured by the Act such as de-identified or statistical demographic data.
  • Funding must also be made available for Indigenous communities seeking to exercise community control over their information.
  • In consultation with Indigenous groups, some categories of data might require additional and explicit protection within the Act.

Section 3. An Effective Regulatory Framework

An effective regulatory framework must place an independent regulator at its core. Currently, the role delegated to the Privacy Commissioner is deficient in two key respects.

First, the consultation document notes that the Act lacks a comprehensive compliance framework. This is an understatement. Currently, the Office of the Privacy Commissioner of Canada is only able to issue non-binding recommendations when responding to complaints, and even its access to enforcement through the [pita] de novo mechanism is curtailed and limited to individual access denials. Meaningful respect for the right to privacy in the modern era demands an independent regulator with the ability to apply the law where necessary.

Second, under the Act, the Treasury Board of Canada Secretariat (TBS) is the primary vehicle for implementing the Act’s substantive obligations. This provides no latitude for the Privacy Commissioner to impose regulatory policies regarding specific practices. While TBS has an important and central ongoing role to play in privacy management, in an age where technological innovation fuels rapid and tectonic changes in government practices, independent regulatory frameworks have an instrumental role to play.

As a result, the Privacy Act lacks a meaningful regulatory framework commensurate with the important quasi-constitutional rights it is intended to protect.

The current framework for regulatory control and oversight under the Privacy Act is deficient. Under this framework, the Act is applied by government institutions through their respective policies and practices, through TBS which issues government-wide policies and directives, through judicial review of government conduct, or through the auspices of the Privacy Commissioner who can issue non-binding findings in response to complaints or investigations. A modernized Privacy Act would enhance shortcomings in this framework.

3.1 Improving accountability for frontline agencies

Frontline responsibility for applying the Act rightly rests pragmatically with government institutions and agencies through their respective policies and practices.Footnote 23 There is currently no explicit accountability obligation in the Act, and the requirement to operate a structured privacy management program arises primarily from non-binding TBS policies.Footnote 24 A The consultation document recommends encoding this obligation within the text of the statute.Footnote 25

Accountability obligations are becoming a regular feature of data protection frameworks, including those applying to public sector bodies.Footnote 26

Formalizing an accountability framework within the text of the statute would improve compliance by explicitly empowering the privacy commissioner to review privacy management programs and impose conditions on their structure and constituent elements.

Recommendation 4: Formalize Accountability obligations

Accountability obligations currently hosted primarily in TBS policies should be formalized within the text of the Act and subject to review and detailed elaboration by the Privacy Commissioner.

The consultation document also recommends codifying an obligation to conduct privacy impact assessments within the text of the statute. As with other accountability mechanisms, there is currently no explicit obligation within the Privacy Act to conduct Privacy Impact Assessments (PIA). PIAs are intended to allow for early analysis and review of the privacy and related human rights and ethical implications of an intended activity with the objective of building trust and confidence in government data practices.Footnote 27 TBS policies recognize a non-binding requirement to conduct privacy impact assessments for new or substantially modified programs and activities where personal information is implicated.Footnote 28

Several features of the current PIA framework undermine its ability to instill public confidence in government privacy practices. To begin with, the requirement to conduct a privacy impact assessment is a non-binding policy, meaning it can be ignored on a case by case basis while the standard for conducting PIAs can be changed by TBS at any time.Footnote 29 Second, while TBS requires agencies to notify the Privacy Commissioner of any privacy impacting initiatives at an early stage of development, agencies are only required to consult with internal government stakeholders until the PIA has been finalized.Footnote 30 Finally, there is no obligation to consult with external stakeholders or with the public in general during the program development process, while only recommended some elements of a PIA are ever made public once the PIA has been finalized and approved.Footnote 31 As a result, highly invasive programs may progress through testing, piloting and advanced implementation stages before meaningful external input is provided.Footnote 32

The Privacy Act must impose a legal impact assessment obligation, triggered by the intention to adopt or expand any program or course of action that implicates personal information. The Privacy Act must also provide the Privacy Commissioner with latitude to impose binding obligations regarding the nature and rigour with which different impact assessments occur, taking into account factors such as the sensitivity of the personal information at issue and the risk of harm. The impact assessment process must be broad enough to permit a range of considerations,Footnote 33 and must include space for public input in particular with respect to the adoption of potentially controversial technologies and programs.Footnote 34 It is notable in this respect that regulators in Australia and the United Kingdom, regulators have indicated the need to consult with relevant civil society stakeholders and the general public as integral steps in meeting the legally mandated public impact assessment requirement in their respective privacy statutes.Footnote 35

Finally, the Privacy Act or accompanying binding regulatory guidance from the Privacy Commissioner must mandate that full privacy impact assessments be published by default.Footnote 36 Any redactions or omissions must be justified, with reference to clearly established rationales such as those encoded as exceptions to the right of access in section 12 of the Act. In its inaugural report, the government’s National Security Transparency Advisory Group (NS-TAG) noted the “reflexive secrecy” of many government agencies and flagged the need for more aggressive and proactive disclosure in order to foster the democratic resilience and credibility of public agencies.Footnote 37 There is simply no justification to continue current practice in Canada, where it is the default to publish short and incomplete summaries of PIAs and force individuals seeking to discover further information to rely on the auspices of the Access to Information Act.

Recommendation 5: Encode Rigorous Privacy Impact Assessment Obligations
  • Encode an obligation to conduct robust impact assessments at the earliest possible opportunity in the adoption and development cycle of privacy-impacting programs.
  • Assessments must be conducted under the close supervision of the privacy commissioner and, where appropriate, with public and civil society input, and finalized impact assessments must be made public by default.
  • The scope of impact assessments must be broad enough to encompass ethical and human rights impacts arising from the processing of personal information.
  • The privacy commissioner must be able to issue binding regulatory guidance on the conditions for undertaking privacy impact assessments, the criteria which guide how rigorous a given assessment must be, and the need to consult with expert stakeholders or the general public.

3.2 Applying & Enforcing the Act

The Act must also be amended to provide a more robust regulatory framework.

The addition of a mechanism to conclude legally binding agreements with government agencies is a helpful recommendation. The Privacy Commissioner must also be empowered to issue binding orders when resolving investigations under section 29 of the Act, including the power to issue remedial measures where appropriate to comply with the Act. The appeal path from such orders must not replicate the existing de novo review currently employed as the primary enforcement mechanism under PIPEDA (which regulates privacy in the private sector) and for elements of the Privacy Act. De novo review is an overly intensive enforcement mechanism that ignores the specialized expertise of a quasi-judicial body such as the Office of the Privacy Commissioner.

In addition, the Privacy Commissioner must be free to conduct regulatory oversight and control of public agencies outside the strictures of the complaint driven process. In this respect, the capacity to conduct commissioner-initiated investigations under section 29(3) must be broadened by removing the reasonable grounds limitation. It is unclear on why a threshold investigative standard should operate to limit the Commissioner’s capacity to ensure public agencies are respecting the Privacy Act.

In addition, the Commission must be empowered to issue binding regulatory guidance in a proactive manner. Currently, TBS is the primary entity through which non-binding directives and guidelines are applied across all government agencies.Footnote 38 While procedural safeguards such as the need for a public inquiry and a statutory appeal would be important to incorporate into the development of binding regulatory policies, the Commission must be empowered to issue proactive guidance even in the absence of an existing violation of the Act.

Recommendation 6: Grant the Privacy Commissioner True Regulatory Powers
  • Empower the Privacy Commissioner to enter into legally binding agreements with government agencies.
  • Empower the Privacy Commissioner to issue binding orders subject to deferential appeal by any impacted individual or agency when resolving investigations into violations of the Act.
  • Empower the Privacy Commissioner to conduct investigations into agency practices even in the absence of reasonable grounds to believe an investigation is required.
  • Empower the Privacy Commissioner to proactively issue binding regulatory frameworks, following a public proceeding and subject to deferential appeal by any impacted individual or agency .

Finally, in order to incentivize compliance with the Act and to ensure public agencies are held to at least the same standard as private companies, a private right of action should be incorporated into the Privacy Act. Ideally, this right of action would include statutory damages for specific types of violations. At minimum, however, the mechanism should allow for damages where violations of the Privacy Act result in harm. A model for such a provision exists in Bill C-11, which currently proposes to amend PIPEDA by recognizing the right of any individual to seek damages in any superior court of record for any loss or injury suffered as a result of a contravention of that statute.Footnote 39 A similar cause of action would be important to ensure accountability for public agencies.

Recommendation 7: Encode a Private Right of Action for Resulting Harms

Encode a private right of action that would at minimum recognize the right of individuals to seek damages in any superior court for any loss or injury resulting from a confirmed contravention of the Act.

Section 4. Improving Transparency Measures

The ability to collect information about individuals without their direct participation has grown exponentially in the years since the Privacy Act was initially enacted. Currently, government agencies are only required to report annual statistics regarding the frequency and nature of certain types of electronic surveillance, while the majority of third party collection remains unreported.Footnote 40 It is important for the public to understand the scope and parameters of government data collection from external sources, particularly in light of the tendency for rapid changes in practices regarding data collection of this type. The Privacy Act should require government departments and agencies to report annually the scale and nature of personal data collected directly from third party sources.

As the Consultation Document notes, agencies should also be required to publish privacy notices, explaining to the public their personal information practices. The Privacy Act should include an additional openness obligation, requiring government agencies to elaborate on their practices when receiving requests from individuals. Most private sector organizations in Canada are subject to such openness obligations, and government agencies should not be held to a lower standard.Footnote 41

Finally, the Privacy Act should require government agencies to notify individuals where their sensitive personal data has been collected without their participation, subject to exceptions relating to investigative necessity. Currently, government agencies are able to surreptitiously undertake intrusive programs with no public accountability at all and without impacted individuals ever becoming aware.

For example, in 2018, Statistics Canada undertook an intrusive and wide-ranging program that collected the sensitive financial information of 24 million Canadians from financial institutions.Footnote 42 This program only came to public attention through unofficial channels, whereas the agency itself stressed its view that it was under no legal obligation to inform impacted individuals of their inclusion in the program.Footnote 43 The operation and secrecy of intrusive, disproportionate and wide-ranging programs of this nature undermines trust in public agencies, and the lack of any legal recourse exacerbates this lack of trust.

Recommendation 8: Encode Statistical Reporting & Individual Notice Requirements
  • Obligate government agencies to publish annual statistics on the scope and nature of their third party data collection and disclosure programs.
  • Obligate government agencies to publish privacy notices outlining their privacy practices.
  • Obligate government agencies to take reasonable steps to inform individuals of the details of their privacy practices upon request.
  • Obligate government agencies to notify individuals when their sensitive personal information is collected, used or disclosed without individual participation.

Section 5. Safeguards & Breach Notification

While the Privacy Act currently implicitly requires government institutions to prevent unauthorized disclosure, there is no explicit obligation to adopt security safeguards and to notify impacted individuals should a data breach occur. In addition, the Act should explicitly prohibit agencies from undermine cryptography and other central security mechanisms.

Prominent security breaches in both the private and public sector are no longer infrequent occurrences and can greatly undermine trust in government data stewardship.

The reform initiative should formalize the obligation to adopt security safeguards and to notify the Privacy Commissioner as well as any impacted individuals should a breach of these security safeguards occur. The Act should establish rigorous standards for this new security safeguard and notification obligation.

Recommendation 9: Set high standard for safeguards & breach notification
  • The Act should obligate the adoption of robust technical and organizational safeguards to prevent unauthorized access to personal information under government control.
  • The Act should obligate agencies to notify the Privacy Commissioner of Canada of any non-trivial breach of security safeguards and to notify any potentially impacted individuals.

The Privacy Act should also adopt measures that will improve the background conditions for cybersecurity. Cybersecurity is effectively an arms raise to secure data against adversaries of constantly evolving capabilities and sophistication. Securing data in this context is difficult, and it is important to adopt measures that will create a more favourable environment for those seeking to do so.

To this end, the Act should explicitly prohibit government actions that undermine security at a systemic level. Historically, various government agencies have controversially acted to weaken core security tools such as encryption protocols and implementation mechanisms.Footnote 44 As recommended in 2019 by the House Standing SECU Committee, government agencies should prioritize general cybersecurity over other objectives such as the impetus to undermine critical safeguards in order to facilitate greater access to personal data.Footnote 45 The Act should explicitly prohibit agencies from undermining or limiting encryption.

Additionally, the Act should provide a regulatory basis for addressing security vulnerabilities and exploits. This would include the impetus for a framework addressing vulnerability equity and disclosure programs.

Too often, security researchers in Canada have faced serious legal threats and liability for good faith attempts to notify government agencies of security lapses.Footnote 46 CIPPIC has additionally worked with security researchers who were deterred from disclosing uncovered vulnerabilities due to the potential threat of civil or criminal liability. Many government agencies and companies around the world have formalized ‘safe harbours’ for individuals who wish to disclose security vulnerabilities.Footnote 47

Formalized Vulnerability Disclosure Programs result in a three-fold benefit. First, they establish parameters for appropriate security analysis to guide researchers. Frequently, potentially problematic security research practices (e.g. where personal information is exfiltrated in contexts where doing so is unnecessary for a proof of concept, or overly rapid publication of an exploit before an agency has had the opportunity to address it) are frequently forestalled simply by providing authoritative and clear guidance on what is and is not appropriate. Second, it substantially improves the security of government systems by removing barriers that otherwise chill security researchers from disclosing critical vulnerabilities. Finally, it is fundamentally unfair to punish well meaning researchers who are simply attempting to point out security flaws in government networks and services. Even where such researchers ultimately mount a legal defence and are acquitted, the cost and trauma are unjust reward for purely non-malicious endeavours. While the details and specifics of any safe harbours for security researchers are perhaps best addressed by individual agencies with guidance from TBS and under the oversight of the Privacy Commissioner, the impetus for this safe harbour must be encoded in the Act.

Government agencies also frequently become aware of vulnerabilities in off the self systems in the course of their general activities. For some agencies, questions surrounding treatment of known vulnerabilities are complicated by conflicting lawful access objectives. For such organizations, a known but undisclosed vulnerability is an opportunity that can be exploited to access a secure system for public safety, national security or broadly framed foreign intelligence purposes. As a result, sometimes known vulnerabilities remain unreported and are even stockpiled by some agencies, even while the same vulnerabilities undermine security of other Canadian government systems. In extreme cases, unauthorized access to vulnerability stockpiles collected by government and private sector agencies has led to widespread harm as the unpatched vulnerabilities in health, financial, educational and government systems were widely exploited.Footnote 48 We note that effective defensive security does not require stockpiling of previously unreported zero day exploits.Footnote 49

With this in mind, when organizations balance competing offensive and defensive objectives in order to assess exploit disclosure, it is critical that this occur with independent input and within a legal framework. Many agencies operate a Vulnerability Equities Program, but the parameters and operation of VEPs remain deeply secret and are not clearly subject to rigorous regulatory oversight.Footnote 50 Other agencies may lack a formal VEP altogether, despite relying on vulnerabilities in the broader course of their public safety and national security activities. The Privacy Act should therefore mandate organizations to emphasize systemic security when assessing whether to publicize discovered vulnerabilities. The Privacy Commissioner should also be empowered to compel specific agencies to adopt and publicize VEPs, and to audit the operation of these programs.

Recommendation 10: Secure conditions for robust encryption & vulnerability disclosure
  • Prohibit agencies from actively undermining critical security protocols and tools such as encryption.
  • Create a legislative safe harbour for responsible security disclosure of vulnerabilities to agencies.
  • The Act should obligate agencies to emphasize systemic cybersecurity considerations when assessing whether to publicize discovered vulnerabilities.
  • Empower the Privacy Commissioner to compel specific agencies to adopt and publicize formal Vulnerability Equities Programs and to audit the operation of these programs.

Section 6. Publicly Available & De-identified Data

The Consultation Document contemplates changing two central gate-keeping concepts in the Act. The Document asks whether the Act’s exception for publicly available information should be narrowed and whether a definition should be adopted for when personal information is not ‘identifiable’ and therefore falls outside the protections of the Act. The Act must be defined in an expansive manner, so that data categories are not excluded from its protective scope.

6.1 Publicly Available Information

Sub-section 69(2) of the Privacy Act currently excludes publicly available personal information from the Act’s use and disclosure protections. The Consultation Document suggests that ‘publicly available’ should be defined and the scope of the exception should be narrowed. We agree.

It should be noted at the outset that Canadian courts have long recognized that privacy expectations persist in public spaces. As the Consultation Document states, privacy laws have also historically applied to publicly available personal information in Canada.

This is for good reason, and monitoring of social media sites provides a cogent example. Social media is user-generated content. It comes in many forms from text, to images to videos, and is a window into the lives of the content producer. Approximately 95.9% of all Twitter users have less than 500 followers.Footnote 51 While for Facebook, an average user has 338 friends and only considers 28% to be authentic.Footnote 52 Many platforms also have Terms of Services that restrict third party capturing, storing and indexing. The social context in which these interactions occur reinforces a strong expectation that they will not be repurposed to achieve any of a range of state objectives.

The detailed and sensitive information that can be collected through social media and the deep insights that can be drawn from open source intelligence demonstrate the need for a regulatory framework of some form. This is particularly the case as Indigenous Peoples and members of other marginalized communities are frequently the objects of monitoring. The Royal Canadian Mounted Police’s Project Sitka, for example, mapped Indigenous protest movements in Canada through intensive social media monitoring. Despite the predominantly peaceful nature of the protests, Project Sitka tracked hundreds of individuals through social media and recorded their affiliation with various political protests and causes.Footnote 53 The National Energy Board has similarly engaged in monitoring of Indigenous and other participants in a forthcoming regulatory hearing.Footnote 54 And in 2010, Aboriginal Affairs and Northern Development Canada surreptitiously monitored the personal Facebook postings of a First Nations activist who was engaged in litigation with the Department.Footnote 55 In each of these instances, the potential for abuse is high, underscoring the need to maintain a regulatory regime.

The Act should therefore be amended to remove the broad existing exception for use or disclosure of publicly available information. Some targeted exceptions might be adopted in its place, permitting specific data practices in relation to personal information published on non-interactive sites, such as newspapers, or for purposes that related directly to those for which the personal data was disclosed.

Recommendation 11: Ensure publicly available information is sufficiently protected

Remove the existing exceptions for use and disclosure of publicly available information and replace with targeted exceptions for specific data handling of personal information from non-interactive publications.

6.2 Identity Obfuscation Techniques

The Consultation Document also suggests encoding an explicit role for identity obfuscation techniques within the Privacy Act. As noted in the document, de-identification and other identity obfuscation techniques can operate as a meaningful privacy safeguard. However, de-identification techniques should not operate as wide-ranging exceptions to the Act’s core protections.Footnote 56

Few identity obfuscation techniques are flawless, particularly when applied at large scale, while re-identification capabilities continue to develop.Footnote 57 As a result, organizations frequently under-estimate the risk of re-identification, and this has led to a number of instances where individuals have been re-identified by third parties. In addition, de-identified datasets and models can increasingly impact individuals in myriad ways.Footnote 58

As a result, de-identified data which constitutes personal information must remain subject to the Act’s core protections, including its consistent purpose and consent requirements. Enrollment in a personal information bank where personal information has been de-identified to some degree must continue to constitute a ‘use’, a ‘disclosure’, or a ‘creation’ of personal data.

De-identification can play an important role as a technical safeguard for personal information, and in particular can implicate the proportionality of a given data program. For example, depending on the use and personal information at issue, less rigorous notification or consent mechanisms may be appropriate where robust de-identification is applied.

Recommendation 12: Identity obfuscation techniques as safeguards, not exclusions
  • The Act and all of its protections should continue to generally apply to the collection, use, disclosure and retention of information about an identifiable individual.
  • Identity obfuscation techniques should be recognized as important technical safeguards, as well as a factor in assessing the proportionality of any given data practice.
  • Re-identification might be directly incorporated into elements of specific measures in the Act, such as determining whether consent can be implied or not to achieve a non-consistent but publicly important and non-adversarial objective.
  • Re-identification attempts should be directly prohibited and penalized in the Act.

Section 7. Cross-Border Disclosures

The Consultation Document suggests encoding an accountability regime for the processing or storage of personal data by organizations that reside outside of Canada.

This form of outsourcing can pose a risk to privacy, in particular because such organizations are subject to foreign state access laws and frequently operate under different norms. Cross-border flows are particularly complicated by the broad scope of national security regimes, which place few limits on the ability of signals intelligence agencies to access private data of non-citizens.Footnote 59 Private sector companies are limited in their ability to resist lawful access demands from foreign state agencies, a limitation that increases when personal information is stored or processed in foreign jurisdictions.Footnote 60

Currently the Privacy Act regulates disclosures of personal information, including the need to prevent unauthorized disclosure. As a result, risks associated with cross-border personal information flows are considered as a component of the broader impact assessment that accompanies any disclosure of personal information from a government agency to a private sector entity for storage or processing. To that end, in some contexts, measures must be taken to ensure personal information does not flow outside of Canada whereas in other contexts contractual or technical measures must be taken to ensure comparable privacy and security standards.Footnote 61 The level of risk is currently assessed based on the sensitivity of the information in question, the reasonable and contractual expectations of impacted individuals, and the probability and gravity of a resulting harm.Footnote 62

The Act could encode an explicit obligation on government agencies to ensure a comparable level of protection when personal information is disclosed in a manner that will expose it to the laws and standards of foreign jurisdictions. Realization of this obligation may be contingent on the existing impact assessment process, under the supervision of TBS and the regulatory oversight of the Privacy Commissioner of Canada. This risk assessment process could be augmented by a personal information classification schema, which would assign different risk levels depending on the sensitivity of the personal information, the incentives for misuse that it attracts, and any benefits that are contingent on the personal information being processed abroad.

Recommendation 13: Formalize Cross-Border Transfer Regime

The Privacy Act should include an explicit obligation governing cross-border disclosure:

X. A Government Institution must ensure that any international transfer of personal information under its control or that results from a disclosure of personal information will provide a comparable level of privacy protection.

Section 8. Automated Decision-making

The Privacy Act must adopt and maintain robust measures for oversight of automated decision-making. Reliance on algorithmic tools is growing rapidly with significant implication for impacted individuals, underscoring the need to retain and expand current obligations in the Act.Footnote 63

Currently, the Privacy Act and related non-binding Treasury Board Secretariat of Canada (TBS) policies impose various obligations with respect to reliance on algorithmic tools.

Sub-section 6(2) of the Privacy Act requires agencies to “take all reasonable steps to ensure that personal information that is used for an administrative purpose by the institution is as accurate, up-to-date and complete as possible.”Footnote 64 “Administrative purpose” is currently defined as “the use of [personal] information in a decision making process that directly affects [an] individual”.Footnote 65

This obligation leads to a number of implications for automated decision-making systems. First, algorithmic systems rely on large datasets in order to ‘learn’ new automated tasks. Where these automated tasks include decisions that impact individuals, the underlying datasets must be accurate. Second, data-driven analytical tools (including algorithmic systems) rely on personal data in their operation and must be carefully calibrated to ensure that they lead to accurate results when relied upon in government decision-making processes that impact individuals.Footnote 66 As many automated decision-making tools are characterized by deep racial biases, disproportionate impact on marginalized groups must be actively avoided.Footnote 67

The Consultation Document recommends expanding sub-section 6(2) so that it is no longer limited to personal information in the context of automated decision-making by broadening the current definition of ‘administrative purpose’. While a more general ‘fit for purpose’ definition of administrative purpose would improve the protective scope of the Privacy Act, any amendments should, at minimum, retain an emphasis for the importance of personal information used in decision-making processes. An improved provision would explicitly encode a general obligation to ensure that any automated tools used in administrative decision-making are accurate.Footnote 68 In terms of specific vetting criteria, an ongoing role for TBS and the Office of the Privacy Commissioner of Canada is appropriate. In this context, TBS can issue policies, the OPC can impose binding regulatory guidance, and the OPC can apply fact-specific obligations when conducting impact assessments and conducting investigations.

Automated decision-making processes are currently also guided by non-binding TBS policies, including the TBS Directive on Automated Decision-Making and a TBS Algorithmic Impact Assessment tool.Footnote 69 The Directive adopts specific measures addressing transparency, accuracy and the need for human supervision, with measures of increasing rigour recommended based on the perceived anticipated ‘impact’ of a given decision-making system, while the Impact Assessment tool provides assists agencies in determining the gravity of this impact.Footnote 70

Elements of this framework should be encoded directly into the Privacy Act. Specifically, the obligation to include human supervision of any automated decision-making system impacts on individuals. A lower threshold for triggering the obligation for human intervention should be adopted, modelled on the European regime, which prohibits fully automated decision-making where any legal or similarly significant effects may result for an individual.Footnote 71 The Privacy Commissioner should establish what criteria and considerations might meet these impact thresholds through binding regulatory guidance,Footnote 72 while TBS would manage implementation.Footnote 73

The Act should similarly adopt obligations for public transparency requiring public notification in advance of the adoption of algorithmic tools in any decision-making process, to explain the underlying general logic of an algorithmic assessment to any individual impacted by it, to provide details regarding the composition of any training datasets, and to publicize theoretical and ongoing accuracy and bias rates.

The Act’s accuracy obligation could be met through a range of measures, depending on the impact of the anticipated tool. The Privacy Commissioner of Canada should be explicitly empowered to determine the specific criteria that should trigger different levels of rigour in meeting this obligation. Adoption of impactful automated decision-making systems must continue to be preceded by independent peer review, as well as consultation with stakeholders, including in particular with members of systemically marginalized communities.Footnote 74 TBS should also be responsible for periodically auditing decision-making systems for accuracy and bias, and the Privacy Commissioner may require independent third party audits in appropriate instances.

Recommendation 14: Encode a Regulatory Framework for Automated Decision-Making
  • The Act’s accuracy obligations should explicitly encompass an obligation to ensure data-driven assessment tools are accurate as possible.
  • The Act should include an explicit right to a human decision-maker at least where administrative conduct impacts the legal or other substantial interests of individuals.
  • The Act should explicitly obligate state agencies to be transparent in their use of automated decision-making tools. Measures include disclosing what tools are in use, publishing the composition of training datasets employed, and publishing accuracy and racial bias rates on an ongoing basis.
  • The Act should encode an individual right to be notified of any automated assessment tool that contributed to a decision that impacted them and to request an explanation of the logic underlying that assessment.
  • The Act should empower the Privacy Commissioner of Canada to compel different accuracy and transparency measures depending on the potential impact of the automated decision-making system in question and to audit systems for impact and bias.

Section 9. Biometric Protections

Biometric information poses a distinct threat to privacy. The underlying information is inherently sensitive, in that it is intricately linked to individual’s physical being or core behavioural traits. State deployment of biometric capabilities are also frequently intrusive in nature, while the creation and adoption of such capabilities frequently occurs surreptitiously, without public input.Footnote 75

Legal regimes around the world increasingly adopt dedicated legal protections for biometric information.Footnote 76 Regional public and private sector entities have imposed moratoriums on some particularly intrusive forms of biometric recognition.Footnote 77

The Privacy Act should explicitly recognize the sensitive nature of biometric information. Any specific course of conduct by a government agency that involves biometric information should trigger the highest and most robust protections available. This could involve, for example, requiring a more overt form of consent or a more stringent proportionality and necessity analysis.

The Act should also empower the Privacy Commissioner to impose specific conditions on the adoption and use of particularly intrusive biometric tools, including the requirement for prior approval by the Commissioner in the absence of express statutory authorization.Footnote 78

Recommendation 15: Additional Protection for Biometrics
  • The Act should explicitly recognize biometrics information as sensitive, triggering the most robust protections available under the Act.
  • The Act should empower the Privacy Commissioner to impose specific conditions on the adoption and use of particularly intrusive biometric systems.