Written By Sébastien Gittens, Stephen Burns and HC Lee
Foreshadowing potential significant changes to the federal Personal Information Protection and Electronic Documents Act (PIPEDA), the Office of the Privacy Commissioner of Canada (OPC) is currently seeking input from interested third parties regarding its recommended changes to PIPEDA in the context of artificial intelligence (the "Consultation Document").
Based on the Consultation Document, the OPC believes that PIPEDA "falls short in its application to AI systems", and that AI "presents fundamental challenges to all foundational privacy principles as formulated in PIPEDA." Given such perceived shortcomings, the OPC has created a list of 11 "proposals for consideration" to amend PIPEDA—many of which have direct parallels to the EU's General Data Protection Regulation (GDPR).
The key takeaways from these proposals include the following:
- Defining "Artificial Intelligence" in PIPEDA. The OPC is considering defining "artificial intelligence" in PIPEDA to accommodate any AI-specific rules that may be introduced therein.
- Adopting a rights-based approach to protect individual privacy rights. The OPC is of the view that a GDPR-like approach be undertaken by ensuring that PIPEDA "… be given a rights-based foundation that recognizes privacy in its proper breadth and scope, and provides direction on how the rest of the Act's provisions should be interpreted." Reflecting the 2019 Resolution of Canada’s Federal, Provincial and Territorial Information and Privacy Commissioners, the OPC accordingly asserts that AI should be "designed, developed and used in respect of fundamental human rights, by ensuring protection of privacy principles such as transparency, accountability, and fairness."
- Creating a right to object to automated decision-making, with exceptions. The OPC proposes that individuals should have a right to object to automated decision-making and profiling similar to the rights under GDPR.
- Creating a right to explanation and increased transparency when individuals interact with or are subject to automated processing. The OPC proposes that individuals should have a right to an explanation of the reasoning underlying any automated processing of their data, and the consequences of such reasoning for their rights and interests. The OPC also proposes further transparency measures in PIPEDA, including performing privacy impact assessments regarding AI systems or public filing of algorithms (similar to U.S. Securities and Exchange Commission filings), with penalties for non-disclosure and non-compliance.
- Requiring Privacy by Design and Human Rights by decision in processing, including data collection. The OPC proposes that the following requirements be incorporated into PIPEDA: (i) the "Data Protection by Design" concept articulated in Article 25 of the GDPR (which includes "putting in place appropriate technical and organizational measures designed to implement the data protection principles and safeguard individual rights and freedoms"); and (ii) the completion of an algorithmic impact assessment prior to the production of any automated decision system.
- Making compliance with purpose specification and data minimization principles realistic and effective. The OPC acknowledges that it "… may be difficult to specify purposes that only become apparent after a machine has identified linkages." As such, the OPC seeks input and discussion on whether the principles of purpose specification and data minimization is workable in the AI context, or whether there are alternatives to achieve the same goals.
- When obtaining meaningful consent is not practicable, including in the law alternative grounds for processing and solutions to protect privacy. The OPC believes that meaningful consent should be required "in the first instance for transparency and to preserve human agency." However, the OPC proposes that alternative grounds should be available in instances where meaningful consent is not possible and prescribed conditions are met (e.g., demonstrating that obtaining consent was considered impracticable and a privacy impact assessment was performed). The OPC seeks input on the content and criteria for such alternative grounds.
- Allowing for flexibility in using non-identifiable information and ensure there are enhanced measures to protect against re-identification. To address the risk of re-identification, the OPC is of the view that PIPEDA should continue to apply to de-identified/non-identifiable information, but that there could be flexibility to use such information (e.g., certain PIPEDA principles (such as consent) could either not apply or their application could be relaxed). In addition, the OPC proposes that there should be penalties for negligent or malicious actions resulting in re-identification of personal information.
- Requiring organizations to ensure data and algorithmic traceability. The OPC proposes that PIPEDA should have a requirement for algorithmic traceability to meet goals of accountability, accuracy, transparency, data minimization as well as access and correction. The OPC seeks input on the necessity of algorithmic traceability or whether there are alternative means to meet these stated goals.
- Requiring demonstrable accountability for development and implementation of AI processing. The OPC proposes mandating "demonstrable accountability" under PIPEDA. This would require organizations to prove compliance with legal requirements on request. Methods of achieving such accountability include traceability, explanation rights, privacy impact assessments, independent third-party auditing and record keeping requirements to facilitate proactive inspections by the OPC.
- Empowering the OPC to issue binding orders and financial penalties to organizations for non-compliance with the law. The OPC believes that PIPEDA should include enhanced enforcement mechanisms to provide "quick and effective" remedies for individuals. In addition to an ability to impose financial penalties, the OPC seeks a range of order making powers, including "the ability to require an organization to stop collecting, using or disclosing personal information, to destroy personal information collected in contravention of the legislation, and more generally to order the application of such remedial measures as are appropriate to ensure the protection of the personal information, among others."
Needless to say, many of the OPC's proposed changes to PIPEDA may significantly impact how organizations conduct business, particularly with respect to the collection and use of personal information through AI and machine learning processes. Moreover, several of these 11 "proposals for consideration" could potentially be applied by Parliament so as to have a more general effect (i.e., even though they have been framed in the Consultation Document through the lens of AI only).
Accordingly, organizations: (i) should review the Consultation Document with due consideration; and (ii) are encouraged to provide feedback to the OPC with respect to the Consultation Document by March 13, 2020.
The Privacy & Data Protection team at Bennett Jones is available to assist your organizations to do so, and answer any questions you might have about your organization’s privacy obligations.
Please note that this publication presents an overview of notable legal trends and related updates. It is intended for informational purposes and not as a replacement for detailed legal advice. If you need guidance tailored to your specific circumstances, please contact one of the authors to explore how we can help you navigate your legal needs.
For permission to republish this or any other publication, contact Amrita Kochhar at kochhara@bennettjones.com.