2025 Wrapped: Top Five Privacy Developments in Canada
Overview
While 2025 has drawn to a close, privacy law in Canada continues to evolve rapidly in response to artificial intelligence and large-scale data collection. Courts and regulators alike have signalled a willingness to more closely scrutinize how organizations collect, use, and safeguard personal information. This article highlights Canada’s top five notable developments in the privacy space in 2025.
(Earlier this year, we prepared a Mid-Year Update. You may want to check out our Mid-Year Update as well, as we tried to limit the overlap between the Mid-Year Update and this Year in Review.)
1. Clearview AI, Inc. v. Alberta (Information and Privacy Commissioner), 2025 ABKB 287
One of the most closely watched privacy decisions of 2025 arose from Clearview AI’s practice of scraping billions of images from publicly accessible websites to build a facial recognition database. In Clearview AI, the Alberta Court of King’s Bench considered whether Alberta’s Personal Information Protection Act (“PIPA”) could constitutionally restrict the collection of publicly available personal information online.
While the Court accepted that Alberta’s privacy legislation could apply to foreign-based technology companies where there is a real and substantial connection to the province, it concluded that aspects of PIPA were unconstitutional as applied to Clearview AI’s activities, on the basis that they unjustifiably limited freedom of expression under section 2(b) of the Charter. In particular, the Court took issue with the extent to which PIPA restricted the collection of publicly accessible information for expressive purposes.
The decision has significant implications for web scraping, AI training, and the future of provincial private-sector privacy statutes. It underscores the tension between privacy protection and freedom of expression in an era where vast amounts of personal information are publicly available online, and it is expected to prompt legislative reconsideration of how provincial privacy laws address AI-driven data practices.
For more information on this case, see our recent blog post, Clearview AI v. Alberta: A Turning Point for Privacy Law and Internet Scraping.
2. Privacy Commissioner of Canada v. Aylo (Pornhub) – Enforcement under PIPEDA
In 2025, the Office of the Privacy Commissioner of Canada (“OPC”) took the notable step of commencing an enforcement application in Federal Court against Aylo, the operator of Pornhub, for alleged contraventions of the Personal Information Protection and Electronic Documents Act (“PIPEDA”).
The OPC’s application focuses on whether meaningful consent was obtained for the collection, use, and disclosure of highly sensitive personal information, including intimate images and videos. The OPC is seeking binding court orders, including deletion of personal information collected without valid consent and requirements to implement stronger consent and safeguards measures.
This proceeding is significant not only because of the sensitivity of the personal information at issue, but also because it reflects a broader shift in the OPC’s enforcement posture. Rather than relying primarily on recommendations and voluntary compliance, the OPC has shown an increased willingness to pursue judicial remedies to compel compliance with PIPEDA, particularly where vulnerable individuals and sensitive data are involved.
3. RateMDs Inc. v. Bleuler, 2025 BCCA 329
In RateMDs Inc. v. Bleuler, the British Columbia Court of Appeal set aside the certification of a proposed class action brought by health professionals whose profiles appeared on RateMDs.com, bringing the proceeding to an end. The respondent physician alleged that the publication of her name, contact information, and user-generated ratings and rankings on the website constituted a violation of privacy and an unauthorized use of her name under the Privacy Act.
The Court of Appeal held that the claims were bound to fail because the pleadings did not establish a reasonable expectation of privacy in the information at issue. While acknowledging the relative novelty of the statutory privacy torts, the Court emphasized that novelty alone cannot sustain a cause of action. The information was professional in nature and widely available from public sources, undermining any claim to privacy.
Relying on its earlier decision in Insurance Corporation of British Columbia v. Ari, the Court confirmed that informational privacy may, in some circumstances, include a degree of control over the use of personal information. However, control cannot exist independently of a reasonable expectation of privacy grounded in the sensitivity of the information. Absent such an expectation, no enforceable right to control arose, and the respondent failed to disclose a viable cause of action for certification.
The Court also rejected the claim for unauthorized use of name, finding no basis to conclude that the appellants commercially exploited the respondent’s identity. The information published mirrored that available through publicly accessible sources, including professional regulatory bodies.
4. Legislative Reform Stagnation at the Federal Level and Increased Regulatory Guidance
2025 also marked a year of continued uncertainty in federal privacy law reform. Bill C-27, which proposed to replace PIPEDA with the Consumer Privacy Protection Act and introduce the Artificial Intelligence and Data Act (“AIDA”), did not proceed and effectively died when Parliament was prorogued in January 2025.
In the absence of legislative reform, regulators have sought to fill the gap through guidance and policy initiatives. In 2025, the OPC released updated guidance on biometrics, AI-related data use, and consent, emphasizing necessity, proportionality, and transparency. These materials, while not binding law, provide a clear signal of regulatory expectations and are increasingly relied upon in investigations and enforcement activity.
Organizations should be mindful that, even without a new federal statute, the OPC’s interpretation of PIPEDA continues to evolve in ways that raise the compliance bar for data-intensive and AI-driven business models.
5. Hospital for Sick Children v. Ontario (Information and Privacy Commissioner), 2025 ONSC 5208
In Hospital for Sick Children v. Ontario (Information and Privacy Commissioner), the Ontario Superior Court of Justice examined the scope of breach notification obligations arising from ransomware attacks that disrupted access to personal information, even in the absence of evidence that the data had been viewed or exfiltrated.
Both the Hospital for Sick Children and Halton Children’s Aid Society experienced ransomware incidents that temporarily prevented staff from accessing records containing personal information. Each organization notified the Information and Privacy Commissioner of Ontario (the “IPC”), and SickKids also issued a public notice. The Commissioner determined that the loss of access itself amounted to an unauthorized use and a loss of personal information under Personal Health Information Protection Act and the Child, Youth and Family Services Act, and ordered Halton to provide public notice in accordance with the legislation.
On judicial review, the Court upheld the IPC’s interpretation of the statutory scheme. It confirmed that personal information can be considered “lost” or “used without authorization” where a cyber incident deprives an organization of control over, or access to, that information. The Court dismissed both judicial review applications and Halton’s appeal, reinforcing that notification obligations may be triggered even where there is no evidence of data theft. The decision underscores the expansive approach Ontario regulators and courts are taking to breach notification in the healthcare and child services sectors.
Bonus: Facial Detection Advertising in Public Spaces Under Privacy Investigation
In November 2025, digital advertising screens near Toronto’s Union Station came under public and regulatory scrutiny after it was revealed that the displays use facial detection technology to analyze passersby and dynamically tailor advertisements based on inferred characteristics such as age and gender.
The OPC subsequently opened an investigation to assess whether this practice complies with PIPEDA, including whether facial analysis constitutes the collection of personal information and whether individuals were provided with meaningful notice or an opportunity to consent in a crowded public environment.
Although the operator has stated that images are processed in real time and not retained, the investigation highlights growing regulatory concern with the discreet use of biometric technologies in public spaces. The outcome may help define the boundaries for real-time analytics and targeted advertising in Canada going forward. We will keep a close eye on the outcome of the OPC investigation.
Conclusion
Overall, 2025 was a consequential year for Canadian privacy law. Courts continued to refine the contours of privacy rights in the digital age, regulators demonstrated a greater willingness to pursue enforcement, and legislatures faced mounting pressure to modernize outdated statutory frameworks. As AI and data-driven technologies become further embedded in everyday operations, privacy compliance will remain a central legal and business risk. Organizations are encouraged to proactively assess their data practices and seek legal advice to navigate this increasingly complex landscape.