Abstract-view-of-Toronto-City-Hall

Responsible Use of AI Scribes in the Health Care Sector: Understanding the IPC Checklist

Torkin Manes LegalPoint
 

On January 28, 2026, the Information and Privacy Commissioner of Ontario (the “IPC”) released a checklist that is aimed at assisting organizations operating in the healthcare sector that are contemplating developing, procuring or using AI Scribes (“IPC Checklist”).

Ontario’s healthcare sector has seen rapid growth in administrative AI systems aimed at assisting healthcare administrators with managing their documentation responsibilities (“AI Scribes”). AI Scribes can provide transcription tools that generate summaries of patient care visits and integrate them directly into an electronic medical record. AI Scribes can also produce reports, referral letters and other written communications for patients.

The IPC Checklist is designed to complement the IPC’s guidance on AI Scribes: Key Considerations for the Health Sector, which provides recommendations for assessing vendors of AI Scribes and establishes an accountability framework to ensure compliance with Ontario’s health privacy laws. Set out below is a summary of the IPC Checklist.

The IPC Checklist

The IPC Checklist is divided into four main categories to support organizations in the healthcare sector that wish to use, procure and/or develop their own AI Scribes.

1. PART I - Preparing an AI Governance and Accountability Framework

Organizations should have comprehensive AI governance and accountability frameworks before deploying any AI system, including AI Scribes. These frameworks should cover the entire AI lifecycle, including design (if applicable), procurement, use, monitoring and assessment, as well as decommissioning. The framework should be integrated within the organization’s existing governance structures and embedded as part of the organizational culture, with oversight from an AI governance committee. The AI governance committee should provide interdisciplinary representation, open feedback and accountability for AI Scribes used by the organization.

2. PART II - Developers of AI Scribes

Developers of AI Scribes need to be cognizant of the following: systems should be designed and governed to prevent harm, protect human rights and resist unexpected or malicious misuse. Their performance, accuracy and potential bias must be continuously monitored, with human oversight and procedures in place to pause, retrain or decommission the system if necessary. Risks of discrimination, high-risk contexts, privacy and cybersecurity must be addressed, with additional assessments conducted and updated throughout the AI lifecycle. Affected individuals and communities should be consulted and informed about the AI Scribes’ purpose, operation, impacts and data use.

3. PART III - Custodians Who Procure AI Scribes

Before procuring AI Scribes, organizations should first thoroughly understand their operational needs to determine whether AI Scribes are the most appropriate solution. It is recommended that organizations conduct comprehensive third-party vendor assessments, evaluate the system’s intended purpose, learning capabilities, performance limitations, bias safeguards, prior risk assessments, cybersecurity measures, transparency and reporting mechanisms. Finally, organizations must ensure any disclosure or sharing of personal health information complies with Ontario’s Personal Health Information Protection Act (“PHIPA”), including obtaining consent for the disclosure or sharing where required.

4. PART IV - Custodians Who Use AI Scribes

Health information custodians using AI Scribes must ensure records are accurate, reviewed and securely stored, with valid PHIPA-compliant consent and clear communication to individuals about AI use, risks and alternatives. Policies should allow access and correction of AI-generated records, maintain transparency about system limitations, and provide knowledgeable contacts for questions. AI Scribes should undergo ongoing monitoring for performance, bias and safety concerns, and have the ability to be paused or deactivated if harmful outputs occur.

Key Takeaways and Practical Considerations

The IPC Checklist is meant to aid health information custodians in the responsible, transparent and accountable use of AI Scribes, which should in turn help to reduce administrative burdens and improve interactions between providers and patients.

To support responsible, transparent and accountable use of AI Scribes, health organizations should implement clear safeguards, including human oversight, privacy protections, reliability checks and transparency measures. Adherence to these practices can help organizations manage potential risks while realizing the benefits of AI technologies in improving health care delivery and patient outcomes.

The alignment of the IPC Checklist with the IPC’s Key Considerations for the Health Sector highlights the increasing expectation in Canada that health sector organizations establish and uphold AI governance frameworks designed to protect personal information and ensure effective human oversight.

Although the IPC Checklist is not legally binding, inadequate governance, oversight or control of these systems in the health sector can create legal, regulatory and reputational risks. Health sector custodians and institutions are therefore advised to use this checklist alongside their existing privacy, human rights and risk management policies when evaluating or implementing AI Scribes.

Canada has not yet enacted comprehensive legislation governing AI, and it is uncertain whether future federal initiatives will align with the IPC’s approach. In the meantime, health sector organizations that proactively follow the IPC Checklist will be better positioned to meet future compliance obligations and stakeholder expectations. The IPC Checklist provides a practical reference for responsibly managing AI Scribe technologies while maintaining compliance with applicable privacy and human rights requirements in health care.

For more information about the legal implications of the use of generative AI or other AI technology, please contact Roland Hung, and Laura Crimi of Torkin Manes’ Technology and Privacy & Data Management Groups.

The authors would like to acknowledge Torkin Manes LLP Articling Student Kayla Oliveira for her invaluable contribution in drafting this bulletin.