Canada Artificial Intelligence and Data Act (AIDA)
Canada - AIDA (Artificial Intelligence and Data Act) is focused on Artificial Intelligence, and it was passed as Bill C-27. According to the Department of Justice, Canada:
Part 3 of the Digital Charter Implementation Act, 2022, the Artificial Intelligence and Data Act, sets out new measures to regulate international and interprovincial trade and commerce in artificial intelligence systems. It would establish common requirements for the design, development, and use of artificial intelligence systems, including measures to mitigate risks of harm and biased output. It would also prohibit specific practices with data and artificial intelligence systems that may result in serious harm to individuals or their interests.
Canada - AIDA also states that:
Measures under the Artificial Intelligence and Data Act may also implicate rights under section 8. For example, under the Artificial Intelligence and Data Act, the Minister responsible may compel the production of certain information from persons subject to the Act for the purpose of verifying compliance with the Act. Regulated persons may also need to provide auditors with records. In addition, the Artificial Intelligence and Data Act authorizes the Minister to publish information about artificial intelligence systems posing a serious risk of harm; to order a person to publish information related to their compliance with the Act; and to share information with other government entities specified in the Act.
The following considerations support the consistency of the Minister’s authorities to collect, disclose, and share information with section 8. Privacy interests are diminished in contexts where government access to business information and records is essential to ensuring compliance with the regulatory obligations a regulated entity must meet. Under the Artificial Intelligence and Data Act, the Minister’s powers to gather, compel the production of, or disclose relevant information for regulatory or administrative purposes are consistent with similar powers upheld as reasonable in other regulatory contexts. They are tailored and subject to limits. They authorize access to information that is reasonable and necessary to advance the Act’s important regulatory objectives. In addition, there are restrictions on the uses to which information gathered under the Act may be put and protections for personal and confidential business information. These protections include limits on the circumstances in which personal and confidential business information may be disclosed by the Minister.
The regulation of AI under the AIDA mainly focuses on persons carrying out a "regulated activity," which means any of the following in the course of international or interprovincial trade and commerce:
processing or making available for use any data relating to human activities to design, develop or use an AI system; or
designing, developing or making available for use an AI system or managing its operations.
Therefore, any organisation developing or deploying AI systems within Canada's jurisdiction has to comply with AIDA - and, when required, provide documentary proof of their compliance.
The degree of regulation of private sector AI systems under the AIDA will depend in part on whether the system falls within the definition of a "high-impact system," with such systems being subject to a higher degree of regulation. As presently drafted, AI systems subject to the AIDA will fall into one of only two categories: high-impact systems and those that are not (meaning AI systems within the scope of the AIDA but do not meet the definition of a high-impact system). It is possible that what constitutes a "high-impact system" will be similar to what the EU defines as a "high-risk" AI system or it may take some of its framework from the Directive.
Those responsible for an AI system will be required to assess whether the AI system is a high-impact system. Where an AI system meets this definition, the person responsible must:
establish measures to identify, assess and mitigate the risks of harm or biased output that could result from the use of the system;
establish measures to monitor compliance with the mitigation measures and the effectiveness of those mitigation measures;
where the system is made available for use, publish on a public website a plain-language description of the system that explains, among other things, how the system is intended to be used, the types of content that it is intended to generate and the types of decisions, recommendations or predictions it is intended to make, and the risk mitigation measures established;
where the person is managing the operation of the system, publish on a public website a plain-language description of the system that explains, among other things, how the system is used, the types of content that it generates and the decisions, recommendations or predictions that it makes, and the mitigation measures established; and
notify the Minister if use of the system results or is likely to result in material harm.
The Minister responsible for administering the AIDA will have considerable powers under the act to promote and ensure compliance.
One of the options available to the Minister to ensure compliance is the ability to conduct or direct an audit by a qualified person—at the expense of the person being audited—where the Minister believes there has been a contravention of certain sections of the act. Unlike the existing PIPEDA framework, which has minimal enforcement powers, the potential implications of an audit for those operating AI systems are substantial. The Minister may:
require any person responsible for a high-impact system to cease using it or making it available for use where the Minister has reasonable grounds to believe that the use of the system gives rise to a serious risk of imminent harm;
require the audited person implement any measure specified in the order to address anything referred to in an audit report; or
require a person to publish on a publicly available website certain information, including audit details, so long as it does not require disclosure confidential business information.
Those subject to an order must comply with the order.
As written, the AIDA creates a variety of Administrative Monetary Penalties. In some cases, these penalties can rise to the greater of $10,000,000 or 3% of the person’s gross global revenues in its previous financial year, or, in the case of an individual, a fine at the discretion of the court. AIDA also creates criminal offences related to:
The possession or use of Personal Information knowing or believing that such information was obtained as a result of the commission of an offence;
Making an Artificial Intelligence system available for use knowing that it is likely to cause serious harm to an individual or their property and the use of the system actually causes such harm; and
Making an Artificial Intelligence system available for use with the intent to defraud the public and to cause substantial economic loss to an individual and it actually causes such loss.
The Seclea Platform provides support for managing and tracking compliance with AIDA. The Canada - AIDA compliance template on the Seclea Platform deals with:
Last updated