General Risk (GER)

When determining the risk posed by AI systems, a number of concerns pertaining to AI itself should be taken into consideration. These concerns should be tailored to the specifics of the system being evaluated as well as the environment in which it will be used.

Organisations have a responsibility to make certain that the policies, processes, procedures, and practices related to mapping, measuring, and managing the risks associated with AI are in place, transparent, and successfully implemented.

Controls related to this risk category are listed as below:

  • GER 01 - Roles and Responsibilities

  • GER 02 - Executive Responsibility

  • GER 03 - AI Risk Organisational Practices

  • GER 04 - Transparent Risk Management

  • GER 05 - Risk Management Monitoring

  • GER 06 - AI Risk Decisions Making

  • GER 07 - AI Organisational Documentation

  • GER 08 - Organisational Information Sharing Mechanism

  • GER 09 - AI's Business Value

  • GER 10 - Organisations AI Mission

  • GER 11 - Organisations Risk Tolerance

  • GER 12 - Allocated Resources for Risk Management

  • GER 13 - 3rd Party Risk Policies

  • GER 14 - 3rd Party Contingency

  • GER 15 - Mapping 3rd-Party Risk

  • GER 16 - Internal Risk Controls for 3rd Party Risk

  • GER 17 - AI Legal and Regulatory Requirements

  • GER 18 - AI Classification

  • GER 19 - AI System Requirements

  • GER 20 - AI System Benefits

  • GER 21 - AI Potential Costs

  • GER 22 - AI Application Scope

  • GER 23 - Approaches and Metrics

  • GER 24 - Metrics Appropriateness and Effectiveness

  • GER 25 - Stakeholder Assessment Consultation

  • GER 26 - Risk Tracking and Management

  • GER 27 - Risk Tracking Assessments

  • GER 28 - Measurement Approaches for Identifying Risk

  • GER 29 - Development and Deployment Decision

  • GER 30 - Risk Mitigation Activities

  • GER 31 - Risk Management of Mapped Risks

  • GER 32 - Post-Deployment Risk Management

  • GER 33 - 3rd Party Risk are Managed

Last updated