Recommended Practice for Organizational Governance of Artificial Intelligence
Last updated: 18 Jul 2024
Development Stage
Pre-draft
Draft
Published
Scope
This recommended practice specifies governance criteria such as safety, transparency, accountability, responsibility and minimizing bias, and process steps for effective implementation, performance auditing, training and compliance in the development or use of artificial intelligence within organizations. ©IEEE 2022. All rights reserved.
Purpose
This recommended practice provides steps that an organization that develops or uses artificial intelligence should follow to enable the responsible, ethical and accountable development or use of artificial intelligence within its operations. This integrates internal steps, including the adoption, training, implementation, and compliance assurance of substantive and procedural governance requirements within the organization. This recommended practice also describes how to contribute to and interact with external governance instruments including government regulations, professional standards and codes of conduct, appropriate academic, non-governmental organizations and stakeholder recommendations, and public values relating to the responsible development and use of artificial intelligence. ©IEEE 2022. All rights reserved.