This refers to legal or other means by which companies, governments or individuals are held to account for AI algorithms or systems that they produce or deploy. If something goes wrong they may be taken to court, fined, imprisoned or at least be subject to public approbation. It is believed if that the costs of negligence are high enough, then companies and other actors will act responsibly.
Used on pages 491, 563
Also known as accountable