brittleness

Terms from Artificial Intelligence: humans at the heart of algorithms

Page numbers are for draft copy at present; they will be replaced with correct numbers when final book is formatted. Chapter numbers are correct and will not change now.

An AI system may work well in good circimstances, but be brittle, that is breakdown dramatically when given slightly unexpected inputs. This may happen because the system has very restricted domain of expertise, say a particular disease (a common issue for expert systems; it may be due to explict or implicit assumptions made during system design, for example that people using a natural language system always speak with 'correct' grammar. it may be due the limited range of training data for machine learning. The last of these has been especially evident is large-language models whch are based on massive text corpora and often give helpful and indeed isnightful repsobses, but may also give ridiculous responses or hallucinante.