LoRA (low-rank adaptation of large language models} performs dimensional reduction on inner layers of the deep neural network used in a large language model thus reducing the cost of execution and retraining. This has economic and environmental advantages.
Used on page 566
Also known as low-rank adaptation of large language models