Your glossary of AI concepts, explained simply
144 concepts
Recently updated and popular entries

Batch size (examples per update) and learning rate (step size for weight updates) are the two most important hyperparameters controlling how neural networks train.

AI orchestration coordinates multiple AI models, tools, and data sources into unified workflows, managing the flow between components in complex AI systems.

Beam search generates text by exploring multiple candidate sequences in parallel, keeping the top-k most promising paths to find the highest-probability output.

Catastrophic forgetting is when training a neural network on new data overwrites previously learned knowledge, causing it to lose earlier capabilities.