A regularisation technique that randomly sets a fraction of neuron activations to zero during training, forcing the network to learn redundant representations and preventing overfitting. Dropout is widely used in both vision and language models.
Book a discovery call to discuss how these AI concepts translate to your specific industry and business challenges.