A regularisation technique that randomly sets a fraction of neuron activations to zero during training, forcing the network to learn redundant representations and preventing overfitting. Dropout is widely used in both vision and language models.
Réservez une consultation pour discuter de l'application des concepts IA à vos défis.