A training paradigm where a model generates its own supervision signal from unlabelled data—for example, by predicting masked words in a sentence. Self-supervised pre-training on internet-scale text or images is what enables foundation models to acquire broad world knowledge.
Buchen Sie eine Beratung, um zu besprechen, wie KI-Konzepte auf Ihre Herausforderungen anwendbar sind.