A training paradigm where a model generates its own supervision signal from unlabelled data—for example, by predicting masked words in a sentence. Self-supervised pre-training on internet-scale text or images is what enables foundation models to acquire broad world knowledge.
AI概念があなたの課題にどのように適用されるかを話し合う相談を予約してください。