A family of autoregressive, decoder-only transformer architectures pre-trained to predict the next token. GPT models are the basis for ChatGPT and most commercial LLM APIs. Their autoregressive design makes them excellent text generators but less efficient for classification tasks than encoder models.
Boek een consultatie om te bespreken hoe AI-concepten op uw uitdagingen van toepassing zijn.