A family of autoregressive, decoder-only transformer architectures pre-trained to predict the next token. GPT models are the basis for ChatGPT and most commercial LLM APIs. Their autoregressive design makes them excellent text generators but less efficient for classification tasks than encoder models.
Book a 30-minute call to discuss how these AI concepts translate to your specific industry and business challenges.