GPT (Generative Pre-trained Transformer)
Neural Networks
Large language model for text generation
What is GPT (Generative Pre-trained Transformer)?
A transformer-based model trained on vast text data to generate human-like text. Can write, answer questions, code, and more.
Real-World Examples
- •ChatGPT conversations
- •Code generation with GitHub Copilot
- •Article writing
- •Email drafting
When to Use This
For text generation, completion, or conversational AI
Related Terms
Learn more about concepts related to GPT (Generative Pre-trained Transformer)