Generative pre-trained transformer
Type of large language model
Generative pre-trained transformer ▸ Facts ▸ Comments ▸ News ▸ Videos
![Generative pre-trained transformer: Type of large language model](https://www.newsr.in/en/knowledge/Generative-pre-trained-transformer_20231031.jpg)
Generative pre-trained transformers (GPT) are a type of large language model (LLM) and a prominent framework for generative artificial intelligence. They are artificial neural networks that are used in natural language processing tasks. GPTs are based on the transformer architecture, pre-trained on large data sets of unlabelled text, and able to generate novel human-like content. As of 2023, most LLMs have these characteristics and are sometimes referred to broadly as GPTs.
0 shares | ShareTweetSavePostSend |
You Might Like
Google touts new AI model that 'beats GPT in almost all tests'Google has revealed a new AI model it claims beats rivals like ChatGPT at most tasks after the company's "largest science and engineering project ever".Sky News - Published | |
OpenAI's double whammy- create your guru and chat with wisdomOpenAI's technological advancements allow individuals to create their own personal gurus by fine-tuning GPT models with customized knowledge and answering styles. With the ability to upload files and..IndiaTimes - Published |
Search this site and the web: |