GPT-3 (Q95726734)
Jump to navigation
Jump to search
2020 transformer-based large language model
- Generative Pre-trained Transformer 3
- Generative Pretrained Transformer 3
- GPT3
Language | Label | Description | Also known as |
---|---|---|---|
English | GPT-3 |
2020 transformer-based large language model |
|
Statements
28 May 2020
0 references
125M
Small
125,000,000 parameter
1 reference
350M
Medium
350,000,000 parameter
1 reference
760M
Large
760,000,000 parameter
1 reference
1.3B
1,300,000,000 parameter
1 reference
175B
GPT-3
175,000,000,000 parameter
1 reference
To study the dependence of ML performance on model size, we train 8 different sizes of model, ranging over three orders of magnitude from 125 million parameters to 175 billion parameters, with the last being the model we call GPT-3. (English)
22 July 2020
1 reference
21 October 2020
GPT-3
0 references
Identifiers
Generative Pretrained Transformers
17 July 2020
23 May 2023
0 references
Sitelinks
Wikipedia(29 entries)
- arwiki جي بي تي-3
- bgwiki GPT-3
- bnwiki জিপিটি-৩
- cawiki GPT-3
- ckbwiki جی-پی-تی-٣
- cswiki GPT-3
- dewiki Generative Pre-trained Transformer 3
- enwiki GPT-3
- eswiki GPT-3
- etwiki GPT-3
- euwiki GPT-3
- fawiki جیپیتی ۳
- fiwiki GPT-3
- frwiki GPT-3
- hewiki GPT-3
- hiwiki जीपीटी3
- itwiki GPT-3
- jawiki GPT-3
- kaawiki GPT-3
- kowiki GPT-3
- nlwiki GPT-3
- ptwiki GPT-3
- quwiki GPT-3
- ruwiki GPT-3
- svwiki GPT-3
- trwiki GPT-3
- ukwiki GPT-3
- viwiki GPT-3
- zhwiki GPT-3
Wikibooks(0 entries)
Wikinews(0 entries)
Wikiquote(0 entries)
Wikisource(0 entries)
Wikiversity(0 entries)
Wikivoyage(0 entries)
Wiktionary(0 entries)
Multilingual sites(1 entry)
- commonswiki Category:GPT-3