GPT-Neo

A series of large language models trained on the Pile. It was our first attempt to produce GPT-3-like language models and comes in 125M, 1.3B, and 2.7B parameter variants.

Previous
Previous

VQGAN-CLIP

Next
Next

GPT-Neo Library