Peter AlbertGuide: Finetune GPT-NEO (2.7 Billion Parameters) on one GPUGPT-NEO is a series of languages model from EleutherAI, that tries to replicate OpenAI’s GPT-3 language model. EleutherAI’s current models…5 min read·Apr 10, 2021--2--2
Peter AlbertGuide: Finetune GPT2 (1.5 B)Finetune GPT2-XL (1.5 Billion Parameters, the biggest model) on a single 16 GB VRAM V100 Google Cloud instance with Huggingface…5 min read·Mar 28, 2021----