gpt-neo-2.7B


  • GPT-Neo 2.7B is a transformer model designed using EleutherAI's replication of the gpt-3 architecture. GPT-Neo refers to the class of models, while 2.7B represents the number of parameters of this particular pre-trained model.