BACK
microsoft-phi-2
Bychukoz71
Phi-2 is a Transformer with 2.7 billion parameters. It was trained using the same data sources as Phi-1.5, augmented with a new data source that consists of various NLP synthetic texts and filtered websites (for safety and educational value). When assessed against benchmarks testing common sense, language understanding, and logical reasoning, Phi-2 showcased a nearly state-of-the-art performance among models with less than 13 billion parameters.
This repository has no tags.
Learn how you can create one:
SEE TUTORIALTo push a ModelKit, use this push command: