Go to homeJozu logo

The safest way to get your AI projects from dev to prod

Jozu Hub makes it easy to develop and manage models, datasets, codebases, and parameters.

Discover

Most popular repositories

BROWSE FOR MORE

Trending repositories

BROWSE FOR MORE

Top verified repositories

BROWSE FOR MORE

Newest repositories

BROWSE FOR MORE
gpt2-distilled-lora-alpaca

By

microsoft-phi-2

By

Phi-2 is a Transformer with 2.7 billion parameters. It was trained using the same data sources as Phi-1.5, augmented with a new data source that consists of various NLP synthetic texts and filtered websites (for safety and educational value). When assessed against benchmarks testing common sense, language understanding, and logical reasoning, Phi-2 showcased a nearly state-of-the-art performance among models with less than 13 billion parameters.

llama3-githubactions

By

microsoft_phi-2

By

Phi-2 is a Transformer with 2.7 billion parameters. It was trained using the same data sources as Phi-1.5, augmented with a new data source that consists of various NLP synthetic texts and filtered websites (for safety and educational value). When assessed against benchmarks testing common sense, language understanding, and logical reasoning, Phi-2 showcased a nearly state-of-the-art performance among models with less than 13 billion parameters.

microsoft_phi-4

By

phi-4 is a state-of-the-art open model built upon a blend of synthetic datasets, data from filtered public domain websites, and acquired academic books and Q&A datasets. The goal of this approach was to ensure that small capable models were trained with data focused on high quality and advanced reasoning. phi-4 underwent a rigorous enhancement and alignment process, incorporating both supervised fine-tuning and direct preference optimization to ensure precise instruction adherence and robust safety measures

fraud-detection-model

By

Fraud detection model using sklearn