Browse
106 - 120 of 128
- test10
By
test1
Latest ModelKit
- Model
- Datasets
- Codebases
- Docs
- Configuration
- llama3-githubactions5
By
Latest ModelKit
- 3
- Model
- Datasets
- Codebases
- Docs
- Configuration
- randomforest0
By
Latest ModelKit
- Model
- Datasets
- Codebases
- Docs
- Configuration
- kitops-huggingface-dagger1
By
Latest ModelKit
- 1
- Model
- Datasets
- Codebases
- Docs
- Configuration
- sv-test0
By
Latest ModelKit
- Model
- Datasets
- Codebases
- Docs
- Configuration
- aopq1-220
By
Work tool
Latest ModelKit
- Model
- Datasets
- Codebases
- Docs
- Configuration
- test0
By
test description
Latest ModelKit
- Model
- Datasets
- Codebases
- Docs
- Configuration
- kit-ray0
By
Integrate Ray with Kitops
Latest ModelKit
- Model
- Datasets
- Codebases
- Docs
- Configuration
- quick-start0
By
Latest ModelKit
- Model
- Datasets
- Codebases
- Docs
- Configuration
- testrepo0
By
Lorem ipsum is a dummy or placeholder text commonly used in graphic design, publishing, and web development to fill empty spaces in a layout that does not yet have content.
Latest ModelKit
- Model
- Datasets
- Codebases
- Docs
- Configuration
- fraud-detection-model0
By
Fraud detection model using sklearn
Latest ModelKit
- Model
- Datasets
- Codebases
- Docs
- Configuration
- microsoft_phi-40
By
phi-4 is a state-of-the-art open model built upon a blend of synthetic datasets, data from filtered public domain websites, and acquired academic books and Q&A datasets. The goal of this approach was to ensure that small capable models were trained with data focused on high quality and advanced reasoning. phi-4 underwent a rigorous enhancement and alignment process, incorporating both supervised fine-tuning and direct preference optimization to ensure precise instruction adherence and robust safety measures
Latest ModelKit
- Model
- Datasets
- Codebases
- Docs
- Configuration
- microsoft_phi-20
By
Phi-2 is a Transformer with 2.7 billion parameters. It was trained using the same data sources as Phi-1.5, augmented with a new data source that consists of various NLP synthetic texts and filtered websites (for safety and educational value). When assessed against benchmarks testing common sense, language understanding, and logical reasoning, Phi-2 showcased a nearly state-of-the-art performance among models with less than 13 billion parameters.
Latest ModelKit
- Model
- Datasets
- Codebases
- Docs
- Configuration
- llama3-githubactions1
By
Latest ModelKit
- 3
- Model
- Datasets
- Codebases
- Docs
- Configuration
- microsoft-phi-20
By
Phi-2 is a Transformer with 2.7 billion parameters. It was trained using the same data sources as Phi-1.5, augmented with a new data source that consists of various NLP synthetic texts and filtered websites (for safety and educational value). When assessed against benchmarks testing common sense, language understanding, and logical reasoning, Phi-2 showcased a nearly state-of-the-art performance among models with less than 13 billion parameters.
Latest ModelKit
- Model
- Datasets
- Codebases
- Docs
- Configuration