Go to home
Back
5K+
8

mixtral-8x7b

:

Use the Pull Tag button to download this ModelKit.

Or, read our KitOps documentation to learn how to use kit unpack --filter to download only the components you need.

ModelKit Tag Metadata
Digest
Author
Mistral.ai
Date added
Size
49.6GB
Total pulls
406
Package
Name
mixtral-8x7b-v0.1
Version
0.1.0
Authors
Mistral.ai
Description
The Mixtral-8x7B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Experts.
Model
Name
mixtral-8x7b-47B-instruct-q8_0
Path
./mixtral-8x7b-47B-instruct-q8_0.gguf
License
Apache-2.0
Parts
N/A
Parameters
N/A
Codebases
README.md
Readme file.
config.json
Configuration file.
generation_config.json
Generation configuration file.
special_tokens_map.json
Mapping of special tokens.
tokenizer.json
Tokenizer configuration.
tokenizer_config.json
Detailed tokenizer configuration.