Go to home
Back
12K+
8

mixtral-8x7b

:

Use the Pull Tag button to download this ModelKit.

Or, read our KitOps documentation to learn how to use kit unpack --filter to download only the components you need.

ModelKit Tag Metadata
Digest
Author
Mistral.ai
Date added
Size
26.4GB
Total pulls
617
Package
Name
mixtral-8x7b-v0.1
Version
0.1.0
Authors
Mistral.ai
Description
The Mixtral-8x7B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Experts.
Model
Name
mixtral-8x7b
Path
./mixtral-8x7b-47B-text-q4_0.gguf
License
Apache-2.0
Parts
N/A
Parameters
N/A
Codebases
config.json
Preview
generation_config.json
Preview
special_tokens_map.json
Preview
tokenizer.json
Preview
tokenizer_config.json
Preview
Docs
README.md
Preview