Ollama Models Api

Read the docs
llama
mixtral

A set of Mixture of Experts (MoE) model with open weights by Mistral AI in 8x7b and 8x22b parameter sizes.

360.9K Pulls
2 weeks ago
69 Tags