Access 2 Hugging Face H4 models on OpenRouter including Zephyr 141B-A35B and Zephyr 7B. Compare pricing, context windows, and capabilities.
Hugging Face H4 tokens processed on OpenRouter
Zephyr 141B-A35B is A Mixture of Experts (MoE) model with 141B total parameters and 35B active parameters. Fine-tuned on a mix of publicly available, synthetic datasets. It is an instruct finetune of Mixtral 8x22B. #moe
Zephyr is a series of language models that are trained to act as helpful assistants. Zephyr-7B-β is the second model in the series, and is a fine-tuned version of mistralai/Mistral-7B-v0.1 that was trained on a mix of publicly available, synthetic datasets using Direct Preference Optimization (DPO).