Skip to content

Dolphin 2.9.2 Mixtral 8x22B 🐬

cognitivecomputations/dolphin-mixtral-8x22b

Created Jun 8, 202465,536 context

Dolphin 2.9 is designed for instruction following, conversational, and coding. This model is a finetune of Mixtral 8x22B Instruct. It features a 64k context length and was fine-tuned with a 16k sequence length using ChatML templates.

This model is a successor to Dolphin Mixtral 8x7B.

The model is uncensored and is stripped of alignment and bias. It requires an external alignment layer for ethical use. Users are cautioned to use this highly compliant model responsibly, as detailed in a blog post about uncensored models at erichartford.com/uncensored-models.

#moe #uncensored

OpenRouterOpenRouter
© 2026 OpenRouter, Inc

Product

  • Chat
  • Rankings
  • Models
  • Providers
  • Pricing
  • Enterprise

Company

  • About
  • Announcements
  • CareersHiring
  • Partners
  • Privacy
  • Terms of Service
  • Support
  • State of AI

Developer

  • Documentation
  • API Reference
  • SDK
  • Status

Connect

  • Discord
  • GitHub
  • LinkedIn
  • X
  • YouTube

Recent activity on Dolphin 2.9.2 Mixtral 8x22B 🐬

Total usage per day on OpenRouter

Not enough data to display yet.