Ai2 and Contextual AI Release OLMoE, a 100% open-source Mixture of Experts Model
OLMOE-1B-7B significantly outperforms all open 1B models and displays competitive performance to dense models with significantly higher inference costs and memory storage.