Qwen1.5-MoE-A2.7B is a small mixture-of-expert (MoE) model with only 2.7 billion activated parameters yet matches the performance of state-of-the-art 7B models like Mistral 7B and Qwen1.5-7B.
About Qwen 1.5 MoE
- Qwen 1.5 MoE - Highly efficient mixture-of-expert (MoE) model from Alibaba.
- Qwen 1.5 MoE was listed under Open Source , Artificial Intelligence .
- Open Source - sharing is caring. build great things together.
- Artificial Intelligence - a.i. helps save us time and scale personalized services like shopping like never before. but watch out, the robots are getting smarter.
- Visit Tiny Alternatives for more updates about Qwen 1.5 MoE.
- Check top 10 alternatives to Qwen 1.5 MoE on Tiny Alternatives
- Tiny Alternatives is the best place to find Qwen 1.5 MoE alternatives .
- Qwen 1.5 MoE was first published on 2024-04-03 13:00:22