Mixture of Experts
Omar Sanseviero @osanseviero Grok weights are out. Download them quickly at https://huggingface.co/xai-org/grok-1
huggingface-cli download xai-org/grok-1 –repo-type model –include ckpt/tensor* –local-dir checkpoints/ckpt-0 –local-dir-use-symlinks False
Learn about mixture of experts at https://hf.co/blog/moe
Replying to @osanseviero
Your “Mixture of Experts Explained” is very revealing. Yes, I looked at your “Open Source MoEs” project links.
Are those few groups with big computers supposed to be the “Mixture of Experts”? It is not that hard if you actually share and use global open tokens. When I see the flopping door on the suborbital test, I know why you do things the way you do. Tell him to stop saying “open”.
Richard Collins, The Internet Foundation