DEV Community

# moe

Posts

๐Ÿ‘‹ Sign in for the ability to sort posts by relevant, latest, or top.
์ž‘์€ ๋ชจ๋ธ์˜ ์‹œ๋Œ€ โ€” SLM, MoE, Distillation, Quantization ์ด์ •๋ฆฌ

์ž‘์€ ๋ชจ๋ธ์˜ ์‹œ๋Œ€ โ€” SLM, MoE, Distillation, Quantization ์ด์ •๋ฆฌ

Comments
2 min read
14GB Model Compressed to 3.5GB, 95% Quality Survived โ€” The Four Pillars of Model Efficiency

14GB Model Compressed to 3.5GB, 95% Quality Survived โ€” The Four Pillars of Model Efficiency

Comments
4 min read
14GB ๋ชจ๋ธ์„ 3.5GB๋กœ ์ค„์˜€๋”๋‹ˆ ์„ฑ๋Šฅ์˜ 95%๊ฐ€ ์‚ด์•„๋‚จ์•˜๋‹ค โ€” ๋ชจ๋ธ ๊ฒฝ๋Ÿ‰ํ™” 4์ข… ์„ธํŠธ

14GB ๋ชจ๋ธ์„ 3.5GB๋กœ ์ค„์˜€๋”๋‹ˆ ์„ฑ๋Šฅ์˜ 95%๊ฐ€ ์‚ด์•„๋‚จ์•˜๋‹ค โ€” ๋ชจ๋ธ ๊ฒฝ๋Ÿ‰ํ™” 4์ข… ์„ธํŠธ

Comments
1 min read
The Era of Small Models โ€” SLM, MoE, Distillation, and Quantization Explained

The Era of Small Models โ€” SLM, MoE, Distillation, and Quantization Explained

Comments
8 min read
Run a Local Neuro Sama with Just **1 Line of Code** Using the LivinGrimoire software design pattern

Run a Local Neuro Sama with Just **1 Line of Code** Using the LivinGrimoire software design pattern

1
Comments
2 min read
The Sparse Future: MoEs Eat the World

The Sparse Future: MoEs Eat the World

Comments
2 min read
The CoderPunk Guide to Mixture of Experts: Requipping AI Like Fairy Tail's Elza

The CoderPunk Guide to Mixture of Experts: Requipping AI Like Fairy Tail's Elza

2
Comments
4 min read
Mixture of Experts (MoE)

Mixture of Experts (MoE)

Comments
2 min read
๐Ÿ‘‹ Sign in for the ability to sort posts by relevant, latest, or top.