MoE 相关的热门 GitHub AI项目仓库

发现与 MoE 相关的最受欢迎的开源项目和工具,了解最新的开发趋势和创新。

选择语言
sglang
Python
Hot

SGLang is a fast serving framework for large language models and vision language models.

sgl-projectsgl-project
13.6K
1.6K
2025-04-27

Anime Scene Search by Image

sorulysoruly
4.7K
237
2024-10-13
Bangumi
TypeScript

:electron: An unofficial https://bgm.tv ui first app client for Android and iOS, built with React Native. 一个无广告、以爱好为驱动、不以盈利为目的、专门做 ACG 的类似豆瓣的追番记录,bgm.tv 第三方客户端。为移动端重新设计,内置大量加强的网页端难以实现的功能,且提供了相当的自定义选项。 目前已适配 iOS / Android / WSA、mobile / 简单 pad、light / dark theme、移动端网页。

czy0729czy0729
4.3K
144
2025-04-27
Moeditor
JavaScript

(discontinued) Your all-purpose markdown editor.

MoeditorMoeditor
4.1K
273
2020-07-07
MoeGoe
Python

Executable file for VITS inference

CjangCjenghCjangCjengh
2.4K
247
2023-08-22
Moe-Counter
JavaScript

Moe counter badge with multiple themes! - 多种风格可选的萌萌计数器

journey-adjourney-ad
2.2K
234
2025-02-06

一款开源简洁高颜值的酷狗第三方客户端 An open-source, concise, and aesthetically pleasing third-party client for KuGou that supports Windows / macOS / Linux :electron:

iAJueiAJue
2.2K
145
2025-04-22
MoE-LLaVA
Python

Mixture-of-Experts for Large Vision-Language Models

PKU-YuanGroupPKU-YuanGroup
2.2K
133
2024-12-03
MoBA
Python

MoBA: Mixture of Block Attention for Long-Context LLMs

MoonshotAIMoonshotAI
1.8K
103
2025-04-03
fastmoe
Python

A fast MoE impl for PyTorch

laekovlaekov
1.7K
196
2025-02-10

DeepSeekMoE: Towards Ultimate Expert Specialization in Mixture-of-Experts Language Models

deepseek-aideepseek-ai
1.7K
277
2024-01-16
OpenMoE
Python

A family of open-sourced Mixture-of-Experts (MoE) Large Language Models

XueFuzhaoXueFuzhao
1.5K
78
2024-03-08
paimon-moe
JavaScript

Your best Genshin Impact companion! Help you plan what to farm with ascension calculator and database. Also track your progress with todo and wish counter.

MadeBarunaMadeBaruna
1.4K
275
2025-04-15
moemail
TypeScript

一个基于 NextJS + Cloudflare 技术栈构建的可爱临时邮箱服务🎉

beilunyangbeilunyang
1.1K
488
2025-04-06

PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538

davidmraudavidmrau
1.1K
108
2024-04-19
Aria
Jupyter Notebook

Codebase for Aria - an Open Multimodal Native MoE

rhymes-airhymes-ai
1.0K
86
2025-01-22

Speech synthesis model /inference GUI repo for galgame characters based on Tacotron2, Hifigan, VITS and Diff-svc

luoyilyluoyily
986
77
2023-03-03
llama-moe
Python

⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training (EMNLP 2024)

pjlab-sys4nlppjlab-sys4nlp
957
56
2024-12-06
moepush
TypeScript

一个基于 NextJS + Cloudflare 技术栈构建的可爱消息推送服务, 支持多种消息推送渠道✨

beilunyangbeilunyang
948
164
2025-04-20
Tutel
Python

Tutel MoE: Optimized Mixture-of-Experts Library, Support DeepSeek FP8/FP4

microsoftmicrosoft
811
96
2025-04-27
Adan
Python

Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models

sail-sgsail-sg
789
67
2024-07-03

A toolkit for inference and evaluation of 'mixtral-8x7b-32kseqlen' from Mistral AI

open-compassopen-compass
769
79
2023-12-16
moe-theme.el
Emacs Lisp

A customizable colorful eye-candy theme for Emacser. Moe, moe, kyun!

kuanyuikuanyui
759
63
2025-02-04

An app to help you capture thoughts and ideas

mudkipmemudkipme
747
79
2025-02-15

The codes about "Uni-MoE: Scaling Unified Multimodal Models with Mixture of Experts"

HITsz-TMGHITsz-TMG
718
43
2025-04-10

An open-source solution for full parameter fine-tuning of DeepSeek-V3/R1 671B, including complete code and scripts from training to inference, as well as some practical experiences and conclusions. (DeepSeek-V3/R1 满血版 671B 全参数微调的开源解决方案,包含从训练到推理的完整代码和脚本,以及实践中积累一些经验和结论。)

ScienceOne-AIScienceOne-AI
663
82
2025-03-13

Virtual YouTubers in bilibili

dd-centerdd-center
620
36
2024-09-10
moedict-webkit
Objective-C

萌典網站

g0vg0v
619
99
2025-04-10

A curated reading list of research in Mixture-of-Experts(MoE).

codecautioncodecaution
618
45
2024-10-30

Satania IS the BEST waifu, no really, she is, if you don't believe me, this website will convince you

PizzacusPizzacus
615
57
2022-10-10

Reverse image search tool (SauceNao, IQDB, Ascii2D, trace.moe, and more)

DecimationDecimation
610
27
2025-04-16

中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs)

ymcuiymcui
603
44
2024-04-30
Time-MoE
Python

[ICLR 2025 Spotlight] Official implementation of "Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts"

Time-MoETime-MoE
575
51
2025-03-30

GUI for MoeGoe

CjangCjenghCjangCjengh
570
68
2023-08-22

Moebooru, a fork of danbooru1 that has been heavily modified

moeboorumoebooru
547
83
2025-04-22

An app to help you capture thoughts and ideas

mudkipmemudkipme
547
47
2025-03-23
MoeList
Kotlin

Another unofficial Android MAL client

axiel7axiel7
546
20
2025-04-27

This Telegram Bot can tell the anime when you send an screenshot to it

sorulysoruly
531
80
2025-04-18
step_into_llm
Jupyter Notebook

MindSpore online courses: Step into LLM

mindspore-coursesmindspore-courses
459
111
2025-01-06
moerail
JavaScript

铁路车站代码查询 × 动车组交路查询

Arnie97Arnie97
441
31
2023-02-27
hydra-moe
Python

暂无描述

SkunkworksAISkunkworksAI
412
15
2023-11-03
pixiv.moe
TypeScript

😘 A pinterest-style layout site, shows illusts on pixiv.net order by popularity.

kokororinkokororin
364
43
2023-03-08

Open Source firmware replacement for Tuya Wifi Thermostate from Beca and Moes with Home Assistant Autodiscovery

fashbergfashberg
354
70
2023-08-27

🖼二次元图片下载器 Pics downloader for booru sites,Pixiv.net,Bilibili.com,Konachan.com,Yande.re , behoimi.org, safebooru, danbooru,Gelbooru,SankakuComplex,Kawainyan,MiniTokyo,e-shuushuu,Zerochan,WorldCosplay ,Yuriimg etc.

xpluskyxplusky
354
25
2025-04-15

:dancer: Anime tracker, database and community. Moved to https://git.akyoto.dev/web/notify.moe

animenotifieranimenotifier
351
45
2022-09-26

A Free and Open Source Java Framework for Multiobjective Optimization

MOEAFrameworkMOEAFramework
336
127
2025-04-15

Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch

lucidrainslucidrains
328
28
2024-06-17

A Telegram bot that imports LINE/kakao stickers or creates/manages new sticker set.

star-39star-39
328
35
2024-06-06

暂无描述

windriseswindrises
327
8
2022-12-14
DiT-MoE
Python

Scaling Diffusion Transformers with Mixture of Experts

feizcfeizc
313
14
2024-09-09

基于 Laravel 开发,支持 Markdown 语法的博客

moell-pengmoell-peng
301
81
2022-07-31

moe SS Front End for https://github.com/mengskysama/shadowsocks/tree/manyuser

wzxjohnwzxjohn
298
107
2015-02-27

Implementation of Soft MoE, proposed by Brain's Vision team, in Pytorch

lucidrainslucidrains
283
8
2025-04-02

A Python library for the state-of-the-art Bayesian optimization algorithms, with the core implemented in C++.

wujian16wujian16
269
63
2020-02-05
MoeSR
JavaScript

An application specialized in image super-resolution for ACGN illustrations and Visual Novel CG. 专注于插画/Galgame CG等ACGN领域的图像超分辨率的应用

TeamMoeAITeamMoeAI
268
8
2024-04-17

Official LISTEN.moe Android app

LISTEN-moeLISTEN-moe
262
25
2025-04-27

GRadient-INformed MoE

microsoftmicrosoft
262
15
2024-09-26

The meizi of a material design style welfare App.

HotBitmapGGHotBitmapGG
251
76
2017-02-14

Improved branching version of MoeLoader

usaginyausaginya
244
37
2021-07-23

Inferflow is an efficient and highly configurable inference engine for large language models (LLMs).

inferflowinferflow
242
25
2024-03-15

萌音影视 - 在线影视应用

iAJueiAJue
241
68
2018-10-31
MoH
Python

MoH: Multi-Head Attention as Mixture-of-Head Attention

SkyworkAISkyworkAI
238
10
2024-10-29

True coroutines for PHP>=8.1 without worrying about event loops and callbacks.

moebiusphpmoebiusphp
232
4
2022-06-09

True coroutines for PHP>=8.1 without worrying about event loops and callbacks.

moebiusphpmoebiusphp
232
4
2022-06-09

True coroutines for PHP>=8.1 without worrying about event loops and callbacks.

moebiusphpmoebiusphp
232
4
2022-06-09

A libGDX cross-platform API for InApp purchasing.

libgdxlibgdx
229
87
2025-01-03

ModuleFormer is a MoE-based architecture that includes two different types of experts: stick-breaking attention heads and feedforward experts. We released a collection of ModuleFormer-based Language Models (MoLM) ranging in scale from 4 billion to 8 billion parameters.

IBMIBM
220
11
2024-04-11

Code for paper "Boosting Continual Learning of Vision-Language Models via Mixture-of-Experts Adapters" CVPR2024

JiazuoYuJiazuoYu
211
16
2024-11-17

[ICLR 2025] MoE++: Accelerating Mixture-of-Experts Methods with Zero-Computation Experts

SkyworkAISkyworkAI
211
6
2024-10-16
fiddler
Python

[ICLR'25] Fast Inference of MoE Models with CPU-GPU Orchestration

efeslabefeslab
208
19
2024-11-18

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06

Misspelling Oblivious Word Embeddings

facebookresearchfacebookresearch
203
22
2019-08-06
MoePhoto
Python

MoePhoto Image Toolbox萌图工具箱

opteroncxopteroncx
189
23
2024-09-23

Mixture-of-Experts (MoE) Language Model

IEIT-YuanIEIT-Yuan
186
40
2024-09-09
PlanMoE
Python

This is a repository aimed at accelerating the training of MoE models, offering a more efficient scheduling method.

JiangkuoWangJiangkuoWang
177
19
2025-02-14