Scores are signals, not verdicts. This tracker surfaces velocity, transparency, and local compatibility in open modelsβso you can choose what fits your hardware, your use case, and your privacy boundary.
Public leaderboards are snapshots, not guarantees. Many benchmarks suffer from data contamination or reward verbosity over accuracy. Use this table to spot momentum, then verify models in your own environment.
Estimate VRAM needs before downloading. INT4 quantization cuts memory footprint by ~75% with minimal quality loss.
| VRAM Rule | β (Parameters Γ Precision) Γ· 8 |
|---|---|
| Entry (12GB) | 7Bβ8B β direct response, light tasks |
| Recommended (24GB) | Up to 34B β 30β50 tok/s, coding/reasoning |
| Workstation (β₯48GB) | 70B+ β production throughput, multilingual |
| Rank | Model | Benchmark | Type | Size |
|---|---|---|---|---|
| 1 | 0-hero/Matter-0.2-7B-DPO | 8.91 | π¬ MistralForCausalLM | 7.242B |
| 2 | 01-ai/Yi-1.5-34B | 25.65 | π’ LlamaForCausalLM | 34.389B |
| 3 | 01-ai/Yi-1.5-34B-32K | 26.73 | π’ LlamaForCausalLM | 34.389B |
| 4 | 01-ai/Yi-1.5-34B-Chat | 33.36 | π¬ LlamaForCausalLM | 34.389B |
| 5 | 01-ai/Yi-1.5-34B-Chat-16K | 29.4 | π¬ LlamaForCausalLM | 34.389B |
| 6 | 01-ai/Yi-1.5-6B | 16.75 | π’ LlamaForCausalLM | 6.061B |
| 7 | 01-ai/Yi-1.5-6B-Chat | 22.78 | π¬ LlamaForCausalLM | 6.061B |
| 8 | 01-ai/Yi-1.5-9B | 22.15 | π’ LlamaForCausalLM | 8.829B |
| 9 | 01-ai/Yi-1.5-9B-32K | 19.81 | π’ LlamaForCausalLM | 8.829B |
| 10 | 01-ai/Yi-1.5-9B-Chat | 29.53 | π¬ LlamaForCausalLM | 8.829B |
| 11 | 01-ai/Yi-1.5-9B-Chat-16K | 23.77 | π¬ LlamaForCausalLM | 8.829B |
| 12 | 01-ai/Yi-34B | 22.37 | π’ LlamaForCausalLM | 34.389B |
| 13 | 01-ai/Yi-34B-200K | 20.01 | π’ LlamaForCausalLM | 34.389B |
| 14 | 01-ai/Yi-34B-Chat | 24.23 | π¬ LlamaForCausalLM | 34.389B |
| 15 | 01-ai/Yi-6B | 13.61 | π’ LlamaForCausalLM | 6.061B |
| 16 | 01-ai/Yi-6B-200K | 12.0 | π’ LlamaForCausalLM | 6.061B |
| 17 | 01-ai/Yi-6B-Chat | 14.12 | π¬ LlamaForCausalLM | 6.061B |
| 18 | 01-ai/Yi-9B | 17.81 | π’ LlamaForCausalLM | 8.829B |
| 19 | 01-ai/Yi-9B-200K | 17.73 | π’ LlamaForCausalLM | 8.829B |
| 20 | 01-ai/Yi-Coder-9B-Chat | 16.99 | πΆ LlamaForCausalLM | 8.829B |
| 21 | 1-800-LLMs/Qwen-2.5-14B-Hindi | 36.27 | πΆ Qwen2ForCausalLM | 14.77B |
| 22 | 1-800-LLMs/Qwen-2.5-14B-Hindi-Custom-Instruct | 31.02 | πΆ Qwen2ForCausalLM | 14.77B |
| 23 | 1024m/PHI-4-Hindi | 27.49 | πΆ LlamaForCausalLM | 14.66B |
| 24 | 1024m/QWEN-14B-B100 | 41.92 | πΆ Qwen2ForCausalLM | 14.77B |
| 25 | 152334H/miqu-1-70b-sf | 29.1 | πΆ LlamaForCausalLM | 68.977B |
| 26 | 1TuanPham/T-VisStar-7B-v0.1 | 19.14 | π¬ MistralForCausalLM | 7.294B |
| 27 | 1TuanPham/T-VisStar-v0.1 | 19.14 | πΆ MistralForCausalLM | 7.294B |
| 28 | 3rd-Degree-Burn/L-3.1-Science-Writer-8B | 21.09 | πΆ LlamaForCausalLM | 8.03B |
| 29 | 3rd-Degree-Burn/Llama-3.1-8B-Squareroot | 11.22 | π€ LlamaForCausalLM | 8.03B |
| 30 | 3rd-Degree-Burn/Llama-3.1-8B-Squareroot-v1 | 8.04 | π€ LlamaForCausalLM | 8.03B |
| 31 | 3rd-Degree-Burn/Llama-Squared-8B | 12.43 | πΆ LlamaForCausalLM | 8.03B |
| 32 | 4season/final_model_test_v2 | 23.09 | πΆ LlamaForCausalLM | 21.421B |
| 33 | AALF/FuseChat-Llama-3.1-8B-Instruct-preview | 28.57 | π¬ LlamaForCausalLM | 8.03B |
| 34 | AALF/FuseChat-Llama-3.1-8B-SFT-preview | 29.23 | πΆ LlamaForCausalLM | 8.03B |
| 35 | AALF/gemma-2-27b-it-SimPO-37K | 9.51 | π¬ Gemma2ForCausalLM | 27.227B |
| 36 | AALF/gemma-2-27b-it-SimPO-37K-100steps | 10.25 | π¬ Gemma2ForCausalLM | 27.227B |
| 37 | AELLM/gemma-2-aeria-infinity-9b | 31.92 | π€ Gemma2ForCausalLM | 9.242B |
| 38 | AELLM/gemma-2-lyco-infinity-9b | 30.05 | π¬ Gemma2ForCausalLM | 10.159B |
| 39 | AGI-0/Art-v0-3B | 12.13 | πΆ Qwen2ForCausalLM | 3.086B |
| 40 | AGI-0/Artificium-llama3.1-8B-001 | 19.49 | πΆ LlamaForCausalLM | 8.03B |
| 41 | AGI-0/smartllama3.1-8B-001 | 20.42 | πΆ LlamaForCausalLM | 8.03B |
| 42 | AI-MO/NuminaMath-7B-CoT | 16.12 | πΆ LlamaForCausalLM | 6.91B |
| 43 | AI-MO/NuminaMath-7B-TIR | 14.18 | πΆ LlamaForCausalLM | 6.91B |
| 44 | AI-Sweden-Models/Llama-3-8B-instruct | 14.34 | πΆ LlamaForCausalLM | 8.03B |
| 45 | AI-Sweden-Models/gpt-sw3-40b | 4.87 | π’ GPT2LMHeadModel | 39.927B |
| 46 | AI4free/Dhanishtha | 11.25 | π© Qwen2ForCausalLM | 1.777B |
| 47 | AI4free/t2 | 11.33 | π€ Qwen2ForCausalLM | 7.613B |
| 48 | AIDC-AI/Marco-o1 | 27.64 | π¬ Qwen2ForCausalLM | 7.616B |
| 49 | Aashraf995/Creative-7B-nerd | 29.98 | π€ Qwen2ForCausalLM | 7.616B |
| 50 | Aashraf995/Gemma-Evo-10B | 34.33 | π€ Gemma2ForCausalLM | 10.159B |
| 51 | Aashraf995/Qwen-Evo-7B | 30.28 | π€ Qwen2ForCausalLM | 7.616B |
| 52 | Aashraf995/QwenStock-14B | 37.13 | π€ Qwen2ForCausalLM | 14.766B |
| 53 | AbacusResearch/Jallabi-34B | 26.19 | π€ LlamaForCausalLM | 34.389B |
| 54 | Ahdoot/StructuredThinker-v0.3-MoreStructure | 23.92 | π€ Qwen2ForCausalLM | 3.397B |
| 55 | Ahdoot/Test_StealthThinker | 22.07 | πΆ Qwen2ForCausalLM | 3.086B |
| 56 | AicoresSecurity/Cybernet-Sec-3B-R1-V0 | 20.58 | πΆ LlamaForCausalLM | 3.213B |
| 57 | AicoresSecurity/Cybernet-Sec-3B-R1-V0-Coder | 22.93 | πΆ LlamaForCausalLM | 3.213B |
| 58 | AicoresSecurity/Cybernet-Sec-3B-R1-V1 | 20.0 | πΆ LlamaForCausalLM | 3.213B |
| 59 | AicoresSecurity/Cybernet-Sec-3B-R1-V1.1 | 22.64 | πΆ LlamaForCausalLM | 3.213B |
| 60 | Alepach/notHumpback-M0 | 5.14 | π¬ LlamaForCausalLM | 3.213B |
| 61 | Alepach/notHumpback-M1 | 4.78 | π¬ LlamaForCausalLM | 3.213B |
| 62 | Alepach/notHumpback-M1-v2 | 5.21 | π¬ LlamaForCausalLM | 3.213B |
| 63 | Alibaba-NLP/gte-Qwen2-7B-instruct | 13.83 | πΆ Qwen2ForCausalLM | 7.613B |
| 64 | Alsebay/Qwen2.5-7B-test-novelist | 27.17 | πΆ Qwen2ForCausalLM | 7.616B |
| 65 | Amaorynho/BBAI2006 | 3.46 | π€ Qwen2ForCausalLM | 1.09B |
| 66 | Amaorynho/BBAI270V4 | 4.55 | π€ Qwen2ForCausalLM | 7.616B |
| 67 | Amaorynho/BBAIIFEV1 | 30.58 | π€ LlamaForCausalLM | 8.03B |
| 68 | Amaorynho/BBAI_375 | 3.46 | π€ Qwen2ForCausalLM | 1.09B |
| 69 | Amu/t1-1.5B | 12.14 | πΆ Qwen2ForCausalLM | 1.777B |
| 70 | Amu/t1-3B | 11.16 | π¬ Qwen2ForCausalLM | 3.397B |
| 71 | ArliAI/ArliAI-RPMax-12B-v1.1 | 20.98 | πΆ MistralForCausalLM | 12.248B |
| 72 | ArliAI/Llama-3.1-8B-ArliAI-RPMax-v1.1 | 23.94 | πΆ LlamaForCausalLM | 8.03B |
| 73 | Arthur-LAGACHERIE/Precis-1B-Instruct | 8.85 | πΆ LlamaForCausalLM | 1.236B |
| 74 | Artples/L-MChat-7b | 21.24 | π€ MistralForCausalLM | 7.242B |
| 75 | Artples/L-MChat-Small | 15.23 | π€ PhiForCausalLM | 2.78B |
| 76 | Aryanne/QwentileSwap | 43.92 | π€ Qwen2ForCausalLM | 32.764B |
| 77 | Aryanne/SHBA | 29.88 | π€ LlamaForCausalLM | 8.03B |
| 78 | Aryanne/SuperHeart | 25.56 | π€ LlamaForCausalLM | 8.03B |
| 79 | AtAndDev/Qwen2.5-1.5B-continuous-learnt | 16.52 | πΆ Qwen2ForCausalLM | 1.544B |
| 80 | AtAndDev/Qwen2.5-1.5B-continuous-learnt | 17.48 | π¬ Qwen2ForCausalLM | 1.544B |
| 81 | Ateron/Glowing-Forest-12B | 22.61 | π€ MistralForCausalLM | 12.248B |
| 82 | Ateron/Lotus-Magpic | 25.5 | π€ MistralForCausalLM | 12.248B |
| 83 | Ateron/Way_of_MagPicaro | 20.63 | π€ MistralForCausalLM | 12.248B |
| 84 | AuraIndustries/Aura-4B | 16.06 | π¬ LlamaForCausalLM | 4.513B |
| 85 | AuraIndustries/Aura-8B | 27.36 | π¬ LlamaForCausalLM | 8.03B |
| 86 | AuraIndustries/Aura-MoE-2x4B | 16.8 | π¬ MixtralForCausalLM | 7.231B |
| 87 | AuraIndustries/Aura-MoE-2x4B-v2 | 17.52 | π¬ MixtralForCausalLM | 7.231B |
| 88 | Aurel9/testmerge-7b | 20.97 | π€ MistralForCausalLM | 7.242B |
| 89 | Ayush-Singh/Llama1B-sft-2 | 3.17 | πΆ LlamaForCausalLM | 1.236B |
| 90 | Azure99/Blossom-V6-14B | 32.81 | π¬ Qwen2ForCausalLM | 14.77B |
| 91 | Azure99/Blossom-V6-7B | 31.05 | π¬ Qwen2ForCausalLM | 7.616B |
| 92 | Azure99/blossom-v5-32b | 27.72 | π¬ Qwen2ForCausalLM | 32.512B |
| 93 | Azure99/blossom-v5-llama3-8b | 14.6 | π¬ LlamaForCausalLM | 8.03B |
| 94 | Azure99/blossom-v5.1-34b | 30.3 | π¬ LlamaForCausalLM | 34.389B |
| 95 | Azure99/blossom-v5.1-9b | 26.47 | π¬ LlamaForCausalLM | 8.829B |
| 96 | BAAI/Gemma2-9B-IT-Simpo-Infinity-Preference | 22.61 | πΆ Gemma2ForCausalLM | 9.242B |
| 97 | BAAI/Infinity-Instruct-3M-0613-Llama3-70B | 35.58 | πΆ LlamaForCausalLM | 70.554B |
| 98 | BAAI/Infinity-Instruct-3M-0613-Mistral-7B | 22.29 | πΆ MistralForCausalLM | 7.242B |
| 99 | BAAI/Infinity-Instruct-3M-0625-Llama3-70B | 36.91 | πΆ LlamaForCausalLM | 70.554B |
| 100 | BAAI/Infinity-Instruct-3M-0625-Llama3-8B | 22.06 | π¬ LlamaForCausalLM | 8.03B |