⚑ AI TREND FLASH

Scores are signals, not verdicts. This tracker surfaces velocity, transparency, and local compatibility in open modelsβ€”so you can choose what fits your hardware, your use case, and your privacy boundary.

🦊 How to read this table

Public leaderboards are snapshots, not guarantees. Many benchmarks suffer from data contamination or reward verbosity over accuracy. Use this table to spot momentum, then verify models in your own environment.

⚑ Local-First Quick Reference

Estimate VRAM needs before downloading. INT4 quantization cuts memory footprint by ~75% with minimal quality loss.

VRAM Ruleβ‰ˆ (Parameters Γ— Precision) Γ· 8
Entry (12GB)7B–8B β†’ direct response, light tasks
Recommended (24GB)Up to 34B β†’ 30–50 tok/s, coding/reasoning
Workstation (β‰₯48GB)70B+ β†’ production throughput, multilingual
Rank Model Benchmark Type Size
10-hero/Matter-0.2-7B-DPO8.91πŸ’¬ MistralForCausalLM7.242B
201-ai/Yi-1.5-34B25.65🟒 LlamaForCausalLM34.389B
301-ai/Yi-1.5-34B-32K26.73🟒 LlamaForCausalLM34.389B
401-ai/Yi-1.5-34B-Chat33.36πŸ’¬ LlamaForCausalLM34.389B
501-ai/Yi-1.5-34B-Chat-16K29.4πŸ’¬ LlamaForCausalLM34.389B
601-ai/Yi-1.5-6B16.75🟒 LlamaForCausalLM6.061B
701-ai/Yi-1.5-6B-Chat22.78πŸ’¬ LlamaForCausalLM6.061B
801-ai/Yi-1.5-9B22.15🟒 LlamaForCausalLM8.829B
901-ai/Yi-1.5-9B-32K19.81🟒 LlamaForCausalLM8.829B
1001-ai/Yi-1.5-9B-Chat29.53πŸ’¬ LlamaForCausalLM8.829B
1101-ai/Yi-1.5-9B-Chat-16K23.77πŸ’¬ LlamaForCausalLM8.829B
1201-ai/Yi-34B22.37🟒 LlamaForCausalLM34.389B
1301-ai/Yi-34B-200K20.01🟒 LlamaForCausalLM34.389B
1401-ai/Yi-34B-Chat24.23πŸ’¬ LlamaForCausalLM34.389B
1501-ai/Yi-6B13.61🟒 LlamaForCausalLM6.061B
1601-ai/Yi-6B-200K12.0🟒 LlamaForCausalLM6.061B
1701-ai/Yi-6B-Chat14.12πŸ’¬ LlamaForCausalLM6.061B
1801-ai/Yi-9B17.81🟒 LlamaForCausalLM8.829B
1901-ai/Yi-9B-200K17.73🟒 LlamaForCausalLM8.829B
2001-ai/Yi-Coder-9B-Chat16.99πŸ”Ά LlamaForCausalLM8.829B
211-800-LLMs/Qwen-2.5-14B-Hindi36.27πŸ”Ά Qwen2ForCausalLM14.77B
221-800-LLMs/Qwen-2.5-14B-Hindi-Custom-Instruct31.02πŸ”Ά Qwen2ForCausalLM14.77B
231024m/PHI-4-Hindi27.49πŸ”Ά LlamaForCausalLM14.66B
241024m/QWEN-14B-B10041.92πŸ”Ά Qwen2ForCausalLM14.77B
25152334H/miqu-1-70b-sf29.1πŸ”Ά LlamaForCausalLM68.977B
261TuanPham/T-VisStar-7B-v0.119.14πŸ’¬ MistralForCausalLM7.294B
271TuanPham/T-VisStar-v0.119.14πŸ”Ά MistralForCausalLM7.294B
283rd-Degree-Burn/L-3.1-Science-Writer-8B21.09πŸ”Ά LlamaForCausalLM8.03B
293rd-Degree-Burn/Llama-3.1-8B-Squareroot11.22🀝 LlamaForCausalLM8.03B
303rd-Degree-Burn/Llama-3.1-8B-Squareroot-v18.04🀝 LlamaForCausalLM8.03B
313rd-Degree-Burn/Llama-Squared-8B12.43πŸ”Ά LlamaForCausalLM8.03B
324season/final_model_test_v223.09πŸ”Ά LlamaForCausalLM21.421B
33AALF/FuseChat-Llama-3.1-8B-Instruct-preview28.57πŸ’¬ LlamaForCausalLM8.03B
34AALF/FuseChat-Llama-3.1-8B-SFT-preview29.23πŸ”Ά LlamaForCausalLM8.03B
35AALF/gemma-2-27b-it-SimPO-37K9.51πŸ’¬ Gemma2ForCausalLM27.227B
36AALF/gemma-2-27b-it-SimPO-37K-100steps10.25πŸ’¬ Gemma2ForCausalLM27.227B
37AELLM/gemma-2-aeria-infinity-9b31.92🀝 Gemma2ForCausalLM9.242B
38AELLM/gemma-2-lyco-infinity-9b30.05πŸ’¬ Gemma2ForCausalLM10.159B
39AGI-0/Art-v0-3B12.13πŸ”Ά Qwen2ForCausalLM3.086B
40AGI-0/Artificium-llama3.1-8B-00119.49πŸ”Ά LlamaForCausalLM8.03B
41AGI-0/smartllama3.1-8B-00120.42πŸ”Ά LlamaForCausalLM8.03B
42AI-MO/NuminaMath-7B-CoT16.12πŸ”Ά LlamaForCausalLM6.91B
43AI-MO/NuminaMath-7B-TIR14.18πŸ”Ά LlamaForCausalLM6.91B
44AI-Sweden-Models/Llama-3-8B-instruct14.34πŸ”Ά LlamaForCausalLM8.03B
45AI-Sweden-Models/gpt-sw3-40b4.87🟒 GPT2LMHeadModel39.927B
46AI4free/Dhanishtha11.25🟩 Qwen2ForCausalLM1.777B
47AI4free/t211.33🀝 Qwen2ForCausalLM7.613B
48AIDC-AI/Marco-o127.64πŸ’¬ Qwen2ForCausalLM7.616B
49Aashraf995/Creative-7B-nerd29.98🀝 Qwen2ForCausalLM7.616B
50Aashraf995/Gemma-Evo-10B34.33🀝 Gemma2ForCausalLM10.159B
51Aashraf995/Qwen-Evo-7B30.28🀝 Qwen2ForCausalLM7.616B
52Aashraf995/QwenStock-14B37.13🀝 Qwen2ForCausalLM14.766B
53AbacusResearch/Jallabi-34B26.19🀝 LlamaForCausalLM34.389B
54Ahdoot/StructuredThinker-v0.3-MoreStructure23.92🀝 Qwen2ForCausalLM3.397B
55Ahdoot/Test_StealthThinker22.07πŸ”Ά Qwen2ForCausalLM3.086B
56AicoresSecurity/Cybernet-Sec-3B-R1-V020.58πŸ”Ά LlamaForCausalLM3.213B
57AicoresSecurity/Cybernet-Sec-3B-R1-V0-Coder22.93πŸ”Ά LlamaForCausalLM3.213B
58AicoresSecurity/Cybernet-Sec-3B-R1-V120.0πŸ”Ά LlamaForCausalLM3.213B
59AicoresSecurity/Cybernet-Sec-3B-R1-V1.122.64πŸ”Ά LlamaForCausalLM3.213B
60Alepach/notHumpback-M05.14πŸ’¬ LlamaForCausalLM3.213B
61Alepach/notHumpback-M14.78πŸ’¬ LlamaForCausalLM3.213B
62Alepach/notHumpback-M1-v25.21πŸ’¬ LlamaForCausalLM3.213B
63Alibaba-NLP/gte-Qwen2-7B-instruct13.83πŸ”Ά Qwen2ForCausalLM7.613B
64Alsebay/Qwen2.5-7B-test-novelist27.17πŸ”Ά Qwen2ForCausalLM7.616B
65Amaorynho/BBAI20063.46🀝 Qwen2ForCausalLM1.09B
66Amaorynho/BBAI270V44.55🀝 Qwen2ForCausalLM7.616B
67Amaorynho/BBAIIFEV130.58🀝 LlamaForCausalLM8.03B
68Amaorynho/BBAI_3753.46🀝 Qwen2ForCausalLM1.09B
69Amu/t1-1.5B12.14πŸ”Ά Qwen2ForCausalLM1.777B
70Amu/t1-3B11.16πŸ’¬ Qwen2ForCausalLM3.397B
71ArliAI/ArliAI-RPMax-12B-v1.120.98πŸ”Ά MistralForCausalLM12.248B
72ArliAI/Llama-3.1-8B-ArliAI-RPMax-v1.123.94πŸ”Ά LlamaForCausalLM8.03B
73Arthur-LAGACHERIE/Precis-1B-Instruct8.85πŸ”Ά LlamaForCausalLM1.236B
74Artples/L-MChat-7b21.24🀝 MistralForCausalLM7.242B
75Artples/L-MChat-Small15.23🀝 PhiForCausalLM2.78B
76Aryanne/QwentileSwap43.92🀝 Qwen2ForCausalLM32.764B
77Aryanne/SHBA29.88🀝 LlamaForCausalLM8.03B
78Aryanne/SuperHeart25.56🀝 LlamaForCausalLM8.03B
79AtAndDev/Qwen2.5-1.5B-continuous-learnt16.52πŸ”Ά Qwen2ForCausalLM1.544B
80AtAndDev/Qwen2.5-1.5B-continuous-learnt17.48πŸ’¬ Qwen2ForCausalLM1.544B
81Ateron/Glowing-Forest-12B22.61🀝 MistralForCausalLM12.248B
82Ateron/Lotus-Magpic25.5🀝 MistralForCausalLM12.248B
83Ateron/Way_of_MagPicaro20.63🀝 MistralForCausalLM12.248B
84AuraIndustries/Aura-4B16.06πŸ’¬ LlamaForCausalLM4.513B
85AuraIndustries/Aura-8B27.36πŸ’¬ LlamaForCausalLM8.03B
86AuraIndustries/Aura-MoE-2x4B16.8πŸ’¬ MixtralForCausalLM7.231B
87AuraIndustries/Aura-MoE-2x4B-v217.52πŸ’¬ MixtralForCausalLM7.231B
88Aurel9/testmerge-7b20.97🀝 MistralForCausalLM7.242B
89Ayush-Singh/Llama1B-sft-23.17πŸ”Ά LlamaForCausalLM1.236B
90Azure99/Blossom-V6-14B32.81πŸ’¬ Qwen2ForCausalLM14.77B
91Azure99/Blossom-V6-7B31.05πŸ’¬ Qwen2ForCausalLM7.616B
92Azure99/blossom-v5-32b27.72πŸ’¬ Qwen2ForCausalLM32.512B
93Azure99/blossom-v5-llama3-8b14.6πŸ’¬ LlamaForCausalLM8.03B
94Azure99/blossom-v5.1-34b30.3πŸ’¬ LlamaForCausalLM34.389B
95Azure99/blossom-v5.1-9b26.47πŸ’¬ LlamaForCausalLM8.829B
96BAAI/Gemma2-9B-IT-Simpo-Infinity-Preference22.61πŸ”Ά Gemma2ForCausalLM9.242B
97BAAI/Infinity-Instruct-3M-0613-Llama3-70B35.58πŸ”Ά LlamaForCausalLM70.554B
98BAAI/Infinity-Instruct-3M-0613-Mistral-7B22.29πŸ”Ά MistralForCausalLM7.242B
99BAAI/Infinity-Instruct-3M-0625-Llama3-70B36.91πŸ”Ά LlamaForCausalLM70.554B
100BAAI/Infinity-Instruct-3M-0625-Llama3-8B22.06πŸ’¬ LlamaForCausalLM8.03B