LLM-FR Leaderboard π
This leaderboard evaluates intelligence modeling in the French language. It is not intended to serve as a reference for LLM evaluations. It is provided for informational and educational purposes only. Please consult other, more official leaderboards for authoritative assessments.
Note: The assessments have been adapted to the Reasoning Language Model: all tasks are in generative mode, with no limit on token generation.
- Pr-Fouras : "Père Fouras"'s Riddles (ex : fan site)
- Kangourou-TO : MATH Quizzes Kangourou. Text Only : Only questions without figures.
- Sornette : Classification of texts (GORAFI, wikipedia, le saviez-vous, ...) into 4 categories -
burlesque et fantaisiste
,ludique et didactique
,insidieux et mensonger
,moral et accablant
- Mix-Fr : π² Mixture of public datasets translated in french
Model Types:
- πͺ¨ - Base, Pretrained, Foundation Model
- π¬ - Chat Model (Instruct, RLHF, DPO, ...)
- π π» - Fine-tuned Model
- π€ - Reasoning Model
- "headers": [
- "R",
- "T",
- "Model",
- "Average β¬οΈ",
- "Pr-Fouras",
- "Kangourou-TO",
- "Sornette",
- "Mix-Fr",
- "#Params (B)",
- "β³ Evaluation Time (min)",
- "Evaluation Date",
- "Precision",
- "Hub License",
- "Hub β€οΈ"
- "data": [
- [
- "1 π₯",
- "π€",
- "<a target="_blank" href="https://huggingface.co/openai/gpt-oss-120b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openai/gpt-oss-120b</a>",
- 79.22,
- 54.26,
- 94.74,
- 88.67,
- 47.37,
- 116.83,
- 37.12,
- "2025-08-07T13-24-36",
- "bfloat16",
- "other",
- 2680
- [
- "2 π₯",
- "π€",
- "<a target="_blank" href="https://huggingface.co/Qwen/Qwen3-235B-A22B-Thinking-2507" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen3-235B-A22B-Thinking-2507</a>",
- 78.39,
- 65.21,
- 88.64,
- 81.33,
- 57.29,
- 235.09,
- 145.71,
- "2025-08-03T03-08-04",
- "bfloat16",
- "apache-2.0",
- 274
- [
- "3 π₯",
- "π€",
- "<a target="_blank" href="https://huggingface.co/deepseek-ai/DeepSeek-R1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">deepseek-ai/DeepSeek-R1</a>",
- 75.54,
- 72.99,
- 88.98,
- 64.67,
- 53.69,
- 684.53,
- 141.28,
- "2025-03-23T02-08-19",
- "bfloat16",
- "mit",
- 11562
- [
- 4,
- "π€",
- "<a target="_blank" href="https://huggingface.co/Qwen/QwQ-32B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/QwQ-32B</a>",
- 70.66,
- 55.96,
- 82.7,
- 73.33,
- 48.06,
- 32.76,
- 53.85,
- "2025-03-17T12-38-20",
- "bfloat16",
- "apache-2.0",
- 2303
- [
- 5,
- "π€",
- "<a target="_blank" href="https://huggingface.co/deepseek-ai/DeepSeek-R1-0528" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">deepseek-ai/DeepSeek-R1-0528</a>",
- 70.06,
- 72.51,
- 71,
- 66.67,
- 44.67,
- 684.53,
- 241.86,
- "2025-06-03T02-34-14",
- "bfloat16",
- "mit",
- 1640
- [
- 6,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/meta-llama/Llama-4-Maverick-17B-128E-Instruct-FP8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-4-Maverick-17B-128E-Instruct-FP8</a>",
- 69.79,
- 58.64,
- 85.41,
- 65.33,
- 63.1,
- 401.65,
- 22.31,
- "2025-05-22T14-59-21",
- "bfloat16",
- "other",
- 111
- [
- 7,
- "π€",
- "<a target="_blank" href="https://huggingface.co/openai/gpt-oss-20b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">openai/gpt-oss-20b</a>",
- 69.59,
- 38.69,
- 94.74,
- 75.33,
- 44.61,
- 20.91,
- 34.26,
- "2025-08-07T13-21-33",
- "bfloat16",
- "other",
- 2280
- [
- 8,
- "π€",
- "<a target="_blank" href="https://huggingface.co/Qwen/Qwen3-235B-A22B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen3-235B-A22B</a>",
- 64.1,
- 61.56,
- 85.41,
- 45.33,
- 53.89,
- 235.09,
- 140.01,
- "2025-05-12T17-52-42",
- "bfloat16",
- "apache-2.0",
- 775
- [
- 9,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/Qwen/Qwen3-235B-A22B-Instruct-2507" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen3-235B-A22B-Instruct-2507</a>",
- 62.87,
- 66.67,
- 87.96,
- 34,
- 61.83,
- 235.09,
- 65.84,
- "2025-08-02T16-48-35",
- "bfloat16",
- "apache-2.0",
- 572
- [
- 10,
- "π€",
- "<a target="_blank" href="https://huggingface.co/Qwen/Qwen3-30B-A3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen3-30B-A3B</a>",
- 61.18,
- 45.5,
- 92.03,
- 46,
- 58.33,
- 30.53,
- 52.05,
- "2025-05-12T12-42-01",
- "bfloat16",
- "apache-2.0",
- 520
- [
- 11,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/mistralai/Mistral-Large-Instruct-2411" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-Large-Instruct-2411</a>",
- 60.19,
- 58.39,
- 61.5,
- 60.67,
- 45.84,
- 122.61,
- 12.85,
- "2025-03-17T09-55-33",
- "bfloat16",
- "other",
- 210
- [
- 12,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/deepseek-ai/DeepSeek-V3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">deepseek-ai/DeepSeek-V3</a>",
- 58.82,
- 59.85,
- 58.62,
- 58,
- 50.78,
- 684.53,
- 39.28,
- "2025-03-22T18-52-10",
- "bfloat16",
- null,
- 3660
- [
- 13,
- "π€",
- "<a target="_blank" href="https://huggingface.co/Qwen/Qwen3-32B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen3-32B</a>",
- 58.09,
- 44.04,
- 93.55,
- 36.67,
- 55.94,
- 32.76,
- 39.52,
- "2025-05-12T00-56-08",
- "bfloat16",
- "apache-2.0",
- 293
- [
- 14,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-72B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-72B-Instruct</a>",
- 57.02,
- 49.88,
- 53.19,
- 68,
- 47.96,
- 72.71,
- 13.17,
- "2025-03-17T10-40-16",
- "bfloat16",
- "other",
- 771
- [
- 15,
- "π€",
- "<a target="_blank" href="https://huggingface.co/mistralai/Magistral-Small-2506" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Magistral-Small-2506</a>",
- 56.7,
- 47.69,
- 75.75,
- 46.67,
- 35.38,
- 23.57,
- 32.84,
- "2025-06-17T16-35-21",
- "bfloat16",
- "apache-2.0",
- 472
- [
- 16,
- "π€",
- "<a target="_blank" href="https://huggingface.co/Qwen/Qwen3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen3-8B</a>",
- 56.25,
- 41.12,
- 88.98,
- 38.67,
- 53.41,
- 8.19,
- 29.4,
- "2025-06-02T23-08-32",
- "bfloat16",
- "apache-2.0",
- 357
- [
- 17,
- "π€",
- "<a target="_blank" href="https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Llama-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">deepseek-ai/DeepSeek-R1-Distill-Llama-70B</a>",
- 56.12,
- 40.39,
- 59.97,
- 68,
- 53,
- 70.55,
- 33.14,
- "2025-03-18T01-30-43",
- "bfloat16",
- "mit",
- 634
- [
- 18,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/mistralai/Mistral-Small-3.2-24B-Instruct-2506" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-Small-3.2-24B-Instruct-2506</a>",
- 54.67,
- 50.36,
- 60.99,
- 52.67,
- 41.39,
- 24.01,
- 5.3,
- "2025-07-07T13-03-01",
- "bfloat16",
- "apache-2.0",
- 320
- [
- 19,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-32B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-32B-Instruct</a>",
- 52.88,
- 38.44,
- 52.85,
- 67.33,
- 44.93,
- 32.76,
- 8.46,
- "2025-03-22T18-06-37",
- "bfloat16",
- "apache-2.0",
- 240
- [
- 20,
- "π€",
- "<a target="_blank" href="https://huggingface.co/tiiuae/Falcon-H1-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/Falcon-H1-7B-Instruct</a>",
- 51.87,
- 36.5,
- 58.45,
- 60.67,
- 44.12,
- 7.59,
- 20.34,
- "2025-07-07T16-41-48",
- "bfloat16",
- "apache-2.0",
- 13
- [
- 21,
- "π π»",
- "<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-3.2-instruct-78b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-3.2-instruct-78b</a>",
- 51.67,
- 53.53,
- 30.8,
- 70.67,
- 43.08,
- 77.96,
- 13.89,
- "2025-03-18T01-15-17",
- "bfloat16",
- "other",
- 107
- [
- 22,
- "π€",
- "<a target="_blank" href="https://huggingface.co/zai-org/GLM-4.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zai-org/GLM-4.5</a>",
- 50.03,
- 60.83,
- 77.27,
- 12,
- 46.85,
- 358.34,
- 227.8,
- "2025-08-02T19-45-03",
- "bfloat16",
- "mit",
- 968
- [
- 23,
- "π€",
- "<a target="_blank" href="https://huggingface.co/Qwen/Qwen3-1.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen3-1.7B</a>",
- 47.73,
- 14.11,
- 75.07,
- 54,
- 39.05,
- 2.03,
- 16.27,
- "2025-08-26T11-50-17",
- "bfloat16",
- "apache-2.0",
- 238
- [
- 24,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.1-405B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.1-405B-Instruct</a>",
- 47.49,
- 62.29,
- 42.84,
- 37.33,
- 46.22,
- 405.85,
- 29.9,
- "2025-03-18T01-20-29",
- "bfloat16",
- "llama3.1",
- 568
- [
- 25,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/meta-llama/Llama-4-Scout-17B-16E-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-4-Scout-17B-16E-Instruct</a>",
- 47.42,
- 51.34,
- 66.93,
- 24,
- 57.87,
- 108.64,
- 16.38,
- "2025-05-20T12-08-50",
- "bfloat16",
- "other",
- 913
- [
- 26,
- "π π»",
- "<a target="_blank" href="https://huggingface.co/jpacifico/Chocolatine-2-14B-Instruct-v2.0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jpacifico/Chocolatine-2-14B-Instruct-v2.0.3</a>",
- 45.83,
- 31.87,
- 38.94,
- 66.67,
- 40.5,
- 14.77,
- 5.08,
- "2025-03-18T00-50-57",
- "bfloat16",
- "apache-2.0",
- 11
- [
- 27,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/google/gemma-3-27b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-3-27b-it</a>",
- 44.87,
- 54.5,
- 29.44,
- 50.67,
- 39.8,
- 27.43,
- 13.61,
- "2025-05-12T00-31-17",
- "bfloat16",
- "gemma",
- 1338
- [
- 28,
- "π€",
- "<a target="_blank" href="https://huggingface.co/deepseek-ai/DeepSeek-R1-0528-Qwen3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">deepseek-ai/DeepSeek-R1-0528-Qwen3-8B</a>",
- 44.31,
- 27.25,
- 62.35,
- 43.33,
- 38.35,
- 8.19,
- 41.71,
- "2025-06-02T16-36-06",
- "bfloat16",
- "mit",
- 592
- [
- 29,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.3-70B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.3-70B-Instruct</a>",
- 43.82,
- 48.91,
- 26.56,
- 56,
- 50.97,
- 70.55,
- 10.96,
- "2025-03-17T10-36-45",
- "bfloat16",
- "llama3.3",
- 1744
- [
- 30,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/mistralai/Mistral-Small-3.1-24B-Instruct-2503" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-Small-3.1-24B-Instruct-2503</a>",
- 43.77,
- 42.82,
- 51.83,
- 36.67,
- 40.13,
- 24.01,
- 4.97,
- "2025-06-17T14-59-05",
- "bfloat16",
- "apache-2.0",
- 1275
- [
- 31,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/mistralai/Mistral-Small-24B-Instruct-2501" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-Small-24B-Instruct-2501</a>",
- 43.07,
- 34.31,
- 46.23,
- 48.67,
- 36.84,
- 23.57,
- 6.06,
- "2025-03-18T00-47-09",
- "bfloat16",
- "apache-2.0",
- 878
- [
- 32,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-7B-Instruct</a>",
- 32.71,
- 23.84,
- 20.96,
- 53.33,
- 35.79,
- 7.62,
- 4.02,
- "2025-03-22T18-11-42",
- "bfloat16",
- "apache-2.0",
- 581
- [
- 33,
- "π€",
- "<a target="_blank" href="https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Qwen-32B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">deepseek-ai/DeepSeek-R1-Distill-Qwen-32B</a>",
- 31.81,
- 30.9,
- 5.87,
- 58.67,
- 47.67,
- 32.76,
- 22.51,
- "2025-03-22T18-53-00",
- "bfloat16",
- "mit",
- 1284
- [
- 34,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/tiiuae/Falcon3-10B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/Falcon3-10B-Instruct</a>",
- 30.76,
- 27.49,
- 11.47,
- 53.33,
- 38.6,
- 10.31,
- 2.51,
- "2025-03-18T00-50-17",
- "bfloat16",
- "other",
- 97
- [
- 35,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-14B-Instruct</a>",
- 29.21,
- 39.9,
- 26.39,
- 21.33,
- 38.61,
- 14.77,
- 6.34,
- "2025-03-18T00-51-23",
- "bfloat16",
- "apache-2.0",
- 208
- [
- 36,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/google/gemma-3-12b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-3-12b-it</a>",
- 29.04,
- 43.31,
- 21.81,
- 22,
- 39.22,
- 12.19,
- 13.57,
- "2025-05-12T00-39-21",
- "bfloat16",
- "gemma",
- 354
- [
- 37,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-3B-Instruct</a>",
- 27.09,
- 15.57,
- 25.03,
- 40.67,
- 29.62,
- 3.09,
- 5.94,
- "2025-03-22T18-02-12",
- "bfloat16",
- "other",
- 221
- [
- 38,
- "π π»",
- "<a target="_blank" href="https://huggingface.co/kurakurai/Luth-1.7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kurakurai/Luth-1.7B-Instruct</a>",
- 25.9,
- 10.71,
- 33.01,
- 34,
- 35.57,
- 1.72,
- 4.39,
- "2025-08-26T11-49-23",
- "bfloat16",
- "apache-2.0",
- 10
- [
- 39,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/tiiuae/Falcon3-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/Falcon3-7B-Instruct</a>",
- 22.68,
- 23.6,
- 9.09,
- 35.33,
- 34.86,
- 7.46,
- 2.17,
- "2025-03-22T22-37-49",
- "bfloat16",
- "other",
- 64
- [
- 40,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/K-intelligence/Midm-2.0-Base-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">K-intelligence/Midm-2.0-Base-Instruct</a>",
- 19.73,
- 9.49,
- 24.36,
- 25.33,
- 31.13,
- 11.55,
- 7,
- "2025-07-04T14-53-35",
- "bfloat16",
- "mit",
- 69
- [
- 41,
- "π€",
- "<a target="_blank" href="https://huggingface.co/Qwen/Qwen3-0.6B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen3-0.6B</a>",
- 18.97,
- 0.24,
- 41.99,
- 14.67,
- 30.7,
- 0.75,
- 12.13,
- "2025-08-26T11-52-01",
- "bfloat16",
- "apache-2.0",
- 565
- [
- 42,
- "πͺ¨",
- "<a target="_blank" href="https://huggingface.co/Qwen/Qwen2.5-72B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Qwen/Qwen2.5-72B</a>",
- 16.95,
- 24.33,
- 5.19,
- 21.33,
- 23.58,
- 72.71,
- 19.13,
- "2025-08-04T09-49-52",
- "bfloat16",
- "other",
- 76
- [
- 43,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/utter-project/EuroLLM-9B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">utter-project/EuroLLM-9B-Instruct</a>",
- 16.58,
- 12.41,
- 0,
- 37.33,
- 22.89,
- 9.15,
- 4.5,
- "2025-03-22T21-35-54",
- "bfloat16",
- "apache-2.0",
- 158
- [
- 44,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/internlm/internlm3-8b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">internlm/internlm3-8b-instruct</a>",
- 15.85,
- 7.06,
- 13.16,
- 27.33,
- 23.53,
- 8.8,
- 10.55,
- "2025-03-22T17-25-47",
- "bfloat16",
- "apache-2.0",
- 208
- [
- 45,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/google/txgemma-27b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/txgemma-27b-chat</a>",
- 15.25,
- 45.74,
- 0,
- 0,
- 33.31,
- 27.23,
- 7.24,
- "2025-03-26T21-30-12",
- "bfloat16",
- "other",
- 13
- [
- 46,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.2-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.2-3B-Instruct</a>",
- 12.96,
- 7.54,
- 0,
- 31.33,
- 16.77,
- 3.22,
- 9.29,
- "2025-03-17T10-36-26",
- "bfloat16",
- "llama3.2",
- 954
- [
- 47,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/OpenLLM-France/Lucie-7B-Instruct-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">OpenLLM-France/Lucie-7B-Instruct-v1.1</a>",
- 9.28,
- 8.03,
- 1.8,
- 18,
- 14.13,
- 6.71,
- 4.22,
- "2025-03-22T21-34-23",
- "bfloat16",
- "apache-2.0",
- 8
- [
- 48,
- "π π»",
- "<a target="_blank" href="https://huggingface.co/kurakurai/Luth-0.6B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kurakurai/Luth-0.6B-Instruct</a>",
- 6.42,
- 2.68,
- 8.58,
- 8,
- 24.47,
- 0.6,
- 2.98,
- "2025-08-26T11-42-47",
- "bfloat16",
- "apache-2.0",
- 8
- [
- 49,
- "π€",
- "<a target="_blank" href="https://huggingface.co/open-r1/OpenR1-Qwen-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">open-r1/OpenR1-Qwen-7B</a>",
- 5,
- 0.49,
- 14.52,
- 0,
- 15.13,
- 7.62,
- 25.45,
- "2025-03-18T02-23-13",
- "bfloat16",
- "apache-2.0",
- 40
- [
- 50,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/utter-project/EuroLLM-1.7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">utter-project/EuroLLM-1.7B-Instruct</a>",
- 1.7,
- 5.11,
- 0,
- 0,
- 8.69,
- 1.66,
- 3.59,
- "2025-03-22T21-37-24",
- "bfloat16",
- "apache-2.0",
- 70
- [
- 51,
- "π¬",
- "<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.2-1B-Instruct</a>",
- 1.68,
- 1.7,
- 0,
- 3.33,
- 12.21,
- 1.24,
- 8.12,
- "2025-03-26T16-11-34",
- "bfloat16",
- "llama3.2",
- 842
- [
- "metadata": null
Some good practices before submitting a model
1) Make sure you can load your model and tokenizer using AutoClasses:
from transformers import AutoConfig, AutoModel, AutoTokenizer
config = AutoConfig.from_pretrained("your model name", revision=revision)
model = AutoModel.from_pretrained("your model name", revision=revision)
tokenizer = AutoTokenizer.from_pretrained("your model name", revision=revision)
If this step fails, follow the error messages to debug your model before submitting it. It's likely your model has been improperly uploaded.
Note: make sure your model is public!
Note: if your model needs use_remote_code=True
, we do not support this option yet but we are working on adding it, stay posted!
2) Convert your model weights to safetensors
It's a new format for storing weights which is safer and faster to load and use. It will also allow us to add the number of parameters of your model to the Extended Viewer
!
3) Make sure your model has an open license!
This is a leaderboard for Open LLMs, and we'd love for as many people as possible to know they can use your model π€
4) Fill up your model card
When we add extra information about models to the leaderboard, it will be automatically taken from the model card
In case of model failure
If your model is displayed in the FAILED
category, its execution stopped.
Make sure you have followed the above steps first.
If everything is done, check you can launch the EleutherAIHarness on your model locally, using the above command without modifications (you can add --limit
to limit the number of examples per task).
model | revision | private | precision | weight_type | status |
---|---|---|---|---|---|
model | revision | private | precision | weight_type | status |
---|---|---|---|---|---|
model | revision | private | precision | weight_type | status |
---|
model | revision | private | precision | weight_type | status |
---|---|---|---|---|---|
model | revision | private | precision | weight_type | status |
---|