Deprecated: The each() function is deprecated. This message will be suppressed on further calls in /home/zhenxiangba/zhenxiangba.com/public_html/phproxy-improved-master/index.php on line 456
Paper page - Babel: Open Multilingual Large Language Models Serving Over 90% of Global Speakers
[go: Go Back, main page]

Librarian Bot. I found the following papers similar to this paper.

\n

The following papers were recommended by the Semantic Scholar API

\n\n

Please give a thumbs up to this comment if you found it helpful!

\n

If you want recommendations for any Paper on Hugging Face checkout this Space

\n

You can directly ask Librarian Bot for paper recommendations by tagging it in a comment: \n\n@librarian-bot\n\t recommend

\n","updatedAt":"2025-03-07T01:34:09.480Z","author":{"_id":"63d3e0e8ff1384ce6c5dd17d","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1674830754237-63d3e0e8ff1384ce6c5dd17d.jpeg","fullname":"Librarian Bot (Bot)","name":"librarian-bot","type":"user","isPro":false,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":318,"isUserFollowing":false}},"numEdits":0,"identifiedLanguage":{"language":"en","probability":0.7221285104751587},"editors":["librarian-bot"],"editorAvatarUrls":["https://cdn-avatars.huggingface.co/v1/production/uploads/1674830754237-63d3e0e8ff1384ce6c5dd17d.jpeg"],"reactions":[],"isReport":false}}],"primaryEmailConfirmed":false,"paper":{"id":"2503.00865","authors":[{"_id":"67c666245e2443d7d5e9b76a","user":{"_id":"64802face9ff472e30dc1ceb","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/64802face9ff472e30dc1ceb/bcwTlgpaUrU7m2RMB5zCc.png","isPro":true,"fullname":"Yiran Zhao","user":"Yiran0924","type":"user"},"name":"Yiran Zhao","status":"claimed_verified","statusLastChangedAt":"2025-03-04T08:51:21.231Z","hidden":false},{"_id":"67c666245e2443d7d5e9b76b","user":{"_id":"61657b0b20606e5e73f611cc","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/61657b0b20606e5e73f611cc/6ZPne2GYlWkxrx35ND1P8.png","isPro":false,"fullname":"CHAOQUN LIU","user":"lukecq","type":"user"},"name":"Chaoqun Liu","status":"claimed_verified","statusLastChangedAt":"2025-03-06T09:27:33.956Z","hidden":false},{"_id":"67c666245e2443d7d5e9b76c","user":{"_id":"6046db56de4e62b756b5a11f","avatarUrl":"/avatars/702b0eb530db72d9342a913f71ba5bf9.svg","isPro":false,"fullname":"Yue Deng","user":"spencer97","type":"user"},"name":"Yue Deng","status":"admin_assigned","statusLastChangedAt":"2025-10-08T09:36:03.655Z","hidden":false},{"_id":"67c666245e2443d7d5e9b76d","user":{"_id":"671609f7664f44a151f1f0e8","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/no-auth/fEQLuH1kdW5Pd9Y_J64hN.png","isPro":false,"fullname":"jiahao ying","user":"jhying","type":"user"},"name":"Jiahao Ying","status":"admin_assigned","statusLastChangedAt":"2025-03-06T09:35:39.926Z","hidden":false},{"_id":"67c666245e2443d7d5e9b76e","user":{"_id":"6539c87ba318a98bf0d15dd8","avatarUrl":"/avatars/beb9ba6eeacb61addc5897836bd59f55.svg","isPro":false,"fullname":"Mahani Aljunied","user":"maljunied","type":"user"},"name":"Mahani Aljunied","status":"admin_assigned","statusLastChangedAt":"2025-03-06T09:35:33.285Z","hidden":false},{"_id":"67c666245e2443d7d5e9b76f","name":"Zhaodonghui Li","hidden":false},{"_id":"67c666245e2443d7d5e9b770","user":{"_id":"6454685a548f22be598414c4","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/noauth/eMjMWKJ-AouF7eY1-RzGF.jpeg","isPro":false,"fullname":"Lidong Bing","user":"LidongBing","type":"user"},"name":"Lidong Bing","status":"admin_assigned","statusLastChangedAt":"2025-03-06T09:35:19.611Z","hidden":false},{"_id":"67c666245e2443d7d5e9b771","user":{"_id":"604f67ef0fe8ff3ec13d71ef","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/604f67ef0fe8ff3ec13d71ef/KhUwWvZ3OJ9nEee3B-SXO.png","isPro":false,"fullname":"Hou Pong (Ken) Chan","user":"kenchan0226","type":"user"},"name":"Hou Pong Chan","status":"admin_assigned","statusLastChangedAt":"2025-03-06T09:35:50.272Z","hidden":false},{"_id":"67c666245e2443d7d5e9b772","user":{"_id":"642eecbf9b2484d7d8526781","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/642eecbf9b2484d7d8526781/4IvGbd66s49Wx5pZyZGHA.png","isPro":false,"fullname":"Yu Rong","user":"Swrooy","type":"user"},"name":"Yu Rong","status":"claimed_verified","statusLastChangedAt":"2025-06-10T09:30:55.214Z","hidden":false},{"_id":"67c666245e2443d7d5e9b773","name":"Deli Zhao","hidden":false},{"_id":"67c666245e2443d7d5e9b774","user":{"_id":"60dff6ae19a362a8c27862aa","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/60dff6ae19a362a8c27862aa/LIYzLB3cdPh-B3XIBgBCC.jpeg","isPro":false,"fullname":"Wenxuan Zhang","user":"isakzhang","type":"user"},"name":"Wenxuan Zhang","status":"claimed_verified","statusLastChangedAt":"2025-03-06T09:27:36.769Z","hidden":false}],"publishedAt":"2025-03-02T11:53:55.000Z","submittedOnDailyAt":"2025-03-06T00:19:03.700Z","title":"Babel: Open Multilingual Large Language Models Serving Over 90% of\n Global Speakers","submittedOnDailyBy":{"_id":"64802face9ff472e30dc1ceb","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/64802face9ff472e30dc1ceb/bcwTlgpaUrU7m2RMB5zCc.png","isPro":true,"fullname":"Yiran Zhao","user":"Yiran0924","type":"user"},"summary":"Large language models (LLMs) have revolutionized natural language processing\n(NLP), yet open-source multilingual LLMs remain scarce, with existing models\noften limited in language coverage. Such models typically prioritize\nwell-resourced languages, while widely spoken but under-resourced languages are\noften overlooked. To address this disparity, we introduce Babel, an\nopen multilingual LLM that covers the top 25 languages by number of speakers,\nsupports over 90% of the global population, and includes many languages\nneglected by other open multilingual LLMs. Unlike traditional continue\npretraining approaches, Babel expands its parameter count through a layer\nextension technique that elevates Babel's performance ceiling. We introduce two\nvariants: Babel-9B, designed for efficient inference and\nfine-tuning, and Babel-83B, which sets a new standard for open\nmultilingual LLMs. Extensive evaluations on multilingual tasks demonstrate its\nsuperior performance compared to open LLMs of comparable size. In addition,\nusing open-source supervised fine-tuning datasets, Babel achieves remarkable\nperformance, with Babel-9B-Chat leading among 10B-sized LLMs and Babel-83B-Chat\nsetting a new standard for multilingual tasks, reaching the same level of\ncommercial models.","upvotes":64,"discussionId":"67c666255e2443d7d5e9b7b3","projectPage":"https://babel-llm.github.io/babel-llm/","githubRepo":"https://github.com/babel-llm/babel-llm","githubRepoAddedBy":"user","ai_summary":"Babel is an open multilingual LLM that expands its parameter count through layer extension, covering numerous languages and achieving superior performance in multilingual tasks compared to other open LLMs.","ai_keywords":["LLMs","natural language processing","multilingual LLMs","layer extension","Babel-9B","Babel-83B","inference","fine-tuning","supervised fine-tuning","multilingual tasks"],"githubStars":213},"canReadDatabase":false,"canManagePapers":false,"canSubmit":false,"hasHfLevelAccess":false,"upvoted":false,"upvoters":[{"_id":"604f67ef0fe8ff3ec13d71ef","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/604f67ef0fe8ff3ec13d71ef/KhUwWvZ3OJ9nEee3B-SXO.png","isPro":false,"fullname":"Hou Pong (Ken) Chan","user":"kenchan0226","type":"user"},{"_id":"64802face9ff472e30dc1ceb","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/64802face9ff472e30dc1ceb/bcwTlgpaUrU7m2RMB5zCc.png","isPro":true,"fullname":"Yiran Zhao","user":"Yiran0924","type":"user"},{"_id":"6046db56de4e62b756b5a11f","avatarUrl":"/avatars/702b0eb530db72d9342a913f71ba5bf9.svg","isPro":false,"fullname":"Yue Deng","user":"spencer97","type":"user"},{"_id":"65363393cb8a5a17e79ce436","avatarUrl":"/avatars/48863002bac2447282585b3f72f890c0.svg","isPro":false,"fullname":"Rengan","user":"DrGan","type":"user"},{"_id":"65e43b57680724cc9a740b2d","avatarUrl":"/avatars/0134f1b75e462863aed71b692cbaefad.svg","isPro":false,"fullname":"Zhang Yang","user":"cosmicthrillseeking","type":"user"},{"_id":"60dff6ae19a362a8c27862aa","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/60dff6ae19a362a8c27862aa/LIYzLB3cdPh-B3XIBgBCC.jpeg","isPro":false,"fullname":"Wenxuan Zhang","user":"isakzhang","type":"user"},{"_id":"6362a77dd3be91534c2e9213","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/6362a77dd3be91534c2e9213/3uUM3B1m2CFMukbkA0yDv.png","isPro":true,"fullname":"Xingxuan Li","user":"veggiebird","type":"user"},{"_id":"64eb7b30d848efd8a6785758","avatarUrl":"/avatars/172d15a9d488a89282087a99f371ef04.svg","isPro":false,"fullname":"ying jiahao","user":"gjg145","type":"user"},{"_id":"61657b0b20606e5e73f611cc","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/61657b0b20606e5e73f611cc/6ZPne2GYlWkxrx35ND1P8.png","isPro":false,"fullname":"CHAOQUN LIU","user":"lukecq","type":"user"},{"_id":"67c91080d8fafcd81b97707c","avatarUrl":"/avatars/31716370a5a6475601fd9ff55462fe3e.svg","isPro":false,"fullname":"Qi Yining","user":"xx777","type":"user"},{"_id":"662620578c67281c9eb17f7b","avatarUrl":"/avatars/0e9c26d2234a9f1898a380f183b1645a.svg","isPro":false,"fullname":"Li Zhaodonghui","user":"LZ12DH","type":"user"},{"_id":"66f612b934b8ac9ffa44f084","avatarUrl":"/avatars/6836c122e19c66c90f1673f28b30d7f0.svg","isPro":false,"fullname":"Tang","user":"tommysally","type":"user"}],"acceptLanguages":["*"],"dailyPaperRank":1}">
Papers
arxiv:2503.00865

Babel: Open Multilingual Large Language Models Serving Over 90% of Global Speakers

Published on Mar 2, 2025
Ā· Submitted by
Yiran Zhao
on Mar 6, 2025
#1 Paper of the day

Abstract

Babel is an open multilingual LLM that expands its parameter count through layer extension, covering numerous languages and achieving superior performance in multilingual tasks compared to other open LLMs.

AI-generated summary

Large language models (LLMs) have revolutionized natural language processing (NLP), yet open-source multilingual LLMs remain scarce, with existing models often limited in language coverage. Such models typically prioritize well-resourced languages, while widely spoken but under-resourced languages are often overlooked. To address this disparity, we introduce Babel, an open multilingual LLM that covers the top 25 languages by number of speakers, supports over 90% of the global population, and includes many languages neglected by other open multilingual LLMs. Unlike traditional continue pretraining approaches, Babel expands its parameter count through a layer extension technique that elevates Babel's performance ceiling. We introduce two variants: Babel-9B, designed for efficient inference and fine-tuning, and Babel-83B, which sets a new standard for open multilingual LLMs. Extensive evaluations on multilingual tasks demonstrate its superior performance compared to open LLMs of comparable size. In addition, using open-source supervised fine-tuning datasets, Babel achieves remarkable performance, with Babel-9B-Chat leading among 10B-sized LLMs and Babel-83B-Chat setting a new standard for multilingual tasks, reaching the same level of commercial models.

Community

Paper author Paper submitter

🌟 Key Highlights:
1ļøāƒ£ Convering 90% population—supporting top 25 languages, prioritizing widely spoken but previously underexplored languages in open multilingual models.

2ļøāƒ£ Innovative architecture—Unlike traditional continued pretraining approaches, Babel expands its parameter count through model extension, raising its performance ceiling.

3ļøāƒ£ Two powerful variants
šŸ’”Babel-9B—Designed for efficient inference and fine-tuning.
šŸ’”Babel-83B—A new benchmark for open multilingual LLMs.

This is an automated message from the Librarian Bot. I found the following papers similar to this paper.

The following papers were recommended by the Semantic Scholar API

Please give a thumbs up to this comment if you found it helpful!

If you want recommendations for any Paper on Hugging Face checkout this Space

You can directly ask Librarian Bot for paper recommendations by tagging it in a comment: @librarian-bot recommend

Sign up or log in to comment

Models citing this paper 4

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2503.00865 in a dataset README.md to link it from this page.

Spaces citing this paper 1

Collections including this paper 10