A collection of items telated the the MMTEB release
mteb, check out our documentation.\n\n\t\n\t\t
\n
\n","classNames":"hf-sanitized hf-sanitized-s7akt_Bd9IiMG-tQjNtl6"},"users":[{"_id":"624bf78fbe275a1baf4dbd91","avatarUrl":"/avatars/be91dd24ea9f42a55c68d75e5a8681d7.svg","isPro":false,"fullname":"LoΓ―c Magne","user":"loicmagne","type":"user"},{"_id":"5ff8c9f4b2035d9a81a859f7","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1652134289581-5ff8c9f4b2035d9a81a859f7.jpeg","isPro":false,"fullname":"Nouamane Tazi","user":"nouamanetazi","type":"user"},{"_id":"5f1eb362eec0ad2a071ad6e2","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5f1eb362eec0ad2a071ad6e2/IXMYkYKuTwn6kBdWnQeeY.png","isPro":false,"fullname":"Niklas Muennighoff","user":"Muennighoff","type":"user"},{"_id":"5eff4688ff69163f6f59e66c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1596792577829-5eff4688ff69163f6f59e66c.jpeg","isPro":false,"fullname":"Nils Reimers","user":"nreimers","type":"user"},{"_id":"6317233cc92fd6fee317e030","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/6317233cc92fd6fee317e030/cJHSvvimr1kqgQfHOjO5n.png","isPro":false,"fullname":"Tom Aarsen","user":"tomaarsen","type":"user"},{"_id":"642c54a5b09c70b36de03071","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/642c54a5b09c70b36de03071/EwQyQmust01dgkdRMtYFt.jpeg","isPro":true,"fullname":"rasdani","user":"rasdani","type":"user"},{"_id":"62610f8040e04009e81047e9","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/62610f8040e04009e81047e9/iqGTYB7OMmS0jkFEB7cEF.jpeg","isPro":false,"fullname":"Imene Kerboua","user":"imenelydiaker","type":"user"},{"_id":"5ff5943752c26e9bc240bada","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5ff5943752c26e9bc240bada/Exyzf3C_gJ2KdsL4K5_cq.png","isPro":false,"fullname":"Kenneth C. Enevoldsen","user":"KennethEnevoldsen","type":"user"},{"_id":"647895fe60410176020a5c58","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/647895fe60410176020a5c58/56k-9Tln9S5epsb-e6Zr6.jpeg","isPro":false,"fullname":"Wissam Siblini","user":"Wissam42","type":"user"},{"_id":"629e4ad01fec321a2d2abfa3","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/629e4ad01fec321a2d2abfa3/LwMzgQr9RH5arKOBNR043.jpeg","isPro":false,"fullname":"Shuang Li","user":"Shuang59","type":"user"},{"_id":"648ba0d68e7f7a927675d4a3","avatarUrl":"/avatars/d82ced93656e03d60c8b55010694f908.svg","isPro":false,"fullname":"Hong Liu","user":"hongliu9903","type":"user"},{"_id":"6362d9712691058b19de1ba4","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/6362d9712691058b19de1ba4/Hdqj5aGrFJJbF7oUSzoIh.jpeg","isPro":true,"fullname":"Orion Weller","user":"orionweller","type":"user"},{"_id":"6610d8304a55d90baf3c238c","avatarUrl":"/avatars/2317948c0b69336d164cccda5bfdee52.svg","isPro":false,"fullname":"Thakur","user":"Sakshamrzt","type":"user"},{"_id":"63434eb76f59b79da07dbddf","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/63434eb76f59b79da07dbddf/BEwmVjqPNYlqmutXG0G6e.jpeg","isPro":false,"fullname":"Sara Hooker","user":"sarahooker","type":"user"},{"_id":"5fc53305fa6eef7667a4d691","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1623089209793-5fc53305fa6eef7667a4d691.jpeg","isPro":false,"fullname":"Manan Dey","user":"manandey","type":"user"},{"_id":"62696cd3d1ac0cde59280dcf","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1656940964610-62696cd3d1ac0cde59280dcf.jpeg","isPro":false,"fullname":"MΓ‘rton Kardos","user":"kardosdrur","type":"user"},{"_id":"62645f88c39850dc093d6105","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1650745211725-noauth.png","isPro":false,"fullname":"Mohammed Hamdy","user":"mmhamdy","type":"user"},{"_id":"662cc845616128914a3c9817","avatarUrl":"/avatars/cd186de00f28bb866abc1ab6c4465663.svg","isPro":false,"fullname":"DomKrz","user":"dokato","type":"user"},{"_id":"6400f2ed568dbe30c9161e47","avatarUrl":"/avatars/c55938df5bce82b5d96e592a1ec36a8b.svg","isPro":false,"fullname":"Weijia Shi","user":"swj0419","type":"user"},{"_id":"5f3801ab7e583543386217ac","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5f3801ab7e583543386217ac/4xMdDV1gws7nxCJrU321H.jpeg","isPro":false,"fullname":"Aaron Chibb","user":"aari1995","type":"user"},{"_id":"624734dc4c731bb6bfab8af7","avatarUrl":"/avatars/6b250b58710a3287b85e4733c1824558.svg","isPro":false,"fullname":"Siva Reddy","user":"sivareddyg","type":"user"},{"_id":"61981af5d420757268e195ac","avatarUrl":"/avatars/8b59aaf33447224f83d497425fd7ea8f.svg","isPro":false,"fullname":"Vaibhav Adlakha","user":"vaibhavad","type":"user"},{"_id":"60e1bc418479fac0bd1daa0e","avatarUrl":"/avatars/61f4f8ac0714aae7d5cbb7d4e1038020.svg","isPro":false,"fullname":"Andrianos Michail","user":"Andrianos","type":"user"},{"_id":"64cc0e80a257a3212c0c4b24","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/64cc0e80a257a3212c0c4b24/wqs6WZN8-3OQthcnQXgN7.png","isPro":false,"fullname":"Isaac Chung","user":"isaacchung","type":"user"},{"_id":"6454e5ac273f64983024ba5d","avatarUrl":"/avatars/ca6f861ed830a79f7a1eba04ebe84afc.svg","isPro":false,"fullname":"Vatolin Alexey","user":"vatolinalex","type":"user"},{"_id":"64d2fce8129a210e569e0c76","avatarUrl":"/avatars/a79a832dc3a46ece1b9e542369fc4888.svg","isPro":false,"fullname":"Dawei Zhu","user":"dwzhu","type":"user"},{"_id":"63108cc834c7d77420b0fd68","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/63108cc834c7d77420b0fd68/taDnqEmcI9Rhe3uzcPEE3.jpeg","isPro":false,"fullname":"Chenghao Xiao","user":"gowitheflow","type":"user"},{"_id":"61af4544d691b3aadd1f62b6","avatarUrl":"/avatars/7a4067accdd1005f78c3c4adad3ee0a5.svg","isPro":false,"fullname":"Solomatin Roman","user":"Samoed","type":"user"},{"_id":"670d9809cb84658d97d01a90","avatarUrl":"/avatars/9d3058c51b8222e25ead588919ba4b81.svg","isPro":false,"fullname":"Kyler Wang","user":"kylerywang","type":"user"},{"_id":"6671be9ff022d14aa10df864","avatarUrl":"/avatars/dd085abefa38c1604dc2ceabf472816d.svg","isPro":false,"fullname":"Adnan El Assadi","user":"AdnanElAssadi","type":"user"},{"_id":"65e7bf52ea6e08f11594f7d1","avatarUrl":"/avatars/561916d564a0d6e13465b9f3d4672124.svg","isPro":false,"fullname":"Mina Parham","user":"Mina76","type":"user"},{"_id":"670d8a232d412a30dfa28ac2","avatarUrl":"/avatars/6c8997140f5e864750b04d21625f86d9.svg","isPro":true,"fullname":"Silky Singh","user":"silky1708","type":"user"},{"_id":"63212881cafe12f481a4e488","avatarUrl":"/avatars/91a402b11fa792566e65091143a5bc97.svg","isPro":false,"fullname":"Pengfei He","user":"hepengfe","type":"user"},{"_id":"67835e771d8713ae81c7a7ec","avatarUrl":"/avatars/ec36ae567709963ba9b727cb1f059706.svg","isPro":false,"fullname":"Sufen","user":"Sufen","type":"user"},{"_id":"66fd9e8b001816b29e0bc2e4","avatarUrl":"/avatars/190e6711cef3126c51e15276bb5ed8cd.svg","isPro":false,"fullname":"Ali Sartaz Khan","user":"alisartazkhan","type":"user"},{"_id":"63ca499104c97982831127ec","avatarUrl":"/avatars/816a962c62c805adf7789fa8e398b01e.svg","isPro":false,"fullname":"Aradhye Agarwal","user":"aradhye","type":"user"},{"_id":"67c0c512f217797d4dbe8ace","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/no-auth/5qEVZY8ky_caKcwAZsteS.png","isPro":false,"fullname":"Shikhar Shiromani","user":"sshiromani1729","type":"user"},{"_id":"64c1094bb85ee9e4223402f5","avatarUrl":"/avatars/5c598ac8472ab27c3d69bd4c5a4c9e2d.svg","isPro":false,"fullname":"SmileHu","user":"SmileXing","type":"user"},{"_id":"64784057403cd7ae4b7897c1","avatarUrl":"/avatars/88d5de26b847b25bba27e99a5fb44e58.svg","isPro":false,"fullname":"Jay Jung","user":"diffunity","type":"user"},{"_id":"66b0c560acd41a0acb530ea9","avatarUrl":"/avatars/b3e0d58d8e21a68039a82ecdf9b9f766.svg","isPro":false,"fullname":"Peichun Hua","user":"pchua","type":"user"},{"_id":"64d02f09a2e7f9ff61035013","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/64d02f09a2e7f9ff61035013/V9HK7GVnpBwenNz8SSyus.jpeg","isPro":false,"fullname":"Yevhen Kostiuk","user":"yevhenkost","type":"user"}],"userCount":41,"collections":[{"slug":"mteb/mmteb-67b74a586236bc839971e8cd","title":"MMTEB","description":"A collection of items telated the the MMTEB release","gating":false,"lastUpdated":"2026-02-20T16:56:16.593Z","owner":{"_id":"624bfda5459c48438cc39f80","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5ff5943752c26e9bc240bada/OrZxdlg8doDNO2TZ6Q58G.png","fullname":"Massive Text Embedding Benchmark","name":"mteb","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":916,"isUserFollowing":false},"items":[{"_id":"67b74a586236bc839971e8ce","position":0,"type":"paper","id":"2502.13595","title":"MMTEB: Massive Multilingual Text Embedding Benchmark","thumbnailUrl":"https://cdn-thumbnails.huggingface.co/social-thumbnails/papers/2502.13595.png","upvotes":44,"publishedAt":"2025-02-19T10:13:43.000Z","isUpvotedByUser":false},{"_id":"67b84a1f342b6cfb48a90202","position":1,"type":"space","author":"mteb","authorData":{"_id":"624bfda5459c48438cc39f80","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5ff5943752c26e9bc240bada/OrZxdlg8doDNO2TZ6Q58G.png","fullname":"Massive Text Embedding Benchmark","name":"mteb","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":916,"isUserFollowing":false},"colorFrom":"blue","colorTo":"indigo","createdAt":"2022-09-29T11:29:23.000Z","emoji":"π₯","id":"mteb/leaderboard","lastModified":"2026-01-20T21:57:52.000Z","likes":7047,"pinned":true,"private":false,"sdk":"docker","repoType":"space","runtime":{"stage":"RUNNING","hardware":{"current":"cpu-upgrade","requested":"cpu-upgrade"},"storage":null,"gcTimeout":172800,"replicas":{"current":1,"requested":1},"devMode":false,"domains":[{"domain":"mteb-leaderboard.hf.space","stage":"READY"}],"sha":"5132ddc89608299ca2f183fc13b60e1fafd2de7d"},"shortDescription":"Embedding Leaderboard","title":"MTEB Leaderboard","isLikedByUser":false,"ai_short_description":"Explore multilingual text embedding benchmark results","ai_category":"Text Analysis","trendingScore":17,"tags":["docker","leaderboard","region:us"],"featured":false},{"_id":"69988f2a7d7ae20f45f3ca74","position":2,"type":"dataset","author":"mteb","downloads":5863,"gated":false,"id":"mteb/BornholmBitextMining","lastModified":"2025-09-10T11:22:32.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":6785,"libraries":["datasets","pandas","mlcroissant","polars"],"formats":["parquet"],"modalities":["text"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false},{"_id":"69988f2a3ccabf2d2404f0e9","position":3,"type":"dataset","author":"davidstap","downloads":5063,"gated":false,"id":"davidstap/biblenlp-corpus-mmteb","lastModified":"2024-04-26T14:09:22.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":1743371,"libraries":["datasets","pandas","mlcroissant","polars"],"formats":["json"],"modalities":["text"]},"private":false,"repoType":"dataset","likes":3,"isLikedByUser":false,"isBenchmark":false}],"position":0,"theme":"green","private":false,"shareUrl":"https://hf.co/collections/mteb/mmteb","upvotes":4,"isUpvotedByUser":false},{"slug":"mteb/maeb-699887357d7ae20f45f35148","title":"MAEB","description":"MAEB is a comprehensive audio benchmark with 30 tasks spanning both audio-only and audio-text cross-modal evaluation. Tasks span 7 task types: retr...","gating":false,"lastUpdated":"2026-02-20T16:56:16.588Z","owner":{"_id":"624bfda5459c48438cc39f80","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5ff5943752c26e9bc240bada/OrZxdlg8doDNO2TZ6Q58G.png","fullname":"Massive Text Embedding Benchmark","name":"mteb","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":916,"isUserFollowing":false},"items":[{"_id":"6998882b7d7ae20f45f35827","position":0,"type":"paper","id":"2602.16008","title":"MAEB: Massive Audio Embedding Benchmark","thumbnailUrl":"https://cdn-thumbnails.huggingface.co/social-thumbnails/papers/2602.16008.png","upvotes":16,"publishedAt":"2026-02-17T21:00:51.000Z","isUpvotedByUser":false},{"_id":"69988735640f064c400cefe7","position":1,"type":"dataset","author":"mteb","downloads":45,"gated":false,"id":"mteb/Clotho","lastModified":"2026-01-20T08:22:54.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":1045,"libraries":["datasets","dask","polars","mlcroissant"],"formats":["parquet"],"modalities":["audio","text"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false},{"_id":"699887353ccabf2d240477ac","position":2,"type":"dataset","author":"mteb","downloads":294,"gated":false,"id":"mteb/common_voice_21_0_mini","lastModified":"2025-12-30T16:14:59.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":77534,"libraries":["datasets","pandas","polars","mlcroissant"],"formats":["parquet"],"modalities":["audio","text"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false},{"_id":"699887356ab28243867a3436","position":3,"type":"dataset","author":"mteb","downloads":522,"gated":false,"id":"mteb/fleurs","lastModified":"2026-01-17T11:30:54.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":384060,"libraries":["datasets","dask","polars","mlcroissant"],"formats":["parquet"],"modalities":["audio","text"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false}],"position":1,"theme":"orange","private":false,"shareUrl":"https://hf.co/collections/mteb/maeb","upvotes":0,"isUpvotedByUser":false},{"slug":"mteb/mieb-699891dc5390e02222d56929","title":"MIEB","description":"MIEB(Multilingual) is a comprehensive image embeddings benchmark, spanning 10 task types, covering 130 tasks and a total of 39 languages.\n In ad...","gating":false,"lastUpdated":"2026-02-20T16:56:16.591Z","owner":{"_id":"624bfda5459c48438cc39f80","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5ff5943752c26e9bc240bada/OrZxdlg8doDNO2TZ6Q58G.png","fullname":"Massive Text Embedding Benchmark","name":"mteb","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":916,"isUserFollowing":false},"items":[{"_id":"6998920bdf79a976c55a39af","position":0,"type":"paper","id":"2504.10471","title":"MIEB: Massive Image Embedding Benchmark","thumbnailUrl":"https://cdn-thumbnails.huggingface.co/social-thumbnails/papers/2504.10471.png","upvotes":21,"publishedAt":"2025-04-14T17:54:28.000Z","isUpvotedByUser":false},{"_id":"699891dd0cab71e3d90f8645","position":1,"type":"dataset","author":"mteb","downloads":98,"gated":false,"id":"mteb/birdsnap","lastModified":"2024-10-26T10:35:08.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":17858,"libraries":["datasets","dask","polars","mlcroissant"],"formats":["parquet"],"modalities":["image","text"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false},{"_id":"699891dd3ccabf2d24052dc3","position":2,"type":"dataset","author":"mteb","downloads":122,"gated":false,"id":"mteb/Caltech101","lastModified":"2025-06-05T14:25:02.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":9144,"libraries":["datasets","pandas","mlcroissant","polars"],"formats":["parquet"],"modalities":["image"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false},{"_id":"699891dddf79a976c55a351d","position":3,"type":"dataset","author":"mteb","downloads":79,"gated":false,"id":"mteb/cifar10","lastModified":"2026-02-09T08:00:14.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":60000,"libraries":["datasets","pandas","polars","mlcroissant"],"formats":["parquet"],"modalities":["image"]},"private":false,"repoType":"dataset","likes":2,"isLikedByUser":false,"isBenchmark":false}],"position":2,"theme":"green","private":false,"shareUrl":"https://hf.co/collections/mteb/mieb","upvotes":0,"isUpvotedByUser":false},{"slug":"mteb/hume-699892257d7ae20f45f40ca9","title":"HUME","description":"The HUME benchmark is designed to evaluate the performance of text embedding models and humans on a comparable set of tasks. This captures areas wh...","gating":false,"lastUpdated":"2026-02-20T16:56:31.704Z","owner":{"_id":"624bfda5459c48438cc39f80","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5ff5943752c26e9bc240bada/OrZxdlg8doDNO2TZ6Q58G.png","fullname":"Massive Text Embedding Benchmark","name":"mteb","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":916,"isUserFollowing":false},"items":[{"_id":"6998923a0cab71e3d90f8d6c","position":0,"type":"paper","id":"2510.10062","title":"HUME: Measuring the Human-Model Performance Gap in Text Embedding Task","thumbnailUrl":"https://cdn-thumbnails.huggingface.co/social-thumbnails/papers/2510.10062.png","upvotes":10,"publishedAt":"2025-10-11T06:56:53.000Z","isUpvotedByUser":false},{"_id":"699892253ccabf2d24053386","position":1,"type":"dataset","author":"mteb","downloads":22,"gated":false,"id":"mteb/HUMEEmotionClassification","lastModified":"2025-09-29T09:43:30.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":16048,"libraries":["datasets","pandas","polars","mlcroissant"],"formats":["parquet","optimized-parquet"],"modalities":["text"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false},{"_id":"69989226d9b39c5c79717d7a","position":2,"type":"dataset","author":"mteb","downloads":26,"gated":false,"id":"mteb/HUMEToxicConversationsClassification","lastModified":"2025-09-29T10:33:57.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":8045,"libraries":["datasets","pandas","polars","mlcroissant"],"formats":["parquet","optimized-parquet"],"modalities":["text"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false},{"_id":"69989226c055587af218b836","position":3,"type":"dataset","author":"mteb","downloads":21,"gated":false,"id":"mteb/HUMETweetSentimentExtractionClassification","lastModified":"2025-09-29T10:49:39.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":27526,"libraries":["datasets","pandas","polars","mlcroissant"],"formats":["parquet","optimized-parquet"],"modalities":["text"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false}],"position":3,"theme":"pink","private":false,"shareUrl":"https://hf.co/collections/mteb/hume","upvotes":0,"isUpvotedByUser":false},{"slug":"mteb/mteb-papers-67b84a2e10a9714460f0db4f","title":"MTEB Papers","description":"This is a collection of MTEB papers (not exhaustive).","gating":false,"lastUpdated":"2026-02-20T16:56:16.616Z","owner":{"_id":"624bfda5459c48438cc39f80","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5ff5943752c26e9bc240bada/OrZxdlg8doDNO2TZ6Q58G.png","fullname":"Massive Text Embedding Benchmark","name":"mteb","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":916,"isUserFollowing":false},"items":[{"_id":"67b84a447614a7cdef8b1ca1","position":0,"type":"paper","id":"2502.13595","title":"MMTEB: Massive Multilingual Text Embedding Benchmark","thumbnailUrl":"https://cdn-thumbnails.huggingface.co/social-thumbnails/papers/2502.13595.png","upvotes":44,"publishedAt":"2025-02-19T10:13:43.000Z","isUpvotedByUser":false},{"_id":"67b84a3f00245b72c59f7efe","position":1,"type":"paper","id":"2210.07316","title":"MTEB: Massive Text Embedding Benchmark","thumbnailUrl":"https://cdn-thumbnails.huggingface.co/social-thumbnails/papers/2210.07316.png","upvotes":6,"publishedAt":"2022-10-13T19:42:08.000Z","isUpvotedByUser":false},{"_id":"67ff78c1fa998532e8f57640","position":2,"type":"paper","id":"2504.10471","title":"MIEB: Massive Image Embedding Benchmark","thumbnailUrl":"https://cdn-thumbnails.huggingface.co/social-thumbnails/papers/2504.10471.png","upvotes":21,"publishedAt":"2025-04-14T17:54:28.000Z","isUpvotedByUser":false},{"_id":"67b84a8300245b72c59f8e18","position":3,"type":"paper","id":"2406.02396","title":"The Scandinavian Embedding Benchmarks: Comprehensive Assessment of\n Multilingual and Monolingual Text Embedding","thumbnailUrl":"https://cdn-thumbnails.huggingface.co/social-thumbnails/papers/2406.02396.png","upvotes":0,"publishedAt":"2024-06-04T15:11:27.000Z","isUpvotedByUser":false}],"position":4,"theme":"indigo","private":false,"shareUrl":"https://hf.co/collections/mteb/mteb-papers","upvotes":2,"isUpvotedByUser":false}],"datasets":[{"author":"mteb","downloads":166777,"gated":false,"id":"mteb/results","lastModified":"2026-02-20T14:36:55.000Z","private":false,"repoType":"dataset","likes":1,"isLikedByUser":false,"isBenchmark":false},{"author":"mteb","downloads":0,"gated":false,"id":"mteb/worldqa","lastModified":"2026-02-20T12:07:55.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":3284,"libraries":["datasets","dask","polars","mlcroissant"],"formats":["parquet","optimized-parquet"],"modalities":["audio","text"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false},{"author":"mteb","downloads":9206,"gated":false,"id":"mteb/arguana","lastModified":"2026-02-20T10:02:39.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":11486,"libraries":["datasets","pandas","polars","mlcroissant"],"formats":["json"],"modalities":["text"]},"private":false,"repoType":"dataset","likes":1,"isLikedByUser":false,"isBenchmark":false},{"author":"mteb","downloads":0,"gated":false,"id":"mteb/MVBench","lastModified":"2026-02-20T09:25:47.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":3899,"libraries":["datasets","dask","polars","mlcroissant"],"formats":["parquet","optimized-parquet"],"modalities":["audio","text"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false},{"author":"mteb","downloads":7,"gated":false,"id":"mteb/panda-70m","lastModified":"2026-02-19T12:41:33.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":3395,"libraries":["datasets","dask","polars","mlcroissant"],"formats":["parquet","optimized-parquet"],"modalities":["audio","text"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false},{"author":"mteb","downloads":5,"gated":false,"id":"mteb/AudioCaps_AV","lastModified":"2026-02-19T12:04:30.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":666,"libraries":["datasets","dask","polars","mlcroissant"],"formats":["parquet","optimized-parquet"],"modalities":["audio","text"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false},{"author":"mteb","downloads":7,"gated":false,"id":"mteb/VALOR-32K","lastModified":"2026-02-19T07:49:04.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":3491,"libraries":["datasets","dask","polars","mlcroissant"],"formats":["parquet","optimized-parquet"],"modalities":["audio","text"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false},{"author":"mteb","downloads":6,"gated":false,"id":"mteb/AVE-Dataset","lastModified":"2026-02-19T06:27:49.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":3714,"libraries":["datasets","dask","polars","mlcroissant"],"formats":["parquet","optimized-parquet"],"modalities":["audio","text"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false},{"author":"mteb","downloads":6,"gated":false,"id":"mteb/VGGSound_AV_RETRIEVAL","lastModified":"2026-02-19T00:27:52.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":696,"libraries":["datasets","dask","polars","mlcroissant"],"formats":["parquet","optimized-parquet"],"modalities":["audio","text"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false},{"author":"mteb","downloads":5,"gated":false,"id":"mteb/Shot2Story20K_test","lastModified":"2026-02-18T20:49:12.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":4023,"libraries":["datasets","dask","polars","mlcroissant"],"formats":["parquet","optimized-parquet"],"modalities":["audio","text"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false}],"models":[{"author":"mteb","authorData":{"_id":"624bfda5459c48438cc39f80","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5ff5943752c26e9bc240bada/OrZxdlg8doDNO2TZ6Q58G.png","fullname":"Massive Text Embedding Benchmark","name":"mteb","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":916,"isUserFollowing":false},"downloads":0,"gated":false,"id":"mteb/bm25s","availableInferenceProviders":[],"lastModified":"2026-02-20T17:10:08.000Z","likes":0,"private":false,"repoType":"model","isLikedByUser":false},{"author":"mteb","authorData":{"_id":"624bfda5459c48438cc39f80","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5ff5943752c26e9bc240bada/OrZxdlg8doDNO2TZ6Q58G.png","fullname":"Massive Text Embedding Benchmark","name":"mteb","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":916,"isUserFollowing":false},"downloads":1,"gated":false,"id":"mteb/index_wikipedia_bm25","availableInferenceProviders":[],"lastModified":"2024-07-27T15:57:22.000Z","likes":1,"private":false,"repoType":"model","isLikedByUser":false},{"author":"mteb","authorData":{"_id":"624bfda5459c48438cc39f80","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5ff5943752c26e9bc240bada/OrZxdlg8doDNO2TZ6Q58G.png","fullname":"Massive Text Embedding Benchmark","name":"mteb","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":916,"isUserFollowing":false},"downloads":2,"gated":false,"id":"mteb/index_arxiv_bm25","availableInferenceProviders":[],"lastModified":"2024-07-27T15:39:34.000Z","likes":3,"private":false,"repoType":"model","isLikedByUser":false},{"author":"mteb","authorData":{"_id":"624bfda5459c48438cc39f80","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5ff5943752c26e9bc240bada/OrZxdlg8doDNO2TZ6Q58G.png","fullname":"Massive Text Embedding Benchmark","name":"mteb","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":916,"isUserFollowing":false},"downloads":1,"gated":false,"id":"mteb/index_stackexchange_bm25","availableInferenceProviders":[],"lastModified":"2024-07-27T03:49:15.000Z","likes":1,"private":false,"repoType":"model","isLikedByUser":false}],"paperPreviews":[{"_id":"2602.16008","title":"MAEB: Massive Audio Embedding Benchmark","id":"2602.16008","thumbnailUrl":"https://cdn-thumbnails.huggingface.co/social-thumbnails/papers/2602.16008.png"},{"_id":"2510.10062","title":"HUME: Measuring the Human-Model Performance Gap in Text Embedding Task","id":"2510.10062","thumbnailUrl":"https://cdn-thumbnails.huggingface.co/social-thumbnails/papers/2510.10062.png"}],"spaces":[{"author":"mteb","authorData":{"_id":"624bfda5459c48438cc39f80","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5ff5943752c26e9bc240bada/OrZxdlg8doDNO2TZ6Q58G.png","fullname":"Massive Text Embedding Benchmark","name":"mteb","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":916,"isUserFollowing":false},"colorFrom":"blue","colorTo":"indigo","createdAt":"2022-09-29T11:29:23.000Z","emoji":"π₯","id":"mteb/leaderboard","lastModified":"2026-01-20T21:57:52.000Z","likes":7047,"pinned":true,"private":false,"sdk":"docker","repoType":"space","runtime":{"stage":"RUNNING","hardware":{"current":"cpu-upgrade","requested":"cpu-upgrade"},"storage":null,"gcTimeout":172800,"replicas":{"current":1,"requested":1},"devMode":false,"domains":[{"domain":"mteb-leaderboard.hf.space","stage":"READY"}],"sha":"5132ddc89608299ca2f183fc13b60e1fafd2de7d"},"shortDescription":"Embedding Leaderboard","title":"MTEB Leaderboard","isLikedByUser":false,"ai_short_description":"Explore multilingual text embedding benchmark results","ai_category":"Text Analysis","trendingScore":17,"tags":["docker","leaderboard","region:us"],"featured":false},{"author":"mteb","authorData":{"_id":"624bfda5459c48438cc39f80","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5ff5943752c26e9bc240bada/OrZxdlg8doDNO2TZ6Q58G.png","fullname":"Massive Text Embedding Benchmark","name":"mteb","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":916,"isUserFollowing":false},"colorFrom":"blue","colorTo":"indigo","createdAt":"2025-02-04T10:13:13.000Z","emoji":"π₯","id":"mteb/leaderboard_legacy","lastModified":"2026-02-08T18:15:59.000Z","likes":37,"pinned":false,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"RUNNING","hardware":{"current":"cpu-basic","requested":"cpu-basic"},"storage":null,"gcTimeout":172800,"replicas":{"current":1,"requested":1},"devMode":false,"domains":[{"domain":"mteb-leaderboard-legacy.hf.space","stage":"READY"}],"sha":"92a31ad4e724352299fc495f48e5d48f3cbc585f"},"title":"MTEB Legacy Leaderboard","isLikedByUser":false,"originRepo":{"name":"mteb/leaderboard","author":{"_id":"624bfda5459c48438cc39f80","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5ff5943752c26e9bc240bada/OrZxdlg8doDNO2TZ6Q58G.png","fullname":"Massive Text Embedding Benchmark","name":"mteb","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":916,"isUserFollowing":false}},"ai_short_description":"Explore and filter MTEB model benchmark results","ai_category":"Data Visualization","trendingScore":0,"tags":["gradio","leaderboard","region:us"],"featured":false},{"author":"mteb","authorData":{"_id":"624bfda5459c48438cc39f80","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5ff5943752c26e9bc240bada/OrZxdlg8doDNO2TZ6Q58G.png","fullname":"Massive Text Embedding Benchmark","name":"mteb","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":916,"isUserFollowing":false},"colorFrom":"indigo","colorTo":"blue","createdAt":"2025-09-08T08:43:31.000Z","emoji":"π’","id":"mteb/leaderboard_dev","lastModified":"2026-01-03T15:55:34.000Z","likes":11,"pinned":false,"private":false,"sdk":"docker","repoType":"space","runtime":{"stage":"RUNNING","hardware":{"current":"cpu-basic","requested":"cpu-basic"},"storage":null,"gcTimeout":172800,"replicas":{"current":1,"requested":1},"devMode":false,"domains":[{"domain":"mteb-leaderboard-dev.hf.space","stage":"READY"}],"sha":"0bf23ba3a677b035564d581f85c52ebf74afe74e"},"shortDescription":"Dedicated display for RTEB benchmark results","title":"Leaderboard Dev","isLikedByUser":false,"ai_short_description":"Compare text embedding models across languages and tasks","ai_category":"Text Analysis","trendingScore":0,"tags":["docker","region:us"],"featured":true},{"author":"mteb","authorData":{"_id":"624bfda5459c48438cc39f80","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5ff5943752c26e9bc240bada/OrZxdlg8doDNO2TZ6Q58G.png","fullname":"Massive Text Embedding Benchmark","name":"mteb","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":916,"isUserFollowing":false},"colorFrom":"indigo","colorTo":"blue","createdAt":"2024-07-13T02:13:04.000Z","emoji":"βοΈ","id":"mteb/arena","lastModified":"2025-09-30T22:45:49.000Z","likes":116,"pinned":false,"private":false,"sdk":"static","repoType":"space","runtime":{"stage":"RUNNING","hardware":{"current":null,"requested":null},"storage":null,"replicas":{"requested":1,"current":1}},"title":"MTEB Arena","isLikedByUser":false,"ai_short_description":"Display MTEB Arena interface","ai_category":"Other","trendingScore":0,"tags":["static","arena","leaderboard","region:us"],"featured":false}],"buckets":[],"numBuckets":0,"numDatasets":1436,"numModels":4,"numSpaces":5,"lastOrgActivities":[{"time":"2026-02-20T17:10:08.539Z","user":"Samoed","userAvatarUrl":"/avatars/7a4067accdd1005f78c3c4adad3ee0a5.svg","org":"mteb","orgAvatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5ff5943752c26e9bc240bada/OrZxdlg8doDNO2TZ6Q58G.png","type":"discussion","discussionData":{"num":4,"author":{"_id":"61af4544d691b3aadd1f62b6","avatarUrl":"/avatars/7a4067accdd1005f78c3c4adad3ee0a5.svg","fullname":"Solomatin Roman","name":"Samoed","type":"user","isPro":false,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":45,"isUserFollowing":false},"repo":{"name":"mteb/bm25s","type":"model"},"title":"Add evaluation results for model mteb/bm25s revision 0_1_10","status":"merged","createdAt":"2026-02-20T10:00:46.000Z","isPullRequest":true,"numComments":1,"topReactions":[],"numReactionUsers":0,"pinned":false,"repoOwner":{"name":"mteb","isParticipating":true,"type":"org","isDiscussionAuthor":true}},"repoId":"mteb/bm25s","repoType":"model","eventId":"6998957022ceed64ea55b572"},{"time":"2026-02-20T16:56:26.756Z","user":"Samoed","userAvatarUrl":"/avatars/7a4067accdd1005f78c3c4adad3ee0a5.svg","orgAvatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5ff5943752c26e9bc240bada/OrZxdlg8doDNO2TZ6Q58G.png","type":"collection","collection":{"id":"699892257d7ae20f45f40ca9","slug":"mteb/hume-699892257d7ae20f45f40ca9","title":"HUME","description":"The HUME benchmark is designed to evaluate the performance of text embedding models and humans on a comparable set of tasks. This captures areas wh...","lastUpdated":"2026-02-20T16:56:31.704Z","numberItems":17,"owner":{"_id":"624bfda5459c48438cc39f80","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5ff5943752c26e9bc240bada/OrZxdlg8doDNO2TZ6Q58G.png","fullname":"Massive Text Embedding Benchmark","name":"mteb","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":916,"isUserFollowing":false},"theme":"pink","shareUrl":"https://hf.co/collections/mteb/hume","upvotes":0,"isUpvotedByUser":false},"org":"mteb"},{"time":"2026-02-20T16:56:08.919Z","user":"Samoed","userAvatarUrl":"/avatars/7a4067accdd1005f78c3c4adad3ee0a5.svg","orgAvatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5ff5943752c26e9bc240bada/OrZxdlg8doDNO2TZ6Q58G.png","type":"collection","collection":{"id":"699892257d7ae20f45f40ca9","slug":"mteb/hume-699892257d7ae20f45f40ca9","title":"HUME","description":"The HUME benchmark is designed to evaluate the performance of text embedding models and humans on a comparable set of tasks. This captures areas wh...","lastUpdated":"2026-02-20T16:56:31.704Z","numberItems":17,"owner":{"_id":"624bfda5459c48438cc39f80","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5ff5943752c26e9bc240bada/OrZxdlg8doDNO2TZ6Q58G.png","fullname":"Massive Text Embedding Benchmark","name":"mteb","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":916,"isUserFollowing":false},"theme":"pink","shareUrl":"https://hf.co/collections/mteb/hume","upvotes":0,"isUpvotedByUser":false},"org":"mteb"}],"acceptLanguages":["*"],"canReadRepos":false,"canReadSpaces":false,"blogPosts":[],"currentRepoPage":0,"filters":{},"paperView":false}">
| Overview | \n\n |
|---|---|
| π Leaderboard | \nThe interactive leaderboard of the benchmark | \n
| Get Started. | \n\n |
| π Get Started | \nOverview of how to use mteb | \n
| π€ Defining Models | \nHow to use existing model and define custom ones | \n
| π Selecting tasks | \nHow to select tasks, benchmarks, splits etc. | \n
| π Running Evaluation | \nHow to run the evaluations, including cache management, speeding up evaluations etc. | \n
| π Loading Results | \nHow to load and work with existing model results | \n
| Overview. | \n\n |
| π Tasks | \nOverview of available tasks | \n
| π Benchmarks | \nOverview of available benchmarks | \n
| π€ Models | \nOverview of available Models | \n
| Contributing | \n\n |
| π€ Adding a model | \nHow to submit a model to MTEB and to the leaderboard | \n
| π©βπ» Adding a dataset | \nHow to add a new task/dataset to MTEB | \n
| π©βπ» Adding a benchmark | \nHow to add a new benchmark to MTEB and to the leaderboard | \n
| π€ Contributing | \nHow to contribute to MTEB and set it up for development | \n
AI & ML interests
Massive Text Embeddings Benchmark
Recent Activity
Papers
MAEB: Massive Audio Embedding Benchmark
HUME: Measuring the Human-Model Performance Gap in Text Embedding Task
Organization Card
MTEB is a Python framework for evaluating embeddings and retrieval systems for both text and image. MTEB covers more than 1000 languages and diverse tasks, from classics like classification and clustering to use-case specialized tasks such as legal, code, or healthcare retrieval.
You can get started using mteb, check out our documentation.
| Overview | |
|---|---|
| π Leaderboard | The interactive leaderboard of the benchmark |
| Get Started. | |
| π Get Started | Overview of how to use mteb |
| π€ Defining Models | How to use existing model and define custom ones |
| π Selecting tasks | How to select tasks, benchmarks, splits etc. |
| π Running Evaluation | How to run the evaluations, including cache management, speeding up evaluations etc. |
| π Loading Results | How to load and work with existing model results |
| Overview. | |
| π Tasks | Overview of available tasks |
| π Benchmarks | Overview of available benchmarks |
| π€ Models | Overview of available Models |
| Contributing | |
| π€ Adding a model | How to submit a model to MTEB and to the leaderboard |
| π©βπ» Adding a dataset | How to add a new task/dataset to MTEB |
| π©βπ» Adding a benchmark | How to add a new benchmark to MTEB and to the leaderboard |
| π€ Contributing | How to contribute to MTEB and set it up for development |
spaces
5
pinned
Running
on
CPU Upgrade
7.05k
MTEB Leaderboard
π₯
Embedding Leaderboard
Running
37
MTEB Legacy Leaderboard
π₯
Explore and filter MTEB model benchmark results
Running
Featured
11
Leaderboard Dev
π’
Dedicated display for RTEB benchmark results
Running
116
MTEB Arena
β
Display MTEB Arena interface
datasets
1,436
mteb/results
Updated
β’
167k
β’
1
mteb/worldqa
Viewer
β’
Updated
β’
3.28k
mteb/arguana
Viewer
β’
Updated
β’
11.5k
β’
9.21k
β’
1
mteb/MVBench
Viewer
β’
Updated
β’
3.9k
mteb/panda-70m
Viewer
β’
Updated
β’
3.4k
β’
7
mteb/AudioCaps_AV
Viewer
β’
Updated
β’
666
β’
5
mteb/VALOR-32K
Viewer
β’
Updated
β’
3.49k
β’
7
mteb/AVE-Dataset
Viewer
β’
Updated
β’
3.71k
β’
6
mteb/VGGSound_AV_RETRIEVAL
Viewer
β’
Updated
β’
696
β’
6
mteb/Shot2Story20K_test
Viewer
β’
Updated
β’
4.02k
β’
5