Deprecated: The each() function is deprecated. This message will be suppressed on further calls in /home/zhenxiangba/zhenxiangba.com/public_html/phproxy-improved-master/index.php on line 456
Paper page - SimWorld: An Open-ended Realistic Simulator for Autonomous Agents in Physical and Social Worlds
[go: Go Back, main page]

https://huggingface.co/papers/2512.01078

\n

My Hugging Face username: mao1207

\n

Would it be possible to remove this Daily Papers entry so that I can resubmit it with the correct video?

\n

cc \n\n@akhaliq\n\t
Thank you so much!

\n","updatedAt":"2025-12-03T09:48:06.470Z","author":{"_id":"65509836e5f380aca800c408","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/65509836e5f380aca800c408/wS6edLhx3YNaWWj0o8kaV.jpeg","fullname":"Lingjun Mao","name":"mao1207","type":"user","isPro":false,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":1,"isUserFollowing":false}},"numEdits":0,"identifiedLanguage":{"language":"en","probability":0.8760406970977783},"editors":["mao1207"],"editorAvatarUrls":["https://cdn-avatars.huggingface.co/v1/production/uploads/65509836e5f380aca800c408/wS6edLhx3YNaWWj0o8kaV.jpeg"],"reactions":[],"isReport":false}},{"id":"6930e64ff154f1d8753a84f9","author":{"_id":"63d3e0e8ff1384ce6c5dd17d","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1674830754237-63d3e0e8ff1384ce6c5dd17d.jpeg","fullname":"Librarian Bot (Bot)","name":"librarian-bot","type":"user","isPro":false,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":318,"isUserFollowing":false},"createdAt":"2025-12-04T01:39:27.000Z","type":"comment","data":{"edited":false,"hidden":false,"latest":{"raw":"This is an automated message from the [Librarian Bot](https://huggingface.co/librarian-bots). I found the following papers similar to this paper. \n\nThe following papers were recommended by the Semantic Scholar API \n\n* [FreeAskWorld: An Interactive and Closed-Loop Simulator for Human-Centric Embodied AI](https://huggingface.co/papers/2511.13524) (2025)\n* [World-in-World: World Models in a Closed-Loop World](https://huggingface.co/papers/2510.18135) (2025)\n* [A Comprehensive Survey on World Models for Embodied AI](https://huggingface.co/papers/2510.16732) (2025)\n* [SocialNav: Training Human-Inspired Foundation Model for Socially-Aware Embodied Navigation](https://huggingface.co/papers/2511.21135) (2025)\n* [ATLAS: Actor-Critic Task-Completion with Look-ahead Action Simulation](https://huggingface.co/papers/2510.22732) (2025)\n* [PAN: A World Model for General, Interactable, and Long-Horizon World Simulation](https://huggingface.co/papers/2511.09057) (2025)\n* [UltraCUA: A Foundation Model for Computer Use Agents with Hybrid Action](https://huggingface.co/papers/2510.17790) (2025)\n\n\n Please give a thumbs up to this comment if you found it helpful!\n\n If you want recommendations for any Paper on Hugging Face checkout [this](https://huggingface.co/spaces/librarian-bots/recommend_similar_papers) Space\n\n You can directly ask Librarian Bot for paper recommendations by tagging it in a comment: `@librarian-bot recommend`","html":"

This is an automated message from the Librarian Bot. I found the following papers similar to this paper.

\n

The following papers were recommended by the Semantic Scholar API

\n\n

Please give a thumbs up to this comment if you found it helpful!

\n

If you want recommendations for any Paper on Hugging Face checkout this Space

\n

You can directly ask Librarian Bot for paper recommendations by tagging it in a comment: \n\n@librarian-bot\n\t recommend

\n","updatedAt":"2025-12-04T01:39:27.762Z","author":{"_id":"63d3e0e8ff1384ce6c5dd17d","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1674830754237-63d3e0e8ff1384ce6c5dd17d.jpeg","fullname":"Librarian Bot (Bot)","name":"librarian-bot","type":"user","isPro":false,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":318,"isUserFollowing":false}},"numEdits":0,"identifiedLanguage":{"language":"en","probability":0.7346282005310059},"editors":["librarian-bot"],"editorAvatarUrls":["https://cdn-avatars.huggingface.co/v1/production/uploads/1674830754237-63d3e0e8ff1384ce6c5dd17d.jpeg"],"reactions":[],"isReport":false}}],"primaryEmailConfirmed":false,"paper":{"id":"2512.01078","authors":[{"_id":"692ffba226742347f61db184","name":"Jiawei Ren","hidden":false},{"_id":"692ffba226742347f61db185","name":"Yan Zhuang","hidden":false},{"_id":"692ffba226742347f61db186","user":{"_id":"6700678116bb14dfc9750d02","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/6700678116bb14dfc9750d02/zPVHXcMsD_fPfK0CbdtX2.jpeg","isPro":true,"fullname":"XiaokangYe","user":"KoeYe","type":"user"},"name":"Xiaokang Ye","status":"claimed_verified","statusLastChangedAt":"2025-12-04T08:48:24.594Z","hidden":false},{"_id":"692ffba226742347f61db187","user":{"_id":"65509836e5f380aca800c408","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/65509836e5f380aca800c408/wS6edLhx3YNaWWj0o8kaV.jpeg","isPro":false,"fullname":"Lingjun Mao","user":"mao1207","type":"user"},"name":"Lingjun Mao","status":"claimed_verified","statusLastChangedAt":"2025-12-04T08:48:27.174Z","hidden":false},{"_id":"692ffba226742347f61db188","name":"Xuhong He","hidden":false},{"_id":"692ffba226742347f61db189","name":"Jianzhi Shen","hidden":false},{"_id":"692ffba226742347f61db18a","user":{"_id":"67310ef1bbc14f43a4e9c283","avatarUrl":"/avatars/9ed1e6b8daa848e7f5dd3a111ef3ac69.svg","isPro":false,"fullname":"Mrinaal Dogra","user":"mdogra","type":"user"},"name":"Mrinaal Dogra","status":"claimed_verified","statusLastChangedAt":"2025-12-04T08:50:29.235Z","hidden":false},{"_id":"692ffba226742347f61db18b","user":{"_id":"654fed4d7490049d621d0cd3","avatarUrl":"/avatars/6bc1553d1571ab4e9ab80c58eeb9820c.svg","isPro":false,"fullname":"Yiming Liang","user":"Yiming1234","type":"user"},"name":"Yiming Liang","status":"claimed_verified","statusLastChangedAt":"2025-12-08T08:33:00.488Z","hidden":false},{"_id":"692ffba226742347f61db18c","name":"Ruixuan Zhang","hidden":false},{"_id":"692ffba226742347f61db18d","name":"Tianai Yue","hidden":false},{"_id":"692ffba226742347f61db18e","name":"Yiqing Yang","hidden":false},{"_id":"692ffba226742347f61db18f","name":"Eric Liu","hidden":false},{"_id":"692ffba226742347f61db190","name":"Ryan Wu","hidden":false},{"_id":"692ffba226742347f61db191","name":"Kevin Benavente","hidden":false},{"_id":"692ffba226742347f61db192","name":"Rajiv Mandya Nagaraju","hidden":false},{"_id":"692ffba226742347f61db193","name":"Muhammad Faayez","hidden":false},{"_id":"692ffba226742347f61db194","name":"Xiyan Zhang","hidden":false},{"_id":"692ffba226742347f61db195","name":"Dhruv Vivek Sharma","hidden":false},{"_id":"692ffba226742347f61db196","name":"Xianrui Zhong","hidden":false},{"_id":"692ffba226742347f61db197","user":{"_id":"630cfc45b66f088d547b2768","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/630cfc45b66f088d547b2768/9-dMts2xFVbPmPHJGGqBx.png","isPro":true,"fullname":"Martin Ziqiao Ma","user":"marstin","type":"user"},"name":"Ziqiao Ma","status":"claimed_verified","statusLastChangedAt":"2025-12-22T11:01:18.055Z","hidden":false},{"_id":"692ffba226742347f61db198","name":"Tianmin Shu","hidden":false},{"_id":"692ffba226742347f61db199","name":"Zhiting Hu","hidden":false},{"_id":"692ffba226742347f61db19a","name":"Lianhui Qin","hidden":false}],"publishedAt":"2025-11-30T20:58:13.000Z","submittedOnDailyAt":"2025-12-03T06:35:42.072Z","title":"SimWorld: An Open-ended Realistic Simulator for Autonomous Agents in Physical and Social Worlds","submittedOnDailyBy":{"_id":"65509836e5f380aca800c408","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/65509836e5f380aca800c408/wS6edLhx3YNaWWj0o8kaV.jpeg","isPro":false,"fullname":"Lingjun Mao","user":"mao1207","type":"user"},"summary":"While LLM/VLM-powered AI agents have advanced rapidly in math, coding, and computer use, their applications in complex physical and social environments remain challenging. Building agents that can survive and thrive in the real world (for example, by autonomously earning income or running a business) requires massive-scale interaction, reasoning, training, and evaluation across diverse embodied scenarios. However, existing world simulators for such development fall short: they often rely on limited hand-crafted environments, simulate simplified game-like physics and social rules, and lack native support for LLM/VLM agents. We introduce SimWorld, a new simulator built on Unreal Engine 5, designed for developing and evaluating LLM/VLM agents in rich, real-world-like settings. SimWorld offers three core capabilities: (1) realistic, open-ended world simulation, including accurate physical and social dynamics and language-driven procedural environment generation; (2) a rich interface for LLM/VLM agents, with multimodal world inputs and open-vocabulary actions at varying levels of abstraction; and (3) diverse and extensible physical and social reasoning scenarios that are easily customizable by users. We demonstrate SimWorld by deploying frontier LLM agents (e.g., GPT-4o, Gemini-2.5-Flash, Claude-3.5, and DeepSeek-Prover-V2) on long-horizon multi-agent delivery tasks involving strategic cooperation and competition. The results reveal distinct reasoning patterns and limitations across models. We open-source SimWorld and hope it becomes a foundational platform for advancing real-world agent intelligence across disciplines: https://simworld.org.","upvotes":34,"discussionId":"692ffba326742347f61db19b","projectPage":"https://simworld.org/","githubRepo":"https://github.com/SimWorld-AI/SimWorld","githubRepoAddedBy":"user","ai_summary":"SimWorld, a new Unreal Engine 5-based simulator, enables the development and evaluation of LLM/VLM agents in realistic, real-world-like settings with diverse physical and social reasoning scenarios.","ai_keywords":["LLM","VLM","AI agents","world simulation","Unreal Engine 5","procedural environment generation","multimodal inputs","open-vocabulary actions","reasoning scenarios","multi-agent delivery tasks","strategic cooperation","competition","DeepSeek-Prover-V2","GPT-4o","Gemini-2.5-Flash","Claude-3.5"],"githubStars":402,"organization":{"_id":"6962cd18131d81632ba1dfb4","name":"SimWorld-AI","fullname":"SimWorld","avatar":"https://cdn-uploads.huggingface.co/production/uploads/6700678116bb14dfc9750d02/LNYvuFDDcsc-d-aOLk_eO.png"}},"canReadDatabase":false,"canManagePapers":false,"canSubmit":false,"hasHfLevelAccess":false,"upvoted":false,"upvoters":[{"_id":"6407e5294edf9f5c4fd32228","avatarUrl":"/avatars/8e2d55460e9fe9c426eb552baf4b2cb0.svg","isPro":false,"fullname":"Stoney Kang","user":"sikang99","type":"user"},{"_id":"6039478ab3ecf716b1a5fd4d","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/6039478ab3ecf716b1a5fd4d/_Thy4E7taiSYBLKxEKJbT.jpeg","isPro":true,"fullname":"taesiri","user":"taesiri","type":"user"},{"_id":"63846de7e77914f72bb87e3d","avatarUrl":"/avatars/49bb6631d2ef075c2e8949acde0915c1.svg","isPro":false,"fullname":"Simon Kotchou","user":"Simon-Kotchou","type":"user"},{"_id":"665bfa1b0d71762b8613282d","avatarUrl":"/avatars/edbde7b1b47032339a1ecc59f8ea8f1a.svg","isPro":false,"fullname":"Zhiting Hu","user":"zhitinghu","type":"user"},{"_id":"64c2ab94e818eec6128bfeb7","avatarUrl":"/avatars/459794e67d43a4b739ddcf90d1921b31.svg","isPro":false,"fullname":"Yi Gu","user":"wuqing157","type":"user"},{"_id":"65824c614653431901ccd4b7","avatarUrl":"/avatars/27415598f340c7d458a86cbae1b02f9c.svg","isPro":false,"fullname":"Yichi Yang","user":"helium6072","type":"user"},{"_id":"65509836e5f380aca800c408","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/65509836e5f380aca800c408/wS6edLhx3YNaWWj0o8kaV.jpeg","isPro":false,"fullname":"Lingjun Mao","user":"mao1207","type":"user"},{"_id":"6672f62e8f1a71c3d6255f20","avatarUrl":"/avatars/4857f3b435ace0e7c63e8b3d21ca990c.svg","isPro":false,"fullname":"Yuhan Liu","user":"ml1893","type":"user"},{"_id":"675924ba41b35c932905735e","avatarUrl":"/avatars/817b24133fc606a03885597bf83981de.svg","isPro":false,"fullname":"Finch","user":"t47092","type":"user"},{"_id":"654fed4d7490049d621d0cd3","avatarUrl":"/avatars/6bc1553d1571ab4e9ab80c58eeb9820c.svg","isPro":false,"fullname":"Yiming Liang","user":"Yiming1234","type":"user"},{"_id":"6700678116bb14dfc9750d02","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/6700678116bb14dfc9750d02/zPVHXcMsD_fPfK0CbdtX2.jpeg","isPro":true,"fullname":"XiaokangYe","user":"KoeYe","type":"user"},{"_id":"65a0b8de51e699b22a6da8ec","avatarUrl":"/avatars/b133a48e5011041fa840d94e0c1060ec.svg","isPro":true,"fullname":"Qiyue Gao","user":"BertG666","type":"user"}],"acceptLanguages":["*"],"dailyPaperRank":0,"organization":{"_id":"6962cd18131d81632ba1dfb4","name":"SimWorld-AI","fullname":"SimWorld","avatar":"https://cdn-uploads.huggingface.co/production/uploads/6700678116bb14dfc9750d02/LNYvuFDDcsc-d-aOLk_eO.png"}}">
Papers
arxiv:2512.01078

SimWorld: An Open-ended Realistic Simulator for Autonomous Agents in Physical and Social Worlds

Published on Nov 30, 2025
· Submitted by
Lingjun Mao
on Dec 3, 2025
Authors:
,
,
,
,
,
,
,
,
,
,
,
,
,
,
,
,

Abstract

SimWorld, a new Unreal Engine 5-based simulator, enables the development and evaluation of LLM/VLM agents in realistic, real-world-like settings with diverse physical and social reasoning scenarios.

AI-generated summary

While LLM/VLM-powered AI agents have advanced rapidly in math, coding, and computer use, their applications in complex physical and social environments remain challenging. Building agents that can survive and thrive in the real world (for example, by autonomously earning income or running a business) requires massive-scale interaction, reasoning, training, and evaluation across diverse embodied scenarios. However, existing world simulators for such development fall short: they often rely on limited hand-crafted environments, simulate simplified game-like physics and social rules, and lack native support for LLM/VLM agents. We introduce SimWorld, a new simulator built on Unreal Engine 5, designed for developing and evaluating LLM/VLM agents in rich, real-world-like settings. SimWorld offers three core capabilities: (1) realistic, open-ended world simulation, including accurate physical and social dynamics and language-driven procedural environment generation; (2) a rich interface for LLM/VLM agents, with multimodal world inputs and open-vocabulary actions at varying levels of abstraction; and (3) diverse and extensible physical and social reasoning scenarios that are easily customizable by users. We demonstrate SimWorld by deploying frontier LLM agents (e.g., GPT-4o, Gemini-2.5-Flash, Claude-3.5, and DeepSeek-Prover-V2) on long-horizon multi-agent delivery tasks involving strategic cooperation and competition. The results reveal distinct reasoning patterns and limitations across models. We open-source SimWorld and hope it becomes a foundational platform for advancing real-world agent intelligence across disciplines: https://simworld.org.

Community

Paper author Paper submitter
edited Dec 3, 2025

Paper author Paper submitter

Hi! I recently submitted my paper to Daily Papers, but I accidentally used the wrong thumbnail as the cover.

Paper link: https://huggingface.co/papers/2512.01078

My Hugging Face username: mao1207

Would it be possible to remove this Daily Papers entry so that I can resubmit it with the correct video?

cc @akhaliq
Thank you so much!

This is an automated message from the Librarian Bot. I found the following papers similar to this paper.

The following papers were recommended by the Semantic Scholar API

Please give a thumbs up to this comment if you found it helpful!

If you want recommendations for any Paper on Hugging Face checkout this Space

You can directly ask Librarian Bot for paper recommendations by tagging it in a comment: @librarian-bot recommend

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2512.01078 in a model README.md to link it from this page.

Datasets citing this paper 1

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2512.01078 in a Space README.md to link it from this page.

Collections including this paper 1