Deprecated: The each() function is deprecated. This message will be suppressed on further calls in /home/zhenxiangba/zhenxiangba.com/public_html/phproxy-improved-master/index.php on line 456 Paper page - LatentChem: From Textual CoT to Latent Thinking in Chemical Reasoning
Please give a thumbs up to this comment if you found it helpful!
\n
If you want recommendations for any Paper on Hugging Face checkout this Space
\n
You can directly ask Librarian Bot for paper recommendations by tagging it in a comment: \n\n@librarian-bot\n\t recommend
\n","updatedAt":"2026-02-11T01:42:35.764Z","author":{"_id":"63d3e0e8ff1384ce6c5dd17d","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1674830754237-63d3e0e8ff1384ce6c5dd17d.jpeg","fullname":"Librarian Bot (Bot)","name":"librarian-bot","type":"user","isPro":false,"isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":318,"isUserFollowing":false}},"numEdits":0,"identifiedLanguage":{"language":"en","probability":0.7347308397293091},"editors":["librarian-bot"],"editorAvatarUrls":["https://cdn-avatars.huggingface.co/v1/production/uploads/1674830754237-63d3e0e8ff1384ce6c5dd17d.jpeg"],"reactions":[],"isReport":false}}],"primaryEmailConfirmed":false,"paper":{"id":"2602.07075","authors":[{"_id":"698a9b8d1b2dc6b37d61af1a","user":{"_id":"648c8f9eb8f4a3542b7f065b","avatarUrl":"/avatars/7320b2b940279755c1c53454fd028594.svg","isPro":false,"fullname":"Xinwu Ye","user":"XinwuYe","type":"user"},"name":"Xinwu Ye","status":"claimed_verified","statusLastChangedAt":"2026-02-10T09:06:51.472Z","hidden":false},{"_id":"698a9b8d1b2dc6b37d61af1b","name":"Yicheng Mao","hidden":false},{"_id":"698a9b8d1b2dc6b37d61af1c","name":"Jia Zhang","hidden":false},{"_id":"698a9b8d1b2dc6b37d61af1d","user":{"_id":"682259cdf0cb2560fcc41f4e","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/no-auth/BwG1kcULZ3BNVSYRKEi47.png","isPro":false,"fullname":"Yoyo Liu","user":"yoyoliuuu","type":"user"},"name":"Yimeng Liu","status":"claimed_verified","statusLastChangedAt":"2026-02-10T09:06:48.819Z","hidden":false},{"_id":"698a9b8d1b2dc6b37d61af1e","name":"Li Hao","hidden":false},{"_id":"698a9b8d1b2dc6b37d61af1f","name":"Fang Wu","hidden":false},{"_id":"698a9b8d1b2dc6b37d61af20","name":"Zhiwei Li","hidden":false},{"_id":"698a9b8d1b2dc6b37d61af21","name":"Yuxuan Liao","hidden":false},{"_id":"698a9b8d1b2dc6b37d61af22","name":"Zehong Wang","hidden":false},{"_id":"698a9b8d1b2dc6b37d61af23","name":"Zhiyuan Liu","hidden":false},{"_id":"698a9b8d1b2dc6b37d61af24","user":{"_id":"64e314ad24809d7fa0f20fbc","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/noauth/bHE0w_hjDFvU-Aul0_E7g.jpeg","isPro":false,"fullname":"Zhenfei Yin","user":"JeremyYin","type":"user"},"name":"Zhenfei Yin","status":"claimed_verified","statusLastChangedAt":"2026-02-10T09:06:53.788Z","hidden":false},{"_id":"698a9b8d1b2dc6b37d61af25","name":"Li Yuan","hidden":false},{"_id":"698a9b8d1b2dc6b37d61af26","name":"Philip Torr","hidden":false},{"_id":"698a9b8d1b2dc6b37d61af27","name":"Huan Sun","hidden":false},{"_id":"698a9b8d1b2dc6b37d61af28","name":"Xiangxiang Zeng","hidden":false},{"_id":"698a9b8d1b2dc6b37d61af29","name":"Mengdi Wang","hidden":false},{"_id":"698a9b8d1b2dc6b37d61af2a","name":"Le Cong","hidden":false},{"_id":"698a9b8d1b2dc6b37d61af2b","name":"Shenghua Gao","hidden":false},{"_id":"698a9b8d1b2dc6b37d61af2c","name":"Xiangru Tang","hidden":false}],"publishedAt":"2026-02-06T01:28:27.000Z","submittedOnDailyAt":"2026-02-10T01:22:01.665Z","title":"LatentChem: From Textual CoT to Latent Thinking in Chemical Reasoning","submittedOnDailyBy":{"_id":"63357c608adfa81faf2ac180","avatarUrl":"/avatars/ae0314c644f882251baf59b9134fd36f.svg","isPro":false,"fullname":"Xiangru Tang","user":"RTT1","type":"user"},"summary":"Chemical large language models (LLMs) predominantly rely on explicit Chain-of-Thought (CoT) in natural language to perform complex reasoning. However, chemical reasoning is inherently continuous and structural, and forcing it into discrete linguistic tokens introduces a fundamental representation mismatch that constrains both efficiency and performance. We introduce LatentChem, a latent reasoning interface that decouples chemical computation from textual generation, enabling models to perform multi-step reasoning directly in continuous latent space while emitting language only for final outputs. Remarkably, we observe a consistent emergent behavior: when optimized solely for task success, models spontaneously internalize reasoning, progressively abandoning verbose textual derivations in favor of implicit latent computation. This shift is not merely stylistic but computationally advantageous. Across diverse chemical reasoning benchmarks, LatentChem achieves a 59.88\\% non-tie win rate over strong CoT-based baselines on ChemCoTBench, while delivering a 10.84times average inference speedup. Our results provide empirical evidence that chemical reasoning is more naturally and effectively realized as continuous latent dynamics rather than discretized linguistic trajectories.","upvotes":18,"discussionId":"698a9b8d1b2dc6b37d61af2d","githubRepo":"https://github.com/xinwuye/LatentChem","githubRepoAddedBy":"user","ai_summary":"LatentChem enables chemical reasoning through continuous latent space computations instead of discrete textual tokens, achieving superior performance and efficiency compared to traditional chain-of-thought approaches.","ai_keywords":["chemical large language models","Chain-of-Thought","latent reasoning","continuous latent space","textual generation","multi-step reasoning","ChemCoTBench","inference speedup"],"githubStars":21},"canReadDatabase":false,"canManagePapers":false,"canSubmit":false,"hasHfLevelAccess":false,"upvoted":false,"upvoters":[{"_id":"648c8f9eb8f4a3542b7f065b","avatarUrl":"/avatars/7320b2b940279755c1c53454fd028594.svg","isPro":false,"fullname":"Xinwu Ye","user":"XinwuYe","type":"user"},{"_id":"63357c608adfa81faf2ac180","avatarUrl":"/avatars/ae0314c644f882251baf59b9134fd36f.svg","isPro":false,"fullname":"Xiangru Tang","user":"RTT1","type":"user"},{"_id":"675e0d5cdd3e9eeed6954f5a","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/no-auth/7oMEoBmaFiCR9K2q9Z_7q.png","isPro":false,"fullname":"Fang Wu","user":"fangwu97","type":"user"},{"_id":"665c4047023241e1898cea75","avatarUrl":"/avatars/6e3111ba6f55462af58d28e186af4d98.svg","isPro":false,"fullname":"zhangjia","user":"changjiakawhi","type":"user"},{"_id":"64e314ad24809d7fa0f20fbc","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/noauth/bHE0w_hjDFvU-Aul0_E7g.jpeg","isPro":false,"fullname":"Zhenfei Yin","user":"JeremyYin","type":"user"},{"_id":"65f3f43fc9940817ca9a427b","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/65f3f43fc9940817ca9a427b/02NN3XjSsbgWDhjrJWtVL.jpeg","isPro":false,"fullname":"Wanghan Xu","user":"CoCoOne","type":"user"},{"_id":"64eadcb03d76028d805a7818","avatarUrl":"/avatars/528e4fded4419caf08589b2ed40437bc.svg","isPro":false,"fullname":"Li Kang","user":"FACEONG","type":"user"},{"_id":"67930201aad25d3eecab81cc","avatarUrl":"/avatars/afa8e19ccd5214979e405caf462d7a72.svg","isPro":false,"fullname":"ZiyangZhou","user":"AzHouangy","type":"user"},{"_id":"62970df979f193515da13dc0","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/62970df979f193515da13dc0/A-mgKIcgTXRJ54GCHswTq.jpeg","isPro":false,"fullname":"Yanjun Shao","user":"super-dainiu","type":"user"},{"_id":"6434c9dc4b34368fdb07d421","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/6434c9dc4b34368fdb07d421/V_afg81iuNyMFfhM7qdgB.jpeg","isPro":false,"fullname":"fansunqi","user":"fansunqi","type":"user"},{"_id":"634ec067aae4bde2c8dfc86f","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/634ec067aae4bde2c8dfc86f/OQBLKcspofUqAzmEpvH0-.png","isPro":false,"fullname":"Yamata Zen","user":"yamatazen","type":"user"},{"_id":"675764b1a55640463e079271","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/no-auth/6EMvM7M8uP9K7EeVMqAz-.png","isPro":false,"fullname":"Yinxi Li","user":"Yinxxx","type":"user"}],"acceptLanguages":["*"],"dailyPaperRank":0}">
LatentChem enables chemical reasoning through continuous latent space computations instead of discrete textual tokens, achieving superior performance and efficiency compared to traditional chain-of-thought approaches.
AI-generated summary
Chemical large language models (LLMs) predominantly rely on explicit Chain-of-Thought (CoT) in natural language to perform complex reasoning. However, chemical reasoning is inherently continuous and structural, and forcing it into discrete linguistic tokens introduces a fundamental representation mismatch that constrains both efficiency and performance. We introduce LatentChem, a latent reasoning interface that decouples chemical computation from textual generation, enabling models to perform multi-step reasoning directly in continuous latent space while emitting language only for final outputs. Remarkably, we observe a consistent emergent behavior: when optimized solely for task success, models spontaneously internalize reasoning, progressively abandoning verbose textual derivations in favor of implicit latent computation. This shift is not merely stylistic but computationally advantageous. Across diverse chemical reasoning benchmarks, LatentChem achieves a 59.88\% non-tie win rate over strong CoT-based baselines on ChemCoTBench, while delivering a 10.84times average inference speedup. Our results provide empirical evidence that chemical reasoning is more naturally and effectively realized as continuous latent dynamics rather than discretized linguistic trajectories.