Deprecated: The each() function is deprecated. This message will be suppressed on further calls in /home/zhenxiangba/zhenxiangba.com/public_html/phproxy-improved-master/index.php on line 456
ccvl (CCVL at JHU)
[go: Go Back, main page]

\n\n

The main goal of the CCVL (Computational Cognition, Vision, and Learning) research group is to develop mathematical models of vision and cognition. These models are intended primarily for designing artificial (computer) vision systems. Learning is required for extracting knowledge from data. Practical applications include vision for the disabled. These models also serve as computational models of biological vision which can be tested by behavioral methods and, in collaborative projects, with invasive, and non-invasive neuroscience techniques. We also study how humans and animals perform cognitive tasks such as learning and reasoning. In addition, we also use machine learning for interpreting medical images and studying brain function.

\n

Lab homepage: ccvl.jhu.edu

\n

Lab members: ccvl.jhu.edu/team/

\n","classNames":"hf-sanitized hf-sanitized-UkiA0fxq-NbbYlgCcul2C"},"users":[{"_id":"625f81afe1994410eef1c36a","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1650426282769-noauth.jpeg","isPro":true,"fullname":"Wufei Ma","user":"wufeim","type":"user"},{"_id":"639f1e519f1f2baab2f00d22","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/639f1e519f1f2baab2f00d22/pFjd51WZuVZ3A11rItvmk.jpeg","isPro":false,"fullname":"Qihao Liu","user":"QHL067","type":"user"},{"_id":"64cbf523e3cc4a476d8291b6","avatarUrl":"/avatars/825d7665db471e46921abad3319c2846.svg","isPro":false,"fullname":"Jiahao Wang","user":"jiahaoplus","type":"user"},{"_id":"650f5f8455dc1e84175a601a","avatarUrl":"/avatars/a212a839e7e98bb5fc2494c806fa2b4b.svg","isPro":false,"fullname":"Xiaoding Yuan","user":"xyuan18","type":"user"},{"_id":"6446940217bdb138c95245ea","avatarUrl":"/avatars/ad6701804ca92abfb257edcaef89aa19.svg","isPro":false,"fullname":"Prakhar Kaushik","user":"toshi2k2","type":"user"},{"_id":"652d84e4656cca7ce9e82d5a","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/652d84e4656cca7ce9e82d5a/FcVTLrJSr1n8REnRQP9jK.jpeg","isPro":false,"fullname":"Yaoyao Liu","user":"yaoyaoliu","type":"user"},{"_id":"65ce61fa089292060ed20b82","avatarUrl":"/avatars/9d80ae30021cad934791c2cd57ddd307.svg","isPro":false,"fullname":"Tom Fischer","user":"tom-fischer96","type":"user"},{"_id":"661434dd0dcdede49138cd7c","avatarUrl":"/avatars/35f08bf08c5e9edfb3c78e280af718cb.svg","isPro":true,"fullname":"Guofeng Zhang","user":"guofeng1123","type":"user"},{"_id":"63cd89ae374057a338eace86","avatarUrl":"/avatars/b203bc7c86408bf18ed3d61b847c99d8.svg","isPro":false,"fullname":"Jiawei Peng","user":"tokamakp","type":"user"},{"_id":"66e0b013733965882099cc37","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/66e0b013733965882099cc37/CkTK2kV2v-TfdYiwsW6Tx.jpeg","isPro":false,"fullname":"Tiezheng Zhang","user":"PatZhang11","type":"user"},{"_id":"66631978126013dc6a1fe62d","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/66631978126013dc6a1fe62d/PK0ZA6ic_iMuC230ZzWeh.png","isPro":false,"fullname":"Yu-Cheng Chou","user":"Johnson111788","type":"user"},{"_id":"661c9059bcd78151e5c06ea1","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/661c9059bcd78151e5c06ea1/27bfNo1LZeZQ77vWuAa10.png","isPro":false,"fullname":"Ju He","user":"turkeyju","type":"user"},{"_id":"64b5ba6060274cbb296d6288","avatarUrl":"/avatars/67e0343954dda6e92ed3f6e7976f9f87.svg","isPro":true,"fullname":"Junfei Xiao","user":"lambertxiao","type":"user"},{"_id":"63494053b53c4d0fbe46d050","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/63494053b53c4d0fbe46d050/0eSHxHNvP4Ml4KCVAsalh.jpeg","isPro":false,"fullname":"Soumava Paul","user":"mvp18","type":"user"}],"userCount":14,"collections":[{"slug":"ccvl/spatialreasoner-68114caec81774edbf1781d3","title":"SpatialReasoner","description":"","gating":false,"lastUpdated":"2025-09-21T00:11:15.799Z","owner":{"_id":"64cbeeddaa31c5d4ec677d0d","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/625f81afe1994410eef1c36a/cNylUA1sz-8IpMsUsbMGj.png","fullname":"CCVL at JHU","name":"ccvl","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":27,"isUserFollowing":false},"items":[{"_id":"6840c9ed28c391936faf252d","position":0,"type":"model","author":"ccvl","authorData":{"_id":"64cbeeddaa31c5d4ec677d0d","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/625f81afe1994410eef1c36a/cNylUA1sz-8IpMsUsbMGj.png","fullname":"CCVL at JHU","name":"ccvl","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":27,"isUserFollowing":false},"downloads":214,"gated":false,"id":"ccvl/SpatialReasoner","availableInferenceProviders":[],"lastModified":"2025-06-04T22:42:25.000Z","likes":4,"private":false,"repoType":"model","isLikedByUser":false,"numParameters":8292166656},{"_id":"6840c9fbce16c6bf978eb120","position":1,"type":"model","author":"ccvl","authorData":{"_id":"64cbeeddaa31c5d4ec677d0d","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/625f81afe1994410eef1c36a/cNylUA1sz-8IpMsUsbMGj.png","fullname":"CCVL at JHU","name":"ccvl","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":27,"isUserFollowing":false},"downloads":1,"gated":false,"id":"ccvl/SpatialReasoner-SFT","availableInferenceProviders":[],"lastModified":"2025-06-04T22:56:03.000Z","likes":0,"private":false,"repoType":"model","isLikedByUser":false,"numParameters":8292166656},{"_id":"6840ca028986af36f66e545f","position":2,"type":"model","author":"ccvl","authorData":{"_id":"64cbeeddaa31c5d4ec677d0d","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/625f81afe1994410eef1c36a/cNylUA1sz-8IpMsUsbMGj.png","fullname":"CCVL at JHU","name":"ccvl","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":27,"isUserFollowing":false},"downloads":1,"gated":false,"id":"ccvl/SpatialReasoner-Zero","availableInferenceProviders":[],"lastModified":"2025-06-04T22:56:35.000Z","likes":0,"private":false,"repoType":"model","isLikedByUser":false,"numParameters":8292166656},{"_id":"6840c9dc9de7b5f76d1fc49a","position":3,"type":"dataset","author":"ccvl","downloads":41,"gated":false,"id":"ccvl/SpatialReasonerEval","lastModified":"2025-04-15T07:07:06.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":1200,"libraries":["datasets","pandas","mlcroissant","polars"],"formats":["csv"],"modalities":["tabular","text"]},"private":false,"repoType":"dataset","likes":1,"isLikedByUser":false,"isBenchmark":false}],"position":0,"theme":"green","private":false,"shareUrl":"https://hf.co/collections/ccvl/spatialreasoner","upvotes":1,"isUpvotedByUser":false}],"datasets":[{"author":"ccvl","downloads":1291,"gated":false,"id":"ccvl/LAION-High-Qualtiy-Pro-6M-VLV","lastModified":"2025-09-20T04:11:55.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":6039211,"libraries":["datasets","dask","mlcroissant","polars"],"formats":["parquet"],"modalities":["text"]},"private":false,"repoType":"dataset","likes":2,"isLikedByUser":false,"isBenchmark":false},{"author":"ccvl","downloads":24,"gated":false,"id":"ccvl/DSPart_SD3","lastModified":"2025-09-13T19:45:58.000Z","datasetsServerInfo":{"viewer":"preview","numRows":0,"libraries":[],"formats":[],"modalities":["image"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false},{"author":"ccvl","downloads":31,"gated":false,"id":"ccvl/SpatialReasonerTrain","lastModified":"2025-06-18T02:38:52.000Z","datasetsServerInfo":{"viewer":"viewer-partial","numRows":45671,"libraries":["datasets","webdataset","mlcroissant"],"formats":["webdataset"],"modalities":["image","text"]},"private":false,"repoType":"dataset","likes":1,"isLikedByUser":false,"isBenchmark":false},{"author":"ccvl","downloads":135,"gated":false,"id":"ccvl/SpatialReasonerTrain-SFT","lastModified":"2025-04-23T07:50:16.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":48000,"libraries":["datasets","pandas","mlcroissant","polars"],"formats":["parquet"],"modalities":["text"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false},{"author":"ccvl","downloads":41,"gated":false,"id":"ccvl/SpatialReasonerEval","lastModified":"2025-04-15T07:07:06.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":1200,"libraries":["datasets","pandas","mlcroissant","polars"],"formats":["csv"],"modalities":["tabular","text"]},"private":false,"repoType":"dataset","likes":1,"isLikedByUser":false,"isBenchmark":false},{"author":"ccvl","downloads":11,"gated":false,"id":"ccvl/SpatialReasoner-Basic3D-QA","lastModified":"2025-04-10T15:08:23.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":24000,"libraries":["datasets","pandas","mlcroissant","polars"],"formats":["parquet"],"modalities":["text"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false},{"author":"ccvl","downloads":10,"gated":false,"id":"ccvl/3dsrbench_hf_mini","lastModified":"2025-03-29T06:00:51.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":20,"libraries":["datasets","pandas","mlcroissant","polars"],"formats":["parquet"],"modalities":["image","tabular","text"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false},{"author":"ccvl","downloads":14,"gated":false,"id":"ccvl/3dsrbench_hf","lastModified":"2025-03-28T07:50:46.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":11686,"libraries":["datasets","dask","mlcroissant","polars"],"formats":["parquet"],"modalities":["image","text"]},"private":false,"repoType":"dataset","likes":1,"isLikedByUser":false,"isBenchmark":false},{"author":"ccvl","downloads":17,"gated":false,"id":"ccvl/SpatialReasonerTrain-RL","lastModified":"2025-03-20T20:19:44.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":1200,"libraries":["datasets","pandas","mlcroissant","polars"],"formats":["parquet"],"modalities":["text"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false},{"author":"ccvl","downloads":7,"gated":false,"id":"ccvl/ReVision-Panda","lastModified":"2025-03-01T07:06:27.000Z","datasetsServerInfo":{"viewer":"viewer-partial","numRows":9285216,"libraries":["datasets","webdataset","mlcroissant"],"formats":["webdataset"],"modalities":["text"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false}],"models":[{"author":"ccvl","authorData":{"_id":"64cbeeddaa31c5d4ec677d0d","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/625f81afe1994410eef1c36a/cNylUA1sz-8IpMsUsbMGj.png","fullname":"CCVL at JHU","name":"ccvl","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":27,"isUserFollowing":false},"downloads":0,"gated":false,"id":"ccvl/Vision-Language-Vision","availableInferenceProviders":[],"lastModified":"2025-07-09T19:43:47.000Z","likes":0,"private":false,"repoType":"model","isLikedByUser":false},{"author":"ccvl","authorData":{"_id":"64cbeeddaa31c5d4ec677d0d","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/625f81afe1994410eef1c36a/cNylUA1sz-8IpMsUsbMGj.png","fullname":"CCVL at JHU","name":"ccvl","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":27,"isUserFollowing":false},"downloads":1,"gated":false,"id":"ccvl/SpatialReasoner-Zero","availableInferenceProviders":[],"lastModified":"2025-06-04T22:56:35.000Z","likes":0,"private":false,"repoType":"model","isLikedByUser":false,"numParameters":8292166656},{"author":"ccvl","authorData":{"_id":"64cbeeddaa31c5d4ec677d0d","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/625f81afe1994410eef1c36a/cNylUA1sz-8IpMsUsbMGj.png","fullname":"CCVL at JHU","name":"ccvl","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":27,"isUserFollowing":false},"downloads":1,"gated":false,"id":"ccvl/SpatialReasoner-SFT","availableInferenceProviders":[],"lastModified":"2025-06-04T22:56:03.000Z","likes":0,"private":false,"repoType":"model","isLikedByUser":false,"numParameters":8292166656},{"author":"ccvl","authorData":{"_id":"64cbeeddaa31c5d4ec677d0d","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/625f81afe1994410eef1c36a/cNylUA1sz-8IpMsUsbMGj.png","fullname":"CCVL at JHU","name":"ccvl","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":27,"isUserFollowing":false},"downloads":214,"gated":false,"id":"ccvl/SpatialReasoner","availableInferenceProviders":[],"lastModified":"2025-06-04T22:42:25.000Z","likes":4,"private":false,"repoType":"model","isLikedByUser":false,"numParameters":8292166656}],"paperPreviews":[{"_id":"2502.04700","title":"EigenLoRAx: Recycling Adapters to Find Principal Subspaces for\n Resource-Efficient Adaptation and Inference","id":"2502.04700","thumbnailUrl":"https://cdn-thumbnails.huggingface.co/social-thumbnails/papers/2502.04700.png"},{"_id":"2411.15966","title":"Gaussian Scenes: Pose-Free Sparse-View Scene Reconstruction using\n Depth-Enhanced Diffusion Priors","id":"2411.15966","thumbnailUrl":"https://cdn-thumbnails.huggingface.co/social-thumbnails/papers/2411.15966.png"}],"spaces":[],"buckets":[],"numBuckets":0,"numDatasets":21,"numModels":4,"numSpaces":1,"lastOrgActivities":[{"time":"2026-02-12T20:28:47.495Z","user":"turkeyju","userAvatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/661c9059bcd78151e5c06ea1/27bfNo1LZeZQ77vWuAa10.png","type":"paper","paper":{"id":"2602.09024","title":"Autoregressive Image Generation with Masked Bit Modeling","publishedAt":"2026-02-09T18:59:58.000Z","upvotes":5,"isUpvotedByUser":true}},{"time":"2025-12-25T20:48:26.215Z","user":"toshi2k2","userAvatarUrl":"/avatars/ad6701804ca92abfb257edcaef89aa19.svg","type":"paper","paper":{"id":"2512.18003","title":"Name That Part: 3D Part Segmentation and Naming","publishedAt":"2025-12-19T19:02:36.000Z","upvotes":4,"isUpvotedByUser":true}},{"time":"2025-12-19T03:28:38.740Z","user":"QHL067","userAvatarUrl":"","type":"paper-daily","paper":{"id":"2512.16921","title":"Differences That Matter: Auditing Models for Capability Gap Discovery and Rectification","thumbnailUrl":"https://cdn-thumbnails.huggingface.co/social-thumbnails/papers/2512.16921.png","upvotes":8,"publishedAt":"2025-12-18T18:59:57.000Z","isUpvotedByUser":true}}],"acceptLanguages":["*"],"canReadRepos":false,"canReadSpaces":false,"blogPosts":[],"currentRepoPage":0,"filters":{},"paperView":false}">

AI & ML interests

Computational Cognition, Vision, and Learning

Recent Activity

drawing

The main goal of the CCVL (Computational Cognition, Vision, and Learning) research group is to develop mathematical models of vision and cognition. These models are intended primarily for designing artificial (computer) vision systems. Learning is required for extracting knowledge from data. Practical applications include vision for the disabled. These models also serve as computational models of biological vision which can be tested by behavioral methods and, in collaborative projects, with invasive, and non-invasive neuroscience techniques. We also study how humans and animals perform cognitive tasks such as learning and reasoning. In addition, we also use machine learning for interpreting medical images and studying brain function.

Lab homepage: ccvl.jhu.edu

Lab members: ccvl.jhu.edu/team/