Deprecated: The each() function is deprecated. This message will be suppressed on further calls in /home/zhenxiangba/zhenxiangba.com/public_html/phproxy-improved-master/index.php on line 456
CVPR (CVPR Demo Track)
[go: Go Back, main page]

Join organization by clicking here\n

Hugging Face Gradio CVPR event\n

\n

\nCVPR organization is accepting Gradio demo submissions for CVPR papers from anyone for a chance to win prizes from Hugging Face, see prizes section and the leaderboard below. The deadline to submit demos is June 30th, 2022 (AOE Time Zone). For all partipants, feel free to submit Gradio demos for any CVPR paper for a chance to win prizes, you can submit demos for multiple papers. Find tutorial on getting started with Gradio on Hugging Face here and to get started with the new Gradio Blocks API here

\n\n

Hugging Face Prizes

\n\n\n

LeaderBoard for Most Popular CVPR Spaces

\n

See the CVPR Leaderboard

\n \"Gradio\n

Hugging Face Spaces & Gradio for Showcasing your CVPR β€˜22 Demo \n

\n

\n In this tutorial, we will demonstrate how to showcase your demo with an easy to use web interface using the Gradio Python library and host it on Hugging Face Spaces so that conference attendees can easily find and try out your demos. Also, see https://gradio.app/introduction_to_blocks/, for a more flexible way to build Gradio Demos\n

\n

πŸš€ Create a Gradio Demo from your Model\n

\n

\nThe first step is to create a web demo from your model. As an example, we will be creating a demo from an image classification model (called model) which we will be uploading to Spaces. The full code for steps 1-4 can be found in this colab notebook.\n


\n\n

1. Install the gradio library\n

\n

\nAll you need to do is to run this in the terminal: pip install gradio\n

\n
\n

2. Define a function in your Python code that performs inference with your model on a data point and returns the prediction\n

\n

\nHere’s we define our image classification model prediction function in PyTorch (any framework, like TensorFlow, scikit-learn, JAX, or a plain Python will work as well):\n

\ndef predict(inp):\n\n  inp = Image.fromarray(inp.astype('uint8'), 'RGB')\n\n  inp = transforms.ToTensor()(inp).unsqueeze(0)\n\n  with torch.no_grad():\n\n    prediction = torch.nn.functional.softmax(model(inp)[0], dim=0)\n\n  return {labels[i]: float(prediction[i]) for i in range(1000)}\n\n
\n

\n\n

3. Then create a Gradio Interface using the function and the appropriate input and output types\n

\n

\nFor the image classification model from Step 2, it would like like this:\n

\n
\n\ninputs = gr.inputs.Image()\n\noutputs = gr.outputs.Label(num_top_classes=3)\n\nio = gr.Interface(fn=predict, inputs=inputs, outputs=outputs)\n\n
\n

\nIf you need help creating a Gradio Interface for your model, check out the Gradio Getting Started guide.\n

\n\n

4. Then launch() you Interface to confirm that it runs correctly locally (or wherever you are running Python)\n

\n
\n\nio.launch() \n\n
\n

\nYou should see a web interface like the following where you can drag and drop your data points and see the predictions:\n

\n\"Gradio\n
\n\n\n\n\n","html":"
\n

This organization invites participants to showoff conference papers on huggingface as a Gradio Web Demo

\n

Join organization by clicking here

\n

Hugging Face Gradio CVPR event\n

\n

\nCVPR organization is accepting Gradio demo submissions for CVPR papers from anyone for a chance to win prizes from Hugging Face, see prizes section and the leaderboard below. The deadline to submit demos is June 30th, 2022 (AOE Time Zone). For all partipants, feel free to submit Gradio demos for any CVPR paper for a chance to win prizes, you can submit demos for multiple papers. Find tutorial on getting started with Gradio on Hugging Face here and to get started with the new Gradio Blocks API here

\n\n

Hugging Face Prizes

\n\n\n

LeaderBoard for Most Popular CVPR Spaces

\n

See the CVPR Leaderboard

\n \"Gradio\n

Hugging Face Spaces & Gradio for Showcasing your CVPR β€˜22 Demo \n

\n

\n In this tutorial, we will demonstrate how to showcase your demo with an easy to use web interface using the Gradio Python library and host it on Hugging Face Spaces so that conference attendees can easily find and try out your demos. Also, see https://gradio.app/introduction_to_blocks/, for a more flexible way to build Gradio Demos\n

\n

πŸš€ Create a Gradio Demo from your Model\n

\n

\nThe first step is to create a web demo from your model. As an example, we will be creating a demo from an image classification model (called model) which we will be uploading to Spaces. The full code for steps 1-4 can be found in this colab notebook.\n


\n\n

1. Install the gradio library\n

\n

\nAll you need to do is to run this in the terminal: pip install gradio\n

\n
\n

2. Define a function in your Python code that performs inference with your model on a data point and returns the prediction\n

\n

\nHere’s we define our image classification model prediction function in PyTorch (any framework, like TensorFlow, scikit-learn, JAX, or a plain Python will work as well):\n

def predict(inp):\n\n

inp = Image.fromarray(inp.astype('uint8'), 'RGB')

\n

inp = transforms.ToTensor()(inp).unsqueeze(0)

\n

with torch.no_grad():

\n
prediction = torch.nn.functional.softmax(model(inp)[0], dim=0)\n
\n

return {labels[i]: float(prediction[i]) for i in range(1000)}\n\n

\n

\n\n

3. Then create a Gradio Interface using the function and the appropriate input and output types\n

\n

\nFor the image classification model from Step 2, it would like like this:\n

\n
\ninputs = gr.inputs.Image()\n\n

outputs = gr.outputs.Label(num_top_classes=3)

\n

io = gr.Interface(fn=predict, inputs=inputs, outputs=outputs)\n\n

\n

\nIf you need help creating a Gradio Interface for your model, check out the Gradio Getting Started guide.\n

\n\n

4. Then launch() you Interface to confirm that it runs correctly locally (or wherever you are running Python)\n

\n
\nio.launch() \n\n
\n

\nYou should see a web interface like the following where you can drag and drop your data points and see the predictions:\n

\n\"Gradio\n
\n\n\n\n\n","classNames":"hf-sanitized hf-sanitized--u_ZrLwgb0bBeNPGHhuvA"},"users":[{"_id":"61e1188afc27c0f5e3641eb3","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/61e1188afc27c0f5e3641eb3/R4JFlWgYomZFvZ9CLSZfD.jpeg","isPro":false,"fullname":"Humphrey Shi","user":"Humphrey","type":"user"},{"_id":"608b8bb39d7c9519b4adae19","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1621947938344-noauth.png","isPro":true,"fullname":"Abubakar Abid","user":"abidlabs","type":"user"},{"_id":"6183d0b249ef1d984699e4a3","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/6183d0b249ef1d984699e4a3/M6B9b73v8su9H8jfdcbgu.jpeg","isPro":false,"fullname":"Yuliang Xiu","user":"Yuliang","type":"user"},{"_id":"61f35e204eba72baea21ea23","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1654687991240-61f35e204eba72baea21ea23.jpeg","isPro":false,"fullname":"Shengsheng Huang","user":"shane-huang","type":"user"},{"_id":"61f1a4fa9241a1f9ef5355f2","avatarUrl":"/avatars/5b409e7563ad8b97dbf1e93d9dab58a2.svg","isPro":false,"fullname":"Chengyuan Xu","user":"cyxu","type":"user"},{"_id":"614592980a1bfdc6fbc7c4a3","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1643326579568-614592980a1bfdc6fbc7c4a3.jpeg","isPro":false,"fullname":"Edem Gold","user":"edemgold","type":"user"},{"_id":"61f44bab7eba274ea80b74ce","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/61f44bab7eba274ea80b74ce/BRbKX1jephdZ7D44FATl4.jpeg","isPro":false,"fullname":"Hyoung-Kyu Song","user":"deepkyu","type":"user"},{"_id":"62262603a8218a9b862b18cc","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1648321672399-62262603a8218a9b862b18cc.jpeg","isPro":false,"fullname":"Paul Engstler","user":"paulengstler","type":"user"},{"_id":"6058cb4d25cd24537dd59de8","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/6058cb4d25cd24537dd59de8/LKyDCiyqww34dC7bISfOf.jpeg","isPro":false,"fullname":"Tarek Naous","user":"tareknaous","type":"user"},{"_id":"6237f25f16004228e6c74e01","avatarUrl":"/avatars/cc63ce464a25702c8155610d2a708595.svg","isPro":false,"fullname":"Bowen Zhang","user":"BwZhang","type":"user"},{"_id":"6237f3eb4cf571bb1ed4036b","avatarUrl":"/avatars/cbfb8ddda18e65d18a818db9f176e16e.svg","isPro":false,"fullname":"Bo Zhang","user":"zhangmozhe","type":"user"},{"_id":"629af93263879b81f097af4f","avatarUrl":"/avatars/64182c44f3fc7c8e1735636ded1a1548.svg","isPro":false,"fullname":"Zhe Huang","user":"bebebe","type":"user"},{"_id":"61e95fe391802b3d70a12e40","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642684385475-noauth.jpeg","isPro":false,"fullname":"Mohamed Elawady","user":"mawady","type":"user"},{"_id":"6125df7f25027fb1ea9c7a41","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1629871954341-noauth.jpeg","isPro":false,"fullname":"Jianwei Yang","user":"jw2yang","type":"user"},{"_id":"628b4e50ac304a69264959ed","avatarUrl":"/avatars/762165fdd6c3202cbdce519b9f92da6a.svg","isPro":false,"fullname":"Ruonan Wang","user":"rnwang","type":"user"},{"_id":"629c7452a5d6f5fe10e5935e","avatarUrl":"/avatars/1c24420092c713ec6c1bcc58986f4d68.svg","isPro":false,"fullname":"YIHAO DING","user":"ydin0771","type":"user"},{"_id":"622b92ac0999fd48bde6bc6a","avatarUrl":"/avatars/805b421adf50a0cd3eab3f651c288fe3.svg","isPro":false,"fullname":"Francisco Lopez-Tiro","user":"Friscolt","type":"user"},{"_id":"623bc822cc92662892c521be","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1648084988854-noauth.jpeg","isPro":false,"fullname":"Francisco 'Cisco' Zabala","user":"datasith","type":"user"},{"_id":"60a7d1ca23ce37179774a210","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1651500117101-60a7d1ca23ce37179774a210.jpeg","isPro":false,"fullname":"Aanisha Bhattacharyya","user":"Aanisha","type":"user"},{"_id":"62a0ee248498753ef048c34e","avatarUrl":"/avatars/de076940ddd6ff95ce4cc755a49527cf.svg","isPro":false,"fullname":"Chandrakanth Gudavalli","user":"chandrakanth","type":"user"},{"_id":"60e579325bbf29465f9f3502","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/60e579325bbf29465f9f3502/2geXUMfH6OCq3ihvRT-zM.jpeg","isPro":false,"fullname":"Epoching","user":"Epoching","type":"user"},{"_id":"62a11992a14606333e3d3f29","avatarUrl":"/avatars/aac5ef4fe6682747ffb7ca6dfba7b4d1.svg","isPro":false,"fullname":"Foozhan Ataiefard","user":"Foozhan","type":"user"},{"_id":"6238cac491feb613c52cd4a3","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1647889083266-noauth.png","isPro":false,"fullname":"Flores","user":"JavierIA","type":"user"},{"_id":"5fa88824a13e063b8b2b5f9b","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1641849397749-5fa88824a13e063b8b2b5f9b.jpeg","isPro":false,"fullname":"Jacopo Mangiavachhi","user":"Jacopo","type":"user"},{"_id":"62434e02ffb7778797651d50","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1654982689149-62434e02ffb7778797651d50.jpeg","isPro":true,"fullname":"Xu Ma","user":"ma-xu","type":"user"},{"_id":"618ccf9219d7d2a706004b98","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1665920159204-618ccf9219d7d2a706004b98.png","isPro":false,"fullname":"g30rv1ty","user":"g30rv17ys","type":"user"},{"_id":"5fcaabed246881afd5b00167","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1651847561574-5fcaabed246881afd5b00167.jpeg","isPro":false,"fullname":"Muhtasham Oblokulov","user":"muhtasham","type":"user"},{"_id":"61b9835574ae3d6a548a5cf8","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1639547718615-noauth.jpeg","isPro":false,"fullname":"Anneudy Alberto Valdera","user":"Blade88","type":"user"},{"_id":"5eedcd2d73936372ea202743","avatarUrl":"/avatars/b1f0c7d41e81284b858ced0922db4a8e.svg","isPro":false,"fullname":"Neil Blake","user":"neibla","type":"user"},{"_id":"62a315a03621790e22aafd68","avatarUrl":"/avatars/4a8a01bbe807f494a03c7a49de3f6f97.svg","isPro":false,"fullname":"Raab","user":"Sigal","type":"user"},{"_id":"5f7c2cbbb1a525442ff96e39","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1623134857336-5f7c2cbbb1a525442ff96e39.jpeg","isPro":false,"fullname":"Ceyda Cinarel","user":"ceyda","type":"user"},{"_id":"5f17f0a0925b9863e28ad517","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/5f17f0a0925b9863e28ad517/fXIY5i9RLsIa1v3CCuVtt.jpeg","isPro":true,"fullname":"Victor Mustar","user":"victor","type":"user"},{"_id":"6245927be1b9dab15a3e7760","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1649901394922-6245927be1b9dab15a3e7760.png","isPro":false,"fullname":"ZengYifu","user":"Zengyf-CVer","type":"user"},{"_id":"62709a9d30409cb16703c7b6","avatarUrl":"/avatars/8dcb696527c15ffb6b95ff2d3b100991.svg","isPro":false,"fullname":"curranjanssens","user":"Curranj","type":"user"},{"_id":"62989cd743e990340bead77c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1654168730944-noauth.png","isPro":false,"fullname":"Jean de Dieu Nyandwi","user":"Nyandwi","type":"user"},{"_id":"62a527f17157a88f920afc50","avatarUrl":"/avatars/d20c9afcdc42db686e05472459fc09f6.svg","isPro":false,"fullname":"Haoxi Ran","user":"Hancy","type":"user"},{"_id":"629c95b7a5d6f5fe10e6ed45","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/629c95b7a5d6f5fe10e6ed45/Sy0Ype5snsRookID-gsSm.jpeg","isPro":false,"fullname":"Yuming Jiang","user":"yumingj","type":"user"},{"_id":"62a54d0410334c1d024e2f59","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1664764278226-62a54d0410334c1d024e2f59.jpeg","isPro":false,"fullname":"Shuai Yang","user":"PKUWilliamYang","type":"user"},{"_id":"62a5487e828aa8df36c31f19","avatarUrl":"/avatars/10c8f7fc5b77e598744660d3d95aeadb.svg","isPro":false,"fullname":"Li Siyao","user":"lisiyao21","type":"user"},{"_id":"6197a1be4de8c6729a8ad089","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/6197a1be4de8c6729a8ad089/1mlcgsYvfxGCw8bdmKORY.jpeg","isPro":false,"fullname":"Furkan KΔ±nlΔ±","user":"birdortyedi","type":"user"},{"_id":"61f27a218b8bcfbf4870fe67","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1643280817703-noauth.jpeg","isPro":false,"fullname":"Ankit Kumar Upadhyay","user":"ankitkupadhyay","type":"user"},{"_id":"61676d89e4e97c73a070399d","avatarUrl":"/avatars/92ce4a97576c28741c33be2f94c3d395.svg","isPro":false,"fullname":"Jaekoo Kang","user":"jkang","type":"user"},{"_id":"62a81207f245503845e18fc2","avatarUrl":"/avatars/682d24785e4b97da53c87ccd24f1db8b.svg","isPro":false,"fullname":"aidewen","user":"aidewen","type":"user"},{"_id":"62a8687fe9a5f7534db61f46","avatarUrl":"/avatars/499dfdded43a3688e1787ec86293239b.svg","isPro":false,"fullname":"Gabriel Tolosa","user":"tolosoft","type":"user"},{"_id":"623f533a28672458f749b8e9","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1651044799068-623f533a28672458f749b8e9.png","isPro":false,"fullname":"Jiawe Ren","user":"jiawei011","type":"user"},{"_id":"626d5aa418565bd9fad1f7e1","avatarUrl":"/avatars/de2f90d578f7abeb1d9e9f9b11b20fd7.svg","isPro":false,"fullname":"Brenden Connors","user":"brendenc","type":"user"},{"_id":"617fb19badaa26f5e57e97cd","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1645002335184-617fb19badaa26f5e57e97cd.png","isPro":false,"fullname":"Schwartz","user":"Idan","type":"user"},{"_id":"61ea2dd3a6485977128d89b1","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1650753337085-61ea2dd3a6485977128d89b1.jpeg","isPro":false,"fullname":"Rushi","user":"0xrushi","type":"user"},{"_id":"6181c72cdcc1df2c9de8a4d8","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1655248010394-6181c72cdcc1df2c9de8a4d8.jpeg","isPro":false,"fullname":"Hila Chefer","user":"Hila","type":"user"},{"_id":"61f7671a481b345bd1059092","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1643603726045-noauth.jpeg","isPro":false,"fullname":"Jiarui Xu","user":"xvjiarui","type":"user"},{"_id":"6178ff1b9e61d7ff71b52b93","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1635319522808-noauth.jpeg","isPro":false,"fullname":"Kavya","user":"RobotJelly","type":"user"},{"_id":"62aa3f2e0efa8121f4562a26","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1655324560933-62aa3f2e0efa8121f4562a26.png","isPro":false,"fullname":"Matthew Walmer","user":"mwalmer","type":"user"},{"_id":"62a993d80472c0b7f94027df","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/62a993d80472c0b7f94027df/j5vp-IwLA2YBexylUHiQU.png","isPro":false,"fullname":"Zhang Yuanhan","user":"ZhangYuanhan","type":"user"},{"_id":"6299dfe09d845f7fb203ea70","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1655989333884-6299dfe09d845f7fb203ea70.jpeg","isPro":false,"fullname":"Wiselin Jaya Jos","user":"wiselinjayajos","type":"user"},{"_id":"62712dc3bcef985363d9ecec","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1654143862823-62712dc3bcef985363d9ecec.png","isPro":false,"fullname":"mindwrapped","user":"mindwrapped","type":"user"},{"_id":"60dc215386932230e632cdeb","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/60dc215386932230e632cdeb/7Tsyenn5aQsjgvM-JkcRu.jpeg","isPro":false,"fullname":"Miguel Guerrero","user":"apol","type":"user"},{"_id":"6272fc4e72af502faf5dc023","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1651702826922-noauth.png","isPro":false,"fullname":"Alayt","user":"ablam","type":"user"},{"_id":"62abac5a1029ed5ec7c213ce","avatarUrl":"/avatars/bfea97249cbc4c6b93b36971211a1f5f.svg","isPro":false,"fullname":"Charig Yang","user":"charigyang","type":"user"},{"_id":"628e213999e8298059312ba2","avatarUrl":"/avatars/e34e017294cbcf59414f30c609bc9527.svg","isPro":false,"fullname":"Himanshi","user":"Himanshi","type":"user"},{"_id":"628cb953ef14f971b6992fe5","avatarUrl":"/avatars/cd4efef87193366ca41f6971f2df5c78.svg","isPro":false,"fullname":"jonas","user":"jonasperegrino","type":"user"},{"_id":"62a066e50d5261aa7604cb7c","avatarUrl":"/avatars/ecf465cdfed1a93aecef4b1ed9f77170.svg","isPro":false,"fullname":"Dave Greenwood","user":"davegreenwood","type":"user"},{"_id":"62a9ea3a1fda81af28d6bba1","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1655303323845-62a9ea3a1fda81af28d6bba1.jpeg","isPro":false,"fullname":"Fabian Deuser","user":"Skyy93x","type":"user"},{"_id":"61ab68e85b18a6f7072b5fb2","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1651944399858-61ab68e85b18a6f7072b5fb2.jpeg","isPro":false,"fullname":"Nicholas Muchinguri","user":"nickmuchi","type":"user"},{"_id":"629a830e0fa8aec7f1bfee15","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/629a830e0fa8aec7f1bfee15/YILjW668CaLDGtSIu3qbJ.jpeg","isPro":false,"fullname":"Steve Omohundro","user":"steveom","type":"user"},{"_id":"62acfb26f13e0d3b06d83f63","avatarUrl":"/avatars/2e5932e3cb69ae26fb3ee21d1c7a444f.svg","isPro":false,"fullname":"Michael J MACMILLAN","user":"Fixate","type":"user"},{"_id":"62ad26ec36f7c7b7f67b0084","avatarUrl":"/avatars/d209bd074deab998eeb5ed6c78315b71.svg","isPro":false,"fullname":"Pavan S","user":"pavansudheendra","type":"user"},{"_id":"605b1a536ce6cabbb3474b5a","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1655583590216-605b1a536ce6cabbb3474b5a.jpeg","isPro":false,"fullname":"Animesh","user":"animesh007","type":"user"},{"_id":"62ae4a4217d21e782c899abb","avatarUrl":"/avatars/a022e89c8e66c13fce67413b0e914851.svg","isPro":false,"fullname":"Jingwen Wang","user":"JingwenJessica","type":"user"},{"_id":"62ae853a4b5d6165083a12f8","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/62ae853a4b5d6165083a12f8/3_Q1yB3Qfpd9PngzzBlgZ.jpeg","isPro":false,"fullname":"Gowthami Somepalli","user":"somepago","type":"user"},{"_id":"62134a75d2b74a48ca5fdab6","avatarUrl":"/avatars/0974a335fd056199c7b4a9d9d92f003e.svg","isPro":false,"fullname":"frankmao","user":"frankmao","type":"user"},{"_id":"6230c6ecfd8b720a5648f6c4","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1652253065548-6230c6ecfd8b720a5648f6c4.jpeg","isPro":false,"fullname":"Amanpreet Singh","user":"aps","type":"user"},{"_id":"62b0774cbaa2ce7b3ab1f3d4","avatarUrl":"/avatars/8d245455605fa33f41531a6669600df3.svg","isPro":false,"fullname":"Biaspaltsau","user":"Andru","type":"user"},{"_id":"6266c07e7a1f5a1562c4113b","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/6266c07e7a1f5a1562c4113b/GEt70X7-UmNIbECDA0eVs.jpeg","isPro":false,"fullname":"Paul Gavrikov","user":"paulgavrikov","type":"user"},{"_id":"6173c6ef8174823b39bff3ff","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/6173c6ef8174823b39bff3ff/WCEqCLlyTbvOBOBI2K_IN.png","isPro":false,"fullname":"Samarth Agarwal","user":"Sa-m","type":"user"},{"_id":"62b0d1df6015560860692072","avatarUrl":"/avatars/4da3bc6958beb39447b142f76022ff52.svg","isPro":false,"fullname":"Yusan Lin","user":"Yusanlin","type":"user"},{"_id":"618e1ee7cc2aa4637228d9a8","avatarUrl":"/avatars/cb2a65ea0f5be1d2e7a51c637c48c0a0.svg","isPro":false,"fullname":"Richard J. Chen","user":"Richarizardd","type":"user"},{"_id":"6126f3567faf48ab18fcf188","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/6126f3567faf48ab18fcf188/yfNqymlvI-DkV_HwPxs85.jpeg","isPro":false,"fullname":"Allan Victor","user":"BecomeAllan","type":"user"},{"_id":"608b2fd32c47d219adc2c12a","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1619734475465-noauth.jpeg","isPro":false,"fullname":"Joel Cabrera Rios","user":"weirdfish23","type":"user"},{"_id":"621d06ec785a50113cd921b4","avatarUrl":"/avatars/924fb640995876fc01addf54a3c49014.svg","isPro":false,"fullname":"Yuan Yuan","user":"miayuany","type":"user"},{"_id":"62b127da17fa24819de1802a","avatarUrl":"/avatars/55f447352724cf5be324261e96490607.svg","isPro":false,"fullname":"qiuzhijun","user":"zhijun","type":"user"},{"_id":"60c7ac6f1ef9b0a2ad1d3987","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1668842353426-60c7ac6f1ef9b0a2ad1d3987.jpeg","isPro":false,"fullname":"Sai Teja Macharla","user":"Saiteja","type":"user"},{"_id":"612e358d10c9918e17abddf6","avatarUrl":"/avatars/09388eb694033fa5a1abe728198266a9.svg","isPro":false,"fullname":"Ning","user":"zncook","type":"user"},{"_id":"62b18d1c73c87d9f3851f611","avatarUrl":"/avatars/9e40e875a0d3f9ecb84b8349015a202b.svg","isPro":false,"fullname":"Abdulhameed Dere","user":"DereAbdulhameed","type":"user"},{"_id":"611241c2e0075d7cf22d6bbd","avatarUrl":"/avatars/71c502ff464bfff6fdf91d174f5ec897.svg","isPro":false,"fullname":"Fawaz Sammani","user":"Fawaz","type":"user"},{"_id":"622f1e0b163b4ef6fd7a534f","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1647255037471-noauth.jpeg","isPro":false,"fullname":"Shivesh Raj","user":"Shivraj8615","type":"user"},{"_id":"61e9e3d4e2a95338e04c9f33","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/61e9e3d4e2a95338e04c9f33/c3Pfr2LrD5Dbf0eFkScP6.png","isPro":false,"fullname":"Tristan Thrush","user":"Tristan","type":"user"},{"_id":"61ae02d550768a05f2e7a125","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1638794025858-61ae02d550768a05f2e7a125.jpeg","isPro":false,"fullname":"ZHU Fan","user":"Alafun","type":"user"},{"_id":"62b11d79b7053ab38d296d51","avatarUrl":"/avatars/73763399d0e196968fba524eadfc835b.svg","isPro":false,"fullname":"You Li","user":"LiYou","type":"user"},{"_id":"61baa5231ca95cff16cbb98a","avatarUrl":"/avatars/b7d4e4500074118db4952c22f0f1e044.svg","isPro":false,"fullname":"Jung","user":"Heechul","type":"user"},{"_id":"62b1e92d63c396bbf02c1a54","avatarUrl":"/avatars/c4b24629239da6b408dd3ca3b1f398b7.svg","isPro":false,"fullname":"Rishabh Jain","user":"rjain17","type":"user"},{"_id":"62b239852baa4ec0ad39b4d0","avatarUrl":"/avatars/99b5e783f6eeb1db12fb5a6a8981f1cb.svg","isPro":false,"fullname":"Kareem Metwaly","user":"Kareem-metwaly","type":"user"},{"_id":"620ad511dbba8fc1fbb8bdea","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/620ad511dbba8fc1fbb8bdea/VHO9neB6VrtpfxEFUQnlm.jpeg","isPro":false,"fullname":"Wenxin Jiang","user":"jiang784","type":"user"},{"_id":"62b2c02b2d2f388f44382aaf","avatarUrl":"/avatars/d9683bb502c118ee8e1b69557a8750e1.svg","isPro":false,"fullname":"Mooyeol Baek","user":"mooyeolb","type":"user"},{"_id":"62b20d1d6a5435fd9a675c6c","avatarUrl":"/avatars/d65823f95f9ac141cb5c845879ec2382.svg","isPro":false,"fullname":"Jeon Do Hyun","user":"jdobbang","type":"user"},{"_id":"62b3a007f9a10809487b3654","avatarUrl":"/avatars/079d43e9ff855f9c4329f9ce3be3a5da.svg","isPro":false,"fullname":"Xin","user":"meowXin","type":"user"},{"_id":"6223f303e24db7b943abe2b9","avatarUrl":"/avatars/7460243e592603dd2a3402ae3d67c9c4.svg","isPro":false,"fullname":"yangtaowang","user":"yangtaowang","type":"user"},{"_id":"626c7cd90efe4e3686856e94","avatarUrl":"/avatars/2b6bd5aa418eddf0f1a7722ed90e45b7.svg","isPro":false,"fullname":"bread","user":"bread001","type":"user"},{"_id":"62b3e85bcbd2a402fc7804b1","avatarUrl":"/avatars/63125ce8a1e20b8c6e836f223d24284f.svg","isPro":false,"fullname":"noam rotstein","user":"noamrot","type":"user"}],"userCount":266,"collections":[],"datasets":[{"author":"CVPR","downloads":102,"gated":false,"id":"CVPR/CVPR2024-papers","lastModified":"2024-10-28T22:31:25.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":2715,"libraries":["datasets","pandas","mlcroissant","polars"],"formats":["parquet"],"modalities":["tabular","text"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false},{"author":"CVPR","downloads":96,"gated":false,"id":"CVPR/CVPR2023-papers","lastModified":"2024-10-28T22:25:47.000Z","datasetsServerInfo":{"viewer":"viewer","numRows":2353,"libraries":["datasets","pandas","mlcroissant","polars"],"formats":["parquet"],"modalities":["tabular","text"]},"private":false,"repoType":"dataset","likes":0,"isLikedByUser":false,"isBenchmark":false}],"models":[{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"downloads":0,"gated":false,"id":"CVPR/DualStyleGAN","availableInferenceProviders":[],"lastModified":"2023-03-18T13:53:08.000Z","likes":11,"private":false,"repoType":"model","isLikedByUser":false},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"downloads":0,"gated":false,"id":"CVPR/FSPBT","availableInferenceProviders":[],"lastModified":"2022-10-24T03:26:33.000Z","likes":1,"private":false,"repoType":"model","isLikedByUser":false}],"paperPreviews":[],"spaces":[{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"red","colorTo":"green","createdAt":"2022-07-11T12:16:32.000Z","emoji":"πŸ”₯","id":"CVPR/CVPR2022_papers","lastModified":"2025-01-11T06:32:17.000Z","likes":13,"pinned":true,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"RUNTIME_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"Exit code: ?. Reason: ","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-cvpr2022-papers.hf.space","stage":"READY"}]},"title":"CVPR2022 Papers","isLikedByUser":false,"ai_short_description":"Find CVPR 2022 papers by title","ai_category":"Document Analysis","trendingScore":0,"tags":["gradio","region:us"],"featured":false},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"purple","colorTo":"pink","createdAt":"2022-06-27T14:26:02.000Z","emoji":"πŸš˜πŸ™οΈ","id":"CVPR/MonoScene","lastModified":"2022-12-01T15:48:44.000Z","likes":49,"pinned":true,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"BUILD_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"Build failed with exit code: 1","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-monoscene.hf.space","stage":"READY"}]},"title":"MonoScene","isLikedByUser":false,"trendingScore":0,"tags":["gradio","region:us"],"featured":false},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"purple","colorTo":"pink","createdAt":"2022-07-02T18:48:24.000Z","emoji":"πŸš˜πŸ™οΈ","id":"CVPR/monoscene_lite","lastModified":"2022-07-02T19:55:40.000Z","likes":4,"pinned":true,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"RUNTIME_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"sitory to your trusted list, change the command to {calling_fn}(..., trust_repo=False) and a command prompt will appear asking for an explicit confirmation of trust, or load(..., trust_repo=True), which will assume that the prompt is to be answered with 'yes'. You can also use load(..., trust_repo='check') which will only prompt for confirmation if the repo is not already trusted. This will eventually be the default behaviour\n warnings.warn(\nDownloading: \"https://github.com/rwightman/gen-efficientnet-pytorch/zipball/master\" to /home/user/.cache/torch/hub/master.zip\nDownloading: \"https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ns-1dbc32de.pth\" to /home/user/.cache/torch/hub/checkpoints/tf_efficientnet_b7_ns-1dbc32de.pth\nLoading base model ()...Done.\nRemoving last two layers (global_pool & classifier).\nBuilding Encoder-Decoder model..Done.\n/home/user/.local/lib/python3.8/site-packages/gradio/deprecation.py:40: UserWarning: `enable_queue` is deprecated in `Interface()`, please use it within `launch()` instead.\n warnings.warn(value)\n/home/user/.local/lib/python3.8/site-packages/gradio/interface.py:286: UserWarning: Currently, only the 'default' theme is supported.\n warnings.warn(\"Currently, only the 'default' theme is supported.\")\nIMPORTANT: You are using gradio version 3.0.20, however version 3.14.0 is available, please upgrade.\n--------\nTraceback (most recent call last):\n File \"app.py\", line 121, in \n demo.launch(enable_queue=True, debug=False)\n File \"/home/user/.local/lib/python3.8/site-packages/gradio/blocks.py\", line 766, in launch\n server_port, path_to_local_server, app, server = networking.start_server(\n File \"/home/user/.local/lib/python3.8/site-packages/gradio/networking.py\", line 114, in start_server\n port = get_first_available_port(\n File \"/home/user/.local/lib/python3.8/site-packages/gradio/networking.py\", line 65, in get_first_available_port\n raise OSError(\nOSError: All ports from 7860 to 7860 are in use. Please close a port.\n","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-monoscene-lite.hf.space","stage":"READY"}]},"title":"MonoScene lite","isLikedByUser":false,"trendingScore":0,"tags":["gradio","region:us"],"featured":false},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"indigo","colorTo":"red","createdAt":"2022-06-15T04:43:53.000Z","emoji":"πŸ‘€","id":"CVPR/GroupViT","lastModified":"2022-06-20T20:02:35.000Z","likes":9,"pinned":true,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"RUNTIME_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"failed to create containerd task: failed to create shim task: context deadline exceeded: unknown","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-groupvit.hf.space","stage":"READY"}]},"title":"GroupViT","isLikedByUser":false,"trendingScore":0,"tags":["gradio","region:us"],"featured":false},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"purple","colorTo":"red","createdAt":"2022-03-23T20:42:59.000Z","emoji":"😻","id":"CVPR/DualStyleGAN","lastModified":"2024-06-10T04:29:43.000Z","likes":168,"pinned":false,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"RUNTIME_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"Exit code: 1. Reason: 3.10/site-packages/requests/sessions.py\", line 589, in request\n resp = self.send(prep, **send_kwargs)\n File \"/usr/local/lib/python3.10/site-packages/requests/sessions.py\", line 703, in send\n r = adapter.send(request, **kwargs)\n File \"/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_http.py\", line 66, in send\n return super().send(request, *args, **kwargs)\n File \"/usr/local/lib/python3.10/site-packages/requests/adapters.py\", line 713, in send\n raise ReadTimeout(e, request=request)\nrequests.exceptions.ReadTimeout: (ReadTimeoutError(\"HTTPSConnectionPool(host='huggingface.co', port=443): Read timed out. (read timeout=10)\"), '(Request ID: 9081758a-1de4-438d-8bb3-9bdcf94122e3)')\n\nThe above exception was the direct cause of the following exception:\n\nTraceback (most recent call last):\n File \"/home/user/app/app.py\", line 54, in \n model = Model()\n File \"/home/user/app/dualstylegan.py\", line 38, in __init__\n self.encoder_dict = self._load_encoder()\n File \"/home/user/app/dualstylegan.py\", line 82, in _load_encoder\n ckpt_path = huggingface_hub.hf_hub_download(MODEL_REPO,\n File \"/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py\", line 114, in _inner_fn\n return fn(*args, **kwargs)\n File \"/usr/local/lib/python3.10/site-packages/huggingface_hub/file_download.py\", line 1221, in hf_hub_download\n return _hf_hub_download_to_cache_dir(\n File \"/usr/local/lib/python3.10/site-packages/huggingface_hub/file_download.py\", line 1325, in _hf_hub_download_to_cache_dir\n _raise_on_head_call_error(head_call_error, force_download, local_files_only)\n File \"/usr/local/lib/python3.10/site-packages/huggingface_hub/file_download.py\", line 1826, in _raise_on_head_call_error\n raise LocalEntryNotFoundError(\nhuggingface_hub.utils._errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.\n","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-dualstylegan.hf.space","stage":"READY"}]},"title":"Portrait Style Transfer with DualStyleGAN","isLikedByUser":false,"ai_short_description":"Transform portraits into various artistic styles","ai_category":"Image","trendingScore":0,"tags":["gradio","region:us"],"featured":true},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"blue","colorTo":"blue","createdAt":"2022-05-31T05:57:52.000Z","emoji":"πŸ‘„","id":"CVPR/ml-talking-face","lastModified":"2024-04-04T02:28:35.000Z","likes":560,"pinned":false,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"RUNNING","hardware":{"current":"cpu-basic","requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"replicas":{"current":1,"requested":1},"devMode":false,"domains":[{"domain":"cvpr-ml-talking-face.hf.space","stage":"READY"}],"sha":"f10e37f18a71746b294c6f9fd6a9d27402cb2196"},"title":"Talking Face Generation with Multilingual TTS","isLikedByUser":false,"ai_short_description":"Generate multilingual talking-face videos from your text","ai_category":"Video Generation","trendingScore":1,"tags":["gradio","region:us"],"featured":true}],"buckets":[],"numBuckets":0,"numDatasets":2,"numModels":2,"numSpaces":33,"lastOrgActivities":[{"time":"2026-02-18T13:38:47.999Z","user":"liangyuch","userAvatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1658586059273-62b67da0f56de4396ca9e44b.jpeg","type":"paper","paper":{"id":"2602.12279","title":"UniT: Unified Multimodal Chain-of-Thought Test-time Scaling","publishedAt":"2026-02-12T18:59:49.000Z","upvotes":19,"isUpvotedByUser":true}},{"time":"2026-02-18T09:29:40.833Z","user":"liangyuch","userAvatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1658586059273-62b67da0f56de4396ca9e44b.jpeg","type":"paper-daily","paper":{"id":"2602.12279","title":"UniT: Unified Multimodal Chain-of-Thought Test-time Scaling","thumbnailUrl":"https://cdn-thumbnails.huggingface.co/social-thumbnails/papers/2602.12279.png","upvotes":19,"publishedAt":"2026-02-12T18:59:49.000Z","isUpvotedByUser":true}},{"time":"2026-02-11T11:20:44.508Z","user":"1aurent","userAvatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/6364f1784f773b7e4cede70c/Hya-Tas75hAims8B7asIn.jpeg","type":"paper","paper":{"id":"2601.08584","title":"Ministral 3","publishedAt":"2026-01-13T14:06:03.000Z","upvotes":54,"isUpvotedByUser":true}}],"acceptLanguages":["*"],"canReadRepos":false,"canReadSpaces":false,"blogPosts":[],"showRepoType":"space","numReposPerPage":24,"repos":[{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"red","colorTo":"green","createdAt":"2022-07-11T12:16:32.000Z","emoji":"πŸ”₯","id":"CVPR/CVPR2022_papers","lastModified":"2025-01-11T06:32:17.000Z","likes":13,"pinned":true,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"RUNTIME_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"Exit code: ?. Reason: ","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-cvpr2022-papers.hf.space","stage":"READY"}]},"title":"CVPR2022 Papers","isLikedByUser":false,"ai_short_description":"Find CVPR 2022 papers by title","ai_category":"Document Analysis","trendingScore":0,"tags":["gradio","region:us"],"featured":false},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"purple","colorTo":"pink","createdAt":"2022-06-27T14:26:02.000Z","emoji":"πŸš˜πŸ™οΈ","id":"CVPR/MonoScene","lastModified":"2022-12-01T15:48:44.000Z","likes":49,"pinned":true,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"BUILD_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"Build failed with exit code: 1","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-monoscene.hf.space","stage":"READY"}]},"title":"MonoScene","isLikedByUser":false,"trendingScore":0,"tags":["gradio","region:us"],"featured":false},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"purple","colorTo":"pink","createdAt":"2022-07-02T18:48:24.000Z","emoji":"πŸš˜πŸ™οΈ","id":"CVPR/monoscene_lite","lastModified":"2022-07-02T19:55:40.000Z","likes":4,"pinned":true,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"RUNTIME_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"sitory to your trusted list, change the command to {calling_fn}(..., trust_repo=False) and a command prompt will appear asking for an explicit confirmation of trust, or load(..., trust_repo=True), which will assume that the prompt is to be answered with 'yes'. You can also use load(..., trust_repo='check') which will only prompt for confirmation if the repo is not already trusted. This will eventually be the default behaviour\n warnings.warn(\nDownloading: \"https://github.com/rwightman/gen-efficientnet-pytorch/zipball/master\" to /home/user/.cache/torch/hub/master.zip\nDownloading: \"https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ns-1dbc32de.pth\" to /home/user/.cache/torch/hub/checkpoints/tf_efficientnet_b7_ns-1dbc32de.pth\nLoading base model ()...Done.\nRemoving last two layers (global_pool & classifier).\nBuilding Encoder-Decoder model..Done.\n/home/user/.local/lib/python3.8/site-packages/gradio/deprecation.py:40: UserWarning: `enable_queue` is deprecated in `Interface()`, please use it within `launch()` instead.\n warnings.warn(value)\n/home/user/.local/lib/python3.8/site-packages/gradio/interface.py:286: UserWarning: Currently, only the 'default' theme is supported.\n warnings.warn(\"Currently, only the 'default' theme is supported.\")\nIMPORTANT: You are using gradio version 3.0.20, however version 3.14.0 is available, please upgrade.\n--------\nTraceback (most recent call last):\n File \"app.py\", line 121, in \n demo.launch(enable_queue=True, debug=False)\n File \"/home/user/.local/lib/python3.8/site-packages/gradio/blocks.py\", line 766, in launch\n server_port, path_to_local_server, app, server = networking.start_server(\n File \"/home/user/.local/lib/python3.8/site-packages/gradio/networking.py\", line 114, in start_server\n port = get_first_available_port(\n File \"/home/user/.local/lib/python3.8/site-packages/gradio/networking.py\", line 65, in get_first_available_port\n raise OSError(\nOSError: All ports from 7860 to 7860 are in use. Please close a port.\n","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-monoscene-lite.hf.space","stage":"READY"}]},"title":"MonoScene lite","isLikedByUser":false,"trendingScore":0,"tags":["gradio","region:us"],"featured":false},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"indigo","colorTo":"red","createdAt":"2022-06-15T04:43:53.000Z","emoji":"πŸ‘€","id":"CVPR/GroupViT","lastModified":"2022-06-20T20:02:35.000Z","likes":9,"pinned":true,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"RUNTIME_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"failed to create containerd task: failed to create shim task: context deadline exceeded: unknown","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-groupvit.hf.space","stage":"READY"}]},"title":"GroupViT","isLikedByUser":false,"trendingScore":0,"tags":["gradio","region:us"],"featured":false},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"purple","colorTo":"red","createdAt":"2022-03-23T20:42:59.000Z","emoji":"😻","id":"CVPR/DualStyleGAN","lastModified":"2024-06-10T04:29:43.000Z","likes":168,"pinned":false,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"RUNTIME_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"Exit code: 1. Reason: 3.10/site-packages/requests/sessions.py\", line 589, in request\n resp = self.send(prep, **send_kwargs)\n File \"/usr/local/lib/python3.10/site-packages/requests/sessions.py\", line 703, in send\n r = adapter.send(request, **kwargs)\n File \"/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_http.py\", line 66, in send\n return super().send(request, *args, **kwargs)\n File \"/usr/local/lib/python3.10/site-packages/requests/adapters.py\", line 713, in send\n raise ReadTimeout(e, request=request)\nrequests.exceptions.ReadTimeout: (ReadTimeoutError(\"HTTPSConnectionPool(host='huggingface.co', port=443): Read timed out. (read timeout=10)\"), '(Request ID: 9081758a-1de4-438d-8bb3-9bdcf94122e3)')\n\nThe above exception was the direct cause of the following exception:\n\nTraceback (most recent call last):\n File \"/home/user/app/app.py\", line 54, in \n model = Model()\n File \"/home/user/app/dualstylegan.py\", line 38, in __init__\n self.encoder_dict = self._load_encoder()\n File \"/home/user/app/dualstylegan.py\", line 82, in _load_encoder\n ckpt_path = huggingface_hub.hf_hub_download(MODEL_REPO,\n File \"/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py\", line 114, in _inner_fn\n return fn(*args, **kwargs)\n File \"/usr/local/lib/python3.10/site-packages/huggingface_hub/file_download.py\", line 1221, in hf_hub_download\n return _hf_hub_download_to_cache_dir(\n File \"/usr/local/lib/python3.10/site-packages/huggingface_hub/file_download.py\", line 1325, in _hf_hub_download_to_cache_dir\n _raise_on_head_call_error(head_call_error, force_download, local_files_only)\n File \"/usr/local/lib/python3.10/site-packages/huggingface_hub/file_download.py\", line 1826, in _raise_on_head_call_error\n raise LocalEntryNotFoundError(\nhuggingface_hub.utils._errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.\n","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-dualstylegan.hf.space","stage":"READY"}]},"title":"Portrait Style Transfer with DualStyleGAN","isLikedByUser":false,"ai_short_description":"Transform portraits into various artistic styles","ai_category":"Image","trendingScore":0,"tags":["gradio","region:us"],"featured":true},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"blue","colorTo":"blue","createdAt":"2022-05-31T05:57:52.000Z","emoji":"πŸ‘„","id":"CVPR/ml-talking-face","lastModified":"2024-04-04T02:28:35.000Z","likes":560,"pinned":false,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"RUNNING","hardware":{"current":"cpu-basic","requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"replicas":{"current":1,"requested":1},"devMode":false,"domains":[{"domain":"cvpr-ml-talking-face.hf.space","stage":"READY"}],"sha":"f10e37f18a71746b294c6f9fd6a9d27402cb2196"},"title":"Talking Face Generation with Multilingual TTS","isLikedByUser":false,"ai_short_description":"Generate multilingual talking-face videos from your text","ai_category":"Video Generation","trendingScore":1,"tags":["gradio","region:us"],"featured":true},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"indigo","colorTo":"indigo","createdAt":"2022-06-20T16:34:49.000Z","emoji":"πŸ‘","id":"CVPR/Image-Animation-using-Thin-Plate-Spline-Motion-Model","lastModified":"2023-10-25T19:06:23.000Z","likes":669,"pinned":false,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"RUNNING","hardware":{"current":"cpu-basic","requested":"cpu-basic"},"storage":null,"gcTimeout":null,"replicas":{"current":1,"requested":1},"devMode":false,"domains":[{"domain":"cvpr-image-animation-using-thin-plate-spline-mot-e234846.hf.space","stage":"READY"}],"sha":"626d7ffc7b07ba96290bd80cb09b73993f85bf40"},"title":"Image Animation Using Thin Plate Spline Motion Model","isLikedByUser":false,"ai_short_description":"Generate animated face images using a driving video","ai_category":"Video Generation","trendingScore":2,"tags":["gradio","region:us"],"featured":true},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"indigo","colorTo":"gray","createdAt":"2022-06-23T03:19:43.000Z","emoji":"😎","id":"CVPR/TokenCut","lastModified":"2023-03-30T17:13:23.000Z","likes":5,"pinned":false,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"RUNTIME_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-tokencut.hf.space","stage":"READY"}]},"title":"TokenCut","isLikedByUser":false,"trendingScore":0,"tags":["gradio","region:us"],"featured":false},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"blue","colorTo":"blue","createdAt":"2022-07-01T22:29:42.000Z","emoji":"βœοΈπŸ§πŸ½β€β™€οΈπŸ§πŸ»","id":"CVPR/drawings-to-human","lastModified":"2022-10-25T18:32:48.000Z","likes":58,"pinned":false,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"BUILD_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"Unexpected build error","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-drawings-to-human.hf.space","stage":"READY"}]},"title":"Drawings to Human","isLikedByUser":false,"trendingScore":0,"tags":["gradio","region:us"],"featured":false},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"yellow","colorTo":"green","createdAt":"2022-06-09T02:42:38.000Z","emoji":"πŸ¦„","id":"CVPR/BigDL-Nano_inference","lastModified":"2022-10-24T11:47:57.000Z","likes":7,"pinned":false,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"RUNTIME_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"Space failed to start. Exit code: 1","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-bigdl-nano-inference.hf.space","stage":"READY"}]},"title":"BigDL-Nano Inference Demo","isLikedByUser":false,"trendingScore":0,"tags":["gradio","region:us"],"featured":false},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"purple","colorTo":"gray","createdAt":"2022-06-19T11:48:29.000Z","emoji":"πŸƒ","id":"CVPR/Text2Human","lastModified":"2022-07-11T13:34:10.000Z","likes":46,"pinned":false,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"RUNTIME_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"3.1 -> 23.3\n[notice] To update, run: python -m pip install --upgrade pip\nSuccessfully installed mmcv-full.\npatching file models/hierarchy_inference_model.py\npatching file models/hierarchy_vqgan_model.py\npatching file models/parsing_gen_model.py\npatching file models/sample_model.py\npatching file models/transformer_model.py\npatching file models/vqgan_model.py\nTraceback (most recent call last):\n File \"app.py\", line 21, in \n from model import Model\n File \"/home/user/app/model.py\", line 16, in \n from utils.language_utils import (generate_shape_attributes,\n File \"Text2Human/utils/language_utils.py\", line 6, in \n from sentence_transformers import SentenceTransformer, util\n File \"/home/user/.local/lib/python3.8/site-packages/sentence_transformers/__init__.py\", line 3, in \n from .datasets import SentencesDataset, ParallelSentencesDataset\n File \"/home/user/.local/lib/python3.8/site-packages/sentence_transformers/datasets/__init__.py\", line 3, in \n from .ParallelSentencesDataset import ParallelSentencesDataset\n File \"/home/user/.local/lib/python3.8/site-packages/sentence_transformers/datasets/ParallelSentencesDataset.py\", line 4, in \n from .. import SentenceTransformer\n File \"/home/user/.local/lib/python3.8/site-packages/sentence_transformers/SentenceTransformer.py\", line 25, in \n from .evaluation import SentenceEvaluator\n File \"/home/user/.local/lib/python3.8/site-packages/sentence_transformers/evaluation/__init__.py\", line 5, in \n from .InformationRetrievalEvaluator import InformationRetrievalEvaluator\n File \"/home/user/.local/lib/python3.8/site-packages/sentence_transformers/evaluation/InformationRetrievalEvaluator.py\", line 6, in \n from ..util import cos_sim, dot_score\n File \"/home/user/.local/lib/python3.8/site-packages/sentence_transformers/util.py\", line 407, in \n from huggingface_hub.snapshot_download import REPO_ID_SEPARATOR\nModuleNotFoundError: No module named 'huggingface_hub.snapshot_download'\n","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-text2human.hf.space","stage":"READY"}]},"title":"Text2Human","isLikedByUser":false,"trendingScore":0,"tags":["gradio","region:us"],"featured":true},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"red","colorTo":"green","createdAt":"2022-06-26T17:40:28.000Z","emoji":"πŸ“Š","id":"CVPR/transfiner","lastModified":"2022-07-05T12:04:50.000Z","likes":2,"pinned":false,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"RUNTIME_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"Defaulting to user installation because normal site-packages is not writeable\nCollecting git+https://github.com/SysCV/transfiner.git\n Cloning https://github.com/SysCV/transfiner.git to /tmp/pip-req-build-f587suky\n Running command git clone --filter=blob:none --quiet https://github.com/SysCV/transfiner.git /tmp/pip-req-build-f587suky\n fatal: unable to access 'https://github.com/SysCV/transfiner.git/': Could not resolve host: github.com\n error: subprocess-exited-with-error\n \n Γ— git clone --filter=blob:none --quiet https://github.com/SysCV/transfiner.git /tmp/pip-req-build-f587suky did not run successfully.\n β”‚ exit code: 128\n ╰─> See above for output.\n \n note: This error originates from a subprocess, and is likely not a problem with pip.\nerror: subprocess-exited-with-error\n\nΓ— git clone --filter=blob:none --quiet https://github.com/SysCV/transfiner.git /tmp/pip-req-build-f587suky did not run successfully.\nβ”‚ exit code: 128\n╰─> See above for output.\n\nnote: This error originates from a subprocess, and is likely not a problem with pip.\nWARNING: There was an error checking the latest version of pip.\nTraceback (most recent call last):\n File \"app.py\", line 16, in \n from detectron2 import model_zoo\nModuleNotFoundError: No module named 'detectron2'\n","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-transfiner.hf.space","stage":"READY"}]},"title":"Transfiner","isLikedByUser":false,"trendingScore":0,"tags":["gradio","region:us"],"featured":false},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"blue","colorTo":"purple","createdAt":"2022-06-29T16:11:01.000Z","emoji":"🏒","id":"CVPR/BrAD","lastModified":"2022-07-02T20:26:00.000Z","likes":16,"pinned":false,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"BUILD_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"Job failed with exit code: 1. Reason: cache miss: [pipfreeze 1/1] RUN pip freeze > /tmp/freeze.txt\ncache miss: [base 4/6] RUN pip install --no-cache-dir pip==22.3.1 && \tpip install --no-cache-dir \tdatasets \t\"huggingface-hub>=0.19\" \"hf-transfer>=0.1.4\" \"protobuf<4\" \"click<8.1\" \"pydantic~=1.0\"\ncache miss: [run 2/2] COPY --from=pipfreeze --link --chown=1000 /tmp/freeze.txt /tmp/freeze.txt\ncache miss: [run 1/2] LINK COPY --link --chown=1000 ./ /home/user/app\ncache miss: [base 3/6] RUN apt-get update && apt-get install -y \tgit \tgit-lfs \tffmpeg \tlibsm6 \tlibxext6 \tcmake \trsync \tlibgl1-mesa-glx \t&& rm -rf /var/lib/apt/lists/* \t&& git lfs install\ncache miss: [base 5/6] RUN --mount=target=/tmp/requirements.txt,source=requirements.txt pip install --no-cache-dir -r /tmp/requirements.txt\ncache miss: [run 1/2] COPY --link --chown=1000 ./ /home/user/app\ncache miss: [base 6/6] RUN pip install --no-cache-dir \tgradio[oauth]==3.0.20 \t\"uvicorn>=0.14.0\" \tspaces httpx==0.24.1 \"fastapi<0.113.0\"\ncache miss: [run 2/2] LINK COPY --from=pipfreeze --link --chown=1000 /tmp/freeze.txt /tmp/freeze.txt\n{\"total\":24,\"completed\":17,\"user_total\":13,\"user_cached\":3,\"user_completed\":6,\"user_cacheable\":12,\"from\":1,\"miss\":9,\"client_duration_ms\":11595}\n","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-brad.hf.space","stage":"READY"}]},"title":"BrAD","isLikedByUser":false,"trendingScore":0,"tags":["gradio","region:us"],"featured":true},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"indigo","colorTo":"indigo","createdAt":"2022-06-27T20:27:23.000Z","emoji":"⚑","id":"CVPR/WALT","lastModified":"2022-06-29T23:13:18.000Z","likes":3,"pinned":false,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"RUNTIME_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"Scheduling failure: not enough hardware capacity","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-walt.hf.space","stage":"READY"}]},"title":"WALT DEMO","isLikedByUser":false,"trendingScore":0,"tags":["gradio","region:us"],"featured":false},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"purple","colorTo":"blue","createdAt":"2022-06-22T17:37:09.000Z","emoji":"🏒","id":"CVPR/regionclip-demo","lastModified":"2022-06-28T01:01:17.000Z","likes":41,"pinned":false,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"RUNTIME_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"ib/python3.8/site-packages (from lvis==0.5.3) (2.8.2)\nRequirement already satisfied: six>=1.12.0 in /home/user/.local/lib/python3.8/site-packages (from lvis==0.5.3) (1.16.0)\nRequirement already satisfied: importlib-resources>=3.2.0 in /home/user/.local/lib/python3.8/site-packages (from matplotlib>=3.1.1->lvis==0.5.3) (5.12.0)\nRequirement already satisfied: pillow>=6.2.0 in /home/user/.local/lib/python3.8/site-packages (from matplotlib>=3.1.1->lvis==0.5.3) (9.4.0)\nRequirement already satisfied: contourpy>=1.0.1 in /home/user/.local/lib/python3.8/site-packages (from matplotlib>=3.1.1->lvis==0.5.3) (1.0.7)\nRequirement already satisfied: fonttools>=4.22.0 in /home/user/.local/lib/python3.8/site-packages (from matplotlib>=3.1.1->lvis==0.5.3) (4.39.0)\nRequirement already satisfied: packaging>=20.0 in /home/user/.local/lib/python3.8/site-packages (from matplotlib>=3.1.1->lvis==0.5.3) (23.0)\nRequirement already satisfied: zipp>=3.1.0 in /home/user/.local/lib/python3.8/site-packages (from importlib-resources>=3.2.0->matplotlib>=3.1.1->lvis==0.5.3) (3.15.0)\n\n[notice] A new release of pip available: 22.3.1 -> 23.1.1\n[notice] To update, run: python -m pip install --upgrade pip\nTraceback (most recent call last):\n File \"app.py\", line 25, in \n from detectron2.data import MetadataCatalog\n File \"/home/user/app/detectron2/data/__init__.py\", line 4, in \n from .build import (\n File \"/home/user/app/detectron2/data/build.py\", line 24, in \n from .clip_build import make_clip_dataset\n File \"/home/user/app/detectron2/data/clip_build.py\", line 12, in \n from .clip_datasets.clip_img_txt_pair_tsv import CLIPImgTxtPairTSVDataset\n File \"/home/user/app/detectron2/data/clip_datasets/clip_img_txt_pair_tsv.py\", line 21, in \n from detectron2.data.clip_datasets.clip_prompt_engineering import get_prompt_templates, prompt_engineering\n File \"/home/user/app/detectron2/data/clip_datasets/clip_prompt_engineering.py\", line 6, in \n import ftfy\nModuleNotFoundError: No module named 'ftfy'\n","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-regionclip-demo.hf.space","stage":"READY"}]},"title":"RegionCLIP Zero-Shot Object Detection Demo","isLikedByUser":false,"trendingScore":0,"tags":["gradio","region:us"],"featured":false},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"indigo","colorTo":"indigo","createdAt":"2022-06-06T23:15:47.000Z","emoji":"πŸ“‰","id":"CVPR/README","lastModified":"2022-06-26T17:26:21.000Z","likes":0,"pinned":false,"private":false,"sdk":"static","repoType":"space","runtime":{"stage":"RUNNING","hardware":{"current":null,"requested":null},"storage":null,"replicas":{"requested":1,"current":1}},"title":"README","isLikedByUser":false,"trendingScore":0,"tags":["static","region:us"],"featured":false},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"red","colorTo":"purple","createdAt":"2022-06-09T22:52:07.000Z","emoji":"🏒","id":"CVPR/unicl-zero-shot-img-recog","lastModified":"2022-06-23T15:23:13.000Z","likes":28,"pinned":false,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"BUILD_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"Unexpected build error","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-unicl-zero-shot-img-recog.hf.space","stage":"READY"}]},"title":"Unicl Zero-Shot Image Recognition Demo","isLikedByUser":false,"trendingScore":0,"tags":["gradio","region:us"],"featured":true},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"indigo","colorTo":"gray","createdAt":"2022-06-13T18:41:56.000Z","emoji":"πŸ“ˆ","id":"CVPR/Leaderboard","lastModified":"2022-06-23T03:09:51.000Z","likes":9,"pinned":false,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"RUNTIME_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"failed to create containerd task: failed to create shim task: context canceled: unknown","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-leaderboard.hf.space","stage":"READY"}]},"title":"Leaderboard","isLikedByUser":false,"trendingScore":0,"tags":["gradio","region:us"],"featured":false},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"red","colorTo":"gray","createdAt":"2022-06-21T14:55:43.000Z","emoji":"πŸ€–","id":"CVPR/flava-multimodal-zero-shot","lastModified":"2022-06-22T03:55:55.000Z","likes":2,"pinned":false,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"RUNTIME_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-flava-multimodal-zero-shot.hf.space","stage":"READY"}]},"title":"FLAVA MultiModal Zero Shot","isLikedByUser":false,"trendingScore":0,"tags":["gradio","region:us"],"featured":false},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"green","colorTo":"green","createdAt":"2022-06-21T19:54:32.000Z","emoji":"🧏","id":"CVPR/SPOTER_Sign_Language_Recognition","lastModified":"2022-06-21T20:36:56.000Z","likes":8,"pinned":false,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"BUILD_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"Unexpected build error","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-spoter-sign-language-recognition.hf.space","stage":"READY"}]},"title":"Spoter Sign language recognition demo","isLikedByUser":false,"trendingScore":0,"tags":["gradio","region:us"],"featured":false},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"blue","colorTo":"gray","createdAt":"2022-06-21T14:07:36.000Z","emoji":"πŸ›Ή","id":"CVPR/winoground-explorer","lastModified":"2022-06-21T18:32:08.000Z","likes":6,"pinned":false,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"RUNTIME_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"/home/user/.local/lib/python3.8/site-packages/datasets/load.py:2089: FutureWarning: 'use_auth_token' was deprecated in favor of 'token' in version 2.14.0 and will be removed in 3.0.0.\nYou can remove this warning by passing 'token=hf_**********************************' instead.\n warnings.warn(\nTraceback (most recent call last):\n File \"app.py\", line 7, in \n winoground = load_dataset(\"facebook/winoground\", use_auth_token=auth_token)[\"test\"]\n File \"/home/user/.local/lib/python3.8/site-packages/datasets/load.py\", line 2129, in load_dataset\n builder_instance = load_dataset_builder(\n File \"/home/user/.local/lib/python3.8/site-packages/datasets/load.py\", line 1815, in load_dataset_builder\n dataset_module = dataset_module_factory(\n File \"/home/user/.local/lib/python3.8/site-packages/datasets/load.py\", line 1508, in dataset_module_factory\n raise FileNotFoundError(\nFileNotFoundError: Couldn't find a dataset script at /home/user/app/facebook/winoground/winoground.py or any data file in the same directory. Couldn't find 'facebook/winoground' on the Hugging Face Hub either: FileNotFoundError: Dataset 'facebook/winoground' doesn't exist on the Hub. If the repo is private or gated, make sure to log in with `huggingface-cli login`.\n","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-winoground-explorer.hf.space","stage":"READY"}]},"title":"WinoGround Explorer","isLikedByUser":false,"trendingScore":0,"tags":["gradio","region:us"],"featured":false},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"red","colorTo":"blue","createdAt":"2022-06-13T10:54:45.000Z","emoji":"πŸ™†β€β™‚οΈ","id":"CVPR/v-doc_abstractive_mac","lastModified":"2022-06-20T13:02:29.000Z","likes":4,"pinned":false,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"RUNTIME_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"Memory limit exceeded (1G)","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-v-doc-abstractive-mac.hf.space","stage":"READY"}]},"title":"VDoc-mac","isLikedByUser":false,"trendingScore":0,"tags":["gradio","region:us"],"featured":false},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"indigo","colorTo":"indigo","createdAt":"2022-06-18T05:02:12.000Z","emoji":"⚑","id":"CVPR/Object-Detection-With-DETR-and-YOLOS","lastModified":"2022-06-18T05:08:02.000Z","likes":8,"pinned":false,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"BUILD_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"Unexpected build error","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-object-detection-with-detr-and-yolos.hf.space","stage":"READY"}]},"title":"Object Detection With DETR And YOLOS","isLikedByUser":false,"trendingScore":0,"tags":["gradio","region:us"],"featured":false},{"author":"CVPR","authorData":{"_id":"61e118ef39fa65f4b2bc9d2c","avatarUrl":"https://cdn-avatars.huggingface.co/v1/production/uploads/1642793296366-61e1188afc27c0f5e3641eb3.jpeg","fullname":"CVPR Demo Track","name":"CVPR","type":"org","isHf":false,"isHfAdmin":false,"isMod":false,"followerCount":338,"isUserFollowing":false},"colorFrom":"gray","colorTo":"indigo","createdAt":"2022-06-15T14:20:30.000Z","emoji":"πŸ‘οΈ","id":"CVPR/VizWiz-CLIP-VQA","lastModified":"2022-06-17T14:15:19.000Z","likes":8,"pinned":false,"private":false,"sdk":"gradio","repoType":"space","runtime":{"stage":"BUILD_ERROR","hardware":{"current":null,"requested":"cpu-basic"},"storage":null,"gcTimeout":86400,"errorMessage":"Unexpected build error","replicas":{"requested":1},"devMode":false,"domains":[{"domain":"cvpr-vizwiz-clip-vqa.hf.space","stage":"READY"}]},"title":"CLIP-VQA for VizWiz 2022","isLikedByUser":false,"trendingScore":0,"tags":["gradio","region:us"],"featured":false}],"numRepos":33,"currentRepoPage":0,"filters":{},"paperView":false}">

AI & ML interests

CVPR Demo Track @ CVPR 2022

Recent Activity

CVPR 's Spaces 33