Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add the release information of MMScan-devkit in the README #96

Merged
merged 2 commits into from
Jan 11, 2025
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,7 @@ It encompasses over <b>5k scans encapsulating 1M ego-centric RGB-D views, 1M lan
Building upon this database, we introduce a baseline framework named <b>Embodied Perceptron</b>. It is capable of processing an arbitrary number of multi-modal inputs and demonstrates remarkable 3D perception capabilities, both within the two series of benchmarks we set up, i.e., fundamental 3D perception tasks and language-grounded tasks, and <b>in the wild</b>.

## 🔥 News
- \[2025-01\] We are delighted to present the official release of MMScan-devkit, which encompasses a suite of data processing utilities, benchmark evaluation tools, and adaptations of some models for the MMScan benchmarks. We invite you to explore these resources and welcome any feedback or questions you may have!
Tai-Wang marked this conversation as resolved.
Show resolved Hide resolved
- \[2024-09\] We are pleased to announce the release of EmbodiedScan v2 beta, with original annotations on newly added ~5k scans from ARKitScenes and the beta version of MMScan's annotations on the original 5k scans. Fill in the [form](https://docs.google.com/forms/d/e/1FAIpQLScUXEDTksGiqHZp31j7Zp7zlCNV7p_08uViwP_Nbzfn3g6hhw/viewform) to apply for downloading. Welcome for any feedback!
- \[2024-08\] We preliminarily release the [sample data](https://drive.google.com/file/d/1Y1_LOE35NpsnkneYElvNwuuR6-OAbwPm/view?usp=sharing) of [MMScan](https://tai-wang.github.io/mmscan/) and the full release will be ready with ARKitScenes' annotations this month, which will be announced via emails to the community. Please stay tuned!
- \[2024-06\] The report of our follow-up work with the most-ever hierarchical grounded language annotations, [MMScan](https://tai-wang.github.io/mmscan/), has been released. Welcome to talk with us about EmbodiedScan and MMScan at Seattle, CVPR 2024!
Expand Down
Loading