Skip to content
This repository has been archived by the owner on Nov 1, 2024. It is now read-only.

Latest commit

 

History

History
46 lines (28 loc) · 1.5 KB

MP3D.md

File metadata and controls

46 lines (28 loc) · 1.5 KB

MP3D and Replica

Download

Matterport3D

  1. Download habitat:
  • habitat-api (if you want to use Matterport or Replica datasets)
  • habitat-sim (if you want to use Matterport or Replica datasets)
  1. Download the point nav datasets.

  2. Download MP3D.

Replica

Do the steps for Matterport.

Download Replica.

Train

Update options

Update the paths in ./options/options.py for the dataset being used.

Training scripts

Use the ./train.sh to train one of the models on a single GPU node.

You can also look at ./submit_slurm_synsin.sh to see how to modify parameters in the renderer and run on a slurm cluster.

Evaluate

Run the evaluation to obtain both visibility and invisibility scores.

Run the following bash command. It will output some sample images, and save the results to a txt file. Make sure to set the options correctly, else this will throw an error, as the results won't be compatible with our given results.

python evaluation/eval.py \
     --result_folder ${TEST_FOLDER} \
     --old_model ${OLD_MODEL} \
     --batch_size 8 --num_workers 10  --images_before_reset 200 \ # It is IMPORTANT to set these correctly
     --dataset replica # ONLY if you want to evaluate on replica