Skip to content
This repository has been archived by the owner on Jan 1, 2025. It is now read-only.

batch_size doesn't affect evaluation #225

Open
SuroshAhmadZobair opened this issue Jan 14, 2024 · 0 comments
Open

batch_size doesn't affect evaluation #225

SuroshAhmadZobair opened this issue Jan 14, 2024 · 0 comments

Comments

@SuroshAhmadZobair
Copy link

Hi

Thanks for your work on this repo.
I am trying to do evaluation on a few models I trained in the past but it takes 3 Hrs for evaluation on each model. I am using RTX 3090 with batch size 128. yes, I have tried changing the batch size it seems like its not affecting the evaluation time. 14GBs of my GPU is also free. No matter wat batch size or num_workers i use, I still get the following timing for evaluation.

[01/14 14:27:12 d2.evaluation.evaluator]: Inference done 11/23852. Dataloading: 0.0006 s/iter. Inference: 0.0625 s/iter. Eval: 0.3715 s/iter. Total: 0.4347 s/iter. ETA=2:52:42

[01/14 14:27:18 d2.evaluation.evaluator]: Inference done 24/23852. Dataloading: 0.0007 s/iter. Inference: 0.0572 s/iter. Eval: 0.3645 s/iter. Total: 0.4225 s/iter. ETA=2:47:46

[01/14 14:27:23 d2.evaluation.evaluator]: Inference done 36/23852. Dataloading: 0.0007 s/iter. Inference: 0.0566 s/iter. Eval: 0.3713 s/iter. Total: 0.4287 s/iter. ETA=2:50:10

How can i speed up this process?
Here is the cmd i use:

python train_net.py --config-file configs//instance-segmentation/maskformer2_R101.yaml --eval-only SOLVER.IMS_PER_BATCH 128 MODEL.WEIGHTS for_eval_testing/model_0154999.pth

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant