-
Notifications
You must be signed in to change notification settings - Fork 7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ADD: ApplySomeOf transforms to Random Choice #7586
base: main
Are you sure you want to change the base?
ADD: ApplySomeOf transforms to Random Choice #7586
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/vision/7586
Note: Links to docs will display an error until the docs builds have been completed. ❌ 2 New Failures, 1 Unrelated FailureAs of commit e737d47 with merge base f69eee6 (): NEW FAILURES - The following jobs have failed:
FLAKY - The following job failed but was likely due to flakiness present on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
@NicolasHug can you have a look please |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR @ambujpawar and sorry for such a late review - thanks for the reminder. The PR looks good so far, I left a few comments below, LMK what you think.
@@ -129,16 +130,24 @@ class RandomChoice(Transform): | |||
p (list of floats or None, optional): probability of each transform being picked. | |||
If ``p`` doesn't sum to 1, it is automatically normalized. If ``None`` | |||
(default), all transforms have the same probability. | |||
max_transforms(int, optional): The maximum number of transforms that can be applied. | |||
If specified, ``p`` is ignored and a random number of transforms sampled from | |||
[1, ``max_transforms``] is applied. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's sample from [0, max_transforms] as in the original feature requets
idx = int(torch.multinomial(torch.tensor(self.p), 1)) | ||
transform = self.transforms[idx] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These 2 lines here should probably be part of the if self.p
block below
selected_transforms = random.sample(self.transforms, k=random.randint(1, self.max_transforms)) | ||
random.shuffle(selected_transforms) | ||
return Compose(selected_transforms)(*inputs) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's try to implement that without using the builtin random
module and instead just rely on pytorch's RNG. torch.randperm
can probably be used for that
def test_assertions_p_and_max_transforms(self): | ||
with pytest.raises(ValueError, match="Only one of `p` and `max_transforms` should be specified."): | ||
transforms.RandomChoice([transforms.Pad(2), transforms.RandomCrop(28)], p=[1], max_transforms=3) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perhaps this is due to the fact that this file has change a lot in the past, but this test is now part of test_pure_tensor_heuristic()
and it shouldn't be there. We have a def test_random_choice()
somewhere, maybe we should make it a class and put those tests together.
Also, it'd be nice to test the behaviour of the new max_transforms
parameter
Extended RandomChoice to include ApplySomeOf transforms.
Will add tests if you're happy with the functional implementation