Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Repeated augmentation layer #372

Closed
sayakpaul opened this issue Apr 27, 2022 · 4 comments · Fixed by #1293
Closed

Repeated augmentation layer #372

sayakpaul opened this issue Apr 27, 2022 · 4 comments · Fixed by #1293
Labels
preprocessing roadmap Components on the roadmap.

Comments

@sayakpaul
Copy link
Contributor

Repeated augmentation layer [1] has also become an important recipe to train SoTA image classification models. The abstract of [1] pretty much sums up what it is:

Large-batch SGD is important for scaling training of deep neural networks. However, without fine-tuning hyperparameter schedules, the generalization of the model may be hampered. We propose to use batch augmentation: replicating instances of samples within the same batch with different data augmentations. Batch augmentation acts as a regularizer and an accelerator, increasing both generalization and performance scaling for a fixed budget of optimization steps. We analyze the effect of batch augmentation on gradient variance and show that it empirically improves convergence for a wide variety of networks and datasets. Our results show that batch augmentation reduces the number of necessary SGD updates to achieve the same accuracy as the state-of-the-art. Overall, this simple yet effective method enables faster training and better generalization by allowing more computational resources to be used concurrently.

References

[1] Augment Your Batch: Improving Generalization Through Instance Repetition

@bhack
Copy link
Contributor

bhack commented Apr 27, 2022

We have talked about this some month ago at #146 (comment).

At that time we have not received any feedback.

@bhack
Copy link
Contributor

bhack commented Apr 27, 2022

P.s. We have still many performance fallback related to the current batch randomizzation strategy:
#291

@sayakpaul
Copy link
Contributor Author

Makes sense to have a dedicated issue for the layer rather than keeping it within comments. That way it becomes more visible and all the related concerns (like the ones you mentioned) can be discussed specifically.

@bhack
Copy link
Contributor

bhack commented Apr 27, 2022

Probably, but as you can see we was in a design phase on Feb and comments emergerged in that context.
Now, at least, the history is complete also here for who is landing on this ticket directly.

@LukeWood LukeWood added preprocessing roadmap Components on the roadmap. labels May 5, 2022
LukeWood added a commit to LukeWood/keras-cv that referenced this issue Jan 19, 2023
LukeWood added a commit that referenced this issue Jan 25, 2023
* Implement RepeatedAugmentation as a KerasCV API

more reading and fixes #372

* add test case

* fix formatting

* fix formatting

* fix formatting

* fix serialization test

* add repeated augmentation usage docstring

* Update component for repeated augment

* Repeated augmentations fix

* Test MixUp explicitly

* update docstring

* update docstring

* Reformat

* keras_cv/layers/preprocessing/repeated_augmentation.py
freedomtan pushed a commit to freedomtan/keras-cv that referenced this issue Jul 20, 2023
ghost pushed a commit to y-vectorfield/keras-cv that referenced this issue Nov 16, 2023
* Implement RepeatedAugmentation as a KerasCV API

more reading and fixes keras-team#372

* add test case

* fix formatting

* fix formatting

* fix formatting

* fix serialization test

* add repeated augmentation usage docstring

* Update component for repeated augment

* Repeated augmentations fix

* Test MixUp explicitly

* update docstring

* update docstring

* Reformat

* keras_cv/layers/preprocessing/repeated_augmentation.py
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
preprocessing roadmap Components on the roadmap.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants