Replies: 11 comments 3 replies
-
I have not extensively checked about these in the model gardens components, but just picking the First One in the list: |
Beta Was this translation helpful? Give feedback.
-
Currently we are prioritizing components that are required to achieve state of the art results on specific tasks. I.e., imagenet1k classification, COCO object detection, etc. Any chance I could get some guidance as to where these components excel? Thanks |
Beta Was this translation helpful? Give feedback.
-
@LukeWood
It can be considered as a to-do list, an interested contributor can get references from here in the coming days. Let me know if I miss something. Also, I think a few components should be already on the current priority list, for example, Squeeze-and-Excitation or CBAM. |
Beta Was this translation helpful? Give feedback.
-
II think that in this specific field you need to always find a balance between popularity, state of the art, sedimentation and the available human and computing resources you have at a specific point on the time dimension in your community. E.g. we are still investing resources for resnext-rs and waiting for the citation/popularity threshold for the next STOA: Often an high popularity component/model is needed to be maintained just cause it is a recurent baseline for new academic work. Another addition dimension/tension Is the computing resources required by a model and its own components. |
Beta Was this translation helpful? Give feedback.
-
@innat @LukeWood
Could you post these separately so people could be assigned if interested and maybe some of the model could be prioritized. |
Beta Was this translation helpful? Give feedback.
-
Ah, also can SWIN transformer be added to the list? |
Beta Was this translation helpful? Give feedback.
-
@old-school-kid For Swin-Transformer, it's been asked, #15545 |
Beta Was this translation helpful? Give feedback.
-
@leondgarse |
Beta Was this translation helpful? Give feedback.
-
@innat Yes, I would like to contribute, and we have a talk here Keras-cv #45. Speaking model architecture modules, in my practice, components like halo_attention and bottleneck_transformer usually need to be rewritten for other new models, as most of them have some detail different. Personally, I'm using most of them as functions, not layers, so prefer port with the whole model. I also support we would better only hold most common models here, and yes, I would like to port them here if decided. :) |
Beta Was this translation helpful? Give feedback.
-
I think we at least should get
|
Beta Was this translation helpful? Give feedback.
-
@old-school-kid In the first post, I've mentioned many attention module with citation count. You can pick one with highest citation layer.
Also @leondgarse is maintaining amazing library |
Beta Was this translation helpful? Give feedback.
-
In the timm package, it provides some soft attention modules to building network blocks and I think it's a good fit here, for example:
and many others.
Beta Was this translation helpful? Give feedback.
All reactions