Clustering swap prediction for image-text pre-training
It is essential to delve into the strategy of multimodal model pre-training, which is an obvious impact on downstream tasks. Currently, clustering learning has achieved noteworthy benefits in multiple methods. However, due to the availability of open image-text pairs, it is challenging for multimoda...
Saved in:
Main Authors: | Fayou, Sun, Meng, Zuqiang, Ngo, Hea Choon, Sek, Yong Wee |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Research
2024
|
Online Access: | http://eprints.utem.edu.my/id/eprint/27536/2/0130221062024105857.PDF http://eprints.utem.edu.my/id/eprint/27536/ https://www.nature.com/articles/s41598-024-60832-x#:~:text=We%20argue%20that%20the%20advantages,can%20be%20dynamically%20adjusted%20with https://doi.org/10.1038/s41598-024-60832-x |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Loop and distillation: Attention weights fusion transformer for
fine‐grained representation
by: Sun, Fayou, et al.
Published: (2023) -
Adopting multiple vision transformer layers for fine-grained image representation
by: Sun, Fayou, et al.
Published: (2023) -
Adopting attention and cross-layer features for fine-grained representation
by: Sun, Fayou, et al.
Published: (2022) -
Snake-swapping for better insight
by: Times Two, Malaysia
Published: (1988) -
PAS and Umno in role swap of sorts
by: Ian Mcintyre
Published: (2008)