... | ... | @@ -50,15 +50,15 @@ Virtual Meeting using [BigBlueButton](https://webconf.fz-juelich.de/b/wen-mym-pj |
|
|
* **Rethinking the Self-Attention in Vision Transformers**<br>
|
|
|
Kyungmin Kim, Bichen Wu, Xiaoliang Dai, Peizhao Zhang, Zhicheng Yan, Peter Vajda, Seon Joo Kim<br>
|
|
|
CVPR Workshops series, 2021<br>
|
|
|
[link](https://openaccess.thecvf.com/content/CVPR2021W/ECV/html/Kim_Rethinking_the_Self-Attention_in_Vision_Transformers_CVPRW_2021_paper.html)
|
|
|
[[link](https://openaccess.thecvf.com/content/CVPR2021W/ECV/html/Kim_Rethinking_the_Self-Attention_in_Vision_Transformers_CVPRW_2021_paper.html)]
|
|
|
|
|
|
* complementary, **optional** further reading:<br>
|
|
|
**Synthesizer: Rethinking self-attention for transformer models**<br>
|
|
|
Tay, Y., Bahri, D., Metzler, D., Juan, D. C., Zhao, Z., & Zheng, C.<br>
|
|
|
In: International Conference on Machine Learning (pp. 10183-10192). PMLR. ICML 2021<br>
|
|
|
[link](https://proceedings.mlr.press/v139/tay21a.html)<br>
|
|
|
[ArXiv](https://arxiv.org/abs/2005.00743)<br>
|
|
|
ICLR 2021 (tight reject) discussion [here](https://openreview.net/forum?id=H-SPvQtMwm)
|
|
|
[[link](https://proceedings.mlr.press/v139/tay21a.html)]<br>
|
|
|
[[ArXiv](https://arxiv.org/abs/2005.00743)]<br>
|
|
|
ICLR 2021 (tight reject) discussion [[here](https://openreview.net/forum?id=H-SPvQtMwm)]
|
|
|
|
|
|
|
|
|
## Past Meetings
|
... | ... | |