Skip to content

Latest commit

 

History

History
222 lines (153 loc) · 15.8 KB

File metadata and controls

222 lines (153 loc) · 15.8 KB

ECNU transformers Reading List

本reading-list是华东师范大学智能知识管理团队(负责人:王晓玲教授)在2022年下半年讨论班的论文阅读目录.

本讨论班围绕transformer模型这个目前大火的模型,分享学习其架构设计,预训练和微调方面的最新研究进展。这个reading-list还在不断地完善中,我们会持续更新这个reading-list. 我们非常欢迎Pull Request ! 如果有好的建议,欢迎联系 [email protected].

Contents

综述

Tutorial

Research papers

Transformer Architecture

Sequence modeling

Vision Transformers

Graph Transformers

Pre-training

Language pretraining

Vision pretraining

Vision-Language pretraining

Speech-Language pretraining

Document pretraining

Time-series pretraining

Recomendation pretraining

Fine-tuning

微调涨分有效方法

Robustness

Parameter efficient fine-tuning

Prompt learning

Debiasing

Inference speedup