您當前的位置:首頁 > 旅遊

【Knowledge Distillation】知識蒸餾論文整理

作者:由 Alpha Code工作室 發表于 旅遊時間:2021-05-12

1. Towards Cross-Modality Medical Image Segmentation with Online Mutual Knowledge Distillation

會議:

AAAI 2020。 AAAI Technical Track: Applications。

作者:

Kang Li, Lequan Yu, Shujun Wang, Pheng-Ann Heng

連結:

https://

aaai。org/ojs/index。php/

AAAI/article/view/5421/5277

2. Online Knowledge Distillation with Diverse Peers

會議:

AAAI 2020。 AAAI Technical Track: Machine Learning。

作者:

Defang Chen, Jian-Ping Mei, Can Wang, Yan Feng, Chun Chen

連結:

https://

aaai。org/ojs/index。php/

AAAI/article/view/5746/5602

3. Towards Oracle Knowledge Distillation with Neural Architecture Search

會議:

AAAI 2020。 AAAI Technical Track: Machine Learning。

作者:

Minsoo Kang, Jonghwan Mun, Bohyung Han

連結:

https://

aaai。org/ojs/index。php/

AAAI/article/view/5866/5722

4. Improved Knowledge Distillation via Teacher Assistant

會議:

AAAI 2020。 AAAI Technical Track: Machine Learning。

作者:

Seyed-Iman Mirzadeh, Mehrdad Farajtabar, Ang Li, Nir Levine, Akihiro Matsukawa, Hassan Ghasemzadeh

連結:

https://

aaai。org/ojs/index。php/

AAAI/article/view/5963/5819

5. Knowledge Distillation from Internal Representations

會議:

AAAI 2020。 AAAI Technical Track: Natural Language Processing。

作者:

Gustavo Aguilar, Yuan Ling, Yu Zhang, Benjamin Yao, Xing Fan, Chenlei Guo

連結:

https://

aaai。org/ojs/index。php/

AAAI/article/view/6229/6085

6. Ultrafast Video Attention Prediction with Coupled Knowledge Distillation

會議:

AAAI 2020。 AAAI Technical Track: Vision。

作者:

Kui Fu, Peipei Shi, Yafei Song, Shiming Ge, Xiangju Lu, Jia Li

連結:

https://

aaai。org/ojs/index。php/

AAAI/article/view/6710/6564

7. Uncertainty-Aware Multi-Shot Knowledge Distillation for Image-Based Object Re-Identification

會議:

AAAI 2020。 AAAI Technical Track: Vision。

作者:

Xin Jin, Cuiling Lan, Wenjun Zeng, Zhibo Chen

連結:

https://

aaai。org/ojs/index。php/

AAAI/article/view/6774/6628

8. Structure-Level Knowledge Distillation For Multilingual Sequence Labeling

會議:

ACL 2020。

作者:

Xinyu Wang, Yong Jiang, Nguyen Bach, Tao Wang, Fei Huang, Kewei Tu

連結:

https://www。

aclweb。org/anthology/20

20。acl-main。304。pdf

9. Knowledge Distillation for Multilingual Unsupervised Neural Machine Translation

會議:

ACL 2020。

作者:

Haipeng Sun, Rui Wang, Kehai Chen, Masao Utiyama, Eiichiro Sumita, Tiejun Zhao

連結:

https://www。

aclweb。org/anthology/20

20。acl-main。324。pdf

10. TextBrewer: An Open-Source Knowledge Distillation Toolkit for Natural Language Processing

會議:

ACL 2020。 System Demonstrations。

作者:

Ziqing Yang, Yiming Cui, Zhipeng Chen, Wanxiang Che, Ting Liu, Shijin Wang, Guoping Hu

連結:

https://www。

aclweb。org/anthology/20

20。acl-demos。2。pdf

11. End-to-End Speech-Translation with Knowledge Distillation: FBK@IWSLT2020

會議:

ACL 2020。 the 17th International Conference on Spoken Language Translation。

作者:

Marco Gaido, Mattia A。 Di Gangi, Matteo Negri, Marco Turchi

連結:

https://www。

aclweb。org/anthology/20

20。iwslt-1。8。pdf

12. Exploring the Limits of Simple Learners in Knowledge Distillation for Document Classification with DocBERT

會議:

ACL 2020。 the 5th Workshop on Representation Learning for NLP。

作者:

Ashutosh Adhikari, Achyudh Ram, Raphael Tang, William L。 Hamilton, Jimmy Lin

連結:

https://www。

aclweb。org/anthology/20

20。repl4nlp-1。10。pdf

13. Neural Networks Are More Productive Teachers Than Human Raters: Active Mixup for Data-Efficient Knowledge Distillation From a Blackbox Model

會議:

CVPR 2020。

作者:

Dongdong Wang, Yandong Li, Liqiang Wang, Boqing Gong

連結:

https://

openaccess。thecvf。com/c

ontent_CVPR_2020/papers/Wang_Neural_Networks_Are_More_Productive_Teachers_Than_Human_Raters_Active_CVPR_2020_paper。pdf

14. Creating Something From Nothing: Unsupervised Knowledge Distillation for Cross-Modal Hashing

會議:

CVPR 2020。

作者:

Hengtong Hu, Lingxi Xie, Richang Hong, Qi Tian

連結:

https://

openaccess。thecvf。com/c

ontent_CVPR_2020/papers/Hu_Creating_Something_From_Nothing_Unsupervised_Knowledge_Distillation_for_Cross-Modal_Hashing_CVPR_2020_paper。pdf

15. Heterogeneous Knowledge Distillation Using Information Flow Modeling

會議:

CVPR 2020。

作者:

Nikolaos Passalis, Maria Tzelepi, Anastasios Tefas

連結:

https://

openaccess。thecvf。com/c

ontent_CVPR_2020/papers/Passalis_Heterogeneous_Knowledge_Distillation_Using_Information_Flow_Modeling_CVPR_2020_paper。pdf

16. Revisiting Knowledge Distillation via Label Smoothing Regularization

會議:

CVPR 2020。

作者:

Li Yuan, Francis EH Tay, Guilin Li, Tao Wang, Jiashi Feng

連結:

https://

openaccess。thecvf。com/c

ontent_CVPR_2020/papers/Yuan_Revisiting_Knowledge_Distillation_via_Label_Smoothing_Regularization_CVPR_2020_paper。pdf

17. Block-Wisely Supervised Neural Architecture Search With Knowledge Distillation

會議:

CVPR 2020。

作者:

Changlin Li, Jiefeng Peng, Liuchun Yuan, Guangrun Wang, Xiaodan Liang, Liang Lin, Xiaojun Chang

連結:

https://

openaccess。thecvf。com/c

ontent_CVPR_2020/papers/Li_Block-Wisely_Supervised_Neural_Architecture_Search_With_Knowledge_Distillation_CVPR_2020_paper。pdf

18. Few Sample Knowledge Distillation for Efficient Network Compression

會議:

CVPR 2020。

作者:

Tianhong Li, Jianguo Li, Zhuang Liu, Changshui Zhang

連結:

https://

openaccess。thecvf。com/c

ontent_CVPR_2020/papers/Li_Few_Sample_Knowledge_Distillation_for_Efficient_Network_Compression_CVPR_2020_paper。pdf

19. Online Knowledge Distillation via Collaborative Learning

會議:

CVPR 2020。

作者:

Qiushan Guo, Xinjiang Wang, Yichao Wu, Zhipeng Yu, Ding Liang, Xiaolin Hu, Ping Luo

連結:

https://

openaccess。thecvf。com/c

ontent_CVPR_2020/papers/Guo_Online_Knowledge_Distillation_via_Collaborative_Learning_CVPR_2020_paper。pdf

20. Circumventing Outliers of AutoAugment with Knowledge Distillation

會議:

ECCV 2020。

作者:

Longhui Wei, An Xiao, Lingxi Xie, Xiaopeng Zhang, Xin Chen, Qi Tian

連結:

https://www。

ecva。net/papers/eccv_20

20/papers_ECCV/papers/123480613。pdf

21. Learning From Multiple Experts: Self-paced Knowledge Distillation for Long-tailed Classification

會議:

ECCV 2020。

作者:

Liuyu Xiang, Guiguang Ding, Jungong Han

連結:

https://www。

ecva。net/papers/eccv_20

20/papers_ECCV/papers/123500239。pdf

22. Knowledge Distillation Meets Self-Supervision

會議:

ECCV 2020。

作者:

Guodong Xu, Ziwei Liu, Xiaoxiao Li, Chen Change Loy

連結:

https://www。

ecva。net/papers/eccv_20

20/papers_ECCV/papers/123540562。pdf

23. Robust Re-Identification by Multiple Views Knowledge Distillation

會議:

ECCV 2020。

作者:

Angelo Porrello, Luca Bergamini, Simone Calderara

連結:

https://www。

ecva。net/papers/eccv_20

20/papers_ECCV/papers/123550103。pdf

24. Local Correlation Consistency for Knowledge Distillation

會議:

ECCV 2020。

作者:

Xiaojie Li, Jianlong Wu, Hongyu Fang, Yue Liao, Fei Wang, Chen Qian

連結:

https://www。

ecva。net/papers/eccv_20

20/papers_ECCV/papers/123570018。pdf

25. AMLN: Adversarial-based Mutual Learning Network for Online Knowledge Distillation

會議:

ECCV 2020。

作者:

Xiaobing Zhang, Shijian Lu, Haigang Gong, Zhipeng Luo, Ming Liu

連結:

https://www。

ecva。net/papers/eccv_20

20/papers_ECCV/papers/123570154。pdf

26. Differentiable Feature Aggregation Search for Knowledge Distillation

會議:

ECCV 2020。

作者:

Yushuo Guan, Pengyu Zhao, Bingxuan Wang, Yuanxing Zhang, Cong Yao, Kaigui Bian, Jian Tang

連結:

https://www。

ecva。net/papers/eccv_20

20/papers_ECCV/papers/123620460。pdf

27. Online Ensemble Model Compression using Knowledge Distillation

會議:

ECCV 2020。

作者:

Devesh Walawalkar, Zhiqiang Shen, Marios Savvides

連結:

https://www。

ecva。net/papers/eccv_20

20/papers_ECCV/papers/123640018。pdf

28. Exclusivity-Consistency Regularized Knowledge Distillation for Face Recognition

會議:

ECCV 2020。

作者:

Xiaobo Wang, Tianyu Fu, Shengcai Liao, Shuo Wang, Zhen Lei, Tao Mei

連結:

https://www。

ecva。net/papers/eccv_20

20/papers_ECCV/papers/123690324。pdf

29. Feature Normalized Knowledge Distillation for Image Classification

會議:

ECCV 2020。

作者:

Kunran Xu, Lai Rui, Yishi Li, Lin Gu

連結:

https://www。

ecva。net/papers/eccv_20

20/papers_ECCV/papers/123700664。pdf

30. Weight Decay Scheduling and Knowledge Distillation for Active Learning

會議:

ECCV 2020。

作者:

Juseung Yun, Byungjoo Kim, Junmo Kim

連結:

https://www。

ecva。net/papers/eccv_20

20/papers_ECCV/papers/123710426。pdf

31. Semantic Relation Preserving Knowledge Distillation for Image-to-Image Translation

會議:

ECCV 2020。

作者:

Zeqi Li, Ruowei Jiang,, Parham Aarabi

連結:

https://www。

ecva。net/papers/eccv_20

20/papers_ECCV/papers/123710647。pdf

32. Interpretable Foreground Object Search As Knowledge Distillation

會議:

ECCV 2020。

作者:

Boren Li, Po-Yu Zhuang, Jian Gu, Mingyang Li, Ping Tan

連結:

https://www。

ecva。net/papers/eccv_20

20/papers_ECCV/papers/123730188。pdf

33. Improving Knowledge Distillation via Category Structure

會議:

ECCV 2020。

作者:

Zailiang Chen, Xianxian Zheng, Hailan Shen, Ziyang Zeng, Yukun Zhou, Rongchang Zhao

連結:

https://www。

ecva。net/papers/eccv_20

20/papers_ECCV/papers/123730205。pdf

34. Making Monolingual Sentence Embeddings Multilingual Using Knowledge Distillation

會議:

EMNLP 2020。 Long Paper。

作者:

Nils Reimers, Iryna Gurevych

連結:

https://www。

aclweb。org/anthology/20

20。emnlp-main。365。pdf

35. Lifelong Language Knowledge Distillation

會議:

EMNLP 2020。 Long Paper。

作者:

Yung-Sung Chuang, Shang-Yu Su, Yun-Nung Chen

連結:

https://www。

aclweb。org/anthology/20

20。emnlp-main。233。pdf

36. Autoregressive Knowledge Distillation through Imitation Learning

會議:

EMNLP 2020。 Long Paper。

作者:

Alexander Lin, Jeremy Wohlwend, Howard Chen, Tao Lei

連結:

https://www。

aclweb。org/anthology/20

20。emnlp-main。494。pdf

37. Improving Neural Topic Models Using Knowledge Distillation

會議:

EMNLP 2020。 Long Paper。

作者:

Alexander Miserlis Hoyle, Pranav Goel, Philip Resnik

連結:

https://www。

aclweb。org/anthology/20

20。emnlp-main。137。pdf

38. Why Skip If You Can Combine: A Simple Knowledge Distillation Technique for Intermediate Layers

會議:

EMNLP 2020。 Short Paper。

作者:

Yimeng Wu, Peyman Passban, Mehdi Rezagholizadeh, Qun Liu

連結:

https://www。

aclweb。org/anthology/20

20。emnlp-main。74。pdf

39. Understanding Knowledge Distillation in Non-autoregressive Machine Translation

會議:

ICLR 2020。

作者:

Chunting Zhou, Jiatao Gu, Graham Neubig

連結:

https://

openreview。net/pdf?

id=BygFVAEKDH

40. P-KDGAN: Progressive Knowledge Distillation with GANs for One-class Novelty Detection

會議:

IJCAI 2020。

作者:

Zhiwei Zhang, Shifeng Chen, Lei Sun

連結:

https://www。

ijcai。org/proceedings/2

020/0448。pdf

41. Private Model Compression via Knowledge Distillation

會議:

AAAI 2019。 AAAI Technical Track: Applications。

作者:

Ji Wang, Weidong Bao, Lichao Sun, Xiaomin Zhu, Bokai Cao, Philip S。 Yu

連結:

https://

aaai。org/ojs/index。php/

AAAI/article/view/3913/3791

42. Knowledge Distillation with Adversarial Samples Supporting Decision Boundary

會議:

AAAI 2019。 AAAI Technical Track: Machine Learning。

作者:

Byeongho Heo, Minsik Lee, Sangdoo Yun, Jin Young Choi

連結:

https://www。

aaai。org/ojs/index。php/

AAAI/article/view/4263/4141

43. Exploiting the Ground-Truth: An Adversarial Imitation Based Knowledge Distillation Approach for Event Detection

會議:

AAAI 2019。 AAAI Technical Track: Natural Language Processing。

作者:

Jian Liu, Yubo Chen, Kang Liu

連結:

https://

aaai。org/ojs/index。php/

AAAI/article/view/4649/4527

44. Scalable Syntax-Aware Language Models Using Knowledge Distillation

會議:

ACL 2019。

作者:

Adhiguna Kuncoro, Chris Dyer, Laura Rimell, Stephen Clark, Phil Blunsom

連結:

https://www。

aclweb。org/anthology/P1

9-1337。pdf

45. PANLP at MEDIQA 2019: Pre-trained Language Models, Transfer Learning and Knowledge Distillation

會議:

ACL 2019。 the 18th BioNLP Workshop and Shared Task。

作者:

Wei Zhu, Xiaofeng Zhou, Keqiang Wang, Xun Luo, Xiepeng Li, Yuan Ni, Guotong Xie

連結:

https://www。

aclweb。org/anthology/W1

9-5040。pdf

46. Structured Knowledge Distillation for Semantic Segmentation

會議:

CVPR 2019。

作者:

Yifan Liu, Ke Chen, Chris Liu, Zengchang Qin, Zhenbo Luo, Jingdong Wang

連結:

https://

openaccess。thecvf。com/c

ontent_CVPR_2019/papers/Liu_Structured_Knowledge_Distillation_for_Semantic_Segmentation_CVPR_2019_paper。pdf

47. Relational Knowledge Distillation

會議:

CVPR 2019。

作者:

Wonpyo Park, Dongju Kim, Yan Lu, Minsu Cho

連結:

https://

openaccess。thecvf。com/c

ontent_CVPR_2019/papers/Park_Relational_Knowledge_Distillation_CVPR_2019_paper。pdf

48. Knowledge Distillation via Instance Relationship Graph

會議:

CVPR 2019。

作者:

Yufan Liu, Jiajiong Cao, Bing Li, Chunfeng Yuan, Weiming Hu, Yangxi Li, Yunqiang Duan

連結:

https://

openaccess。thecvf。com/c

ontent_CVPR_2019/papers/Liu_Knowledge_Distillation_via_Instance_Relationship_Graph_CVPR_2019_paper。pdf

49. Refine and Distill: Exploiting Cycle-Inconsistency and Knowledge Distillation for Unsupervised Monocular Depth Estimation

會議:

CVPR 2019。

作者:

Andrea Pilzer, Stephane Lathuiliere, Nicu Sebe, Elisa Ricci

連結:

https://

openaccess。thecvf。com/c

ontent_CVPR_2019/papers/Pilzer_Refine_and_Distill_Exploiting_Cycle-Inconsistency_and_Knowledge_Distillation_for_Unsupervised_CVPR_2019_paper。pdf

50. Patient Knowledge Distillation for BERT Model Compression

會議:

EMNLP 2019。

作者:

Siqi Sun, Yu Cheng, Zhe Gan, Jingjing Liu

連結:

https://www。

aclweb。org/anthology/D1

9-1441。pdf

51. Weakly Supervised Cross-lingual Semantic Relation Classification via Knowledge Distillation

會議:

EMNLP 2019。

作者:

Yogarshi Vyas, Marine Carpuat

連結:

https://www。

aclweb。org/anthology/D1

9-1532。pdf

52. Natural Language Generation for Effective Knowledge Distillation

會議:

EMNLP 2019。 the 2nd Workshop on Deep Learning Approaches for Low-Resource NLP (DeepLo 2019)。

作者:

Raphael Tang, Yao Lu, Jimmy Lin

連結:

https://www。

aclweb。org/anthology/D1

9-6122。pdf

53. Knowledge Distillation via Route Constrained Optimization

會議:

ICCV 2019。

作者:

Xiao Jin, Baoyun Peng, Yichao Wu, Yu Liu, Jiaheng Liu, Ding Liang, Junjie Yan, Xiaolin Hu

連結:

https://

openaccess。thecvf。com/c

ontent_ICCV_2019/papers/Jin_Knowledge_Distillation_via_Route_Constrained_Optimization_ICCV_2019_paper。pdf

54. Similarity-Preserving Knowledge Distillation

會議:

ICCV 2019。

作者:

Frederick Tung, Greg Mori

連結:

https://

openaccess。thecvf。com/c

ontent_ICCV_2019/papers/Tung_Similarity-Preserving_Knowledge_Distillation_ICCV_2019_paper。pdf

55. On the Efficacy of Knowledge Distillation

會議:

ICCV 2019。

作者:

Jang Hyun Cho, Bharath Hariharan

連結:

https://

openaccess。thecvf。com/c

ontent_ICCV_2019/papers/Cho_On_the_Efficacy_of_Knowledge_Distillation_ICCV_2019_paper。pdf

56. Correlation Congruence for Knowledge Distillation

會議:

ICCV 2019。

作者:

Baoyun Peng, Xiao Jin, Jiaheng Liu, Dongsheng Li, Yichao Wu, Yu Liu, Shunfeng Zhou, Zhaoning Zhang

連結:

https://

openaccess。thecvf。com/c

ontent_ICCV_2019/papers/Peng_Correlation_Congruence_for_Knowledge_Distillation_ICCV_2019_paper。pdf

57. Multilingual Neural Machine Translation with Knowledge Distillation

會議:

ICLR 2019。

作者:

Xu Tan, Yi Ren, Di He, Tao Qin, Zhou Zhao, Tie-Yan Liu

連結:

https://

openreview。net/pdf?

id=S1gUsoR9YX

58. Zero-Shot Knowledge Distillation in Deep Networks

會議:

ICML 2019。

作者:

Gaurav Kumar Nayak, Konda Reddy Mopuri, Vaisakh Shaj, Venkatesh Babu Radhakrishnan, Anirban Chakraborty

連結:

http://

proceedings。mlr。press/v

97/nayak19a/nayak19a。pdf

59. Towards Understanding Knowledge Distillation

會議:

ICML 2019。

作者:

Mary Phuong, Christoph Lampert

連結:

http://

proceedings。mlr。press/v

97/phuong19a/phuong19a。pdf

60. Pedestrian Attribute Recognition by Joint Visual-semantic Reasoning and Knowledge Distillation

會議:

IJCAI 2019。

作者:

Qiaozhe Li, Xin Zhao, Ran He, Kaiqi Huang

連結:

https://www。

ijcai。org/proceedings/2

019/0117。pdf

61. On Knowledge distillation from complex networks for response prediction

會議:

NAACL 2019。

作者:

Siddhartha Arora, Mitesh M。 Khapra, Harish G。 Ramaswamy

連結:

https://www。

aclweb。org/anthology/N1

9-1382。pdf

62. Self-supervised Knowledge Distillation Using Singular Value Decomposition

會議:

ECCV 2018。

作者:

Seung Hyun Lee, Dae Ha Kim, Byung Cheol Song

連結:

https://

openaccess。thecvf。com/c

ontent_ECCV_2018/papers/SEUNG_HYUN_LEE_Self-supervised_Knowledge_Distillation_ECCV_2018_paper。pdf

63. Apprentice: Using Knowledge Distillation Techniques To Improve Low-Precision Network Accuracy

會議:

ICLR 2018。

作者:

Asit K。 Mishra, Debbie Marr

連結:

https://

openreview。net/pdf?

id=B1ae1lZRb

64. Progressive Blockwise Knowledge Distillation for Neural Network Acceleration

會議:

IJCAI 2018。

作者:

Hui Wang, Hanbin Zhao, Xi Li, Xu Tan

連結:

https://www。

ijcai。org/proceedings/2

018/0384。pdf

65. KDGAN: Knowledge Distillation with Generative Adversarial Networks

會議:

NeurIPS 2018。

作者:

Xiaojie Wang, Rui Zhang, Yu Sun, Jianzhong Qi

連結:

https://

papers。nips。cc/paper/73

58-kdgan-knowledge-distillation-with-generative-adversarial-networks。pdf

66. Knowledge Distillation by On-the-Fly Native Ensemble

會議:

NeurIPS 2018。

作者:

xu lan, Xiatian Zhu, Shaogang Gong

連結:

https://

papers。nips。cc/paper/79

80-knowledge-distillation-by-on-the-fly-native-ensemble。pdf

67. Learning to Specialize with Knowledge Distillation for Visual Question Answering

會議:

NeurIPS 2018。

作者:

Jonghwan Mun, Kimin Lee, Jinwoo Shin, Bohyung Han

連結:

https://

papers。nips。cc/paper/80

31-learning-to-specialize-with-knowledge-distillation-for-visual-question-answering。pdf

68. WebChild 2.0 : Fine-Grained Commonsense Knowledge Distillation

會議:

ACL 2017。 System Demonstrations。

作者:

Niket Tandon, Gerard de Melo, Gerhard Weikum

連結:

https://www。

aclweb。org/anthology/P1

7-4020。pdf

69. A Gift From Knowledge Distillation: Fast Optimization, Network Minimization and Transfer Learning

會議:

CVPR 2017。

作者:

Junho Yim, Donggyu Joo, Jihoon Bae, Junmo Kim

連結:

https://

openaccess。thecvf。com/c

ontent_cvpr_2017/papers/Yim_A_Gift_From_CVPR_2017_paper。pdf

70. Knowledge Distillation for Bilingual Dictionary Induction

會議:

EMNLP 2017。

作者:

Ndapandula Nakashole, Raphael Flauger

連結:

https://www。

aclweb。org/anthology/D1

7-1264。pdf

71. Visual Relationship Detection With Internal and External Linguistic Knowledge Distillation

會議:

ICCV 2017。

作者:

Ruichi Yu, Ang Li, Vlad I。 Morariu, Larry S。 Davis

連結:

https://

openaccess。thecvf。com/c

ontent_ICCV_2017/papers/Yu_Visual_Relationship_Detection_ICCV_2017_paper。pdf

72. Learning Efficient Object Detection Models with Knowledge Distillation

會議:

NeurIPS 2017。

作者:

Guobin Chen, Wongun Choi, Xiang Yu, Tony Han, Manmohan Chandraker

連結:

https://

papers。nips。cc/paper/66

76-learning-efficient-object-detection-models-with-knowledge-distillation。pdf

73. Sequence-Level Knowledge Distillation

會議:

EMNLP 2016。

作者:

Yoon Kim, Alexander M。 Rush

連結:

https://www。

aclweb。org/anthology/D1

6-1139。pdf

標簽: knowledge  distillation  2020  https  papers