(二十八):Soft-Label Dataset Distillation and Text Dataset Distillation2023-11-09 23:23:16 (二十八):Soft-Label Dataset Distillation and Text Dataset Distillation Abstract 1. Introduction 2. Related Work 2.1 Knowledge Distillation知识蒸馏 2.2 Learning from ‘small’ data 2.3 Dataset Reduction, Prototype Generation, and Summarization 2.4 Generative Adversarial Networks 2.5 Measuring Problem Dimensionality 3. Extending Dataset Distillation扩展数据集蒸馏 3.1 Motivation 上一篇:PODNet: Pooled Outputs Distillation for Small-Tasks Incremental Learning论文详解ECCV2020下一篇:【论文笔记】TinyBERT: Distilling BERT for Natural Language Understanding