(2020李宏毅)机器学习-NETWORK COMPRESSION2023-09-29 13:51:58 文章目录 NETWORK COMPRESSION Network Pruning Knowledge Distillation(知识蒸馏) Parameter Quantization Architecture Design Low rank approximation Dynamic Computation NETWORK COMPRESSION Network Pruning 修剪方法: 修剪neurons Knowledge Distillation(知识蒸馏) Parameter Quantization Architecture Design Low rank approximation Dynamic Computation 上一篇:Redis源码解析之跳跃表(一)下一篇:[源码解析] 深度学习流水线并行 PipeDream(5)--- 通信模块