插入目录
\begin{document}
\maketitle\tableofcontents
\newpage # 从下一页开始\end{document}\maketitle
\tableofcontents
\textbf{\ \\\\\\\\\\$*$ means that it is a supplementary experimental results for the manuscript.\\ $**$ means that it is a newly added …
1.分类
BN是在batch上,对N、H、W做归一化,而保留通道 C 的维度。BN对较小的batch size效果不好。BN适用于固定深度的前向神经网络,如CNN,不适用于RNN;LN在通道方向上,对C、H、W归一化,主要对RN…
Knowledge Distil 相关文章1.FitNets : Hints For Thin Deep Nets (ICLR2015)2.A Gift from Knowledge Distillation:Fast Optimization, Network Minimization and Transfer Learning (CVPR 2017)3.Matching Guided Distillation(…
模型压缩相关文章Learning both Weights and Connections for Efficient Neural Networks (NIPS2015)Deep Compression: Compressing Deep Neural Networks with Pruning, Trained Quantization and Huffman Coding(ICLR2016)Learning both Weights and …