转载自:https:///wmz545546/article/details/77603543
- When and how to fine-tune?
- 四个主要场景:
- 新数据集小,且与原始数据集相似:要考虑小数据集过度拟合问题;利用CNN codes 训练一个线性分类器
- 新数据集大,且与原始数据集相似:不用考虑过度拟合,可尝试微调整个神经网络;
- 新数据集小,并与原始数据集差距大:训练一个线性分类器,而新数据集与原始数据集差距大,work better to train the SVM classifier from activations somewhere earlier in the network
- 新数据集大,且与原始数据集差距大,使用预训练模型参数,基于新数据集微调整个神经网络
- Practical advice
- a few additional things to keep in mind when performing Transfer Learning:
- Constraints from pretrained models:使用预训练网络,新数据集使用的架构将受限,比如不能随意take out Conv layers from the pretrained network;
- Learning rates:微调 ConvNet权重(ConvNet weights are relatively good)时的学习率要比新的线性分类器(权重是随机初始化的)的学习率要小;