Is it mandatory to replace dense layer when fine tuning a CNN?

Hi everyone! I have to fine-tune a ResNet-101 model that was pre-trained on 2 classes: normal subjects and affected disease subjects. My fine-tuning dataset is composed of 2 classes: stable subjects and progressive subjects. These stable and progressive subjects are between the normal subjects and the affected disease subjects, so the images and classes are not extremely different.

So, my question is: when fine-tuning the ResNet-101 pre-trained model, should I replace the final dense layer? Or, since the images of the fine-tuning dataset are similar to the pre-trained dataset, can I keep the original pre-trained dense layer? What would be the best practice in this scenario?

Thank you in advance for your help!

The exit layer learns a mapping that’s appropriated only for the context it’s trained on, so I’d remove it. Sometimes you even remove more than just the exit layer; could be an exit block.

1 Like

I’d recommend replacing the final dense layer in your case. Even though your new classes are somewhat similar to the original ones, they’re still distinct categories. Replacing the dense layer allows the model to learn the specific features that distinguish stable from progressive subjects.

However, you might want to experiment with freezing different numbers of layers during fine-tuning. Since your new dataset is similar to the original, you might be able to freeze more layers than usual, which could help if you have a smaller dataset for fine-tuning.

Ultimately, the best approach depends on your specific data and results. I’d suggest trying both methods (keeping vs replacing the dense layer) and comparing their performance. This empirical approach often yields the best results in deep learning tasks.

1 Like

Thank you so much. I removed the last layer and replaced and I got better results. Thanks for the awnser

1 Like

Nice explanation. I understood it all. Thank you so much for your awnser! I tried both ways, and I got better results removing the last layer and replacing it. Thanks a lot.