[go: nahoru, domu]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inconsistent regularization loss computation when used with pretrained models. #41161

Open
meet-minimalist opened this issue Jul 7, 2020 · 2 comments
Assignees
Labels
comp:keras Keras related issues stat:awaiting tensorflower Status - Awaiting response from tensorflower TF 2.11 Issues related to TF 2.11 type:bug Bug

Comments

@meet-minimalist
Copy link

System information

  • Have I written custom code (as opposed to using a stock example script provided in TensorFlow): Yes
  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Google Colab Environment
  • Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device: N/A
  • TensorFlow installed from (source or binary): Binary (Colab Pre-installed)
  • TensorFlow version (use command below): 2.4.0-nightly
  • Python version: 3.6.9
  • Bazel version (if compiling from source): N/A
  • GCC/Compiler version (if compiling from source): N/A
  • CUDA/cuDNN version: N/A (Colab CPU Environment used)
  • GPU model and memory: N/A

Describe the current behavior
I am trying to use MobileNetV2 pretrained model for a simple classification task. Here I want to add regularization loss for MobileNetV2 - base model along with added FC layers. I have conducted 3 experiments as per attached colab notebook, where I get different results for the computation of regularization loss. The MobileNetV2 regularizer loss is computed twice in the model. And I dont know why it happens.

Describe the expected behavior
The computation of the loss in all the experiments should be same.

Standalone code to reproduce the issue
Colab Link : https://colab.research.google.com/drive/1CeKMIAq_g0AOKdakupKIeUjEhkL_TtI1?usp=sharing

Other info / logs

  1. I have tried the same on TF2.2 but got the same behaviour.
  2. I have got the reference of this: issue-37511 for adding regularization loss into pretrained model.
@amahendrakar
Copy link
Contributor

Was able to reproduce the issue with TF v2.2 and TF-nightly. Please find the attached gist .Thanks!

@amahendrakar amahendrakar added comp:keras Keras related issues TF 2.2 Issues related to TF 2.2 labels Jul 9, 2020
@gowthamkpr gowthamkpr assigned fchollet and unassigned gowthamkpr Jul 13, 2020
@gowthamkpr gowthamkpr added the stat:awaiting tensorflower Status - Awaiting response from tensorflower label Jul 13, 2020
@saikumarchalla
Copy link
saikumarchalla commented May 27, 2021

Was able to reproduce the issue with TF 2.11 and Nightly versions.Please find the gist here. Thanks!

@pjpratik pjpratik added TF 2.11 Issues related to TF 2.11 and removed TF 2.2 Issues related to TF 2.2 labels Nov 30, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:keras Keras related issues stat:awaiting tensorflower Status - Awaiting response from tensorflower TF 2.11 Issues related to TF 2.11 type:bug Bug
Projects
None yet
Development

No branches or pull requests

7 participants