[go: nahoru, domu]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What is the right way to use coverage.py with Tensorflow? #33759

Open
nedbat opened this issue Oct 27, 2019 · 14 comments
Open

What is the right way to use coverage.py with Tensorflow? #33759

nedbat opened this issue Oct 27, 2019 · 14 comments
Assignees
Labels
comp:core issues related to core part of tensorflow stat:awaiting tensorflower Status - Awaiting response from tensorflower type:feature Feature requests

Comments

@nedbat
Copy link
nedbat commented Oct 27, 2019

I apologize if this is the wrong way to ask this question. I'm the maintainer of coverage.py, for measuring code coverage in Python projects. A user wrote an issue for me: nedbat/coveragepy#856

After digging into it, I see that his tf.keras.Model.call() function is not executed directly, but is transformed into a temporary file, and executed there. So coverage.py reports that his code is unexecuted, even though he can see the effects of its execution.

I also see that the transformed code has an ag_source_map__ parameter which can be used to map back from the transformed code to the original code. A coverage.py plugin could use that information to report coverage usefully.

My questions are:

  1. Is there a reason people haven't reported this to coverage.py before? Is there a existing known way to get coverage reports on this kind of code?
  2. What is a stable public API for getting the transformation mapping?
  3. Would TensorFlow be interested in maintaining a coverage.py plugin to make this work properly?
@oanush oanush self-assigned this Oct 29, 2019
@oanush oanush added the type:support Support issues label Oct 29, 2019
@oanush oanush assigned ymodak and unassigned oanush Oct 29, 2019
@ymodak ymodak added comp:keras Keras related issues type:feature Feature requests and removed comp:keras Keras related issues type:support Support issues labels Oct 29, 2019
@ymodak ymodak added the comp:keras Keras related issues label Nov 18, 2019
@tmcclintock
Copy link

Has there been any progress on this on the TF side? This is an important feature for TF in production.

@csachs
Copy link
Contributor
csachs commented Mar 26, 2020

I second that request. This issue needs discussion and maybe special addressing on the coverage.py side. In particular for TF2, all tf.function decorated functions are not observed by coverage.py (as a work-around, in some cases they can be dynamically un-decorated for unittesting)

@nedbat
Copy link
Author
nedbat commented Mar 26, 2020

Does anyone want to take on writing a coverage.py plugin? I can help with the coverage.py side of things.

@csachs
Copy link
Contributor
csachs commented Apr 2, 2020

The mentioned issue ( Sujit-O/pykg2vec#123 ) hinted me at the function tf.config.experimental_run_functions_eagerly(run_eagerly), which looks like the ideal solution. However it does not seem to work completely as advertised, cf. #38170 , as certain functions cease working if switched between eager/non-eager modes.

@rmothukuru rmothukuru added the stat:awaiting tensorflower Status - Awaiting response from tensorflower label Jun 28, 2021
@Zahlii
Copy link
Zahlii commented Sep 17, 2021

With tensorflow 2.5, even if I run tf.config.experimental_run_functions_eagerly(True) directly before my tests, the coverage is not correctly reported. I want to write tests for a custom loss function around CRFs for a production model, where having adequate test coverage is key. Currently, I am stuck on having to exclude this file from coverage.py.

@nedbat
Copy link
Author
nedbat commented Sep 17, 2021

I'm the coverage.py maintainer. I'd be glad to work closely with someone from the Tensorflow side to find a solution to this problem.

@mohantym
Copy link
Contributor
mohantym commented Sep 18, 2021

Hi @nedbat, Found some articles of unit testing and code coverage of Tensorflow codes , It indicates use of tf.test api .Attaching below for reference .
https://theaisummer.com/unit-test-deep-learning/
https://www.tensorflow.org/api_docs/python/tf/test/Benchmark
https://github.com/tensorflow/docs/blob/master/site/en/community/contribute/tests.md . Are they helpful to proceed?

@mohantym mohantym added stat:awaiting response Status - Awaiting response from author and removed stat:awaiting tensorflower Status - Awaiting response from tensorflower labels Oct 7, 2021
@google-ml-butler
Copy link

This issue has been automatically marked as stale because it has no recent activity. It will be closed if no further activity occurs. Thank you.

@google-ml-butler google-ml-butler bot added the stale This label marks the issue/pr stale - to be closed automatically if no activity label Oct 14, 2021
@google-ml-butler
Copy link

Closing as stale. Please reopen if you'd like to work on this further.

@gsakkis
Copy link
gsakkis commented Feb 9, 2022

Bump; this is still an issue.

@nedbat
Copy link
Author
nedbat commented Feb 9, 2022

@gsakkis:

I'm the coverage.py maintainer. I'd be glad to work closely with someone from the Tensorflow side to find a solution to this problem.

@mohantym
Copy link
Contributor

Ok! Reopening as requested.

@mohantym mohantym reopened this Feb 11, 2022
@mohantym mohantym removed stat:awaiting response Status - Awaiting response from author stale This label marks the issue/pr stale - to be closed automatically if no activity labels Feb 11, 2022
@mohantym mohantym removed their assignment Feb 11, 2022
@rivershah
Copy link
rivershah commented Feb 5, 2023

@fchollet Any further progress or thoughts on this please? Coverage reports not being accurate is a production concern. Essentially any decorated function is showing up as a miss. Could we please look into prioritising this work with @gsakkis

While this does work for some use cases tf.config.experimental_run_functions_eagerly(True) it may be more helpful to have the transformation mapping work done

@sachinprasadhs sachinprasadhs added the stat:awaiting tensorflower Status - Awaiting response from tensorflower label Feb 15, 2023
@ymodak ymodak added comp:core issues related to core part of tensorflow and removed comp:keras Keras related issues labels Feb 16, 2023
@hmc-cs-mdrissi
Copy link
hmc-cs-mdrissi commented Mar 12, 2023

I think easiest fix is to just have a global way to fully disable autograph. tf.config.experimental_run_functions_eagerly doesn't actually disable all autograph usages. A layer called in v1 mode ignores that setting entirely. While v1 layers may not support eager, they can still support not using autograph for coverage statistics. tf.data also looks to ignore that setting. Some way for tests under coverage to just fully opt out of autograph, tf.config.disable_autograph would be nice.

edit: One hacky way (depends on internals) that does look to work,

@pytest.fixture(autouse=True)
def disable_autograph_in_coverage() -> None:
    config.CONVERSION_RULES = (DoNotConvert("your_module"),) + config.CONVERSION_RULES

Scanning the tf tests this way looks to be used here. A public api for adjusting conversion rules would work too. Although I think for test case simple global flag to fully turn off tf_convert would be enough.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:core issues related to core part of tensorflow stat:awaiting tensorflower Status - Awaiting response from tensorflower type:feature Feature requests
Projects
None yet
Development

No branches or pull requests