[go: nahoru, domu]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add validation to module name #49

Merged
merged 4 commits into from
Apr 8, 2018
Merged

Add validation to module name #49

merged 4 commits into from
Apr 8, 2018

Conversation

mimischi
Copy link
Contributor
@mimischi mimischi commented Feb 24, 2018

Changes made in this pull request:

  • Added check of module name to mdbenchmark.generate. When given a module name, we try to find it on the system. Users can skip validation with --skip-validation.
  • Added --skip-validation option to mdbenchmark.generate.
  • Resolves Generate Function Arguments #47.

PR Checklist

  • Added unit / integration tests.
  • Added changelog fragment in ./changelog/ (more information)?
  • [ ] Issue raised/referenced?

# If we are unable to find the modules path on the system or if the
# user told us to skip the validation, we do so.
if (validated_module is None) or skip_validation:
console.warn('Not performing module name validation.')
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This warning will be printed for every module that we define. We probably should put this outside the for-loop so it is only printed once.

@codecov
Copy link
codecov bot commented Feb 28, 2018

Codecov Report

Merging #49 into master will increase coverage by 2.29%.
The diff coverage is 100%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master      #49      +/-   ##
==========================================
+ Coverage   87.06%   89.35%   +2.29%     
==========================================
  Files          11       11              
  Lines         456      526      +70     
==========================================
+ Hits          397      470      +73     
+ Misses         59       56       -3
Impacted Files Coverage Δ
mdbenchmark/mdengines/__init__.py 100% <100%> (ø) ⬆️
mdbenchmark/utils.py 98.24% <100%> (-0.09%) ⬇️
mdbenchmark/generate.py 100% <100%> (+4.16%) ⬆️
mdbenchmark/submit.py 97.56% <0%> (+2.43%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update bfee8f3...67a3045. Read the comment docs.

if (validated_module is None) or skip_validation:
console.warn('Not performing module name validation.')
# If the validation fails, we throw an error and exit the script.
elif not validated_module:
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We currently do not test this case.

@mimischi
Copy link
Contributor Author
mimischi commented Mar 1, 2018

@kain88-de Any thoughts? Besides my two comments, this PR is feature complete.

We could think about moving the code from mdbenchmark/validate.py to mdbenchmark/mdengines/__init__.py where it would live with detect_md_engine().


from . import console

SUPPORTED_MDENGINES = ('gromacs', 'namd')
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

rather get that variable from a definition in the mdengines module. Right now we need to maintain a list of supported engines in two places.

@@ -103,10 +115,40 @@ def generate(name, gpu, module, host, min_nodes, max_nodes, time, list_hosts):
'If you use the {} option make sure you use the GPU compatible NAMD module!',
'--gpu')

# Grab list of all available and supported modules
available_modules = get_available_modules()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why not filter the list modules here?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you mean we should duplicate the for m in module: code? Otherwise we cannot do the validation of module names, as we need to iterate through the list.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

modules = [m for m in modules if m in available_modules]

Now all modules are actually available. If you want to warn the user for missing ones you can so

requested_modules = set(modules)
modules = [...]
if requested_modules.difference(modules):
    console.warn(" {} are not available modules".format(requested_modules.difference(modules))

This if-clause should only be activated if the difference is not empty. If it is empty all modules are available.

@mimischi
Copy link
Contributor Author

@kain88-de What is wrong with failing tests on python 2.7? What's wrong about that import statement?

@mimischi
Copy link
Contributor Author
mimischi commented Mar 18, 2018

This became way bigger than anticipated. I've done a few things in the latest commits:

  • Extracted all argument validation for the generate CLI command into a new validate.py. This way we can test the exceptions separately in test_validate.py.
    • Also this cuts down on the size of the generate function.
  • Added a new module name validation block inside thegenerate function.
    1. It first checks whether the requested MD engine is supported. Throws an error if not.
    2. Then it grabs all available modules and compares them to the requested ones.
    3. It then throws an error for all modules that it could not find and shows all available options to the user.

Several things that we could consider:

  1. Refactor the module name validation into some different file. Eithervalidate.py or mdengines/__init__.py or create another mdengines/validate.py.
  2. Change the style on how we inform the user. The current messages are pretty repetitive and pretty verbose. This could use some styling.
  3. Add / update docstrings of the functions. I think there are some dangling docstrings that I edited, but never finished. Also some of our older functions do not have any docstrings at all.
  4. Finally clean up the tests. I really do not like the state of our tests, but I'm too inexperienced with the intricacies of pytest to use it to its full extent. I'm sure we can cut down on some of the spaghetti code, e.g. here and here. Lots of test code is repetitive (e.g., calls to cli_runner could probably be cut-down with some custom fixture?).

@kain88-de kain88-de closed this Mar 18, 2018
@kain88-de kain88-de deleted the validation branch March 18, 2018 19:00
@kain88-de
Copy link
Contributor

Sorry about closing this. That was an error

@kain88-de kain88-de restored the validation branch March 18, 2018 20:03
@kain88-de kain88-de reopened this Mar 18, 2018
@mimischi
Copy link
Contributor Author
mimischi commented Mar 26, 2018

How is one supposed to test the callback functions? Nevermind. Found this example in the Flask code base, via SO:

def test_get_version(test_apps, capsys):
    """Test of get_version."""
    from flask import __version__ as flask_ver
    from sys import version as py_ver

    class MockCtx(object):
        resilient_parsing = False
        color = None

        def exit(self): return

    ctx = MockCtx()
    get_version(ctx, None, "test")
    out, err = capsys.readouterr()
    assert flask_ver in out
    assert py_ver in out

@mimischi
Copy link
Contributor Author
mimischi commented Apr 3, 2018

Phew. Refactoring code after the last merge wasn't as exciting as I was hoping for. Everything should be working now.

I still want to make sure that we are testing everything correctly. Quickly went through the CLI manually and found a major bug which I now fixed. When I've finished that, this should finally be ready to merge.

@kain88-de
Copy link
Contributor

you can remove the validation tests and clean up the history a bit.

@mimischi
Copy link
Contributor Author
mimischi commented Apr 3, 2018

Which validation tests? The contents of test_validate.py? I would keep those, because we want to make sure that we throw the expected error messages.

@kain88-de
Copy link
Contributor

or move them to the appropriate files. There is no file called validate.py

@mimischi
Copy link
Contributor Author
mimischi commented Apr 3, 2018

The PR is done. @kain88-de I assume we should keep the merge and just squash the commits in between, right?

Copy link
Contributor
@kain88-de kain88-de left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM besides the one minor issue.

About squashing. You can squash as much as you like. With the merge commit, you have to be careful git doesn't like rebasing them very much.

from mdbenchmark import cli
from mdbenchmark.ext.click_test import cli_runner
from mdbenchmark.generate import (validate_hosts, validate_module,
validate_name, validate_number_of_nodes)
from mdbenchmark.mdengines import normalize_modules
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this last function should be tested in test_mdengines

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is currently not directly tested in this module either

@mimischi mimischi changed the title [WIP] Add validation to module name Add validation to module name Apr 8, 2018
@mimischi mimischi force-pushed the validation branch 2 times, most recently from c243611 to af92fc5 Compare April 8, 2018 12:57
mimischi and others added 4 commits April 8, 2018 14:59
* Move validation into callbacks
* Fix import errors
* Add fixture to mock context
* Move skip_validation check into normalize_modules method.
This fixes a bug, when providing an unsupported MD engine and the
`--skip-validation` option at the same time.
* Add docstrings
* Added some missing tests in other parts of the module
@mimischi mimischi merged commit 678836e into master Apr 8, 2018
@mimischi mimischi deleted the validation branch April 8, 2018 13:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants