[go: nahoru, domu]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix non-idempotent prediction due to in-place update to model-config #11014

Merged
merged 4 commits into from
Feb 6, 2024

Conversation

B-Step62
Copy link
Collaborator
@B-Step62 B-Step62 commented Feb 5, 2024
🛠 DevTools 🛠

Open in GitHub Codespaces

Install mlflow from this PR

pip install git+https://github.com/mlflow/mlflow.git@refs/pull/11014/merge

Checkout with GitHub CLI

gh pr checkout 11014

What changes are proposed in this pull request?

In predict() method of _TransformersWrapper, self.model_config is modified in-place for a few times, e.g. (1) params overrides model config values (2) some pop() operations to remove our custom keys like include_prompt before actual pipeline inference. These updates directly change the original model config, so as a result, the next prediction uses the different model config tweaked by the first call.

E.g.

model_config = {
    "top_k": 2,
    "num_beams": 5,
    "max_length": 30,
}
    
pyfunc_model = _TransformersWrapper(text2text_generation_pipeline, model_config=model_config)
assert pyfunc_model.model_config["top_k"] == 2

# Params will be used to override the values of model_config
params = {
    "top_k": 3,
    "max_length": 50,
}
pyfunc_model.predict("How to learn Python in 3 weeks?", params=params)
    
>       assert pyfunc_model.model_config["top_k"] == 2
E       assert 3 == 2

This also cause another issue in model logging, where incorrect model config is saved. This is because we run sometimes multiple prediction during model saving e.g. when user specified input_example we generate model output to infer signature from it. Then the saved model config is the one tweaked by these predictions.

Note: I hit this issue while development for another issue and running test_instructional_pipeline_no_prompt_in_output test, which is skipped in CI (due to large model size). Other CI tests didn't catch this, because most of them specify signature manually + not giving input example so only single prediction call only in the test case.

How is this PR tested?

  • Existing unit/integration tests
  • New unit/integration tests
  • Manual tests

Does this PR require documentation update?

  • No. You can skip the rest of this section.
  • Yes. I've updated:
    • Examples
    • API references
    • Instructions

We probably need to update some document to warn users for this model config issue, as we suggest users to use the option (example). I can do that as a follow-up.

Release Notes

Is this a user-facing change?

  • No. You can skip the rest of this section.
  • Yes. Give a description of this change to be included in the release notes for MLflow users.

Fix idempotent prediction issue due to in-place update to model_config, which can also cause incorrect metadata to be saved.

What component(s), interfaces, languages, and integrations does this PR affect?

Components

  • area/artifacts: Artifact stores and artifact logging
  • area/build: Build and test infrastructure for MLflow
  • area/deployments: MLflow Deployments client APIs, server, and third-party Deployments integrations
  • area/docs: MLflow documentation pages
  • area/examples: Example code
  • area/model-registry: Model Registry service, APIs, and the fluent client calls for Model Registry
  • area/models: MLmodel format, model serialization/deserialization, flavors
  • area/recipes: Recipes, Recipe APIs, Recipe configs, Recipe Templates
  • area/projects: MLproject format, project running backends
  • area/scoring: MLflow Model server, model deployment tools, Spark UDFs
  • area/server-infra: MLflow Tracking server backend
  • area/tracking: Tracking Service, tracking client APIs, autologging

Interface

  • area/uiux: Front-end, user experience, plotting, JavaScript, JavaScript dev server
  • area/docker: Docker use across MLflow's components, such as MLflow Projects and MLflow Models
  • area/sqlalchemy: Use of SQLAlchemy in the Tracking Service or Model Registry
  • area/windows: Windows support

Language

  • language/r: R APIs and clients
  • language/java: Java APIs and clients
  • language/new: Proposals for new client languages

Integrations

  • integrations/azure: Azure and Azure ML integrations
  • integrations/sagemaker: SageMaker integrations
  • integrations/databricks: Databricks integrations

How should the PR be classified in the release notes? Choose one:

  • rn/none - No description will be included. The PR will be mentioned only by the PR number in the "Small Bugfixes and Documentation Updates" section
  • rn/breaking-change - The PR will be mentioned in the "Breaking Changes" section
  • rn/feature - A new user-facing feature worth mentioning in the release notes
  • rn/bug-fix - A user-facing bug fix worth mentioning in the release notes
  • rn/documentation - A user-facing documentation change worth mentioning in the release notes

In predict() method of _TransformersWrapper, self.model_config is modified
in-place a few times, like update(), pop(). Then next prediction call will
see the different model_config tweaked by the first call.
As we run multiple prediction during model saving, this also causes that the
saved model_config is not equal to what user specifies.

Signed-off-by: B-Step62 <yuki.watanabe@databricks.com>
@github-actions github-actions bot added area/models MLmodel format, model serialization/deserialization, flavors rn/bug-fix Mention under Bug Fixes in Changelogs. labels Feb 5, 2024
Copy link
github-actions bot commented Feb 5, 2024

Documentation preview for 43e1642 will be available here when this CircleCI job completes successfully.

More info

data=example,
model_config=model_config,
params=params,
flavor_config=flavor_config,
Copy link
Collaborator Author
@B-Step62 B-Step62 Feb 5, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

NB: This is a fix for another small bug - flavor_config needs to be passed for inferring signature (otherwise logging an InstructionalPipeline with input_example will raise NPE).

# deep copy of the original model_config that was specified by the user, otherwise the
# prediction won't be idempotent. Hence we creates an immutable dictionary of the original
# model config here and enforce creating a deep copy at every predict call.
self.model_config = MappingProxyType(model_config or {})
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice use of the read-only dict wrapper!

Copy link
Member
@BenWilson2 BenWilson2 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Great fix for an insidious bug. Marked for inclusion in 2.10.1 patch

BenWilson2 and others added 3 commits February 5, 2024 13:43
Signed-off-by: Ben Wilson <39283302+BenWilson2@users.noreply.github.com>
Co-authored-by: Ben Wilson <39283302+BenWilson2@users.noreply.github.com>
Signed-off-by: Yuki Watanabe <31463517+B-Step62@users.noreply.github.com>
Signed-off-by: B-Step62 <yuki.watanabe@databricks.com>
Copy link
Member
@harupy harupy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@B-Step62 B-Step62 merged commit 19681e4 into mlflow:master Feb 6, 2024
61 checks passed
@B-Step62 B-Step62 deleted the fix-model-config-side-effect branch February 6, 2024 01:50
daniellok-db pushed a commit to daniellok-db/mlflow that referenced this pull request Feb 6, 2024
…lflow#11014)

Signed-off-by: B-Step62 <yuki.watanabe@databricks.com>
Signed-off-by: Ben Wilson <39283302+BenWilson2@users.noreply.github.com>
Signed-off-by: Yuki Watanabe <31463517+B-Step62@users.noreply.github.com>
Co-authored-by: Ben Wilson <39283302+BenWilson2@users.noreply.github.com>
daniellok-db pushed a commit that referenced this pull request Feb 6, 2024
…11014)

Signed-off-by: B-Step62 <yuki.watanabe@databricks.com>
Signed-off-by: Ben Wilson <39283302+BenWilson2@users.noreply.github.com>
Signed-off-by: Yuki Watanabe <31463517+B-Step62@users.noreply.github.com>
Co-authored-by: Ben Wilson <39283302+BenWilson2@users.noreply.github.com>
lu-wang-dl pushed a commit to lu-wang-dl/mlflow that referenced this pull request Feb 6, 2024
…lflow#11014)

Signed-off-by: B-Step62 <yuki.watanabe@databricks.com>
Signed-off-by: Ben Wilson <39283302+BenWilson2@users.noreply.github.com>
Signed-off-by: Yuki Watanabe <31463517+B-Step62@users.noreply.github.com>
Co-authored-by: Ben Wilson <39283302+BenWilson2@users.noreply.github.com>
Signed-off-by: lu-wang-dl <lu.wang@databricks.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/models MLmodel format, model serialization/deserialization, flavors rn/bug-fix Mention under Bug Fixes in Changelogs.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants