-
Notifications
You must be signed in to change notification settings - Fork 273
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fairness indicator metrics do not show up #138
Comments
Hi zywind, metrics_specs=[
tfma.MetricsSpec(
metrics=[
tfma.MetricConfig(class_name='ExampleCount'),
tfma.MetricConfig(class_name='BinaryAccuracy'),
tfma.MetricConfig(class_name='FairnessIndicators', config='{"thresholds":[0.3,0.5,0.7]}'),
]
)
] This syntax should work with both TFX and non-TFX examples you've pasted. Also, could you point me to the guide you have been using for Fairness Indicators so that I could capture it there? |
Thank you @kumarpiyush for the information. I tried your suggestion but I get a different error: This might be because I'm using an outdated TFMA (v0.26)? My company will update our TFMA soon so I will test this again later. As for the guide, I was just following the standard guide here: https://www.tensorflow.org/tfx/guide/fairness_indicators#compute_fairness_metrics. It suggested using the add_metrics_callbacks parameter. For TFX, there is no guide. I just found the fairness_indicator_thresholds parameter in the Evaluator's API. Adding a guide for TFX's Evaluator would be very useful. |
@zywind are you by any chance applying transformations to your label? I experienced a similar issue when transforming string labels to one-hot encoded array's. In your case, you are using the output from the exampleGen component rather than the transform component. If your labels are strings, then your error is probably generated by the following line of code, coming from the one_hot function in tensorflow_model_analysis/metrics/metric_util.py: To solve this, you should take the output of the transform component rather than the exampleGen component. Again, I don't know your full situation, but this might be a possible reason your pipeline is failing. |
@DirkjanVerdoorn Thanks for the suggestion. The labels are integer types so that's really not the problem. |
Just to follow up on this, now our environment is updated to TFMA 0.31 and I can confirm that the fairness_indicator_thresholds parameter in Evaluator still doesn't work, but @kumarpiyush's method worked. |
System information
provided in TensorFlow Model Analysis): Yes
First method using TFX
Second method without TFX
Describe the problem
I am trying to display fairness metrics with a TF2 based model, but for some reason, the fairness metrics (false discovery rate, false positive rate, etc.) do not show up in eval_result or in the fairness indicator widget (see screenshot below). This happened both when I use TFX's Evaluator component and when I run TFMA directly. Is it a bug or am I doing something wrong?
The text was updated successfully, but these errors were encountered: