Cannot load saved tflite model when model contains dropout. However quantization-aware model is fine. #51848
Labels
comp:lite
TF Lite related issues
stale
This label marks the issue/pr stale - to be closed automatically if no activity
stat:awaiting response
Status - Awaiting response from author
TF 2.5
Issues related to TF 2.5
type:bug
Bug
System information
Describe the current behavior
If a model contains dropout layers, if can be converted to tflite, but the tflite model cannot be loaded. The error message is:
Confusingly, if I make the model quantization-aware, the model can be successfully converted to tflite and loaded. Does QAT automatically remove dropout?
Describe the expected behavior
If a tflite model that contain dropout is not supported, then the desired behavior are:
Standalone code to reproduce the issue
I'm wondering how should I deal with dropout layer in tflite? As dropout is essential for training, how can I export a trained model without dropout? This is a similar issue, and I'm not satisfied by the workaround proposed there.
The text was updated successfully, but these errors were encountered: