You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Ubuntu 20.04
TensorFlow installation (pip package or built from source): pip package
TensorFlow library (version, if pip package or github SHA, if built from source): 2.5.0
2. Code
Provide code to help us reproduce your issues using one of the following options:
def tokenize(self, x):
x = tf_text.case_fold_utf8(x)
x = tf_text.normalize_utf8(x, "NFD")
x = tf.strings.regex_replace(x, r"\p{Mn}", "")
x = tf.strings.regex_replace(x, r"\p{Cc}|\p{Cf}", " ")
x,_ ,_ = tf_text.regex_split_with_offsets(
x, self._delim_regex_pattern, self._keep_delim_regex_pattern,
"BasicTokenizer")
return x
Testing with a simpler delim and keep_delim inputs will also result in the conversion error.
The full model is in the following colab gist.
Reference To [TensorFlow Model, and tflite conversion Colab] Colab gist
In the gist, the model is shown to work before and after saving to SavedModel.
The code attempts to convert to tflite from SavedModel.
3. Failure during conversion
TFLite Fails to convert when model contains regex_split_with_offsets ops.
tensorflow_text is imported,
Select Ops flex option is enabled. such as: converter.target_spec.supported_ops = [ tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS ]
In the example, there are other tf_text ops before problematic regex_split_with_offsets.
When commenting away the line with tf_text.regex_split_with_offsets, it will convert successfully.
4. (optional) Any other info / logs
The error prompt says the "Graph does not contain node: ". Full trace logs is available below and in the gist.
---------------------------------------------------------------------------
Exception Traceback (most recent call last)
/usr/local/lib/python3.7/dist-packages/tensorflow/lite/python/convert.py in toco_convert_protos(model_flags_str, toco_flags_str, input_data_str, debug_info_str, enable_mlir_converter)
293 debug_info_str,
--> 294 enable_mlir_converter)
295 return model_str
4 frames
/usr/local/lib/python3.7/dist-packages/tensorflow/lite/python/wrap_toco.py in wrapped_toco_convert(model_flags_str, toco_flags_str, input_data_str, debug_info_str, enable_mlir_converter)
37 debug_info_str,
---> 38 enable_mlir_converter)
39
Exception: Graph does not contain node:
During handling of the above exception, another exception occurred:
ConverterError Traceback (most recent call last)
<ipython-input-82-51fa59e54032> in <module>()
5 tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops.
6 ]
----> 7 tflite_model = converter.convert()
8 with open(tfLite_filepath, 'wb') as f:
9 f.write(tflite_model)
/usr/local/lib/python3.7/dist-packages/tensorflow/lite/python/lite.py in convert(self)
911 converter_kwargs.update(quant_mode.converter_flags())
912
--> 913 result = _convert_saved_model(**converter_kwargs)
914 if self.experimental_new_quantizer:
915 calibrate_and_quantize, flags = quant_mode.quantizer_flags(
/usr/local/lib/python3.7/dist-packages/tensorflow/lite/python/convert.py in convert_saved_model(saved_model_dir, saved_model_version, saved_model_tags, saved_model_exported_names, **kwargs)
725 None, # input_data, unused
726 None, # debug_info_str, unused
--> 727 enable_mlir_converter=True)
728 return data
729
/usr/local/lib/python3.7/dist-packages/tensorflow/lite/python/convert.py in toco_convert_protos(model_flags_str, toco_flags_str, input_data_str, debug_info_str, enable_mlir_converter)
295 return model_str
296 except Exception as e:
--> 297 raise ConverterError(str(e))
298
299 if distutils.spawn.find_executable(_toco_from_proto_bin) is None:
ConverterError: Graph does not contain node:
The text was updated successfully, but these errors were encountered:
Ragged Tensors are not a simple tensor type. Returning objects should be one of TensorFlow Lite types. Just returning a Ragged Tensor object is not supported.
Thank you for your hints. For those interested, I am able to successfully convert to tflite after introducing the line after regex_split_with_offsets with, x = x.to_tensor()
1. System information
2. Code
Provide code to help us reproduce your issues using one of the following options:
Testing with a simpler delim and keep_delim inputs will also result in the conversion error.
The full model is in the following colab gist.
Reference To [TensorFlow Model, and tflite conversion Colab] Colab gist
In the gist, the model is shown to work before and after saving to SavedModel.
The code attempts to convert to tflite from SavedModel.
3. Failure during conversion
TFLite Fails to convert when model contains regex_split_with_offsets ops.
tensorflow_text is imported,
Select Ops flex option is enabled. such as:
converter.target_spec.supported_ops = [ tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS ]
In the example, there are other tf_text ops before problematic regex_split_with_offsets.
When commenting away the line with tf_text.regex_split_with_offsets, it will convert successfully.
4. (optional) Any other info / logs
The error prompt says the "Graph does not contain node: ". Full trace logs is available below and in the gist.
The text was updated successfully, but these errors were encountered: