[go: nahoru, domu]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TensorFlow Lite Inference Crash with tf.reverse(x, axis=[]) #62679

Open
ganler opened this issue Dec 22, 2023 · 6 comments
Open

TensorFlow Lite Inference Crash with tf.reverse(x, axis=[]) #62679

ganler opened this issue Dec 22, 2023 · 6 comments
Assignees
Labels
comp:lite TF Lite related issues TFLiteConverter For issues related to TFLite converter

Comments

@ganler
Copy link
Contributor
ganler commented Dec 22, 2023

1. System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): ubuntu 22
  • TensorFlow installation (pip package or built from source): pip install tf-nightly where python is 3.9
  • TensorFlow library (version, if pip package or github SHA, if built from source):
pip show tf-nightly                                           
Name: tf-nightly
Version: 2.16.0.dev20231221
...

2. Code

Colab link: https://colab.research.google.com/drive/1gAsclHMWEf9in0wkF-y1nIbbFrh1m11V?usp=sharing

import tensorflow as tf

tf.reverse(tf.ones((1,), dtype=tf.float32), [])  # no problem

class Foo(tf.Module):
    @tf.function(input_signature=[tf.TensorSpec(shape=[None], dtype=tf.float32)])
    def reverse(self, x):
        #    works fine if axis = [0]
        #    crashes if axis = []
        return tf.reverse(x, axis=[])

foo = Foo()
converter = tf.lite.TFLiteConverter.from_concrete_functions(
    funcs=[foo.reverse.get_concrete_function()],
    trackable_obj=foo,
)

tflite_model = converter.convert()
interpreter = tf.lite.Interpreter(model_content=tflite_model)
# crash
interpreter.get_signature_runner()(x=tf.ones((1,), dtype=tf.float32))
image

3. Failure after conversion

Converted model crash at inference and the model is fully valid.

@ganler ganler added the TFLiteConverter For issues related to TFLite converter label Dec 22, 2023
@LakshmiKalaKadali LakshmiKalaKadali added the comp:lite TF Lite related issues label Dec 27, 2023
@LakshmiKalaKadali
Copy link
Contributor

Hi @pkgoogle
I have reproduced the issue in both tf-nightly and stable version(tf 2.15) as well. The session has crashed in both versions. Here is the gist. Please look into the issue.
Thank You

@pkgoogle
Copy link

I am able to replicate w/ the same gist, I should note that tf.reverse(x, axis=[]) is effectively a non-op. @ganler, Is that what you intended? I.e. you are saying reverse no axes.

@pkgoogle pkgoogle added the stat:awaiting response Status - Awaiting response from author label Dec 27, 2023
@ganler
Copy link
Contributor Author
ganler commented Dec 27, 2023

Yes the usage leading to a crash is reverse of no axes and it is a non-op. I am reporting it as a bug here as it unexpectedly crashed the Python program leading to inconveniences in an automated model generation pipeline. Maybe it is better to just eliminate the non-op and move on instead of crash without errors. :)

@google-ml-butler google-ml-butler bot removed the stat:awaiting response Status - Awaiting response from author label Dec 27, 2023
@pkgoogle
Copy link
pkgoogle commented Dec 28, 2023

No worries, just wanted to ensure you are able to continue w/ your work while we investigate this case, @nutsiepully can you please take a look? Thanks.

@pkgoogle pkgoogle added the stat:awaiting tensorflower Status - Awaiting response from tensorflower label Dec 28, 2023
@sawantkumar sawantkumar self-assigned this Jun 12, 2024
@sawantkumar
Copy link
sawantkumar commented Jun 12, 2024

Hi @ganler , if you are able to access a linux system you may be able to resolve your issue by using AI-Edge-Torch, you can find more information here: googleblog.

I have actually created a simple script for converting your model here:

import torch
import torch.nn as nn
import ai_edge_torch

class Foo(nn.Module):
    def __init__(self):
        super(Foo, self).__init__()

    def forward(self, x):
        return torch.flip(x, dims=[])

foo = Foo()

sample_input = (torch.randn(1),)

# Convert the model using AI Edge Torch
edge_model = ai_edge_torch.convert(foo.eval(), sample_input)

# Export the model to TFLite format
edge_model.export('foo_model.tflite')

If you want to, you can actually try visualizing the result in model-explorer as well.

Please try them out and let us know if this resolves your issue. If you still need further help, feel free to open a new issue at the respective repo.

@sawantkumar sawantkumar added the stat:awaiting response Status - Awaiting response from author label Jun 18, 2024
Copy link

This issue is stale because it has been open for 7 days with no activity. It will be closed if no further activity occurs. Thank you.

@github-actions github-actions bot added the stale This label marks the issue/pr stale - to be closed automatically if no activity label Jun 26, 2024
@pkgoogle pkgoogle removed the stat:awaiting tensorflower Status - Awaiting response from tensorflower label Jun 26, 2024
@pkgoogle pkgoogle removed their assignment Jun 26, 2024
@github-actions github-actions bot removed stale This label marks the issue/pr stale - to be closed automatically if no activity stat:awaiting response Status - Awaiting response from author labels Jun 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:lite TF Lite related issues TFLiteConverter For issues related to TFLite converter
Projects
None yet
Development

No branches or pull requests

6 participants