[go: nahoru, domu]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Which is the smallest SSD model to use in a MCU like ESP-EYE and using tensorflow lite micro? #50656

Open
VicGoico opened this issue Jul 8, 2021 · 3 comments
Assignees
Labels
comp:micro Related to TensorFlow Lite Microcontrollers stat:awaiting tensorflower Status - Awaiting response from tensorflower type:others issues not falling in bug, perfromance, support, build and install or feature

Comments

@VicGoico
Copy link
VicGoico commented Jul 8, 2021

Hi,

I am trying to work with the smallest SSD model of tensorflow, but my problem is that I am working with an ESP-EYE and this MCU has only 4 MB of memory.

All SSD models that I saw in the pretrained models Garden of tensorflow, are too big (more than 4 MB) or they dont have a ".tflite" file.
In this last case, I saw that there is a way to convert the file ".pb" to ".tflite" HERE and a video where explain which parameters you must use VIDEO, but the size of the files increase a lot.
For example, if the size of a ".pb" file is 5 MB, the conversion makes a file of 25 MB and things like this.

Here is a example of the code in python that I use for the conversion:

import tensorflow as tf

# Convert the model.
converter = tf.compat.v1.lite.TFLiteConverter.from_frozen_graph(
    graph_def_file='tflite/tflite_graph.pb', # Dir of the ".pb" file
    input_arrays=['normalized_input_image_tensor'],
    input_shapes={'normalized_input_image_tensor' : [1, 300, 300,3]}, # Shape of the images of the model you will need to change the 300 and 300
    output_arrays=['TFLite_Detection_PostProcess', 'TFLite_Detection_PostProcess:1', 'TFLite_Detection_PostProcess:2','TFLite_Detection_PostProcess:3'],
)
converter.allow_custom_ops=True

# converter.quantized_input_stats = {'input' : (0., 1.)}  # mean, std_dev (input range is [-1, 1])
# converter.inference_type = tf.int8 # this is the recommended type.
# converter.inference_input_type=tf.uint8 #optional
# converter.inference_output_type=tf.uint8 #optional
converter.optimizations = [tf.lite.Optimize.DEFAULT]
tflite_model = converter.convert()

# Save the model.
with open('quantized_model.tflite', 'wb') as f:
  f.write(tflite_model)

Soo my questions are:

  1. Do you know some SSD model with a size less than 4 MB, for example, 3MB or 2MB, with the capability to detect persons?
  2. Is posible to work with a SSD model and make the "Invoke"(prediction) of the model in the ESP-EYE or is a lot of work for a MCU?
    I found some documentation, but I´m not sure if it works:

Thanks for read and have a good day.

See you XD

@VicGoico VicGoico added the type:others issues not falling in bug, perfromance, support, build and install or feature label Jul 8, 2021
@VicGoico VicGoico changed the title Which is the smallest ssd model to use in a MCU like ESP-EYE and using tensorflow lite micro? Which is the smallest SSD model to use in a MCU like ESP-EYE and using tensorflow lite micro? Jul 8, 2021
@UsharaniPagadala UsharaniPagadala added comp:lite TF Lite related issues comp:micro Related to TensorFlow Lite Microcontrollers TFLiteConverter For issues related to TFLite converter labels Jul 8, 2021
@ymodak
Copy link
Contributor
ymodak commented Jul 9, 2021

Did you refer person detection example for tf lite micro to get some pointers?
https://github.com/tensorflow/tflite-micro/blob/main/tensorflow/lite/micro/examples/person_detection/training_a_model.md

@ymodak ymodak added the stat:awaiting response Status - Awaiting response from author label Jul 9, 2021
@VicGoico
Copy link
Author
VicGoico commented Jul 9, 2021

Hi,

I saw that tutorial, but they explain how to train a classification model where the model can say if there is a person in the image, but not how many.
In my case, I would need a very lite SSD model(less than 2MB or 1MB), that can detected multiple objects in the same image, specifically people.

@VicGoico
Copy link
Author

Hi,

My main question is:

Could I work with an object detection model in a MCU like ESP-EYE, with 4MB of size, using TFLiteMicro or TFLiteMicro only can support object classification models, like this example?

Because all object detection models that I saw are more heavy than 4MB.

See you XD

@tensorflowbutler tensorflowbutler removed the stat:awaiting response Status - Awaiting response from author label Jul 16, 2021
@ymodak ymodak added stat:awaiting tensorflower Status - Awaiting response from tensorflower and removed comp:lite TF Lite related issues TFLiteConverter For issues related to TFLite converter labels Jul 16, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:micro Related to TensorFlow Lite Microcontrollers stat:awaiting tensorflower Status - Awaiting response from tensorflower type:others issues not falling in bug, perfromance, support, build and install or feature
Projects
None yet
Development

No branches or pull requests

5 participants