[go: nahoru, domu]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using NnApiDelegate in TFLite 2.11.0 returns same embeddings for all images. It works fine for 2.6.0 #60653

Open
shuaga opened this issue May 22, 2023 · 8 comments
Assignees
Labels
comp:lite TF Lite related issues stat:awaiting tensorflower Status - Awaiting response from tensorflower TF 2.11 Issues related to TF 2.11 TFLiteNNAPIDelegate For issues related to TFLite NNAPI Delegate type:support Support issues

Comments

@shuaga
Copy link
shuaga commented May 22, 2023

System information

  • Have I written custom code (as opposed to using a stock example script
    provided in TensorFlow)
    : Yes
  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): MacOS 13.3.1
  • Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue
    happens on a mobile device
    : Xiaomi Poco F1
  • TensorFlow Lite version: 2.11.0

Describe the problem

In my android app I'm using Facenet model to recognize faces. I have added NnApiDelegate to the interpreterOptions.
My app has been working with TFLite version 2.6.0. When I upgraded the TFLite version to 2.10.0 or 2.11.0, I see that the model returns the same embeddings for any image I provide.
Removing the NnApiDelegate works in 2.11.0, but it slows down the face recognition considerably, so I do not want to remove NnApiDelegate.

Source code / logs

Code for setting up the interpreter in Kotlin:

val interpreterOptions = Interpreter.Options()
interpreterOptions.addDelegate(NnApiDelegate())
interpreter = Interpreter(FileUtil.loadMappedFile(context, model.assetsFilename ) , interpreterOptions)

@synandi synandi added type:support Support issues comp:lite TF Lite related issues TFLiteNNAPIDelegate For issues related to TFLite NNAPI Delegate TF 2.11 Issues related to TF 2.11 labels May 22, 2023
@tilakrayal tilakrayal assigned pjpratik and unassigned synandi May 23, 2023
@pjpratik
Copy link
Contributor

Hi @shuaga

Can you try benchmarking and share the results?

Also, can you try with TF 2.12 and nightly and let us know if you are observing the same behaviour?

Thanks.

@pjpratik pjpratik added the stat:awaiting response Status - Awaiting response from author label May 23, 2023
@shuaga
Copy link
Author
shuaga commented May 24, 2023

Hi @pjpratik

I tried with TF 2.12 and the same problem behaviour was reproduced.

Benchmarking steps done

adb install -r -d -g android_aarch64_benchmark_model.apk
adb push facenet.tflite /data/local/tmp
adb logcat -c
adb shell am start -S -n org.tensorflow.lite.benchmark/.BenchmarkModelActivity --es args '"--graph=/data/local/tmp/facenet.tflite --num_threads=4"'
adb logcat | grep "Inference timings"

The result of this was -
05-24 12:57:09.622 4844 4844 I tflite : Inference timings in us: Init: 25648, First inference: 539451, Warmup (avg): 539451, Inference (avg): 530406

Next -
adb logcat -c
adb shell am start -S -n org.tensorflow.lite.benchmark/.BenchmarkModelActivity --es args '"--graph=/data/local/tmp/facenet.tflite --num_threads=4 --use_nnapi=true"'
adb logcat | grep "Inference timings"

The result this time was -
05-24 12:59:17.100 5300 5300 I tflite : Inference timings in us: Init: 1629033, First inference: 54704, Warmup (avg): 53009.9, Inference (avg): 52862.3

Below are logs containing 'NNAPI' from the logcat:

05-24 12:59:12.259 5300 5300 I tflite_BenchmarkModelActivity: Running TensorFlow Lite benchmark with args: --graph=/data/local/tmp/facenet.tflite --num_threads=4 --use_nnapi=true

05-24 12:59:12.268 5300 5300 I tflite : Use NNAPI: [1]

05-24 12:59:12.280 5300 5300 I tflite : NNAPI accelerators available: [qti-default,qti-dsp,qti-gpu,nnapi-reference]

05-24 12:59:12.288 5300 5300 I tflite : Created TensorFlow Lite delegate for NNAPI.

05-24 12:59:12.288 5300 5300 I tflite : NNAPI delegate created.

05-24 12:59:12.289 5300 5300 W tflite : NNAPI SL driver did not implement SL_ANeuralNetworksDiagnostic_registerCallbacks!

05-24 12:59:12.290 5300 5300 I TypeManager: Failed to read /vendor/etc/nnapi_extensions_app_allowlist ; No app allowlisted for vendor extensions use.

05-24 12:59:12.755 5300 5300 I tflite : Replacing 179 out of 181 node(s) with delegate (TfLiteNnapiDelegate) node, yielding 2 partitions for the whole graph.

05-24 12:59:12.755 5300 5300 W tflite : NNAPI SL driver did not implement SL_ANeuralNetworksDiagnostic_registerCallbacks!

05-24 12:59:13.889 5300 5300 I tflite : Explicitly applied NNAPI delegate, and the model graph will be partially executed by the delegate w/ 1 delegate kernels.

Next steps

Do let me know if anything else is needed.

@google-ml-butler google-ml-butler bot removed the stat:awaiting response Status - Awaiting response from author label May 24, 2023
@pjpratik
Copy link
Contributor

@shuaga Thanks for the information.

@pkgoogle Could you please look into this issue?

@pjpratik pjpratik assigned pkgoogle and unassigned pjpratik May 24, 2023
@pkgoogle
Copy link

Sure thing, I'll take a look at this.

@pkgoogle
Copy link
pkgoogle commented Jun 6, 2023

Hi @shuaga, can you upload the .tflite file that encountered this issue? The smaller the better but anything that reproduces it will be fine.

@pkgoogle pkgoogle added the stat:awaiting response Status - Awaiting response from author label Jun 6, 2023
@shuaga
Copy link
Author
shuaga commented Jun 9, 2023

This is the one that encountered the issue.

facenet_from_github.tflite.zip

@google-ml-butler google-ml-butler bot removed the stat:awaiting response Status - Awaiting response from author label Jun 9, 2023
@shuaga
Copy link
Author
shuaga commented Jun 9, 2023

Putting the various library combinations that I've tried and what works and what doesn't --

Doesn't work --
implementation 'org.tensorflow:tensorflow-lite:2.12.0'
implementation 'org.tensorflow:tensorflow-lite-gpu:2.12.0'
implementation 'org.tensorflow:tensorflow-lite-gpu-api:2.12.0'
implementation 'org.tensorflow:tensorflow-lite-support:0.2.0'

Doesn't work --
implementation 'org.tensorflow:tensorflow-lite:2.12.0'
implementation 'org.tensorflow:tensorflow-lite-gpu:2.12.0'
implementation 'org.tensorflow:tensorflow-lite-gpu-api:2.12.0'
implementation 'org.tensorflow:tensorflow-lite-support:0.4.3'

Works --
implementation 'org.tensorflow:tensorflow-lite:2.6.0'
implementation 'org.tensorflow:tensorflow-lite-gpu:2.6.0'
implementation 'org.tensorflow:tensorflow-lite-gpu-api:2.12.0'
implementation 'org.tensorflow:tensorflow-lite-support:0.2.0'

@pkgoogle
Copy link
pkgoogle commented Jun 9, 2023

Hi @sirakiin, can you please take a look at this?

@pkgoogle pkgoogle added the stat:awaiting tensorflower Status - Awaiting response from tensorflower label Jun 9, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:lite TF Lite related issues stat:awaiting tensorflower Status - Awaiting response from tensorflower TF 2.11 Issues related to TF 2.11 TFLiteNNAPIDelegate For issues related to TFLite NNAPI Delegate type:support Support issues
Projects
None yet
Development

No branches or pull requests

5 participants