[go: nahoru, domu]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tflite inference with multiple inputs and outputs on Android #70682

Open
panhu opened this issue Jul 1, 2024 · 0 comments
Open

Tflite inference with multiple inputs and outputs on Android #70682

panhu opened this issue Jul 1, 2024 · 0 comments
Assignees
Labels
2.6.0 comp:lite TF Lite related issues type:bug Bug

Comments

@panhu
Copy link
panhu commented Jul 1, 2024

System information

  • Android Device
  • TensorFlow Lite in Play Services SDK version
    ( implementation ("org.tensorflow:tensorflow-lite:2.6.0")
    implementation 'org.tensorflow:tensorflow-lite-select-tf-ops:2.6.0'
    implementation 'org.tensorflow:tensorflow-lite-support:0.4.4')

The code is as follows:

public class DenoiseDemo extends AppCompatActivity {

private Interpreter tflite;

@Override
protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.activity_main);

    try {
        // 初始化 TensorFlow Lite 解释器
        tflite = new Interpreter(loadModelFile(this, "multi_io_model.tflite"));

        // 示例推理数据,随机生成的输入数组
        float[][] input1 = new float[1][3];
        float[][] input2 = new float[1][3];
        for (int i = 0; i < 3; i++) {
            input1[0][i] = (float) Math.random();
            input2[0][i] = (float) Math.random();
        }

        // 打印输入数据
        System.out.println("Input 1: " + Arrays.toString(input1[0]));
        System.out.println("Input 2: " + Arrays.toString(input2[0]));

        // 创建输出数据结构
        float[] output1 = new float[1]; // 输出为1维数组
        float[] output2 = new float[1]; // 输出为1维数组

        Map<Integer, Object> outputs = new HashMap<>();
        outputs.put(0, output1);
        outputs.put(1, output2);

        // 执行模型推理
        Object[] inputs = new Object[]{input1, input2};
        tflite.runForMultipleInputsOutputs(inputs, outputs);

        // 打印推理结果
        System.out.println("Output 1: " + Arrays.toString(output1));
        System.out.println("Output 2: " + Arrays.toString(output2));

    } catch (Exception e) {
        e.printStackTrace();
        System.out.println("Exception caught during model execution: " + e.getMessage());
    }
}

// 从assets文件夹加载模型文件
private MappedByteBuffer loadModelFile(Context context, String modelFileName) throws IOException {
    AssetFileDescriptor fileDescriptor = context.getAssets().openFd(modelFileName);
    FileInputStream inputStream = new FileInputStream(fileDescriptor.getFileDescriptor());
    FileChannel fileChannel = inputStream.getChannel();
    long startOffset = fileDescriptor.getStartOffset();
    long declaredLength = fileDescriptor.getDeclaredLength();
    return fileChannel.map(FileChannel.MapMode.READ_ONLY, startOffset, declaredLength);
}

@Override
protected void onDestroy() {
    super.onDestroy();
    if (tflite != null) {
        tflite.close();
    }
}

}

Testing in Python is normal, but when running inference on Android, the output cannot be seen, with only the following display:
System.err com.autel.DenoiseDemo W(warning) at org.tensorflow.lite.InterpreterImpl.runForMultipleInputsOutputs(InterpreterImpl.java:101)
System.err com.autel.DenoiseDemo W at org.tensorflow.lite.Interpreter.runForMultipleInputsOutputs(Interpreter.java:95)

Unable to display output results?why?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
2.6.0 comp:lite TF Lite related issues type:bug Bug
Projects
None yet
Development

No branches or pull requests

3 participants