[go: nahoru, domu]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tflite use USB camera with android image classification app #61445

Open
suyash-narain opened this issue Aug 1, 2023 · 8 comments
Open

Tflite use USB camera with android image classification app #61445

suyash-narain opened this issue Aug 1, 2023 · 8 comments
Assignees
Labels
Android comp:lite TF Lite related issues stat:awaiting tensorflower Status - Awaiting response from tensorflower TF 2.10 type:bug Bug type:feature Feature requests

Comments

@suyash-narain
Copy link

System information

  • Have I written custom code (as opposed to using a stock example script
    provided in TensorFlow)
    : No
  • TensorFlow installed from (source or binary): binary
  • TensorFlow version (use command below): 2.10
  • Python version: 3.10

Describe the problem

I am new to android and building the tflite image classification app in tensorflow/examples using android studio. I want to make use of a USB camera instead of the mobile back camera to detect the images for classification. How can I achieve that?
What changes do i need to make in CameraFragment.kt to make sure the app can search for a connected USB camera as well? currently the default app only searches for back camera as in this code https://github.com/tensorflow/examples/blob/0bbf4fe43fbf41b7174b9ce4a64d69bd33aadd21/lite/examples/image_classification/android/app/src/main/java/org/tensorflow/lite/examples/imageclassification/fragments/CameraFragment.kt

thanks

@sushreebarsa sushreebarsa added type:bug Bug comp:lite TF Lite related issues TF 2.10 labels Aug 2, 2023
@sushreebarsa sushreebarsa assigned pjpratik and unassigned sushreebarsa Aug 2, 2023
@pjpratik pjpratik assigned pkgoogle and unassigned pjpratik Aug 2, 2023
@pkgoogle pkgoogle added type:support Support issues Android and removed type:bug Bug labels Aug 2, 2023
@pkgoogle
Copy link
pkgoogle commented Aug 2, 2023

@pkgoogle pkgoogle added the stat:awaiting response Status - Awaiting response from author label Aug 2, 2023
@github-actions
Copy link

This issue is stale because it has been open for 7 days with no activity. It will be closed if no further activity occurs. Thank you.

@github-actions github-actions bot added the stale This label marks the issue/pr stale - to be closed automatically if no activity label Aug 10, 2023
@suyash-narain
Copy link
Author

Hi @pkgoogle

I was trying to replicate https://developer.android.com/training/camerax/configuration#camera-selection and add this snippet to select cameraId using camerax api on the official tflite image classification example using kotlin. The build is successful but the app keeps on crashing.
Camerax makes use of camera2 api so i assume i can use camera2 cameramanager to select the cameraId and call that in cameraSelector build using camerafilter?
what do you suggest?

@google-ml-butler google-ml-butler bot removed stale This label marks the issue/pr stale - to be closed automatically if no activity stat:awaiting response Status - Awaiting response from author labels Aug 16, 2023
@pkgoogle
Copy link

Hi @suyash-narain,

It's hard for me to tell exactly what you are doing, can you share your code? or just the portion which is causing the issue? Glad to hear that your build is successful. Do you have any errors or error logs you can share as well when it crashes? Generally the more information you share with me the faster/more likely I will be able to help you. Thanks for your help!

@pkgoogle pkgoogle added the stat:awaiting response Status - Awaiting response from author label Aug 17, 2023
@suyash-narain
Copy link
Author
suyash-narain commented Aug 17, 2023

Hi @pkgoogle,

my code is sourced from https://github.com/tensorflow/examples/blob/master/lite/examples/image_classification/android/app/src/main/java/org/tensorflow/lite/examples/imageclassification/fragments/CameraFragment.kt
though I added a camerafilter to search for external camera and use it if found

The code is below:

/*
 * Copyright 2022 The TensorFlow Authors. All Rights Reserved.
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *             http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package org.tensorflow.lite.examples.imageclassification.fragments

import android.annotation.SuppressLint
import android.content.res.Configuration
import android.graphics.Bitmap
import android.hardware.camera2.CameraCharacteristics
import android.hardware.camera2.CameraManager
import android.os.Bundle
import android.util.DisplayMetrics
import android.util.Log
import android.view.*
import android.widget.AdapterView
import android.widget.Toast
import androidx.annotation.NonNull
import androidx.annotation.OptIn
import androidx.annotation.experimental.Experimental
import androidx.camera.camera2.interop.Camera2CameraInfo
import androidx.camera.camera2.interop.ExperimentalCamera2Interop
import androidx.camera.core.AspectRatio
import androidx.camera.core.ImageProxy
import androidx.camera.core.Camera
import androidx.camera.core.CameraFilter
import androidx.camera.core.CameraInfo
import androidx.camera.core.CameraSelector
import androidx.camera.core.ImageAnalysis
import androidx.camera.core.Preview
import androidx.camera.lifecycle.ProcessCameraProvider
import androidx.core.content.ContextCompat
import androidx.fragment.app.Fragment
import androidx.navigation.Navigation
import androidx.recyclerview.widget.LinearLayoutManager
import org.tensorflow.lite.examples.imageclassification.ImageClassifierHelper
import org.tensorflow.lite.examples.imageclassification.MyCameraFilter
import org.tensorflow.lite.examples.imageclassification.R
import org.tensorflow.lite.examples.imageclassification.databinding.FragmentCameraBinding
import org.tensorflow.lite.task.vision.classifier.Classifications
import java.util.concurrent.ExecutorService
import java.util.concurrent.Executors


class CameraFragment : Fragment(), ImageClassifierHelper.ClassifierListener {

    companion object {
        private const val TAG = "Image Classifier"
    }

    private var _fragmentCameraBinding: FragmentCameraBinding? = null
    private val fragmentCameraBinding
        get() = _fragmentCameraBinding!!

    private lateinit var imageClassifierHelper: ImageClassifierHelper
    private lateinit var bitmapBuffer: Bitmap
    private val classificationResultsAdapter by lazy {
        ClassificationResultsAdapter().apply {
            updateAdapterSize(imageClassifierHelper.maxResults)
        }
    }
    private var preview: Preview? = null
    private var imageAnalyzer: ImageAnalysis? = null
    private var camera: Camera? = null
    private var cameraProvider: ProcessCameraProvider? = null

    /** Blocking camera operations are performed using this executor */
    private lateinit var cameraExecutor: ExecutorService

    override fun onResume() {
        super.onResume()

        if (!PermissionsFragment.hasPermissions(requireContext())) {
            Navigation.findNavController(requireActivity(), R.id.fragment_container)
                .navigate(CameraFragmentDirections.actionCameraToPermissions())
        }
    }

    override fun onDestroyView() {
        _fragmentCameraBinding = null
        super.onDestroyView()

        // Shut down our background executor
        cameraExecutor.shutdown()
    }

    override fun onCreateView(
        inflater: LayoutInflater,
        container: ViewGroup?,
        savedInstanceState: Bundle?
    ): View {
        _fragmentCameraBinding = FragmentCameraBinding.inflate(inflater, container, false)

        return fragmentCameraBinding.root
    }

    @SuppressLint("MissingPermission")
    override fun onViewCreated(view: View, savedInstanceState: Bundle?) {
        super.onViewCreated(view, savedInstanceState)

        imageClassifierHelper =
            ImageClassifierHelper(context = requireContext(), imageClassifierListener = this)

        with(fragmentCameraBinding.recyclerviewResults) {
            layoutManager = LinearLayoutManager(requireContext())
            adapter = classificationResultsAdapter
        }

        cameraExecutor = Executors.newSingleThreadExecutor()

        fragmentCameraBinding.viewFinder.post {
            // Set up the camera and its use cases
            setUpCamera()
        }

        // Attach listeners to UI control widgets
        initBottomSheetControls()
    }

    // Initialize CameraX, and prepare to bind the camera use cases
    private fun setUpCamera() {
        val cameraProviderFuture = ProcessCameraProvider.getInstance(requireContext())
        cameraProviderFuture.addListener(
            {
                // CameraProvider
                cameraProvider = cameraProviderFuture.get()

                // Build and bind the camera use cases
                bindCameraUseCases()
            },
            ContextCompat.getMainExecutor(requireContext())
        )
    }

    private fun initBottomSheetControls() {
        // When clicked, lower classification score threshold floor
        fragmentCameraBinding.bottomSheetLayout.thresholdMinus.setOnClickListener {
            if (imageClassifierHelper.threshold >= 0.1) {
                imageClassifierHelper.threshold -= 0.1f
                updateControlsUi()
            }
        }

        // When clicked, raise classification score threshold floor
        fragmentCameraBinding.bottomSheetLayout.thresholdPlus.setOnClickListener {
            if (imageClassifierHelper.threshold < 0.9) {
                imageClassifierHelper.threshold += 0.1f
                updateControlsUi()
            }
        }

        // When clicked, reduce the number of objects that can be classified at a time
        fragmentCameraBinding.bottomSheetLayout.maxResultsMinus.setOnClickListener {
            if (imageClassifierHelper.maxResults > 1) {
                imageClassifierHelper.maxResults--
                updateControlsUi()
                classificationResultsAdapter.updateAdapterSize(size = imageClassifierHelper.maxResults)
            }
        }

        // When clicked, increase the number of objects that can be classified at a time
        fragmentCameraBinding.bottomSheetLayout.maxResultsPlus.setOnClickListener {
            if (imageClassifierHelper.maxResults < 3) {
                imageClassifierHelper.maxResults++
                updateControlsUi()
                classificationResultsAdapter.updateAdapterSize(size = imageClassifierHelper.maxResults)
            }
        }

        // When clicked, decrease the number of threads used for classification
        fragmentCameraBinding.bottomSheetLayout.threadsMinus.setOnClickListener {
            if (imageClassifierHelper.numThreads > 1) {
                imageClassifierHelper.numThreads--
                updateControlsUi()
            }
        }

        // When clicked, increase the number of threads used for classification
        fragmentCameraBinding.bottomSheetLayout.threadsPlus.setOnClickListener {
            if (imageClassifierHelper.numThreads < 4) {
                imageClassifierHelper.numThreads++
                updateControlsUi()
            }
        }

        // When clicked, change the underlying hardware used for inference. Current options are CPU
        // GPU, and NNAPI
        fragmentCameraBinding.bottomSheetLayout.spinnerDelegate.setSelection(0, false)
        fragmentCameraBinding.bottomSheetLayout.spinnerDelegate. : AdapterView.OnItemSelectedListener {
                override fun onItemSelected(
                    parent: AdapterView<*>?,
                    view: View?,
                    position: Int,
                    id: Long
                ) {
                    imageClassifierHelper.currentDelegate = position
                    updateControlsUi()
                }

                override fun onNothingSelected(parent: AdapterView<*>?) {
                    /* no op */
                }
            }

        // When clicked, change the underlying model used for object classification
        fragmentCameraBinding.bottomSheetLayout.spinnerModel.setSelection(0, false)
        fragmentCameraBinding.bottomSheetLayout.spinnerModel. : AdapterView.OnItemSelectedListener {
                override fun onItemSelected(
                    parent: AdapterView<*>?,
                    view: View?,
                    position: Int,
                    id: Long
                ) {
                    imageClassifierHelper.currentModel = position
                    updateControlsUi()
                }

                override fun onNothingSelected(parent: AdapterView<*>?) {
                    /* no op */
                }
            }
    }

    // Update the values displayed in the bottom sheet. Reset classifier.
    private fun updateControlsUi() {
        fragmentCameraBinding.bottomSheetLayout.maxResultsValue.text =
            imageClassifierHelper.maxResults.toString()

        fragmentCameraBinding.bottomSheetLayout.thresholdValue.text =
            String.format("%.2f", imageClassifierHelper.threshold)
        fragmentCameraBinding.bottomSheetLayout.threadsValue.text =
            imageClassifierHelper.numThreads.toString()
        // Needs to be cleared instead of reinitialized because the GPU
        // delegate needs to be initialized on the thread using it when applicable
        imageClassifierHelper.clearImageClassifier()
    }

    override fun onConfigurationChanged(newConfig: Configuration) {
        super.onConfigurationChanged(newConfig)
        imageAnalyzer?.targetRotation = fragmentCameraBinding.viewFinder.display.rotation
    }

    // Declare and bind preview, capture and analysis use cases
    @SuppressLint("UnsafeOptInUsageError")
    private fun bindCameraUseCases() {

        // CameraProvider
        val cameraProvider =
            cameraProvider ?: throw IllegalStateException("Camera initialization failed.")

        //CameraSelector - makes assumption that we're only using the back camera

        //val cameraSelector =
        //    CameraSelector.Builder().requireLensFacing(CameraSelector.LENS_FACING_BACK).build()
//        @androidx.annotation.OptIn(ExperimentalCamera2Interop::class)
//        val cam2info = cameraProvider.availableCameraInfos.map{ Camera2CameraInfo.from(it)}.sortedByDescending{it.getCameraCharacteristic(CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL)}
//        Log.i(TAG, "[camera id] available cameras:"+cam2info)
//        val cameraSelector = CameraSelector.Builder().addCameraFilter{it.filter {caminfo -> val camid = Camera2CameraInfo.from(caminfo).cameraId
//        camid =0}}.build()
       // val cameraSelector = CameraSelector.Builder().addCameraFilter{it.filter {camInfo ->
       //     Camera2CameraInfo.from(camInfo).getCameraCharacteristic(CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL)==CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL_EXTERNAL}}.build()
       //cameraSelector = selectCam(cameraProvider)

        val mCameraId = 103
        val cameraSelector = CameraSelector.Builder().addCameraFilter(MyCameraFilter("$mCameraId")).build()

        // Preview. Only using the 4:3 ratio because this is the closest to our models
        preview =
            Preview.Builder()
                .setTargetAspectRatio(AspectRatio.RATIO_4_3)
                .setTargetRotation(fragmentCameraBinding.viewFinder.display.rotation)
                .build()

        // ImageAnalysis. Using RGBA 8888 to match how our models work
        imageAnalyzer =
            ImageAnalysis.Builder()
                .setTargetAspectRatio(AspectRatio.RATIO_4_3)
                .setTargetRotation(fragmentCameraBinding.viewFinder.display.rotation)
                .setBackpressureStrategy(ImageAnalysis.STRATEGY_KEEP_ONLY_LATEST)
                .setOutputImageFormat(ImageAnalysis.OUTPUT_IMAGE_FORMAT_RGBA_8888)
                .build()
                // The analyzer can then be assigned to the instance
                .also {
                    it.setAnalyzer(cameraExecutor) { image ->
                        if (!::bitmapBuffer.isInitialized) {
                            // The image rotation and RGB image buffer are initialized only once
                            // the analyzer has started running
                            bitmapBuffer = Bitmap.createBitmap(
                                image.width,
                                image.height,
                                Bitmap.Config.ARGB_8888
                            )
                        }

                        classifyImage(image)
                    }
                }

        // Must unbind the use-cases before rebinding them
        cameraProvider.unbindAll()

        try {
            // A variable number of use-cases can be passed here -
            // camera provides access to CameraControl & CameraInfo
            camera = cameraProvider.bindToLifecycle(this, cameraSelector, preview, imageAnalyzer)

            // Attach the viewfinder's surface provider to preview use case
            preview?.setSurfaceProvider(fragmentCameraBinding.viewFinder.surfaceProvider)
        } catch (exc: Exception) {
            Log.e(TAG, "Use case binding failed", exc)
        }
    }

    private fun getScreenOrientation() : Int {
        val outMetrics = DisplayMetrics()

        val display: Display?
        if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.R) {
            display = requireActivity().display
            display?.getRealMetrics(outMetrics)
        } else {
            @Suppress("DEPRECATION")
            display = requireActivity().windowManager.defaultDisplay
            @Suppress("DEPRECATION")
            display.getMetrics(outMetrics)
        }

        return display?.rotation ?: 0
    }

    private fun classifyImage(image: ImageProxy) {
        // Copy out RGB bits to the shared bitmap buffer
        image.use { bitmapBuffer.copyPixelsFromBuffer(image.planes[0].buffer) }

        // Pass Bitmap and rotation to the image classifier helper for processing and classification
        imageClassifierHelper.classify(bitmapBuffer, getScreenOrientation())
    }

    @SuppressLint("NotifyDataSetChanged")
    override fun onError(error: String) {
        activity?.runOnUiThread {
            Toast.makeText(requireContext(), error, Toast.LENGTH_SHORT).show()
            classificationResultsAdapter.updateResults(null)
            classificationResultsAdapter.notifyDataSetChanged()
        }
    }

    @SuppressLint("NotifyDataSetChanged")
    override fun onResults(
        results: List<Classifications>?,
        inferenceTime: Long
    ) {
        activity?.runOnUiThread {
            // Show result on bottom sheet
            classificationResultsAdapter.updateResults(results)
            classificationResultsAdapter.notifyDataSetChanged()
            fragmentCameraBinding.bottomSheetLayout.inferenceTimeVal.text =
                String.format("%d ms", inferenceTime)
        }
    }
}

In above code, cameraid=103 is determined from my android device with a usb camera connected, using the command: 'dumpsys media.camera'

I created another MyCameraFilter.kt which i use in camerafragment.kt as below:

package org.tensorflow.lite.examples.imageclassification

import android.annotation.SuppressLint
import android.util.Log
import androidx.camera.core.CameraFilter
import androidx.camera.core.CameraInfo
import androidx.camera.core.impl.CameraInfoInternal
import androidx.core.util.Preconditions

class MyCameraFilter(private val mId: String) :  CameraFilter {

    private val TAG = "CameraIdCameraFilter"

    @SuppressLint("RestrictedApi")
    override fun filter(cameraInfos: MutableList<CameraInfo>): MutableList<CameraInfo> {

        val result = mutableListOf<CameraInfo>()
        cameraInfos.forEach {
            Preconditions.checkArgument(
                it is CameraInfoInternal,
                "the camera info doesn't contain internal implementation "
            )
            it as CameraInfoInternal
            val id = it.cameraId
            Log.d(TAG, "id: $id")

            if (id.contains(mId)) {
                result.add(it)
            }
        }
        return result
    }
}

I tried adding CameraSelector.LENS_FACING_EXTERNAL which is part of @ExperimentalLensFacing but my android studio didn't take this annotation at all. Do you have any suggestions how to move forward? my device only supports usb camera, and since camerax contains camera2, i assume i can use camera2 api with camerax interchangeably.

How else can i use a usb camera to perform image classification?

thanks

@google-ml-butler google-ml-butler bot removed the stat:awaiting response Status - Awaiting response from author label Aug 17, 2023
@suyash-narain
Copy link
Author

logcat error log on android studio when running the classification app:

2023-07-31 14:19:29.809 744-967 StartingSurfaceDrawer com.android.systemui D fillViewWithIcon surfaceWindowView android.window.SplashScreenView{8f16232 V.E...... ......ID 0,0-0,0}
2023-07-31 14:19:29.826 517-537 Compatibil...geReporter system_server D Compat change id reported: 135634846; UID 10077; state: DISABLED
2023-07-31 14:19:29.829 517-537 Compatibil...geReporter system_server D Compat change id reported: 177438394; UID 10077; state: DISABLED
2023-07-31 14:19:29.829 517-537 Compatibil...geReporter system_server D Compat change id reported: 135772972; UID 10077; state: DISABLED
2023-07-31 14:19:29.830 517-537 Compatibil...geReporter system_server D Compat change id reported: 135754954; UID 10077; state: ENABLED
2023-07-31 14:19:29.831 517-544 Compatibil...geReporter system_server D Compat change id reported: 143937733; UID 10077; state: ENABLED
2023-07-31 14:19:29.850 285-285 Zygote pid-285 D Forked child process 2466
2023-07-31 14:19:29.854 517-544 ActivityManager system_server I Start proc 2466:org.tensorflow.lite.examples.imageclassification/u0a77 for pre-top-activity {org.tensorflow.lite.examples.imageclassification/org.tensorflow.lite.examples.imageclassification.MainActivity}
2023-07-31 14:19:29.874 2466-2466 Zygote pid-2466 I seccomp disabled by setenforce 0
2023-07-31 14:19:29.887 2466-2466 eclassificatio pid-2466 I Late-enabling -Xcheck:jni
2023-07-31 14:19:29.952 394-427 adbd adbd I jdwp connection from 2466
2023-07-31 14:19:30.084 2466-2466 re-initialized> pid-2466 W type=1400 audit(0.0:1035): avc: granted { execute } for path="/data/data/org.tensorflow.lite.examples.imageclassification/code_cache/startup_agents/566ca8ec-agent.so" dev="dm-2" ino=20224 scontext=u:r:untrusted_app:s0:c77,c256,c512,c768 tcontext=u:object_r:app_data_file:s0:c77,c256,c512,c768 tclass=file app=org.tensorflow.lite.examples.imageclassification
2023-07-31 14:19:30.113 2466-2466 eclassificatio pid-2466 W DexFile /data/data/org.tensorflow.lite.examples.imageclassification/code_cache/.studio/instruments-45c255fe.jar is in boot class path but is not in a known location
2023-07-31 14:19:30.836 2466-2466 eclassificatio pid-2466 W Current dex file has more than one class in it. Calling RetransformClasses on this class might fail if no transformations are applied to it!
2023-07-31 14:19:31.168 2466-2466 eclassificatio pid-2466 W Current dex file has more than one class in it. Calling RetransformClasses on this class might fail if no transformations are applied to it!
2023-07-31 14:19:31.173 2466-2466 eclassificatio pid-2466 W Redefining intrinsic method java.lang.Thread java.lang.Thread.currentThread(). This may cause the unexpected use of the original definition of java.lang.Thread java.lang.Thread.currentThread()in methods that have already been compiled.
2023-07-31 14:19:31.173 2466-2466 eclassificatio pid-2466 W Redefining intrinsic method boolean java.lang.Thread.interrupted(). This may cause the unexpected use of the original definition of boolean java.lang.Thread.interrupted()in methods that have already been compiled.
2023-07-31 14:19:31.221 2466-2466 eclassificatio pid-2466 W Current dex file has more than one class in it. Calling RetransformClasses on this class might fail if no transformations are applied to it!
2023-07-31 14:19:31.807 2466-2466 eclassificatio pid-2466 W Current dex file has more than one class in it. Calling RetransformClasses on this class might fail if no transformations are applied to it!
1969-12-31 16:00:00.000 0-0 I ---------------------------- PROCESS STARTED (2466) for package org.tensorflow.lite.examples.imageclassification ----------------------------
2023-07-31 14:19:32.372 2466-2466 eclassificatio pid-2466 W Current dex file has more than one class in it. Calling RetransformClasses on this class might fail if no transformations are applied to it!
2023-07-31 14:19:32.389 2466-2466 Compatibil...geReporter org....examples.imageclassification D Compat change id reported: 171979766; UID 10077; state: ENABLED
2023-07-31 14:19:32.796 2466-2466 GraphicsEnvironment org....examples.imageclassification V ANGLE Developer option for 'org.tensorflow.lite.examples.imageclassification' set to: 'default'
2023-07-31 14:19:32.797 2466-2466 GraphicsEnvironment org....examples.imageclassification V ANGLE GameManagerService for org.tensorflow.lite.examples.imageclassification: false
2023-07-31 14:19:32.798 2466-2466 GraphicsEnvironment org....examples.imageclassification V Neither updatable production driver nor prerelease driver is supported.
2023-07-31 14:19:32.812 2466-2466 NetworkSecurityConfig org....examples.imageclassification D No Network Security Config specified, using platform default
2023-07-31 14:19:32.814 2466-2466 NetworkSecurityConfig org....examples.imageclassification D No Network Security Config specified, using platform default
2023-07-31 14:19:33.691 2466-2466 eclassificatio org....examples.imageclassification W Accessing hidden method Landroid/view/View;->computeFitSystemWindows(Landroid/graphics/Rect;Landroid/graphics/Rect;)Z (unsupported, reflection, allowed)
2023-07-31 14:19:33.695 2466-2466 eclassificatio org....examples.imageclassification W Accessing hidden method Landroid/view/ViewGroup;->makeOptionalFitsSystemWindows()V (unsupported, reflection, allowed)
2023-07-31 14:19:33.701 2466-2477 System org....examples.imageclassification W A resource failed to call close.
2023-07-31 14:19:33.827 2466-2466 Compatibil...geReporter org....examples.imageclassification D Compat change id reported: 171228096; UID 10077; state: ENABLED
2023-07-31 14:19:33.844 517-1259 TaskPersister system_server E File error accessing recents directory (directory doesn't exist?).
2023-07-31 14:19:34.158 2466-2466 tflite org....examples.imageclassification I Initialized TensorFlow Lite runtime.
2023-07-31 14:19:34.488 2466-2466 RenderThread org....examples.imageclassification I type=1400 audit(0.0:1036): avc: denied { open } for path="/dev/properties/u:object_r:vendor_default_prop:s0" dev="tmpfs" ino=256 scontext=u:r:untrusted_app:s0:c77,c256,c512,c768 tcontext=u:object_r:vendor_default_prop:s0 tclass=file permissive=1 app=org.tensorflow.lite.examples.imageclassification
2023-07-31 14:19:34.488 2466-2466 RenderThread org....examples.imageclassification I type=1400 audit(0.0:1037): avc: denied { getattr } for path="/dev/properties/u:object_r:vendor_default_prop:s0" dev="tmpfs" ino=256 scontext=u:r:untrusted_app:s0:c77,c256,c512,c768 tcontext=u:object_r:vendor_default_prop:s0 tclass=file permissive=1 app=org.tensorflow.lite.examples.imageclassification
2023-07-31 14:19:34.488 2466-2466 RenderThread org....examples.imageclassification I type=1400 audit(0.0:1038): avc: denied { map } for path="/dev/properties/u:object_r:vendor_default_prop:s0" dev="tmpfs" ino=256 scontext=u:r:untrusted_app:s0:c77,c256,c512,c768 tcontext=u:object_r:vendor_default_prop:s0 tclass=file permissive=1 app=org.tensorflow.lite.examples.imageclassification
2023-07-31 14:19:34.516 199-199 hwservicemanager hwservicemanager I getTransport: Cannot find entry android.hardware.configstore@1.0::ISurfaceFlingerConfigs/default in either framework or device VINTF manifest.
2023-07-31 14:19:34.718 2466-2483 cutils-trace org....examples.imageclassification E Error opening trace file: Permission denied (13)
2023-07-31 14:19:34.776 744-965 StartingSurfaceDrawer com.android.systemui D Task start finish, remove starting surface for task 92
2023-07-31 14:19:34.776 744-965 StartingSurfaceDrawer com.android.systemui V Removing splash screen window for task: 92
2023-07-31 14:19:34.781 517-534 ActivityTaskManager system_server I Displayed org.tensorflow.lite.examples.imageclassification/.MainActivity: +5s8ms
2023-07-31 14:19:34.902 2466-2498 CameraManagerGlobal org....examples.imageclassification I Connecting to camera service
2023-07-31 14:19:34.915 895-993 ServiceManager cameraserver W Permission failure: android.permission.CAMERA_OPEN_CLOSE_LISTENER from uid=10077 pid=2466
2023-07-31 14:19:34.975 2466-2498 CameraRepository org....examples.imageclassification D Added camera: 103
2023-07-31 14:19:35.072 517-899 InputManager-JNI system_server W Input channel object '2117409 Splash Screen org.tensorflow.lite.examples.imageclassification (client)' was disposed without first being removed with the input manager!
2023-07-31 14:19:35.266 2466-2498 Camera2CameraInfo org....examples.imageclassification I Device Level: INFO_SUPPORTED_HARDWARE_LEVEL_EXTERNAL
2023-07-31 14:19:35.281 2466-2498 CameraValidator org....examples.imageclassification D Verifying camera lens facing on device, lensFacingInteger: null
2023-07-31 14:19:35.365 2466-2466 CameraIdCameraFilter org....examples.imageclassification D id: 103
2023-07-31 14:19:35.479 2466-2466 DeferrableSurface org....examples.imageclassification D Surface created[total_surfaces=1, used_surfaces=0](androidx.camera.core.SurfaceRequest$2@58f49df}
2023-07-31 14:19:35.496 2466-2466 CameraOrientationUtil org....examples.imageclassification D getRelativeImageRotation: destRotationDegrees=0, sourceRotationDegrees=0, isOppositeFacing=false, result=0
2023-07-31 14:19:35.498 2466-2466 CameraOrientationUtil org....examples.imageclassification D getRelativeImageRotation: destRotationDegrees=0, sourceRotationDegrees=0, isOppositeFacing=false, result=0
2023-07-31 14:19:35.502 2466-2466 DeferrableSurface org....examples.imageclassification D Surface created[total_surfaces=2, used_surfaces=0](androidx.camera.core.impl.ImmediateSurface@176268a}
2023-07-31 14:19:35.509 2466-2498 Camera2CameraImpl org....examples.imageclassification D {Camera@cf9cb2a[id=103]} Use case androidx.camera.core.Preview-e28319fa-d481-4935-af63-c99e3540a85160384050 INACTIVE
2023-07-31 14:19:35.513 2466-2466 CameraOrientationUtil org....examples.imageclassification D getRelativeImageRotation: destRotationDegrees=0, sourceRotationDegrees=0, isOppositeFacing=false, result=0
2023-07-31 14:19:35.514 2466-2466 PreviewView org....examples.imageclassification D Surface requested by Preview.
2023-07-31 14:19:35.515 2466-2498 UseCaseAttachState org....examples.imageclassification D Active and attached use case: [] for camera: 103
2023-07-31 14:19:35.523 2466-2498 Camera2CameraImpl org....examples.imageclassification D {Camera@cf9cb2a[id=103]} Use case androidx.camera.core.ImageAnalysis-8231df21-375a-4253-a87a-2ae8de627879255109251 ACTIVE
2023-07-31 14:19:35.524 2466-2498 UseCaseAttachState org....examples.imageclassification D Active and attached use case: [] for camera: 103
2023-07-31 14:19:35.532 2466-2498 Camera2CameraImpl org....examples.imageclassification D {Camera@cf9cb2a[id=103]} Use cases [androidx.camera.core.Preview-e28319fa-d481-4935-af63-c99e3540a85160384050, androidx.camera.core.ImageAnalysis-8231df21-375a-4253-a87a-2ae8de627879255109251] now ATTACHED
2023-07-31 14:19:35.532 2466-2466 PreviewView org....examples.imageclassification D Preview transformation info updated. TransformationInfo{cropRect=Rect(0, 0 - 960, 720), rotationDegrees=0, targetRotation=0}
2023-07-31 14:19:35.533 2466-2466 AndroidRuntime org....examples.imageclassification D Shutting down VM

--------- beginning of crash
2023-07-31 14:19:35.535 2466-2498 UseCaseAttachState org....examples.imageclassification D All use case: [androidx.camera.core.ImageAnalysis-8231df21-375a-4253-a87a-2ae8de627879255109251, androidx.camera.core.Preview-e28319fa-d481-4935-af63-c99e3540a85160384050] for camera: 103
2023-07-31 14:19:35.536 2466-2466 AndroidRuntime org....examples.imageclassification E FATAL EXCEPTION: main
Process: org.tensorflow.lite.examples.imageclassification, PID: 2466
java.lang.NullPointerException: Attempt to invoke virtual method 'int java.lang.Integer.intValue()' on a null object reference
at androidx.camera.view.PreviewView$1.lambda$onSurfaceRequested$1$androidx-camera-view-PreviewView$1(PreviewView.java:203)
at androidx.camera.view.PreviewView$1$$ExternalSyntheticLambda0.onTransformationInfoUpdate(Unknown Source:6)
at androidx.camera.core.SurfaceRequest.lambda$setTransformationInfoListener$7(SurfaceRequest.java:456)
at androidx.camera.core.SurfaceRequest$$ExternalSyntheticLambda3.run(Unknown Source:4)
at android.os.Handler.handleCallback(Handler.java:938)
at android.os.Handler.dispatchMessage(Handler.java:99)
at android.os.Looper.loopOnce(Looper.java:201)
at android.os.Looper.loop(Looper.java:288)
at android.app.ActivityThread.main(ActivityThread.java:7839)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:548)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1003)
2023-07-31 14:19:35.539 2466-2498 UseCaseAttachState org....examples.imageclassification D Active and attached use case: [androidx.camera.core.ImageAnalysis-8231df21-375a-4253-a87a-2ae8de627879255109251] for camera: 103
2023-07-31 14:19:35.542 517-899 ActivityTaskManager system_server W Force finishing activity org.tensorflow.lite.examples.imageclassification/.MainActivity
2023-07-31 14:19:35.548 517-2502 DropBoxManagerService system_server I add tag=data_app_crash isTagEnabled=true flags=0x2
2023-07-31 14:19:35.555 2466-2498 Camera2CameraImpl org....examples.imageclassification D {Camera@cf9cb2a[id=103]} Resetting Capture Session
2023-07-31 14:19:35.558 2466-2498 Camera2CameraImpl org....examples.imageclassification D {Camera@cf9cb2a[id=103]} Releasing session in state INITIALIZED
2023-07-31 14:19:35.562 2466-2498 Camera2CameraImpl org....examples.imageclassification D {Camera@cf9cb2a[id=103]} Attempting to force open the camera.
2023-07-31 14:19:35.564 2466-2498 CameraStateRegistry org....examples.imageclassification D tryOpenCamera(Camera@cf9cb2a[id=103]) [Available Cameras: 1, Already Open: false (Previous state: null)] --> SUCCESS
2023-07-31 14:19:35.570 2466-2466 Process org....examples.imageclassification I Sending signal. PID: 2466 SIG: 9
---------------------------- PROCESS ENDED (2466) for package org.tensorflow.lite.examples.imageclassification ----------------------------
2023-07-31 14:19:35.666 517-899 ActivityManager system_server I Process org.tensorflow.lite.examples.imageclassification (pid 2466) has died: fg TOP
2023-07-31 14:19:35.668 517-545 libprocessgroup system_server I Successfully killed process cgroup uid 10077 pid 2466 in 0ms
2023-07-31 14:19:35.670 285-285 Zygote pid-285 I Process 2466 exited due to signal 9 (Killed)
2023-07-31 14:19:35.670 517-921 WindowManager system_server I WIN DEATH: Window{d0379fb u0 org.tensorflow.lite.examples.imageclassification/org.tensorflow.lite.examples.imageclassification.MainActivity}
2023-07-31 14:19:35.671 517-921 InputManager-JNI system_server W Input channel object 'd0379fb org.tensorflow.lite.examples.imageclassification/org.tensorflow.lite.examples.imageclassification.MainActivity (client)' was disposed without first being removed with the input manager!
2023-07-31 14:19:35.687 343-343 BpTransact...edListener surfaceflinger E Failed to transact (-32)
2023-07-31 14:19:35.700 517-899 ActivityTaskManager system_server W Can't find TaskDisplayArea to determine support for multi window. Task id=92 attached=false
2023-07-31 14:19:35.701 517-899 ActivityTaskManager system_server W Can't find TaskDisplayArea to determine support for multi window. Task id=92 attached=false
2023-07-31 14:19:35.730 517-537 ActivityManager system_server W setHasOverlayUi called on unknown pid: 2466
2023-07-31 14:19:35.798 1109-1153 OpenGLRenderer com.android.launcher3 I Davey! duration=99848ms; Flags=1, FrameTimelineVsyncId=7224, IntendedVsync=236882707405, Vsync=236882707405, InputEventId=0, HandleInputStart=236883940860, AnimationStart=236883945013, PerformTraversalsStart=236883948167, DrawStart=236907705936, FrameDeadline=236916091065, FrameInterval=236883932629, FrameStartTime=16691830, SyncQueued=236914034936, SyncStart=236914191167, IssueDrawCommandsStart=236914506936, SwapBuffers=236919260321, FrameCompleted=336731089942, DequeueBufferDuration=178385, QueueBufferDuration=1787154, GpuCompleted=336731089942, SwapBuffersCompleted=236929178167, DisplayPresentTime=168144668009,
2023-07-31 14:19:35.803 517-527 system_server system_server I NativeAlloc concurrent copying GC freed 76952(4326KB) AllocSpace objects, 17(452KB) LOS objects, 37% free, 10191KB/15MB, paused 2.919ms,344us total 204.204ms
2023-07-31 14:19:35.848 1109-1153 OpenGLRenderer com.android.launcher3 I Davey! duration=99807ms; Flags=0, FrameTimelineVsyncId=7237, IntendedVsync=236966153846, Vsync=236966153846, InputEventId=0, HandleInputStart=236967284475, AnimationStart=236967287013, PerformTraversalsStart=236967289936, DrawStart=236967508013, FrameDeadline=236999537506, FrameInterval=236967278936, FrameStartTime=16691830, SyncQueued=236967943783, SyncStart=236968116167, IssueDrawCommandsStart=236968318706, SwapBuffers=236975167321, FrameCompleted=336774274327, DequeueBufferDuration=35230, QueueBufferDuration=1930308, GpuCompleted=336774274327, SwapBuffersCompleted=236977859629, DisplayPresentTime=168161398317,
2023-07-31 14:19:36.048 517-537 ActivityTaskManager system_server W Activity top resumed state loss timeout for ActivityRecord{55231d3 u0 org.tensorflow.lite.examples.imageclassification/.MainActivity t-1 f}}
2023-07-31 14:19:36.945 284-322 netd netd I setProcSysNet(4, 2, wlan0, retrans_time_ms, 750) <0.75ms>
2023-07-31 14:19:36.946 284-322 netd netd I setProcSysNet(4, 2, wlan0, ucast_solicit, 10) <0.30ms>
2023-07-31 14:19:36.948 284-322 netd netd I setProcSysNet(6, 2, wlan0, retrans_time_ms, 750) <0.44ms>
2023-07-31 14:19:36.950 284-322 netd netd I setProcSysNet(6, 2, wlan0, ucast_solicit, 10) <0.43ms>
2023-07-31 14:19:38.698 517-1259 TaskPersister system_server E File error accessing recents directory (directory doesn't exist?).

@pkgoogle
Copy link

Hi, @suyash-narain, those are the two ways that makes sense, LENS_FACING_EXTERNAL is experimental so sometimes it doesn't always work. You did say you try it, was there a building error? That information can be helpful.

Hi, @miaout17, can you please take a look? Thanks.

@pkgoogle pkgoogle added stat:awaiting tensorflower Status - Awaiting response from tensorflower type:bug Bug type:feature Feature requests and removed type:support Support issues labels Aug 21, 2023
@suyash-narain
Copy link
Author
suyash-narain commented Aug 21, 2023

Hi @pkgoogle @miaout17

android studio doesn't take LENS_FACING_EXTERNAL and its corresponding annotation. It shows it as an error (marking with red font and a red underline) before the build. So couldn't build with it.

using val cameraSelector = CameraSelector.Builder().addCameraFilter(MyCameraFilter("$mCameraId")).build() doesn't lead to any build error, but the app crashes at runtime, and camera is not detected as could be seen from the above posted error log

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Android comp:lite TF Lite related issues stat:awaiting tensorflower Status - Awaiting response from tensorflower TF 2.10 type:bug Bug type:feature Feature requests
Projects
None yet
Development

No branches or pull requests

5 participants