[go: nahoru, domu]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Captions can't be turned on using Google Assistant #1413

Open
1 task
peter-kovacs-accedo opened this issue May 31, 2024 · 3 comments
Open
1 task

Captions can't be turned on using Google Assistant #1413

peter-kovacs-accedo opened this issue May 31, 2024 · 3 comments
Assignees
Labels

Comments

@peter-kovacs-accedo
Copy link

Version

Media3 main branch

More version details

I'm trying to achieve the same functionality on Android TV as it was possible using ExoPlayer and MediaSessionConnector.

Previously using MediaSessionConnector.CaptionCallback it was possible to enable subtitles using Google Assistant. I can't find any references to this with Media3. Is this feature supported with on-device integration, or am I missing something?

Note: From investigation we may be able to workaround this accessing (MediaSessionCompat) MediaSession#Impl, but this field is private, and accessing this might result in unwanted behaviours. Also MediaSessionLegacyStub says: "no-op" in onSetCaptioningEnabled implementation.

Used versions:
androidx.media3: v1.3.1
katniss: v7.10.20240225.1_cl.4
OS: A10, A11
androidx.media: v1.7.0 (for compat)

Devices that reproduce the issue

Android TV and STBs

Devices that do not reproduce the issue

No response

Reproducible in the demo app?

Yes

Reproduction steps

When the Playback is running, with mediaSession set up, use the following ADB command:
adb shell am start -a android.search.action.GLOBAL_SEARCH --es query 'enable\ captions'

Previously this invoked MediaSessionCompat#onSetCaptioningEnabled callback. (and still happening, but it's hidden)

Expected result

Callback should be accessible or exposed.

Actual result

Google Assistant says "Sorry, I can't do that on this app."

Media

https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_16x9/bipbop_16x9_variant.m3u8

Bug Report

@marcbaechinger
Copy link
Contributor
marcbaechinger commented May 31, 2024

Thanks for your excellent report. I specifically like the adb command. TIL!

This is a regression to the legacy session and the MediaSessionConnector indeed.

Google Assistant says "Sorry, I can't do that on this app."

This is probably because Media3 never advertises PlaybackStateCompat.ACTION_SET_CAPTIONING_ENABLED.

I remember the conversation around that when we implemented this in the connector. We ended up adding MediaSessionConnector.CaptionCallback that an app needs to override instead of doing this automatically from the library side. The reason for this was that the API of the legacy session is simplified to caption on/off, which with many streams isn't deterministically clear what subtititle or close caption track should be enabled. It may reasonably be a CC track in the language of the device, but this may still not be unambiguously determined for some stream/devices combinations for instance when the device language isn't available as a cc/subtitle track. So the decision at that time was to leave this decision to the app that needs to implement that callback.

Sorry, for the long context. :) For Media3 controllers and when Assistant would be migrated to Media3, then this wouldn't be an issue as Assistant could inspect the subtitle tracks on their end and then enable and select the subtitle track of choice themselves.

We still need to provide backward compatibility to be on par with the connector as you correctly state. The API on the legacy session side is a bit limited and simplified as explained above. Without thinking too much about it for now we need to consider and discuss a few points that come to mind:

  • provide a way to let an app configure whether to set PlaybackStateCompat.ACTION_SET_CAPTIONING_ENABLED or not. Alternatively this could be (fuzzily) concluded from available tracks in the stream but there probably needs to be a way to override this for apps.
  • provide a way to let the app make the decision which cc/subtitle track to enable and select when a user enables captions via the legacy API

Implementation note: Above functionality is required for legacy/platform controllers only but not for Media3 controllers. The API needs to be designed accordingly.

@marcbaechinger
Copy link
Contributor
marcbaechinger commented May 31, 2024

Addendum: @icbaker brought up an important point that even in the case of a Media3 controller apps may want to choose the correct subtitle instead of letting a controller (like Assistant) choose. The reason is that an app may have stored the preferred language of the user in the app that isn't the device or region language.

Hence, we'd probably even for Media3 controllers need a way that a session gets the ability to either take that language decision on its own or then should be able to override a choice of the controller. Latter is probably again difficult to distinguish for a session: is this an explicit user choice or a best guess of the controller that the session should override? A solution for this would be a Media3 API MediaController.setCloseCaptionsEnabled(boolean)/MediaSession.Callback.onSetCloseCaptionsEnabled(boolean).

(note: this is just brainstormed and written down here for visibility. No concrete APIs mentioned here are decided to be implemented).

@peter-kovacs-accedo
Copy link
Author

Using the previous versions it allowed us to query the available tracks and choose from them then apply changes on the player in the callback.

I was also thinking about how it can be done in extent of your brainstorming:

  • Select the first available CC track available (or the second as in most cases first is "disabled", or selecting system language etc.) - not nice, but it's a start. And that can be a problem as well that you've already mentioned that subtitle tracks naming are not unified when trying to select the system/ app's language.
  • Provide a specific key for custom actions (as it's possible to add custom buttons to the session which will appear in chrome as well) and make this accessible to Assistant as well. (https://developer.android.com/media/legacy/mediasession#custom-action)
  • Conversation with the Assistant: can it ask which track the user wants?

Thanks for the updates, It is a pleasure to see some insights how decisions were made 🙂 and I appreciate the speed how you picked up the topic!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants