[go: nahoru, domu]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Would like an example of playing music with background execution in Flutter #9250

Open
eseidelGoogle opened this issue Oct 31, 2018 · 34 comments
Labels
co.proposal Community ask to change an approach or process for docs.flutter.dev d.new-feature Adds new Flutter content e2-days Effort: < 5 days from.flutter-sdk Reported via move from flutter/flutter lang.native-code Involves Swift, ObjC, Java, or Kotlin code for mobile apps p3-low Valid but not urgent concern. Resolve when possible. Encourage upvote to surface. st.triage.ltw Indicates Lead Tech Writer has triaged

Comments

@eseidelGoogle
Copy link
Contributor

e.g. https://codelabs.developers.google.com/codelabs/android-music-player/index.html except in Flutter.

This came up from discussions with a customer. FYI @bkonyi @RedBrogdon

@bkonyi
Copy link
Contributor
bkonyi commented Oct 31, 2018

Would we want this example for both Android and iOS?

@bkonyi bkonyi self-assigned this Oct 31, 2018
@eseidelGoogle
Copy link
Contributor Author

I wouldn't necessarily start with the goal of publishing. Nice to do if others think it's worthwhile, but I think the most useful part of this exercise is to validate that the background execution code we have is able to support (common?) use cases like this.

If I were doing this I'd try on Android and see if that works. If you/matt/android/others decide its worth publishing, then I'd try to do on both.

@ryanheise
Copy link
ryanheise commented Nov 28, 2018

I think MP3 players and the like are a very common type of project, so this will be extremely useful.

I have just published a new package that does background execution of audio called audio_service:

https://pub.dartlang.org/packages/audio_service

I have only implemented the Android side of things but am hoping someone with the relevant iOS knowledge is out there and willing to contribute.

What this plugin does is set up all the platform-level stuff typical for playing audio (in Android, starting a media browser service in the foreground, acquiring a wake lock and audio focus, etc.) but let's you write your audio playing code itself in dart making use of other dart plugins to do the actual audio playing. This is (for starters) because some users of this plugin may want to play MP3s in the background while other users may want to play text-to-speech or some other custom audio (think of an RSS reader that reads out articles with text to speech).

One thing I've noticed is that a lot of plugins out there won't actually work when registered with the registry for the FlutterNativeView because they do something like registrar.activity().getApplicationContext() and here activity() will be null and produce a NullPointerException. The best I can do is to contact the authors of those plugins as I encounter them to suggest a fix.

Anyway, please feel free to try out audio_service and contributors are welcome (especially on the iOS side)

@ryanheise
Copy link

I have a question about how to manage spawning and destroying the background isolate, whether we should leave one running forever or repeatedly start and stop one as needed.

There is overhead in spawning a new isolate and initialising it, so on pausing the media player, I think it makes sense to keep the background isolate alive so that playback can be resumed quickly. This is as opposed to when the user stops the media player where I guess it's OK to completely shut down the background isolate. (Or is it?)

All of the Google-provided examples so far keep the background isolate running forever (or until it is forcibly destroyed by the system to reclaim memory). So if you wanted to do it, what would be the recommended way to manually shut down the isolate, either from within the isolate or externally? Or is it just recommended to follow the provided examples and just leave the isolate running forever?

@zoechi
Copy link
zoechi commented Dec 7, 2018

@ryanheise the isolate ends itself like main() does if it has no more work to do.
This usually means when the event queue is empty.
So if there are no more futures awaited or streams subscribed to.
You could also use https://api.dartlang.org/stable/2.1.0/dart-isolate/Isolate/kill.html but that would not be a good fit for your situation.

I think you need to do some profiling (memory usage when kept open vs startup time) to figure out if it is better for your use case to keep it open or close-and-reopen.

@ryanheise
Copy link

@zoechi Thanks very much for the explanation. I'm guessing that platform channels also fall into this category, so I would need to call channel.setMethodCallHandler(null) within the isolate before the isolate will shut down gracefully?

I've updated the audio_service plugin to retain the isolate on pause and (hopefully) terminate the isolate on stop.

@zoechi
Copy link
zoechi commented Dec 7, 2018

As far as I remember channel.setMethodCallHandler(null) is not necessary.

@ryanheise
Copy link

@zoechi Hmm, then how does the background isolate normally stay alive long enough to receive messages on the method channel? For example, here's the isolate entrypoint for the Geofencing plugin by @bkonyi

void callbackDispatcher() {
	const MethodChannel _backgroundChannel =
			MethodChannel('plugins.flutter.io/geofencing_plugin_background');
	WidgetsFlutterBinding.ensureInitialized();

	_backgroundChannel.setMethodCallHandler((MethodCall call) async {
		print("Callback Dispatcher Invoked: ${call.arguments}");
		final List<dynamic> args = call.arguments;
		final Function callback = PluginUtilities.getCallbackFromHandle(
				CallbackHandle.fromRawHandle(args[0]));
		assert(callback != null);
		final List<String> triggeringGeofences = args[1].cast<String>();
		final List<double> locationList = <double>[];
		// 0.0 becomes 0 somewhere during the method call, resulting in wrong
		// runtime type (int instead of double). This is a simple way to get
		// around casting in another complicated manner.
		args[2]
				.forEach((dynamic e) => locationList.add(double.parse(e.toString())));
		final Location triggeringLocation = locationFromList(locationList);
		final GeofenceEvent event = intToGeofenceEvent(args[3]);
		callback(triggeringGeofences, triggeringLocation, event);
	});
	print('GeofencingPlugin dispatcher started');
	_backgroundChannel.invokeMethod('GeofencingService.initialized');
}

It seems to me that the only thing keeping the isolate alive is the method channel. But if you're right, it raises the question of what we would need to do to let the isolate die.

@zoechi
Copy link
zoechi commented Dec 9, 2018

Not sure. Perhaps someone else with more insight can shed some light.

@ghost
Copy link
ghost commented Jun 6, 2019

Any Audio player that supports background music play in both platform

@seyidnaali
Copy link
seyidnaali commented Jun 24, 2019

I'm using the fluttery_audio plugin to play any song from remote
you can use it
https://pub.dev/packages/fluttery_audio

----- I migrated my project to android and I met this error with this plugin, can someone help me please

this is my error

Android -> Flutter: onBufferingUpdate()
E/AndroidRuntime(29083): FATAL EXCEPTION: AudioPlayer
E/AndroidRuntime(29083): java.lang.RuntimeException: Methods marked with @UiThread must be executed on the main thread. Current thread: AudioPlayer
E/AndroidRuntime(29083):     at io.flutter.embedding.engine.FlutterJNI.ensureRunningOnMainThread(FlutterJNI.java:605)
E/AndroidRuntime(29083):     at io.flutter.embedding.engine.FlutterJNI.dispatchPlatformMessage(FlutterJNI.java:515)
E/AndroidRuntime(29083):     at io.flutter.embedding.engine.dart.DartMessenger.send(DartMessenger.java:76)
E/AndroidRuntime(29083):     at io.flutter.embedding.engine.dart.DartExecutor.send(DartExecutor.java:166)
E/AndroidRuntime(29083):     at io.flutter.view.FlutterNativeView.send(FlutterNativeView.java:155)
E/AndroidRuntime(29083):     at io.flutter.plugin.common.MethodChannel.invokeMethod(MethodChannel.java:98)
E/AndroidRuntime(29083):     at io.flutter.plugin.common.MethodChannel.invokeMethod(MethodChannel.java:84)
E/AndroidRuntime(29083):     at io.fluttery.flutteryaudio.FlutteryAudioPlugin$1.onPlayerPlaybackUpdate(FlutteryAudioPlugin.java:83)
E/AndroidRuntime(29083):     at io.fluttery.flutteryaudio.AudioPlayer$1.run(AudioPlayer.java:218)
E/AndroidRuntime(29083):     at android.os.Handler.handleCallback(Handler.java:873)
E/AndroidRuntime(29083):     at android.os.Handler.dispatchMessage(Handler.java:99)
E/AndroidRuntime(29083):     at android.os.Looper.loop(Looper.java:193)
E/AndroidRuntime(29083):     at android.os.HandlerThread.run(HandlerThread.java:65)

@arthurb123
Copy link

I'm using the fluttery_audio plugin to play any song from remote
you can use it
https://pub.dev/packages/fluttery_audio

----- I migrated my project to android and I met this error with this plugin, can someone help me please

this is my error

Android -> Flutter: onBufferingUpdate()
E/AndroidRuntime(29083): FATAL EXCEPTION: AudioPlayer
E/AndroidRuntime(29083): java.lang.RuntimeException: Methods marked with @UiThread must be executed on the main thread. Current thread: AudioPlayer
E/AndroidRuntime(29083):     at io.flutter.embedding.engine.FlutterJNI.ensureRunningOnMainThread(FlutterJNI.java:605)
E/AndroidRuntime(29083):     at io.flutter.embedding.engine.FlutterJNI.dispatchPlatformMessage(FlutterJNI.java:515)
E/AndroidRuntime(29083):     at io.flutter.embedding.engine.dart.DartMessenger.send(DartMessenger.java:76)
E/AndroidRuntime(29083):     at io.flutter.embedding.engine.dart.DartExecutor.send(DartExecutor.java:166)
E/AndroidRuntime(29083):     at io.flutter.view.FlutterNativeView.send(FlutterNativeView.java:155)
E/AndroidRuntime(29083):     at io.flutter.plugin.common.MethodChannel.invokeMethod(MethodChannel.java:98)
E/AndroidRuntime(29083):     at io.flutter.plugin.common.MethodChannel.invokeMethod(MethodChannel.java:84)
E/AndroidRuntime(29083):     at io.fluttery.flutteryaudio.FlutteryAudioPlugin$1.onPlayerPlaybackUpdate(FlutteryAudioPlugin.java:83)
E/AndroidRuntime(29083):     at io.fluttery.flutteryaudio.AudioPlayer$1.run(AudioPlayer.java:218)
E/AndroidRuntime(29083):     at android.os.Handler.handleCallback(Handler.java:873)
E/AndroidRuntime(29083):     at android.os.Handler.dispatchMessage(Handler.java:99)
E/AndroidRuntime(29083):     at android.os.Looper.loop(Looper.java:193)
E/AndroidRuntime(29083):     at android.os.HandlerThread.run(HandlerThread.java:65)

I am having the exact same issue and am getting the same stack trace when using fluttery_audio.
Have you by any chance found a solution to your problem?

@ryanheise
Copy link

You can try audioplayer which doesn't have this error:

https://pub.dev/packages/audioplayer

You can use audioplayer in combination with audio_service to play audio in the background:

https://pub.dev/packages/audio_service

Background audio is Android-only so far, but help is wanted for the iOS side:

ryanheise/audio_service#10

@Sun3
Copy link
Contributor
Sun3 commented Sep 18, 2019

Any updates on a Flutter plugin that plays audio from a local flutter asset folder (or web) and playing audio in the background while showing media controls on the lock screen. Written in Kotlin might be a good way to go since its the default Android language when creating a new Flutter project.

I have looked at every audio plugin and none of them really do the above feature, most plugins come close but also they look not updated.

Thank you and do believe a good audio player plugin is really needed for flutter like the video_player plugin from the Flutter team.

@ryanheise
Copy link

Hi @Sun3

If you were to write this in native Android, you would use MediaPlayer to play audio, and you would use MediaBrowserService to support background play with controls on the lock screen.

In Flutter, my audio_service plugin creates for you the MediaBrowserService, and the audioplayer plugin (among many others) creates you a MediaPlayer. audio_service simply wraps a service around any other audio code you want to write in order to keep it running in the backround and present the lock screen controls.

So, you can do what you describe with audio_service, and the example project already demonstrates playing audio from the web in the background and showing media controls on the lock screen. I'd be interested to know what you believe can't be done with this plugin and I'd be happy to help.

Once you create an audio service, you can write whatever audio playing code you like inside of it. My example demonstrates playing audio from the web, and also playing text to speech, but you could also play audio from assets by using Flutter's AssetBundle to copy the asset data into a regular file and then use the audioplayer plugin as before to play it, or if it's a short audio file you could use the soundpool plugin.

The video_player plugin does not support background execution, although I am creating a fork of this project to support it. Once that is done, you should also be able to add background and lock screen controls with audio_service.

@Sun3
Copy link
Contributor
Sun3 commented Sep 19, 2019

They are all great plugins.

The audiplayer plugin doesn't handle showing the media controls on the lock screen and doesn't handle loading local audio files located in the Flutter app project structure. It was updated 10 months ago.

The audio_service plugin doesn't support loading local audio files located in the Flutter app project structure.

The assets_audio_player plugin does load audio assets from the flutter assets folder but does not support playing media controls from the lock screen. It was updated about 6 months ago.

@ryanheise
Copy link
ryanheise commented Sep 19, 2019

The audio_service plugin doesn't support loading local audio files located in the Flutter app project structure.

I actually do exactly what you describe in my own app using those plugins, and I explained in my previous comment how I did it.

  • audio_service allows you to run any audio code in the background with lock screen controls
  • AssetBundle lets you read data from an asset
  • audioplayer lets you play that audio

I think the AssetBundle API is the ingredient you're missing. It might be convenient if the audioplayer plugin added support for playing directly from assets, but it is literally only one extra line of code for you to do that extra step yourself with AssetBundle.

Perhaps you're after an all-in-one plugin, but since Flutter is designed to be composable in order to allow mixing and matching of components, I think this is more in the spirit of Flutter. Consider that there are many different use cases for types of audio you might want to play in the background, such as:

  • Streaming audio from a URL/URI
  • Text To Speech
  • Local assets
  • Synthesised audio
  • Custom audio sequences involving combinations of the above within the same media session
  • etc.

That last point means there is actually an infinite number of different use cases, and audio_service tries to stay agnostic of all of that allowing you to write your "own" code to play whatever audio you want, so it does not limit you. audio_service does this by asking you to implement a callback like this:

onStart: () async {
   ... write here STANDARD dart code to play your audio ...
}

This way, I can support all of the above infinite use cases easily.

If you want to play audio from an asset, then you can insert code in that callback to do just that:

onStart: () async {
    await File(filePath).writeAsBytes(
        (await rootBundle.load('assets/$name')).buffer.asUint8List());
    AudioPlayer audioPlayer = new AudioPlayer();
    audioPlayer.play(filePath);
    await _completer.future;
}

@Sun3
Copy link
Contributor
Sun3 commented Sep 19, 2019

I understand what you are saying, and yes :) I would love to see one plugin to handle audio for different secnarios. I still would like to suggest a Google handling an audio plugin (like the video_plugin they support) by using Swift and Kotlin since it's the new default languages selected for new projects. Any plugins written in Java/Objective C in future OS releases will they be compatible/updated respectively for each OS platform, API depreciation, or..? I know Java in not going away anytime soon but Objective C has been deprecated for a while and apps are written in Swift (and now SwiftUI).

My personal experience in past apps is when you introduce too many plugins or outside libraries you might run into issues in the future when updates are needed. One plugin updates and the other doesn't. I still remember in my Windows programming the DLL hell.

@ryanheise I really do thank you for your feedback and suggestions and you have great plugins.

Your suggestion to load audio assets from the Flutter asset bundle does work by loading the asset then copying it to the appropriate iOS and Android local folders.

In the plugin itself, for example loading the Flutter asset directly from the Android code would be:

val assetManager = registrar.context().assets
val key = registrar.lookupKeyForAsset("assets/audios/my_audio_file.mp3")
val fd = assetManager.openFd(key)

Thanks

@ryanheise
Copy link

@Sun3 Putting aside the suggestions directed at Google, I'll just comment on the idea of building a monolithic plugin that does everything conceivable. The key point you have to understand here is that the only way to handle all scenarios is to use a callback. What you need, for good reasons, is a separate API to begin and end a background audio session, and a separate API to play text to speech, and a separate API to play an audio file, and a separate API to do synthesised speech, and so on and so on, and you want all of these to work together.

  1. Developers should be able to link in only the APIs they use. Not all apps want to use the text-to-speech API in the background, so they should be able to use the text-to-speech API without the background audio API, for example. They should also be able to use audio player API without having to link in the library for text-to-speech.
  2. Developers should be able to explicitly begin and end a background audio session, and then be able to execute a programmable "sequence" of audio output code within that background session. For example, imagine something like an MP3 player with a play list that plays a sequence of audio files within the same background audio session, and then imagine a similar app that uses text-to-speech to play a sequence of news items in your feed, all within the same audio session, and then imagine another app that is actually able to do something similar with a mixed feed where some items in the feed have MP3 enclosures, and other items in the feed have just plain text, and you'd like your app to BEGIN a background audio session, then loop over each item in the feed, and depending on the item play it as an audio file or read it out in text to speech, and then END the audio session. If you think the best way for Google to handle this use case is to build exactly that scenario into their plugin, then keep in mind that that was just one very specific case. There are many apps that do all sorts of custom sequences of audio all within the same background audio session, and the only way to handle this is to treat the management of the background audio session as a separate API.

So sure, you can encourage Google to come up with their own officially maintained plugin for the various motivating reasons you have stated (and I am open to that), but even if they do, they will likely come up with a very similar design to the one I arrived at if they want to support all use cases. That is, separate APIs that can be mixed and matched rather than a single plugin that links in text-to-speech and playing audio files and synthesised audio, and custom sequences involving program logic.

(It should also be worth noting that Google actively encourages the community to contribute to Flutter in an open source fashion rather than them building official plugins for everything. I'd also like to point out that my plugin is designed to do a single job and do that job well, but it's also a library which means other projects can link it in. If you would like to contribute in some way to solving your specific problem for other people, one thing you could do is create your own audio plugin that links in audio_service and audioplayer, and provides the exact API that you want, and share it on pub.dev)

@Sun3
Copy link
Contributor
Sun3 commented Sep 20, 2019

@ryanheise I wasn't suggesting to build a monolithic plugin that does everything conceivable. Like I previously stated I think you have great plugins. I am only asking Google to consider to create like an audio_plugin like their video_plugin. Maintainability is a strong concern when building enterprise level apps and the language that they are written and ability to keep up with future OS updates is something that I learned early on that it could kill your app.

I appreciate all of your suggestions.

@ryanheise
Copy link

I wasn't suggesting to build a monolithic plugin that does everything conceivable [...] I am only asking Google to consider [...]

Your suggestions to Google are separate, so just to be clear, I have nothing more really to say on that personally. I was more concerned about your feedback on the plugins developed by the open source community that Plugin A didn't offer the services of Plugin B or C, and Plugin B didn't offer the services of Plugin C, and Plugin C didn't offer the services of Plugin B. (A = Audio, B = Background, C = Assets). For example, you commented about my plugin (B):

The audio_service plugin doesn't support loading local audio files located in the Flutter app project structure.

when in fact you are free to mix and match A + B + C to meet your requirements. And then you clarified:

I would love to see one plugin to handle audio for different secnarios.

So this is really what I understood your request to be in terms of features.

You may not want all other conceivable combinations for yourself, but I'm asking you to consider that other users want A + B + "D", and others want A + C + E, and yet others just want "D", and so on ("D" could be something like text-to-speech, "E" could be generated/synthesised audio). So designing a set of APIs needs to consider this, and from a birds eye view, it begins to make sense to make each of these separate APIs which you can mix together in any combination that fits your specific use case. Again, this point is independent of whether you would rather Google build these plugins.

@Sun3
Copy link
Contributor
Sun3 commented Sep 21, 2019

All of your points in terms of features are great, I do not disagree with them :)

@consuelogranata
Copy link

Hi! I use audioplayers to listen to a radio channel (streaming). It works well in the background except when the user receives a call. I this case the playing stops but doesn't restart automatically at the end of the call. Any solution for that? Thank you!!!

@remoteportal
Copy link

It's been a year... has anyone contributed the iOS part yet?

@ryanheise
Copy link

Hi @remoteportal

Not yet. A number of community members have contributed pointers to the iOS code that will need to be integrated, but unfortunately (and understandably) the issue is that we are all open source volunteers who have various other priorities that get in the way. However, very recently, two contributors have offered to try to work on the iOS side: ryanheise/audio_service#10 (comment)

@ryanheise
Copy link

@consuelogranata , audio_service provides the necessary callbacks to detect when you receive a call so that you can pause, and when audio focus returns to your app so that you can resume audio playback.

@ryanheise
Copy link

@remoteportal I was recently able to get my hands on a Mac, watched a Lynda.com tutorial on Objective C, and was able to hack together a partial iOS implementation. We also have others contributing iOS pull requests on this base, so it appears the ball is rolling.

Most features are actually implemented on iOS, except for displaying certain information in "Now Playing" such as album art and the current playback position, which we hopefully hope to have down the track.

You can try out audio_service here.

@suragch
Copy link
Contributor
suragch commented Dec 16, 2019

In addition to a solution for Android and iOS, I also need to play audio on the web. An official package would be nice.

@kf6gpe
Copy link
Contributor
kf6gpe commented Jan 27, 2020

This looks like it's two things: One is about providing a sample, the other is about providing a plugin for media playback. We appreciate your contributions!! I"m going to add the plugin label so that this surfaces to the plugin team.

@rohansohonee
Copy link

https://github.com/rohansohonee/ufmp
UFMP is the sample app that I have put together using @ryanheise plugins. This sample can be the starting resource for how one can build an audio app. Feedback and improvements are welcome.

@bkonyi
Copy link
Contributor
bkonyi commented Jan 27, 2022

Clearing my assignment as I won't have time to look at this in the foreseeable future. @RedBrogdon, maybe this is something DevRel could pull together at some point?

@darshankawar
Copy link
Member

Adding will need additional triage based on the issue reported which is quite old and to properly track it going ahead.

@Hixie Hixie transferred this issue from flutter/flutter Aug 15, 2023
@Hixie
Copy link
Contributor
Hixie commented Aug 15, 2023

I've transferred this to the website repo since if we do this it should probably be as a cookbook example.

@ryanheise
Copy link

FYI, @azamor-luccas created a demo of this earlier in the year:

ryanheise/audio_service#982

@danagbemava-nc danagbemava-nc added st.triage.triage-team Triage team reviewing and categorizing the issue p3-low Valid but not urgent concern. Resolve when possible. Encourage upvote to surface. e2-days Effort: < 5 days co.proposal Community ask to change an approach or process for docs.flutter.dev d.new-feature Adds new Flutter content from.flutter-sdk Reported via move from flutter/flutter and removed st.triage.triage-team Triage team reviewing and categorizing the issue labels Aug 16, 2023
@atsansone atsansone added from.team Reported by Dash docs team member st.triage.ltw Indicates Lead Tech Writer has triaged and removed from.team Reported by Dash docs team member labels Aug 21, 2023
@atsansone atsansone added the lang.native-code Involves Swift, ObjC, Java, or Kotlin code for mobile apps label May 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
co.proposal Community ask to change an approach or process for docs.flutter.dev d.new-feature Adds new Flutter content e2-days Effort: < 5 days from.flutter-sdk Reported via move from flutter/flutter lang.native-code Involves Swift, ObjC, Java, or Kotlin code for mobile apps p3-low Valid but not urgent concern. Resolve when possible. Encourage upvote to surface. st.triage.ltw Indicates Lead Tech Writer has triaged
Projects
None yet
Development

No branches or pull requests