Integrate photo, audio, and video content into your apps.

Posts under Media tag

61 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

INPlayMediaIntent `mediaSearch` mediaName unreliable when searching for playlists
We are working with an app that uses the INPlayMediaIntent to allow users to select and play music using Siri. In building out this feature, we have noticed that when selecting playlists to play, Siri will consistently leave out information from the intent that we are use to resolve the media to play in the app. It seems that there is generally no rhyme or reason as to why some information is left out. Walking through a couple test cases, here is the phrase and corresponding mediaSearch that we receive when testing: "Hey Siri, play the playlist happy songs in the app " (this is a working example) ▿ Optional<INMediaSearch> - some : <INMediaSearch: 0x114050780> { reference = 0; mediaType = 5; sortOrder = 0; albumName = <null>; mediaName = happy songs; genreNames = ( ); artistName = <null>; moodNames = ( ); releaseDate = <null>; mediaIdentifier = <null>; } "Hey Siri, play the playlist my favorites in the app " (this fails with a null mediaName) ▿ Optional<INMediaSearch> - some : <INMediaSearch: 0x114050600> { reference = 0; mediaType = 5; sortOrder = 0; albumName = <null>; mediaName = <null>; genreNames = ( ); artistName = <null>; moodNames = ( ); releaseDate = <null>; mediaIdentifier = <null>; } "Hey Siri, play the playlist working out playlist in the app " (this fails as the term "playlist" is excluded) ▿ Optional<INMediaSearch> - some : <INMediaSearch: 0x114050ae0> { reference = 0; mediaType = 5; sortOrder = 0; albumName = <null>; mediaName = working out; genreNames = ( ); artistName = <null>; moodNames = ( ); releaseDate = <null>; mediaIdentifier = <null>; } "Hey Siri, play the playlist recently added in the app " (this fails with a null mediaName) ▿ Optional<INMediaSearch> - some : <INMediaSearch: 0x1140507e0> { reference = 0; mediaType = 5; sortOrder = 0; albumName = <null>; mediaName = <null>; genreNames = ( ); artistName = <null>; moodNames = ( ); releaseDate = <null>; mediaIdentifier = <null>; } Based on the above, Siri seems to ignore playlists named "Recently Added", "My Favorites", and playlists that have the word "playlist" in them such as "Working Out Playlist". To rectify this, we attempted to set the INVocabulary for the playlist titles that a user has in the app, as suggested in this WWDC session: https://developer.apple.com/videos/play/wwdc2020/10060/ let vocabulary = INVocabulary.shared() vocabulary.setVocabularyStrings(NSOrderedSet(array: [ "my favorites", "recently added", "working out playlist" ]), of: .mediaPlaylistTitle); This seems to have no effect. We understand the note in https://developer.apple.com/documentation/sirikit/registering_custom_vocabulary_with_sirikit/ stating that "a few minutes" should be waited before testing custom vocabulary, but waiting upwards of 20 minutes and even restarting the device did not result in any of the custom vocabulary making a difference. If these playlist names are set in AppIntentVocabulary.plist, "Recently Added" and "My Favorites" are able to be discovered as playlists, but the other failed test cases remain failing. The obvious shortcoming here is that these are not dynamic. <key>ParameterVocabularies</key> <array> <dict> <key>ParameterNames</key> <array> <string>INPlayMediaIntent.playlistTitle</string> </array> <key>ParameterVocabulary</key> <array> <dict> <key>VocabularyItemIdentifier</key> <string>working out playlist</string> <key>VocabularyItemSynonyms</key> <array> <dict> <key>VocabularyItemPhrase</key> <string>working out playlist</string> </dict> </array> </dict> <dict> <key>VocabularyItemIdentifier</key> <string>recently added</string> <key>VocabularyItemSynonyms</key> <array> <dict> <key>VocabularyItemPhrase</key> <string>recently added</string> </dict> </array> </dict> <dict> <key>VocabularyItemIdentifier</key> <string>my favorites</string> <key>VocabularyItemSynonyms</key> <array> <dict> <key>VocabularyItemPhrase</key> <string>my favourites</string> </dict> <dict> <key>VocabularyItemPhrase</key> <string>my favorites</string> </dict> </array> </dict> </array> </dict> </array> Given the above, our questions are as follows: Is there documentation surrounding how Siri may pass along the mediaSearch in INPlayMediaIntent and how/why information may be left out? Why does setting custom vocabulary with INVocabulary seem to have no effect, yet the same vocabulary in AppIntentVocabulary does have an effect? Is the functionality we are experiencing to be expected, or should this be reported as a bug? We've published the test app that we are using for debugging this functionality at this link: https://github.com/awojnowski/SiriTest
3
0
139
3d
Can't share Video to Facebook
I have the Facebook SDK version 17.0.2 and xcode 15. Sharing photos and links work fine but when I try sharing videos, I get the following error: Failed to log access with error: access=<PATCCAccess 0x301d12b20> accessor:<<PAApplication 0x301d27e30 identifierType:auditToken identifier:{pid:18440, version:47210}>> identifier:A9159DCD-76B1-4C77-A01E-DA611929B50B kind:intervalEvent timestampAdjustment:0 visibilityState:0 assetIdentifierCount:0 accessCount:0 tccService:kTCCServicePhotos, error=Error Domain=NSCocoaErrorDomain Code=4097 "connection to service with pid 15679 named com.apple.privacyaccountingd" UserInfo={NSDebugDescription=connection to service with pid 15679 named com.apple.privacyaccountingd}
1
0
174
2w
Camera issues on iPhone 14 Pro Max after iOS 18 beta 3
4 days after installing iOS 18 beta 3 my iPhone no longer had macro control, .5 zoom, and other features for the Pro models, I’ve tried restarting my phone but nothing changed, settings Keeps saying that the Phone has an unknown part on the camera or it’s not genuine, I bought it brand new on T-mobile last year, I need help wether this is just a beta issue or an actual physical damage
3
1
297
Jul ’24
PhotoAsset in TagView
I'm trying to recreate the Tag people functionality in Instagram. Where a carousel of media the user has selected is displayed to them. They can then go through and tag people to the media. I'm trying to achieve this (but with food items instead of people) with TagView using PHAssets however the result is some funky behaviour I'm pulling my hair out trying to understand. The items are tagging correctly but the scroll feature on the TabView works sporadically. It occasionally scrolls fine but all of a sudden won't let me scroll past one image.. (See attached video for example). import SwiftUI import Photos struct TagItemView: View { @Binding var selectedAssets: [PHAsset] @State private var showTagSheet = false @State private var currentAsset: PHAsset? { didSet { if let currentAsset = currentAsset { assetTags = tags[currentAsset.localIdentifier] ?? [] } } } @State private var tags: [String: [String]] = [:] // Dictionary to store tags for each media item @State private var assetTags: [String] = [] // Tags for the current asset var body: some View { VStack { mediaCarousel tagsView Spacer() } .background(Color.black.ignoresSafeArea()) .onAppear { if let firstAsset = selectedAssets.first { currentAsset = firstAsset } } .onChange(of: currentAsset) { newAsset in if let currentAsset = newAsset { assetTags = tags[currentAsset.localIdentifier] ?? [] print("currentAsset changed: \(currentAsset.localIdentifier)") print("assetTags: \(assetTags)") } } .sheet(isPresented: $showTagSheet) { TagSheetView(selectedAsset: $currentAsset, tags: $tags, showTagSheet: $showTagSheet, assetTags: $assetTags) } } private var mediaCarousel: some View { VStack { TabView(selection: $currentAsset) { ForEach(selectedAssets, id: \.self) { asset in if asset.mediaType == .image { TagItemImageView(asset: asset) .tag(asset.localIdentifier) .onAppear { currentAsset = asset print("Asset in view (onAppear): \(asset.localIdentifier)") } .onTapGesture { currentAsset = asset showTagSheet = true } } else if asset.mediaType == .video { TagItemVideoView(asset: asset) .tag(asset.localIdentifier) .onAppear { currentAsset = asset print("Asset in view (onAppear): \(asset.localIdentifier)") } .onTapGesture { currentAsset = asset showTagSheet = true } } } } .tabViewStyle(PageTabViewStyle(indexDisplayMode: .always)) .frame(height: UIScreen.main.bounds.height * 0.4) // Fixed height for carousel } } private var tagsView: some View { ScrollView { if !assetTags.isEmpty { ItemView(assetTags: assetTags, removeTag: { tag in removeTag(tag, from: currentAsset!) }) .transition(.opacity) } else { InstructionsView() .transition(.opacity) } } .background(Color.black) .padding(.top, 8) .padding(.horizontal, 15) } private func removeTag(_ tag: String, from asset: PHAsset) { guard var assetTags = tags[asset.localIdentifier] else { return } assetTags.removeAll { $0 == tag } tags[asset.localIdentifier] = assetTags if currentAsset?.localIdentifier == asset.localIdentifier { self.assetTags = assetTags } } }
0
0
169
3w
Multiview HLS with HDR
I have an HDR10+ encoded video that if loaded as a mov plays back on the Apple Vision Pro but when that video is encoded using the latest (1.23b) Apple HLS tools to generate an fMP4 - the resulting m3u8 cannot be played back in the Apple Vision Pro and I only get back a "Cannot Open" error. To generate the m3u8, I'm just calling mediafilesegmenter (with -iso-fragmented) and then variantplaylistcreator. This completes with no errors but the m3u8 will playback on the Mac using VLC but not on the Apple Vision Pro. The relevant part of the m3u8 is: #EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=40022507,BANDWIDTH=48883974,VIDEO-RANGE=PQ,CODECS="ec-3,hvc1.1.60000000.L180.B0",RESOLUTION=4096x4096,FRAME-RATE=24.000,CLOSED-CAPTIONS=NONE,AUDIO="audio1",REQ-VIDEO-LAYOUT="CH-STEREO" {{url}} Has anyone been able to use the HLS tools to generate fMP4s of MV-HEVC videos with HDR10?
0
0
329
Jul ’24
Location not visible in video recorded in third party app
I recently bought an insta360 flow gimbal. when recording video with the instaflow app, I cannot see the location in apple photos app and all other apple apps. However I can see the location in windows photos app once I downloaded the videos into my windows PC. The location is also visible in android app once I share it through google account. With an exif app, I can see the location meta data in exif table as well, but again not shown as location. exiftool in my pc can also see the meta data including location as in attached screenshot. Compared to video shot with built-in camera app, I cannot find any difference in terms of location meta data. What could be wrong? I contacted insta360 app support, they do not seem to understand what's going on, just asking for very simple questions again and again like do you enable GPS location access, are you shooting video? I also contacted apple support, they are just saying it's thirdparty issue and refusing to help further. If it's really thirdparty issue how come the location data is actually embeded as meta data, and windows pc and android device can see the location? BTW, I air drop this video to all my apple devices like iPhone 15 ultra and ipad air, and very old iPhone, all of them cannot see the location.
1
0
399
Jul ’24
What is the proper way to handle videos in SwiftData ?
I'm creating an application with swiftui which gets images and videos from the Photos picker then store them with swiftData for later use. I save both images and videos as data with  @Attribute(.externalStorage). But it just seems wrong to me to store the videos that way, they can be several gigabytes in size . What is the correct way to handle something like this ? Is it to store the url and then each time the user wants to see a video save a temporary video ?. If that's the case can anyone show me how this should be done? Any comments appreciated Guillermo
0
0
277
Jun ’24
Event callback issues with MediaSession API and iOS
Hi, I'm working on a web project that uses the MediaSession API to interface with the media notification on iOS. The issue that I'm experiencing occurs after pressing the play button in the media session modal where the session seems to NOT fire the event handler callback and also kill the media session itself. It's a strange behaviour considering that the pause callback works fine. audio_source = new Audio(url); navigator.mediasession.metadata = { ... // Metadata here }; navigator.mediasession.setActionHandler('play', (details) => { audio_source.play(); } ); navigator.mediasession.setActionHandler('pause', (details) => { audio_source.pause(); } );
0
0
278
Jun ’24
Command Center / Dynamic Island missing icons and animations
hello all! I'm setting up a really simple media player in my swiftui app. the code is the following: import AVFoundation import MediaPlayer class AudioPlayerProvider { private var player: AVPlayer init() { self.player = AVPlayer() self.player.automaticallyWaitsToMinimizeStalling = false self.setupAudioSession() self.setupRemoteCommandCenter() } private func setupAudioSession() { do { try AVAudioSession.sharedInstance().setCategory(.playback, mode: .default) try AVAudioSession.sharedInstance().setActive(true) } catch { print("Failed to set up audio session: \(error.localizedDescription)") } } private func setupRemoteCommandCenter() { let commandCenter = MPRemoteCommandCenter.shared() commandCenter.playCommand.addTarget { [weak self] _ in guard let self = self else { return .commandFailed } self.play() return .success } commandCenter.pauseCommand.addTarget { [weak self] _ in guard let self = self else { return .commandFailed } self.pause() return .success } } func loadAudio(from urlString: String) { guard let url = URL(string: urlString) else { return } let asset = AVAsset(url: url) let playerItem = AVPlayerItem(asset: asset) self.player.pause() self.player.replaceCurrentItem(with: playerItem) NotificationCenter.default.addObserver(self, selector: #selector(self.streamFinished), name: .AVPlayerItemDidPlayToEndTime, object: self.player.currentItem) } func setMetadata(title: String, artist: String, duration: Double) { var nowPlayingInfo = [ MPMediaItemPropertyTitle: title, MPMediaItemPropertyArtist: artist, MPMediaItemPropertyPlaybackDuration: duration, MPNowPlayingInfoPropertyPlaybackRate: 1.0, ] as [String: Any] MPNowPlayingInfoCenter.default().nowPlayingInfo = nowPlayingInfo } @objc private func streamFinished() { self.player.seek(to: .zero) try? AVAudioSession.sharedInstance().setActive(false) MPNowPlayingInfoCenter.default().playbackState = .stopped } func play() { MPNowPlayingInfoCenter.default().playbackState = .playing self.player.play() } func pause() { MPNowPlayingInfoCenter.default().playbackState = .paused self.player.pause() } } pretty scholastic. The code works when called on views. It also shows up within the lock screen / dynamic island (when in background), but here lies the problems: The play/pause button do not appear neither in the Command Center nor in the dynamic island. If I tap on the position these button should show up, the command works. Just the icons are not appearing. the waveform animation does not animate when playing. Many audio apps are working just fine so is my code lacking something. But I don't know why. What is missing? Thanks in advance!
1
0
505
May ’24
RTC Peer connection error when trying to accept live video call in WKWebView ran in Mac Catalyst
I am developing an WKWebView app that runs on both mobile and mac platforms using a react website. Currently, an admin user can send a call request in which the user gets a pop-up to answer. on iOS it works as intended, the user is asked to grant permission to utilize camera and microphone and the user is immediately connected upon accepting, however I never receive the pop-up when running in Mac Catalyst. I have ensured to enable permissions in sandbox settings as well as adding NSCamera and NSMic permissions in the property list. in the safari debugging console I receive this error upon clicking the accept call ReferenceError: Can't find variable: RTCPeerConnection Ive attempted to use the webview callback requestMediaCapturePermissionFor, and this did not seem to trigger on attempting to answer the call. Is this an error on my development end? or is this an issue I should look to the react code for a fix. I appreciate any feedback possible. Thanks.
0
0
400
May ’24
Send Custom Interactive Layout in iMessage Extension
Hi, I am integrating iMessage app where I have audio which I want to send as Message. But My requirement is to send a custom layout with play button on it and by tapping on it I can play/pause audio. Also On tapping on sent message view presentation style changes to expanded while I want to not have any presentation change. I just want to tap on message to play audio, nothing else. recently I tried to make a custom layout and then by taking its screenshot I sent it as image, but issue is I cant make this view interqctive. I can play audio on tap of message but I also want to update the layout of the selected message. func sendCustomViewMessage(url:URL) { let customView = MessageView(frame: CGRect(x: 0, y: 0, width: 150, height: 50)) //CustomView(frame: CGRect(x: 0, y: 0, width: 200, height: 200)) // Initialize your custom view customView.audioURL = url let customViewImage = imageFromView(view: customView) // Convert custom view to UIImage let layout = MSMessageTemplateLayout() layout.image = customViewImage // Set the image of the message layout layout.mediaFileURL = url layout.caption = "Firt Message" let message = MSMessage() message.layout = layout message.url = url self.activeConversation?.insert(message, completionHandler: nil) } I am searching since days about this but I couldn't get any appropriate solution, can anyone help me on this?
0
0
272
May ’24
Problem IOS Photo Permission
I'm encountering a problem on some Iphone models with photo gallery authorization. On some devices, the authorization only displays "add photos" and "None". However, for many devices, most of them have "Full access", "Limited access" and "None" authorizations, which means that you can't access the gallery. Example of device affected by the bug: Iphone 11 IOS 17.3. I tested on an emulator with the same version, but it works with all 3 authorizations. In Info.plist , I have the following information: NSCameraUsageDescription The application wants to have access to your camera to help you add photos to your worksites. NSPhotoLibraryAddUsageDescription The application wants access to your photos to help you add photos to your building sites. NSPhotoLibraryUsageDescription The application wants access to your photos to help you add photos to your worksites.
1
0
363
Apr ’24
Mail Privacy Protection (MPP) / Private Relay Question
Hello, we have noticed a change in the last few weeks in how Mail Privacy Protection (MPP) is operating. Specifically, MPP pre-caches images within email newsletters that are protected via Private Relay. The end result of the pre-cacheing is that every image in the newsletter is retrieved from our servers even if the user does not open the newsletter. This has been in place since '21. What we've noticed in the last month or so, is that the amount of pre-cacheing has dropped significantly, on the order of 20-25%. We can compare this with newsletters opened in non-MPP environments to know that email sends are consistent, it is only that pre-cached events seem to have changed. Does anyone know of any changes to the logic of Private Relay / MPP that would impact how it is pre-caching data from email newsletters? Thank you.
0
0
431
Apr ’24
Life Radio Tirol Homepage: Problem with MP3 192kb Elements in Safari only
Hello everyone! Im new to this forum and i have a question about playing mp3 files on our Homepage on apple devices: We have a service section with the latest News, Traffic and Weather on our Site. The problem is, that the mp3 Files can be played, but the player shows 0:00 and is not able to go to the middle or end of the file. This is ONLY on Iphone in our app and on Safari. I am recording the mp3 with ffmpeg. The strange thing is, that there is no Bitrate shown in FIle details, and also no length. The File is auto Recorded with a script.
0
0
306
Apr ’24
SwiftUI PhotosPicker does not work
iOS 17.4.1, iPhone 15 Pro. I pick photos from the user's photo library using: ... .photosPicker(isPresented: $addPhotos, selection: $pickedPhotos, matching: .images) .onChange(of: pickedPhotos) { import(photoItems: pickedPhotos) } The picker UI works ok, but then when I import the photos: private func import(photoItems: [PhotosPickerItem]) { for photoItem in photoItems { Log.debug("picked: \(photoItem)") Task { do { let imageData = try await photoItem.loadTransferable(type: Data.self) guard let imageData else { Log.error("failed to load image data") return } guard let image = UIImage(data: imageData) else { Log.error("failed to create image from data") return } // use image .... } catch { Log.error("failed to load image data: \(error)") } } } } Logging the picked photo gives: PhotosPickerItem(_itemIdentifier: "C7E2F753-43F6-413D-BA42-509C60BE9D77/L0/001", _shouldExposeItemIdentifier: false, _supportedContentTypes: [<_UTCoreType 0x1ebcd1c10> public.jpeg (not dynamic, declared), <_UTCoreType 0x1ebcd1d70> public.heic (not dynamic, declared), <UTType 0x300fe0430> com.apple.private.photos.thumbnail.standard (not dynamic, declared), <UTType 0x300fe03f0> com.apple.private.photos.thumbnail.low (not dynamic, declared)], _itemProvider: <PUPhotosFileProviderItemProvider: 0x303fdff00> {types = ( "public.jpeg", "public.heic", "com.apple.private.photos.thumbnail.standard", "com.apple.private.photos.thumbnail.low" )}) Looks like there's a valid photo? But then the loadTransferable() call fails with: 5C9D59CB-3606-48C1-9B37-1F18D642B3AD grantAccessClaim reply is an error: Error Domain=NSCocoaErrorDomain Code=4101 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x308244f30 {Error Domain=PFPAssetRequestErrorDomain Code=0 "The operation couldn’t be completed. (PFPAssetRequestErrorDomain error 0.)" UserInfo={NSURL=file:///private/var/mobile/Containers/Shared/AppGroup/36CF50FB-38FC-440E-9662-35C23B5E636C/File%20Provider%20Storage/photospicker/uuid=C7E2F753-43F6-413D-BA42-509C60BE9D77&library=1&type=1&mode=2&loc=true&cap=true.jpeg, NSLocalizedDescription=The operation couldn’t be completed. (PFPAssetRequestErrorDomain error 0.)}}} Error loading public.data: Error Domain=NSItemProviderErrorDomain Code=-1000 "Cannot load representation of type public.jpeg" UserInfo={NSLocalizedDescription=Cannot load representation of type public.jpeg, NSUnderlyingError=0x3081a2550 {Error Domain=NSCocoaErrorDomain Code=4101 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x308244f30 {Error Domain=PFPAssetRequestErrorDomain Code=0 "The operation couldn’t be completed. (PFPAssetRequestErrorDomain error 0.)" UserInfo={NSURL=file:///private/var/mobile/Containers/Shared/AppGroup/36CF50FB-38FC-440E-9662-35C23B5E636C/File%20Provider%20Storage/photospicker/uuid=C7E2F753-43F6-413D-BA42-509C60BE9D77&library=1&type=1&mode=2&loc=true&cap=true.jpeg, NSLocalizedDescription=The operation couldn’t be completed. (PFPAssetRequestErrorDomain error 0.)}}}}} 2024-04-03 12:16:07.8010 error PhotosView.import: failed to load image data: importNotSupported [ As usual I rebooted my phone as these things tend to be pretty buggy in iOS, but same error. Note this is not in a simulator which seems to have long standing bugs related to photo picking, this is on a freshly upgraded 17.4.1 device. I can't find any documentation related to these errors and all googling comes up with a few other cases but no solutions. Should this API actually work or is it better to go back to the old UIKit stuff? I use loadTransferable(type: Data.self) as UIImage.self is not Transferable and this hack has seemed to work ok for some months.
4
1
917
Apr ’24