Integrate video and other forms of moving visual media into your apps.

Posts under Video tag

84 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Immersive experience from sample
Hi, We are currently building an app for immersive experiences of our custom content. This is displayed from a video on a custom geometry in the immersive on the Vision Pro I have enabled the AVPlayerViewController system controls that detach when entering immersive like in the sample: https://developer.apple.com/documentation/visionos/building-an-immersive-media-viewing-experience For our case, we do not need the 2D screen showing after entering the immersive, only the environment So my question is how to remove the screen with the video and keep the controls, like in the Apple TV app for immersive experiences? Thanks in advance
2
0
218
1w
ios17 remember full screen status and playsinline not working
I encountered an issue when playing WebRTC video using the H5 video tag on an iPhone with iOS 17. After performing the following operations, the video automatically plays in fullscreen mode: Create the video tag, specify the playback source, and set the autoplay attribute to true. Call the API to enter fullscreen playback mode. Exit fullscreen by swiping with a finger. Stop playback and destroy the video tag. Repeat step 1; at this point, the video does not correctly default to inline playback but instead automatically plays in fullscreen. It seems as if the system has cached the previous fullscreen state, even though I have exited fullscreen, and the same result occurs with different browsers. I have set the 'playsinline' and 'webkit-playsinline' attributes for the video tag. Anyone encountered the same issue? I would appreciate any solutions that can be shared.
0
0
133
2w
iOS 18 - Webcam video not rendered within page without interacting with browser chrome
We have a service providing identification services via video, i.e. users get on a WebRTC video call and follow some instructions, all the while seeing themselves full screen. So this web site displays the user's camera full screen with a few overlays. In iOS 18 beta, this stopped working; the video isn't rendered anymore. The remote end of the conversation can see the video fine, so the camera itself and WebRTC are working. No errors in the console. When tapping the site settings button in the address bar and the font size/reader/translate page dialog pops up, now suddenly the video is being rendered just fine, immediately. It suggests it was there the entire time, Safari just didn't bother actually showing it. It's hard to create a minimal reproducible example for this. The behaviour just suddenly changed starting with iOS 18 beta. Has anyone observed something similar or has any tricks which may enable a workaround?
0
0
248
3w
VideoMaterial to display SBS Stereoscopic 3D video? [VisionOS]
Hi, I love VideoMaterial API that gives so much power to play video on any mesh. But I am trying to play a side-by-side 3D video usingVideoMaterial: RealityView { content in let mesh = MeshResource.generatePlane(width: 300.0, height: 300.0, cornerRadius: 0) //generate mesh let vidMaterial = VideoMaterial(avPlayer: AVPlayer(url: URL(string: "https://someurl/test/master.m3u8")!)) //VideoMaterial vidMaterial.controller.preferredViewingMode = .stereo //<-- no idea why it doesn't work for SBS video in simulator vidMaterial.avPlayer?.play() let planeEntity = Entity() //new entity planeEntity.components.set(ModelComponent(mesh: mesh, materials: [vidMaterial])) //set a new ModelComponent to the entity content.add(planeEntity) } this code works well for plain 2D video playback but how do I display a Side-by-Side or Top-Bottom 3D video? I found GeometrySwitchCameraIndex in custom ShaderGraphMaterial but if I use input node as a image texture then how do I pass the video frame as texture into my custom shader to achieve the 3D effect or maybe there is an even better way to deal with this? There seems to be additional API .preferredViewingMode on the VideoMaterial's controller that can be set to .stereo but it doesn't give any stereo effect. Perhaps it's only for MV-HEVC media playback?
1
0
274
3w
Setting the right height for visualize in a correct way VR180 3D video
Hi, I'm developing a simple app to visualize embedded VR180 3D video. I used a semisphere and projected the video as its material. The semisphere is in the ambient at a fixed y value of 1.35, which is good for a seated person, but not ideal for a standing person because the stereoscopic vision is not correct. In the AppleTV+ and Kandao applications, I noticed that the translation of the video is anchored to the Apple Vision Pro. I tried using AnchorEntity to the head with trackingMode .once, but there is the problem of rotation; the semisphere starts with the rotation of the head. Is there a solution, for example, to anchor the semisphere only to the translation and not to the rotation of the head?
4
0
272
Jul ’24
Multiview HLS with HDR
I have an HDR10+ encoded video that if loaded as a mov plays back on the Apple Vision Pro but when that video is encoded using the latest (1.23b) Apple HLS tools to generate an fMP4 - the resulting m3u8 cannot be played back in the Apple Vision Pro and I only get back a "Cannot Open" error. To generate the m3u8, I'm just calling mediafilesegmenter (with -iso-fragmented) and then variantplaylistcreator. This completes with no errors but the m3u8 will playback on the Mac using VLC but not on the Apple Vision Pro. The relevant part of the m3u8 is: #EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=40022507,BANDWIDTH=48883974,VIDEO-RANGE=PQ,CODECS="ec-3,hvc1.1.60000000.L180.B0",RESOLUTION=4096x4096,FRAME-RATE=24.000,CLOSED-CAPTIONS=NONE,AUDIO="audio1",REQ-VIDEO-LAYOUT="CH-STEREO" {{url}} Has anyone been able to use the HLS tools to generate fMP4s of MV-HEVC videos with HDR10?
0
0
329
Jul ’24
AVPlayerViewController (AppleTV) - Dolby (multi-channel audio) visualisation
We noticed that AVPlayerViewController does not always show the "Multi-channel" label in the audio setting in the player when playing a video asset with surround sound as an audio track. (see image) We only serve in the HLS master manifest a multichannel audio track, like this #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="audio_0",CHANNELS="6",NAME="Surround",LANGUAGE.... Different tvOS versions will give us different outcomes on whether or not the "multi-channel" label is shown DOES NOT SHOW (the label Multi-channel will not show) Model A1842 (tvOS v 17.5.1) Model A1625 (tvOS v 16.6) DOES SHOW (see image) Model A1625 (tvOS v 15.6) This gives us the impression that the label being shown depends on tvOS version.. Any reason why? This is an ideal way for the user to see that the audio track has surround..
0
1
293
Jul ’24
Location not visible in video recorded in third party app
I recently bought an insta360 flow gimbal. when recording video with the instaflow app, I cannot see the location in apple photos app and all other apple apps. However I can see the location in windows photos app once I downloaded the videos into my windows PC. The location is also visible in android app once I share it through google account. With an exif app, I can see the location meta data in exif table as well, but again not shown as location. exiftool in my pc can also see the meta data including location as in attached screenshot. Compared to video shot with built-in camera app, I cannot find any difference in terms of location meta data. What could be wrong? I contacted insta360 app support, they do not seem to understand what's going on, just asking for very simple questions again and again like do you enable GPS location access, are you shooting video? I also contacted apple support, they are just saying it's thirdparty issue and refusing to help further. If it's really thirdparty issue how come the location data is actually embeded as meta data, and windows pc and android device can see the location? BTW, I air drop this video to all my apple devices like iPhone 15 ultra and ipad air, and very old iPhone, all of them cannot see the location.
1
0
398
Jul ’24
how can i get camera access in ios12 application embedded html
i can sure the app already hava the camera access, but in the embedded html, i still cannot open the camera. And this HTML page is work at Safari, but cant work on app when the page is embedded in app. there is the error message: DOMException: undefine is not an object (evaluating 'navigator.mediaDevices.getUserMedia') and i also try to use 'navigator.getUserMedia' and 'navigator.mediaDevices.enumerateDevices()', this all dont work.
0
0
347
Jul ’24
Disable iOS Screen Mirroring for Apps
Hello Apple, I am concerned about the new iOS Screen Mirroring that is available on iOS. I have an app that is only meant to be viewed on iPhones (not Macs or Computers, due to security reasons. I am assuming that Screen Mirroring is using AirPlay underneath, otherwise is there an API being planned or coming that can disable this functionality or is there a way for my app to opt out out of iOS Screen Mirroring? Thanks.
1
0
549
Jun ’24
DestinationVideo -- MV-HEVC Files
In the code example provided there is a bool in the Video object to set a video as 3D: /// A Boolean value that indicates whether the video contains 3D content. let is3D: Bool I have a hosted spatial video that I know works correctly on the AVP player. When I point the Videos.json file to the this URL and set is3D=true my 3D video doesn't show up and I get the follow error: iPVC/1-0 Received playback error: [Error Domain=AVFoundationErrorDomain Code=-11850 "Operation Stopped" UserInfo={NSLocalizedFailureReason=The server is not correctly configured., NSLocalizedDescription=Operation Stopped, NSUnderlyingError=0x30227c510 {Error Domain=CoreMediaErrorDomain Code=-12939 "byte range length mismatch - should be length 2 is length 2434" UserInfo={NSDescription=byte range length mismatch - should be length 2 is length 2434, NSURL=https: <omitted for post> }}}] Can anyone tell me what might be going on? The error is telling me my server is not configured correctly. For context, I'm using a google drive to deliver dynamic images/videos using: https://drive.google.com/uc?export=download&id= <file ID> And the above works great for my images and 2d videos. Is there something I need to do specifically when delivering MV-HEVC videos?
1
0
508
Jun ’24
Controlling spacial video from floating window
I've created a Full Immersive VisionOS project and added a spacial video player in the ImmersiveView swift file. I have a few buttons on a different VideosView swift file on a floating window and i'd like switch the video playing in ImmersiveView when i click on a button in VideosView file. Video player working great in ImmersiveView: RealityView { content in if let videoEntity = try? await Entity(named: "Video", in: realityKitContentBundle) { guard let url = Bundle.main.url(forResource: "video1", withExtension: "mov") else {fatalError("Video was not found!")} let asset = AVURLAsset(url: url) let playerItem = AVPlayerItem(asset: asset) let player = AVPlayer() videoEntity.components[VideoPlayerComponent.self] = .init(avPlayer: player) content.add(videoEntity) player.replaceCurrentItem(with: playerItem) player.play() }else { print("file not found!") } } Buttons in floating window from VideosView: struct VideosView: View { var body: some View { VStack{ Button(action: {}) { Text("video 1").font(.title) } Button(action: {}) { Text("video 2").font(.title) } Button(action: {}) { Text("video 3").font(.title) } } } } In general how do I control the video player across views and how do I replace the video when each button is selected. Any help/code/links would be greatly appreciated.
1
0
499
May ’24
iOS to Android H264 encoding issue.
I'm trying to cast the screen from an iOS device to an Android device. I'm leveraging ReplayKit on iOS to capture the screen and VideoToolbox for compressing the captured video data into H.264 format using CMSampleBuffers. Both iOS and Android are configured for H.264 compression and decompression. While screen casting works flawlessly within the same platform (iOS to iOS or Android to Android), I'm encountering an error ("not in avi mode") on the Android receiver when casting from iOS. My research suggests that the underlying container formats for H.264 might differ between iOS and Android. Data transmission over the TCP socket seems to be functioning correctly. My question is: Is there a way to ensure a common container format for H.264 compression and decompression across iOS and Android platforms? Here's a breakdown of the iOS sender details: Device: iPhone 13 mini running iOS 17 Development Environment: Xcode 15 with a minimum deployment target of iOS 16 Screen Capture: ReplayKit for capturing the screen and obtaining CMSampleBuffers Video Compression: VideoToolbox for H.264 compression Compression Properties: kVTCompressionPropertyKey_ConstantBitRate: 6144000 (bitrate) kVTCompressionPropertyKey_ProfileLevel: kVTProfileLevel_H264_Main_AutoLevel (profile and level) kVTCompressionPropertyKey_MaxKeyFrameInterval: 60 (maximum keyframe interval) kVTCompressionPropertyKey_RealTime: true (real-time encoding) kVTCompressionPropertyKey_Quality: 1 (lowest quality) NAL Unit Handling: Custom header is added to NAL units Android Receiver Details: Device: RedMi 7A running Android 10 Video Decoding: MediaCodec API for receiving and decoding the H.264 stream
0
0
472
May ’24
Question regarding the kVTVideoEncoderList_IsHardwareAccelerated flag
I am a bit confused on whether certain Video Toolbox (VT) encoders support hardware acceleration or not. When I query the list of VT encoders (VTCopyVideoEncoderList(nil,&encoderList)) on an iPhone 14 Pro device, for avc1 (AVC / H.264) and hevc1 (HEVC / H.265) encoders, the kVTVideoEncoderList_IsHardwareAccelerated flag is not there, which -based on the documentation found on the VTVideoEncoderList.h- means that the encoders do not support hardware acceleration: optional. CFBoolean. If present and set to kCFBooleanTrue, indicates that the encoder is hardware accelerated. In fact, no encoders from this list return this flag as true and most of them do not include the flag at all on their dictionaries. On the other hand, when I create a compression session using the VTCompressionSessionCreate() and pass the kVTVideoEncoderSpecification_EnableHardwareAcceleratedVideoEncoder as true in the encoder specifications, after querying the kVTCompressionPropertyKey_UsingHardwareAcceleratedVideoEncoder using the following code, I get a CFBoolean value of true for both H.264 and H.265 encoder. In fact, I get a true value (for both of the aforementioned encoders) even if I don't specify the kVTVideoEncoderSpecification_EnableHardwareAcceleratedVideoEncoder during the creation of the compression session (note here that this flag was introduced in iOS 17.4 ^1). So the question is: Are those encoders actually hardware accelerated on my device, and if so, why isn't that reflected on the VTCopyVideoEncoderList() call?
3
1
543
May ’24