PhotoKit

RSS for tag

Work with image and video assets managed by the Photos app, including those from iCloud Photos and Live Photos, using PhotoKit.

PhotoKit Documentation

Posts under PhotoKit tag

101 Posts
Sort by:
Post not yet marked as solved
4 Replies
1.4k Views
When selecting more photos with previous limited authorization, I get this crash on iOS 17.0 *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[PHPhotoLibrary presentLimitedLibraryPickerFromViewController:completionHandler:]: unrecognized selector sent to instance 0x105ea2a60' when using the synchronous and asynchronous methods: PHPhotoLibrary.shared().presentLimitedLibraryPicker(from: viewController) { newlySelectedPhotoIDs in ... OR let newlySelectedPhotoIDs = await PHPhotoLibrary.shared().presentLimitedLibraryPicker(from: viewController) Debugger output is unexpected... after all these methods are in the header... (lldb) po PHPhotoLibrary.shared().responds(to: #selector(PHPhotoLibrary.presentLimitedLibraryPicker(from:))) false (lldb) po PHPhotoLibrary.shared().responds(to: #selector(PHPhotoLibrary.presentLimitedLibraryPicker(from:completionHandler:))) false
Posted
by awal.
Last updated
.
Post marked as solved
2 Replies
1.5k Views
I have been searching high and low for anyone that has encountered the same issue, My problem is when an image is not actually on the device but rather in iCloud, my app fails to retrieve the image. How would i go about remedying this? My code for PHPicker implementation is as follows, any help would be greatly appreciated. extension PageViewController: PHPickerViewControllerDelegate {       func presentPicker() {     ArtModel.holdingBay.removeAll()     var configuration = PHPickerConfiguration()     configuration.filter = .images     configuration.selectionLimit = 5     let picker = PHPickerViewController(configuration: configuration)     picker.delegate = self     present(picker, animated: true)   }       func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {     if results.count 0 {       for result in results {         result.itemProvider.loadObject(ofClass: UIImage.self, completionHandler: { (object, error) in           if let image = object as? UIImage {             DispatchQueue.main.async {               ArtModel.holdingBay.append(image)             }           }         })       }     }     picker.dismiss(animated: true, completion: self.holdingBayCheck)   }       func holdingBayCheck() {     if ArtModel.holdingBay.count 0 {       performSegue(withIdentifier: K.goToCrop, sender: self)     }   }     }
Posted
by njdeane.
Last updated
.
Post marked as solved
3 Replies
999 Views
I'm trying to provide a mechanism for the user to select a bg image from their photo library. I've got this code which theoretically is getting the data but can't figure out how to convert it to an Image in SwiftUI. I've found plenty of examples on iOS but none workable on MacOS. This is my implementation of the PhotosPicker on iOS: PhotosPicker(selection: $vm.selectedBGphoto, matching: .images) { Label("Select Background Image", systemImage: "photo.on.rectangle") } .tint(.purple) .controlSize(.large) .onChange(of: vm.selectedBGphoto) { newItem in Task { if let data = try? await newItem?.loadTransferable(type: Data.self) { if let uiImage = UIImage(data: data) { vm.selectedBGdata = data vm.bgImage = Image(uiImage: uiImage) } } } This DOES NOT work on MacOS because of the UIImage reference. How do I convert what I get from PhotosPicker into an Image on MacOS?
Posted
by botofogo.
Last updated
.
Post not yet marked as solved
0 Replies
374 Views
Now I use AVFoundation framework to get the photo output, but the image aspect ratio is 4:3. But according to the Camera app in the iPhone 13 Pro, it has server image aspect ratio: 4:3, 16:9 and 1:1 when take the proraw image. So how can I get the 1:1, 16:9 aspect ratio proraw image? After I do some research, I find that no matter you use which kinds of camera in the iPhone 11, 12, 13, 14, 15 or Pro, the result image is always 4:3, 1:1 and 16:9 come from the 4:3 cropping. If it is true, how can I crop the proraw file without any data lossing? My developer environment: iPhone 13 Pro iOS 16.7 xcode 14.3.1 This is the session configuration code for the camera device configuration. session.beginConfiguration() /* Do not create an AVCaptureMovieFileOutput when setting up the session because Live Photo is not supported when AVCaptureMovieFileOutput is added to the session. */ session.sessionPreset = .photo // Add video input. do { var defaultVideoDevice: AVCaptureDevice? if let backCameraDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back) { // If a rear dual camera is not available, default to the rear wide angle camera. defaultVideoDevice = backCameraDevice } else if let frontCameraDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .front) { // If the rear wide angle camera isn't available, default to the front wide angle camera. defaultVideoDevice = frontCameraDevice } guard let videoDevice = defaultVideoDevice else { print("Default video device is unavailable.") setupResult = .configurationFailed session.commitConfiguration() return } let videoDeviceInput = try AVCaptureDeviceInput(device: videoDevice) if session.canAddInput(videoDeviceInput) { session.addInput(videoDeviceInput) self.videoDeviceInput = videoDeviceInput } else { print("Couldn't add video device input to the session.") setupResult = .configurationFailed session.commitConfiguration() return } } catch { print("Couldn't create video device input: \(error)") setupResult = .configurationFailed session.commitConfiguration() return } // check the lens list let camerasOptions = videoDeviceDiscoverySession.devices var availableCameras: [AVCaptureDevice.DeviceType] = [] if camerasOptions.isEmpty { print("no camera devices") } else { for camera in camerasOptions { if camera.deviceType == .builtInUltraWideCamera || camera.deviceType == .builtInWideAngleCamera || camera.deviceType == .builtInTelephotoCamera { if !availableCameras.contains(camera.deviceType) { availableCameras.append(camera.deviceType) } } } } DispatchQueue.main.async { self.lensList = availableCameras } // Add the photo output. if session.canAddOutput(photoOutput) { session.addOutput(photoOutput) photoOutput.isHighResolutionCaptureEnabled = true photoOutput.maxPhotoQualityPrioritization = .quality print(photoOutput.isAppleProRAWSupported) // Use the Apple ProRAW format when the environment supports it. photoOutput.isAppleProRAWEnabled = photoOutput.isAppleProRAWSupported DispatchQueue.main.async { self.isSupportAppleProRaw = self.photoOutput.isAppleProRAWSupported } } else { print("Could not add photo output to the session") setupResult = .configurationFailed session.commitConfiguration() return } session.commitConfiguration()
Posted
by qihuijia.
Last updated
.
Post not yet marked as solved
1 Replies
1k Views
Hello, In one of my apps, I'm trying to modify the pixel buffer from a ProRAW capture to then write the modified DNG. This is what I try to do: After capturing a ProRAW photo, I work in the delegate function func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) { ... } In here I can access the photo.pixelBuffer and get its base address: guard let buffer = photo.pixelBuffer else { return } CVPixelBufferLockBaseAddress(buffer, []) let pixelFormat = CVPixelBufferGetPixelFormatType(buffer) // I check that the pixel format corresponds with ProRAW . This is successful, the code enters the if block if (pixelFormat == kCVPixelFormatType_64RGBALE) { guard let pointer = CVPixelBufferGetBaseAddress(buffer) else { return } // We have 16bits per component, 4 components let count = CVPixelBufferGetWidth(buffer) * CVPixelBufferGetHeight(buffer) * 4 let mutable = pointer.bindMemory(to: UInt16.self, capacity: count) // As a test, I want to replace all pixels with 65000 to get a white image let finalBufferArray : [Float] = Array.init(repeating: 65000, count: count) vDSP_vfixu16(finalBufferArray, 1, mutable, 1, vDSP_Length(finalBufferArray.count)) // I create an vImage Pixel buffer. Note that I'm referencing the photo.pixelBuffer to be sure that I modified the underlying pixelBuffer of the AVCapturePhoto object let imageBuffer = vImage.PixelBuffer<vImage.Interleaved16Ux4>(referencing: photo.pixelBuffer!, planeIndex: 0) // Inspect the CGImage let cgImageFormat = vImage_CGImageFormat(bitsPerComponent: 16, bitsPerPixel: 64, colorSpace: CGColorSpace(name: CGColorSpace.displayP3)!, bitmapInfo: CGBitmapInfo(rawValue: CGImageAlphaInfo.last.rawValue | CGBitmapInfo.byteOrder16Little.rawValue))! let cgImage = imageBuffer.makeCGImage(cgImageFormat: cgImageFormat)! // I send the CGImage to the main view controller. This is successful, I can see a white image when rendering the CGImage into a UIImage. This lets me think that I successfully modified the photo.pixelBuffer firingFrameDelegate?.didSendCGImage(image: cgImage) } // Now I try to write data. Unfortunately, this does not work. The photo.fileDataRepresentation() writes the data corresponding to the original, unmodified pixelBuffer `if let photoData = photo.fileDataRepresentation() { // Sending the data to the view controller and rendering it in the UIImage displays the original photo, not the modified pixelBuffer firingFrameDelegate?.didSendData(data: photoData) thisPhotoData = photoData }` CVPixelBufferUnlockBaseAddress(buffer, []) The same happens if I try to write the data to disk. The DNG file displays the original photo and not the data corresponding to the modified photo.pixelBuffer. Do you know why this code should not work? Do you have any ideas on how I can modify the ProRAW pixel buffer so that I can write the modified buffer into a DNG file? My goal is to write a modified file, so, I'm not sure I can use CoreImage of vImage to output a ProRAW file.
Posted
by salvo_89.
Last updated
.
Post not yet marked as solved
0 Replies
329 Views
It seems like there is no way to do this but I am having trouble wrapping my head around this posibility. How is this not a thing? Where is PhotoPicker for tvOS? Why is it missing?
Posted
by olihall.
Last updated
.
Post not yet marked as solved
2 Replies
770 Views
In iOS 17.0.3, photos taken using Apple's native camera app can't be loaded immediately (within approximately 30 seconds) through PHPickerViewController. Specifically, the method itemProvider.canLoadObject(ofClass: UIImage.self) returns false. However, after about 30 seconds post-capture, the photos load without any hindrance. Initially, I considered an issue with my own photo-loading code, but the same problem persists even with Apple's official PHPickerDemo sample code. [PHPickerDemo - SelectingPhotosAndVideosInIOS.zip] https://developer.apple.com/documentation/photokit/selecting_photos_and_videos_in_ios ViewController.swift Line 89 (PHPickerDemo) func displayNext() { guard let assetIdentifier = selectedAssetIdentifierIterator?.next() else { return } currentAssetIdentifier = assetIdentifier let progress: Progress? let itemProvider = selection[assetIdentifier]!.itemProvider if itemProvider.canLoadObject(ofClass: PHLivePhoto.self) { progress = itemProvider.loadObject(ofClass: PHLivePhoto.self) { [weak self] livePhoto, error in DispatchQueue.main.async { self?.handleCompletion(assetIdentifier: assetIdentifier, object: livePhoto, error: error) } } } else if itemProvider.canLoadObject(ofClass: UIImage.self) { <========================================= FALSE progress = itemProvider.loadObject(ofClass: UIImage.self) { [weak self] image, error in DispatchQueue.main.async { self?.handleCompletion(assetIdentifier: assetIdentifier, object: image, error: error) } } } ...omitted... } Environment & Settings: iPhone 12 iOS 17.0.3 Settings -> Camera -> Formats -> High Efficiency (Enabled) Reproduction Steps: Take a photo in normal photo mode with Apple's native camera app (not in portrait mode). Launch the PHPickerDemo app. Tap the photo icon located in the top right. Observe that the photo fails to load. Workarounds: Wait for over 30 seconds prior to selecting the photo. Opt to shoot in portrait mode rather than the standard photo mode. Switch on Settings -> Camera -> Formats -> Most Compatible. I am developing a photo editing app, and I have received many emails from users stating that they cannot select photos since updating to iOS 17. Thanks.
Posted Last updated
.
Post not yet marked as solved
1 Replies
1.3k Views
I have a UICollectionView to display photos from a device's album with the Photos framework. The photos are correctly displayed, but if I scroll fast (like when you tape at the top of the screen to go to the top of the collectionView), I have some photos which are not at the good indexPath. I just need to scroll a bit to put the bad photo out of the screen, and everything go back in place.I clean the cell during prepareForReuse by canceling the current request.I presume it's a problem with the asynchronous request of PHImageManager, but I don't know how to avoid this problem. And I want to keet the asynchronous request to keep the collectionView smooth.Here some code :View Controller extension AlbumDetailViewController : UICollectionViewDataSource { func numberOfSectionsInCollectionView(collectionView: UICollectionView) -&gt; Int { return 1 } func collectionView(collectionView: UICollectionView, numberOfItemsInSection section: Int) -&gt; Int { return photoList.count } func collectionView(collectionView: UICollectionView, cellForItemAtIndexPath indexPath: NSIndexPath) -&gt; UICollectionViewCell { let cell = collectionView.dequeueReusableCellWithReuseIdentifier("PhotoCell", forIndexPath: indexPath) as! PhotoCollectionCell cell.setImage(photoList.objectAtIndex(indexPath.row) as! PHAsset) return cell } }Custom CollectionViewCell class PhotoCollectionCell: UICollectionViewCell { @IBOutlet weak var imageView: UIImageView! var requestId: PHImageRequestID! let manager = PHImageManager.defaultManager() override func awakeFromNib() { super.awakeFromNib() // Initialization code } override func prepareForReuse() { self.imageView.image = nil manager.cancelImageRequest(self.requestId) } func setImage(asset: PHAsset) { let option = PHImageRequestOptions() option.resizeMode = .Fast option.deliveryMode = .HighQualityFormat self.requestId = manager.requestImageForAsset(asset, targetSize: CGSize(width: self.frame.size.width * UIScreen.mainScreen().scale, height: self.frame.size.height * UIScreen.mainScreen().scale), contentMode: PHImageContentMode.Default, options: option, resultHandler: {(result, info)-&gt;Void in self.imageView.image = result }) } }Thank you
Posted Last updated
.
Post not yet marked as solved
2 Replies
3.9k Views
While trying to save a photo to a custom album on some devices we get the following error: The operation couldn’t be completed. (PHPhotosErrorDomain error 3300). code: 3300 This is the part of the code where the issue happens PHPhotoLibrary.shared().performChanges({ let assetChangeRequest = PHAssetChangeRequest.creationRequestForAsset(from: image) guard let placeholder = assetChangeRequest.placeholderForCreatedAsset else { return } let albumChangeRequest = PHAssetCollectionChangeRequest(for: album) albumChangeRequest?.addAssets([placeholder] as NSArray) }, completionHandler: { success, error in completion(success, error) }) I would be thankful for any tips since I am out of ideas.
Posted
by blukz.
Last updated
.
Post not yet marked as solved
2 Replies
352 Views
Hi, I am trying to store persistent changes to the photo library. I know I can use a PHPersistentChangeToken to get the changes since that token, but I am not sure how to serialize this object and save it into a file. I am also wondering if you can serialize a PHPersistentChangeFetchResult and store that in a file. Cheers.
Posted
by nqeng.
Last updated
.
Post not yet marked as solved
0 Replies
332 Views
Goal is to get/save the captured photo (From default camera) immediately from my app when the app is in background. When I capture a photo with default camera, photoLibraryDidChange(_:) function do not execute that time. But when I reopen my app, that time this function executes and give the images, which were captured in that particular time. how to execute photoLibraryDidChange(_:) when app is in background?
Posted
by ashikur16.
Last updated
.
Post not yet marked as solved
3 Replies
518 Views
Hello, after updating the physical device to iOS17, it seems there's an issue with the ImagePicker's functionality. In our app, even though NSItemProvider indicates canLoadObject(ofClass: UIImage.self) returns true, the method loadObject(ofClass: UIImage.self) { (object, error) consistently returns nil. There's also a possibility that the same phenomenon is occurring with the standard Notes app's image picker, preventing images from being passed to the app.
Posted Last updated
.
Post not yet marked as solved
13 Replies
7.1k Views
Hello! I am playing around with the PHPickerViewController and so far I was able to get the selected images by loading them into UIImage instances but I don't know how to get the selected video. Below is the relevant implementation of the method: func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]): let provider = result.itemProvider guard provider.hasItemConformingToTypeIdentifier(AVFileType.mov.rawValue) else { return } &#9;&#9;&#9;&#9;&#9;&#9;provider.loadItem(forTypeIdentifier: AVFileType.mov.rawValue, options: nil) { (fileURL, error) in &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;if let error = error { &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;print(error) &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;return &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;guard let videoURL = fileURL as? URL else { return } &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;DispatchQueue.main.async { &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;let fm = FileManager.default &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;let destination = fm.temporaryDirectory.appendingPathComponent("video123.mov") &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;try! fm.copyItem(at: videoURL, to: destination) &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;let playerVC = AVPlayerViewController() &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;playerVC.player = AVPlayer(url: destination) &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;self.present(playerVC, animated: true, completion: nil) &#9;&#9;&#9;&#9;&#9;&#9;&#9;&#9;} &#9;&#9;&#9;&#9;&#9;&#9;} I get crash trying to copy the item. It says the source file does not exists but the path looks real to me. "The file “3C2BCCBC-4474-491B-90C2-93DF848AADF5.mov” couldn’t be opened because there is no such file." I tried it without copying first and just passing the URL to AVPlayer but nothing would play. I am testing this on a simulator. Thanks for help!
Posted
by nemecek_f.
Last updated
.
Post not yet marked as solved
1 Replies
605 Views
I am trying to display HDR Images (ProRAW) within UIImageView using preferredImageDynamicRange. This was shown in a 2023 WWDC Video let imageView = UIImageView() if #available(iOS 17.0, *) { self.imageView.preferredImageDynamicRange = UIImage.DynamicRange.high } self.imageView.clipsToBounds = true self.imageView.isMultipleTouchEnabled = true self.imageView.contentMode = .scaleAspectFit self.photoScrollView.addSubview(self.imageView) I pull the image from PHImageManager: let options = PHImageRequestOptions() options.deliveryMode = .highQualityFormat options.isNetworkAccessAllowed = true PHImageManager.default().requestImage(for: asset, targetSize: self.targetSize(), contentMode: .aspectFit, options: options, resultHandler: { image, info in guard let image = image else { return } DispatchQueue.main.async { self.imageView.image =image if #available(iOS 17.0, *) { self.imageView.preferredImageDynamicRange = UIImage.DynamicRange.high } } } Issue The image shows successfully, yet not in HDR mode (no bright specular highlights, as seen when the same image ((ProRAW) is pulled on the native camera app. What am I missing here?
Posted
by sle39lvr.
Last updated
.
Post not yet marked as solved
1 Replies
348 Views
We have received a lot of user feedback, saying that our app caused the video in the user's system album to not play, we did reproduce this phenomenon after operating some modules of our app many times, after monitoring the device log, click on the system album z probably received the following abnormal error VideoContentProvider received result:<AVPlayerItem: 0x281004850, asset = <AVURLAsset: 0x28128fce0, URL = file:///var/mobile/Media/DCIM/100APPLE/IMG_0085.MP4>>, info:{ PHImageResultRequestIDKey = 316; }, priority:oneup automatic, strategy:<PXDisplayAssetVideoContentDeliveryStrategy: 0x2836c3000>quality: medium+(med-high), segment:{ nan - nans }, streaming:YES, network:YES, audio:YES, targetSize:{1280, 1280}, displayAsset:8E30C461-B089-4142-82D9-3A8CFF3B5DE9 <PUBrowsingVideoPlayer: 0xc46a59770> Asset : <PHAsset: 0xc48f5fc50> 8E30C461-B089-4142-82D9-3A8CFF3B5DE9/L0/001 mediaType=2/524288, sourceType=1, (828x1792), creationDate=2023-07-19 上午7:36:41 +0000, location=0, hidden=0, favorite=0, adjusted=0 VideoSession : <PXVideoSession 0xc48a1ec50> { Content Provider: <PXPhotoKitVideoContentProvider: 0x282d441e0>, Asset <PHAsset: 0xc48f5fc50> 8E30C461-B089-4142-82D9-3A8CFF3B5DE9/L0/001 mediaType=2/524288, sourceType=1, (828x1792), creationDate=2023-07-19 上午7:36:41 +0000, location=0, hidden=0, favorite=0, adjusted=0 , Media Provider: <PUPhotoKitMediaProvider: 0x28104da70> Desired Play State: Paused Play State: Paused Stalled: 0 At Beginning: 1 End: 0 Playback: ‖ Paus √ b0 a0 s0 l1 f0 e0 r0.0 0.000/60.128 VideoOutput: (null) Got First Pixel Buffer: NO Pixel Buffer Frame Drops: 0 Buffering: 0 }: Starting disabling of video loading for reason: OutOfFocus <PUBrowsingVideoPlayer: 0xc46de66e0> Asset : <PHAsset: 0xc48f5f1d0> 11ECA95E-0B79-4C7C-97C6-5958EE139BAB/L0/001 mediaType=2/0, sourceType=1, (1080x1920), creationDate=2023-09-21 上午7:54:46 +0000, location=1, hidden=0, favorite=0, adjusted=0 VideoSession : (null): Starting disabling of video loading for reason: OutOfFocus I think this message is imporant VideoSession : (null): Starting disabling of video loading for reason: OutOfFocus restart the iPhone can resolve this anomalous ,can you know reason or how to avoid this bug the bug like :https://discussionschinese.apple.com/thread/254766045 https://discussionschinese.apple.com/thread/254787836
Posted
by YangMike.
Last updated
.
Post not yet marked as solved
9 Replies
3.3k Views
I use the following code to parse Photo metadata and this works well. However, I am unable to pull the new iOS 14 "caption" from this metadata (it worked in early iOS 14 betas, but has since stopped working in the GM.) Does anyone know how I can get the caption data from a PHAsset? Thanks! Stephen         let options = PHContentEditingInputRequestOptions()         options.isNetworkAccessAllowed = true         asset.requestContentEditingInput(with: options, completionHandler: {(contentEditingInput, _) -> Void in             if let url = contentEditingInput?.fullSizeImageURL {                 let fullImage = CIImage(contentsOf: url)                                  // get all the metadata                 self.allPhotoMetadata = fullImage?.properties ?? [:]                                  // {TIFF}                 if let tiffDict = self.allPhotoMetadata["{TIFF}"] as? [String:Any] {                     if tiffDict["Make"] != nil {                         self.cameraData[cameraKeys.make] = tiffDict["Make"]                     }                     if tiffDict["Model"] != nil {                         self.cameraData[cameraKeys.model] = tiffDict["Model"]                     }                     if tiffDict["ImageDescription"] != nil {                         self.imageData[imageKeys.caption] = tiffDict["ImageDescription"]                     }                 }                                  // {IPTC}                 if let iptcDict = self.allPhotoMetadata["{IPTC}"] as? [String:Any] {                     // if we didn't find a caption in the TIFF dict, try to get it from IPTC data                     // first try, Caption/Abtract, then ArtworkContentDescription                     if self.imageData[imageKeys.caption] == nil {                         if iptcDict["Caption/Abstract"] != nil {                             self.imageData[imageKeys.caption] = iptcDict["ArtworkContentDescription"]                         } else if iptcDict["ArtworkContentDescription"] != nil {                             self.imageData[imageKeys.caption] = iptcDict["ArtworkContentDescription"]                         }                     }                 }             }         })     }
Posted
by sorth.
Last updated
.
Post not yet marked as solved
2 Replies
1.1k Views
Users can run our apps on Macs with Apple Silicon via the "iPad Apps on Mac" feature. The apps use PHPhotoLibrary.requestAuthorization(for: .addOnly, handler: callback) to request write-only access to the user's Photo Library during image export. This works as intended on macOS, but a huge problem arises when the user denies access (by accident or intentionally) and later decides that they want us to add their image to Photos: There is no way to grant this permission again. In System Preferences → Privacy &amp;amp; Security → Photos, the app is just not listed – in fact, none of the "iPad Apps on Mac" apps appear here. Not even tccutil reset all my.bundle.id works. It just reports tccutil: Failed to reset all approval status for my.bundle.id. Uninstalling, restarting the Mac, and reinstalling the app also doesn't work. The system seems to remember the initial decision. Is this an oversight in the integration of those apps with macOS, or are we missing something fundamental here? Is there maybe a way to prompt the user again?
Posted Last updated
.