PhotoKit

RSS for tag

Work with image and video assets managed by the Photos app, including those from iCloud Photos and Live Photos, using PhotoKit.

PhotoKit Documentation

Posts under PhotoKit tag

101 Posts
Sort by:
Post not yet marked as solved
2 Replies
1.1k Views
Users can run our apps on Macs with Apple Silicon via the "iPad Apps on Mac" feature. The apps use PHPhotoLibrary.requestAuthorization(for: .addOnly, handler: callback) to request write-only access to the user's Photo Library during image export. This works as intended on macOS, but a huge problem arises when the user denies access (by accident or intentionally) and later decides that they want us to add their image to Photos: There is no way to grant this permission again. In System Preferences → Privacy & Security → Photos, the app is just not listed – in fact, none of the "iPad Apps on Mac" apps appear here. Not even tccutil reset all my.bundle.id works. It just reports tccutil: Failed to reset all approval status for my.bundle.id. Uninstalling, restarting the Mac, and reinstalling the app also doesn't work. The system seems to remember the initial decision. Is this an oversight in the integration of those apps with macOS, or are we missing something fundamental here? Is there maybe a way to prompt the user again?
Posted
by
Post not yet marked as solved
3 Replies
984 Views
Dear Experts, PHAsset.creationDate is an NSDate, which does not have a timezone associated with it, right? Consider a photo viewer app. If I take a photo of the sunrise at 0600 local time while I am away, when I get home and view the photo in the app, I believe I want the timestamp shown with the photo to be 0600. Do you agree? But NSDate is just a time-point, and I don't think Foundation (or anything else in iOS) has a type that combines a time-point with a time zone. Nor does PHAsset have any other useful attributes - unless I were to determine the time zone from the location! Am I missing anything?
Posted
by
Post marked as solved
1 Replies
818 Views
I use PHPicker for user to import photos, but UIImage not support the pic of .AVIF, so I want to get the origin data of .AVIF pic, this is my code: func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) { picker.dismiss(animated: true) for image in results { loadImagePickerResult(image: image) } } func loadImagePickerResult(image: PHPickerResult) { if image.itemProvider.canLoadObject(ofClass: UIImage.self) { image.itemProvider.loadObject(ofClass: UIImage.self) { [weak self] newImage, error in guard let self = self else { return } if let _ = error { } else if let needAddImage = newImage as? UIImage { let imageItem = ContentImageItem() imageItem.image = needAddImage self.viewModel.selectedImageList.append(imageItem) DispatchQueue.main.async { self.scrollImageView.reloadData() self.checkConfirmState() } } } } else if image.itemProvider.hasItemConformingToTypeIdentifier(kUTTypeImage as String) { image.itemProvider.loadItem(forTypeIdentifier: kUTTypeImage as String, options: nil) { [weak self] item, error in guard let self = self else { return } guard let url = item as? URL else { return } var imageData: Data? do { imageData = try Data(contentsOf: url, options: [.mappedIfSafe, .alwaysMapped]) } catch { } guard let selectedImageData = imageData else { return } /// selectedImageData is empty data } } else { } } When I choose .AVIF pic, itemProvider can load the image by "kUTTypeImage" typeIdentifier, and success to get the local path of the pic, but when I use Data(contentsOf: ) to read the origin data, I can only get an empty data. So, is there any problem with this code?Does anyone have experience in handling this matter? "FileManager.default.contents(atPath: url.path)" and "NSData(contentsOf" is also return empty Data
Posted
by
Post not yet marked as solved
0 Replies
510 Views
item.loadTransferable(type: Data.self) { result in switch result { case .success(let data): guard let data = data, let image = UIImage(data: data) else { break } imageView.image = image case .failure(let error): ... } } I load the raw format photo through the above code, but the displayed image is very blurry and messed up. But not all raw format photos will be like this, what is the reason?
Posted
by
Post marked as solved
3 Replies
830 Views
We are developing a widget for an iOS app that displays photos from the user's local photo library. To get photos, we use the PHImageManager.requestImage method with the following parameters: let requestOptions = PHImageRequestOptions() requestOptions.isNetworkAccessAllowed = false requestOptions.isSynchronous = true requestOptions.resizeMode = .exact requestOptions.deliveryMode = .highQualityFormat PHImageManager.requestImage(for: asset, targetSize: context.displaySize /* context is the TimelineProviderContext of the widget */, contentMode: .aspectFill, options: requestOptions) { (image, error) in ... } We load about 12 images per update cycle, calling the requestImage method that many times. While executing this code we can regularly monitor that executing the requestImage method increases the memory consumption by 20MB, even while loading an image with a size of about 3MB. (Image size has been retrieved from the PHAssetResource.) The memory spikes before the callback closure is being executed. We tinkered around with displaySize, PHImageRequestOptions and trying to reduce our overall memory consumption, to no avail. We integrated a DispatchGroup in our code to ensure that requesting images is strictly synchronous and do so in an autoreleasepool to free up memory after processing the image data. I like to stress that we a reasonably sure that the memory spike does not occur in our closure that receives the requested image but in the time between calling the requestImage method and the callback. This has been tested by placing a breakpoint in the line where we call requestImage and a breakpoint in the first line of the callback closure. The first breakpoint stops execution of the code, when continuing execution, the process ends immediately and we get a "" warning and never hitting the second breakpoint. This is a huge problem for us because iOS terminates our widget process every time the memory consumption spikes due to exceeding the 30MB memory limit (our widget consumes about 13-15MB of memory while updating the timeline). The issue described above was observed on an iPhone 11, iOS 16.4 and occurs while trying to load JPG photos between 0.7-4.0MB. We kindly ask for help on why memory spikes when using PHImageManager.requestImage, how to prevent this or if this is a known issue.
Posted
by
Post not yet marked as solved
1 Replies
458 Views
I have my callback registered and it's called repeatedly without anything changing in the photo album. I'm getting the assets via PHAssetCollection.fetchAssetCollections In the callback, I update my PHFetchResult and PHCollection to the one passed in the callback. I process the changeDetails yet the callback keeps getting called. Any ideas?
Posted
by
Post not yet marked as solved
1 Replies
689 Views
Good day. I'm using PHPicker to handle multiple image selections. When a user initially selects, for example, 3 images from the photo library, they can re-open the photo library and deselect any of the selected photos. But I'm not sure how to handle the deselect action. Right now I have this piece of code that handles appending the images to an array, which then I show in the UI. func makeUIViewController(context: Context) -> some UIViewController { var configuration = PHPickerConfiguration(photoLibrary: PHPhotoLibrary.shared()) configuration.filter = .images // filter only to images configuration.selectionLimit = 4 // max 4 selection configuration.preselectedAssetIdentifiers = self.imageIdentifierArray configuration.selection = .ordered //print(self.imageIdentifierArray) let photoPickerViewController = PHPickerViewController(configuration: configuration) photoPickerViewController.delegate = context.coordinator // Use Coordinator for delegation return photoPickerViewController } func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) { picker.dismiss(animated: true, completion: nil) let identifiers = results.compactMap(\.assetIdentifier) self.parent.imageIdentifierArray = identifiers // Handle selected photos for result in results where result.itemProvider.canLoadObject(ofClass: UIImage.self) { result.itemProvider.loadObject(ofClass: UIImage.self) { [weak self] image, error in guard let self = self, let image = image as? UIImage else { return } DispatchQueue.main.async { selectedPhotos.append(image) } } } } I would like to know how I can update the selectedPhotos array whenever a user deselects one of the selected photos from the photo library. My guess is that the problem is something about result.itemProvider.canLoadObject. Because when I deselect a pre-selected image from the photo library, canLoadObject returns false and it won't enter the for loop again. It enters only when I reselect the photo I deselected.
Posted
by
Post not yet marked as solved
0 Replies
271 Views
I've looked over the official document, and of course, many post online too, it seems like that there is no api in PhotoKit with which we can develop functions to organize and manage the hidden photos or the recently delete list. So, is there any plan for apple to support these apis in future? Or I'm just wrong, there is some other way to do that? If so, please show me, I really appriciate it.
Posted
by
Post marked as solved
3 Replies
1.4k Views
Using the new inline PhotosPicker style in iOS 17, it isn't made clear how to handle the cancel button's input, and i cannot seem to find an answer in the documentation. PhotosPicker( "Select picture", selection: $selected, selectionBehavior: .default, photoLibrary: .shared() ) .photosPickerStyle(.inline) Does anybody have a solution or is this a bug that needs to be fixed?
Posted
by
Post not yet marked as solved
0 Replies
615 Views
ImageSaver class: class ImageSaver: NSObject { var successHandler: (() -> Void)? var errorHandler: ((Error) -> Void)? func writeToPhotoAlbum(image: UIImage) { UIImageWriteToSavedPhotosAlbum(image, self, #selector(saveCompleted), nil) } @objc func saveCompleted(_ image: UIImage, didFinishSavingWithError error: Error?, contextInfo: UnsafeRawPointer) { if let error = error { errorHandler?(error) } else { successHandler?() } } } button action to trigger the download of a png file from URL: Button{ let imageSaver = ImageSaver() imageSaver.successHandler = { print("Success!") } imageSaver.errorHandler = { print("Oops: \($0.localizedDescription)") } let imageRequestUrl = URL(string: lastImageUrl)! // I'm sure that I can see the image url is valid and I can // downloaded it from Google chrome successfully print("imageRequestUrl:\(imageRequestUrl)") let req = URLRequest(url: imageRequestUrl) let task = URLSession.shared.dataTask(with: req) { d, res, err in if let data = d, let image = UIImage(data: data) { imageSaver.writeToPhotoAlbum(image: image) } } task.resume() } And this is the error: findWriterForTypeAndAlternateType:119: unsupported file format 'org.webmproject.webp' More info When I change the image URL to another one, like some other open-source png file, there's no error.... please help why it's so weird?
Posted
by
Post not yet marked as solved
0 Replies
804 Views
The updated Photos access dialog in iOS 17 states: Photos may contain metadata, such as location, depth information, or captions. How do I access the caption a user added to a photo in my app? This wasn’t possible in iOS 16, is there new API in 17? I previously requested this ability via metadata in FB10205012 and via PHAsset in FB8244665. If it remains inaccessible I’ve submitted FB12437093 to request captions be removed from this wording.
Posted
by
Post marked as solved
2 Replies
793 Views
I'm trying the sample app from here: https://developer.apple.com/documentation/vision/detecting_moving_objects_in_a_video I made a tweak to read the video from library instead of document picker: var recordedVideoURL: AVAsset? @IBAction func uploadVideoForAnalysis(_ sender: Any) { var configuration = PHPickerConfiguration() configuration.filter = .videos configuration.selectionLimit = 1 let picker = PHPickerViewController(configuration: configuration) picker.delegate = self present(picker, animated: true, completion: nil) } The delegation method: extension HomeViewController: PHPickerViewControllerDelegate { func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) { guard let selectedResult = results.first else { print("assetIdentifier: null") dismiss(animated: true, completion: nil) return } selectedResult.itemProvider.loadFileRepresentation(forTypeIdentifier: UTType.movie.identifier) { [weak self] url, error in guard error == nil, let url = url else { print(error?.localizedDescription ?? "Failed to load image") return } let asset = AVAsset(url: url) self?.recordedVideoURL = asset DispatchQueue.main.async { [weak self] in self?.dismiss(animated: true) { // dismiss the picker self?.performSegue(withIdentifier: ContentAnalysisViewController.segueDestinationId, sender: self!) self?.recordedVideoURL = nil } } } } } everything else is pretty much the same. Then, in the camera controller, it raised an error: "The requested URL was not found on this server." I put a debug break point and it show the error was from the line init AssetReader: let reader = AVAssetReader(asset: asset) func startReadingAsset(_ asset: AVAsset, reader: AVAssetReader? = nil) { videoRenderView = VideoRenderView(frame: view.bounds) setupVideoOutputView(videoRenderView) videoFileReadingQueue.async { [weak self] in do { guard let track = asset.tracks(withMediaType: .video).first else { throw AppError.videoReadingError(reason: "No video tracks found in the asset.") } let reader = AVAssetReader(asset: asset) let settings = [kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_420YpCbCr8BiPlanarFullRange] let output = AVAssetReaderTrackOutput(track: track, outputSettings: settings) if reader.canAdd(output) { reader.add(output) } else { throw AppError.videoReadingError(reason: "Couldn't add a track to the asset reader.") } if !reader.startReading() { throw AppError.videoReadingError(reason: "Couldn't start the asset reader.") } ... I tried to create a reader directly in the asset creation block, it worked: selectedResult.itemProvider.loadFileRepresentation(forTypeIdentifier: UTType.movie.identifier) { [weak self] url, error in guard error == nil, let url = url else { print(error?.localizedDescription ?? "Failed to load image") return } let asset = AVAsset(url: url) do { let reader = try AVAssetReader(asset: asset) self?.assetReader = reader print("reader: \(reader)") } catch let e { print("No reader: \(e)") } ... but if I just move it a little bit to the Dispatch.main.async block, it printed out "No reader: The requested URL was not found on this server. Therefore, I have to keep an instance of the reader and pass it to the cameraVC. Can someone please explain why is this happening? What's the logic behind this?
Posted
by
Post not yet marked as solved
1 Replies
756 Views
In the WWDC talk about the new embedded photo picker, the presenter says "If you want to implement your own replacement for some picker features, use the ".photosPicker DisabledCapabilities" modifier" (https://developer.apple.com/videos/play/wwdc2023/10107/?time=270). However, he did not demonstrate how to do so. For example, I want to use the compact version of the photo picker, with the header hidden, and place a custom "Add" button to use instead. Additionally, using any version of the picker, how could I add a button to the toolbar where "add" and "cancel" are? Is it possible to apply styling to the existing toolbar buttons, for instance changing the text color? Thanks!
Posted
by
Post not yet marked as solved
0 Replies
583 Views
Hello Team, I try to delete photo from Photos for that i used this method, [[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{ [PHAssetChangeRequest deleteAssets:@[assetToDelete]]; completionHandler:^(BOOL success, NSError *error) { }]; This method pops up a dialog with Don't Allow or Delete. But some time in some iPhones not respond PHAssetChangeRequest deleteAssets method that's why that completionHandler not called because of that i can't perform any operation of PHPhotoLibrary then after. If I restart my iPhone then it works. Many users of my app complained about this issue. I have an iPhone 11 with iOS 15.3. But some iOS 12,14,16 users also face the same issue. So what exact issue is there? Is it related to iOS or a method? Thanks, Ankur
Posted
by
Post not yet marked as solved
0 Replies
728 Views
We started implementing the new deferred photo processing and noticed a critical piece missing: It is not possible to set any custom metadata on the photo proxy before it has finished processing. The WWDC video mentions we as the developer need to update the photo later. In practice this proofs to be quite complicated to implement, because of the following reasons. It is not known when a photo will be processed Assuming the app is in the foreground it is still up to the operating system to decide when to process a photo There is no callback for when a photo is processed; only photo library changes that can be tracked when the app is in the foreground. Photo could be processed by the operating system when the app is on the background or not running at all. AVAsset has no public property available to indicate whether a photo is still waiting to be processed or is processed. Combined this requires complicated logic on the app side. The metadata needs to be cached on disk so that it can be added at a later stage and even if it is cached it is hard/impossible to match the cached metadata with a previously deferred photo. We filed FB12418208, we would love to see this fixed before iOS 17 comes out. I imagine that other camera app developers will have the same issue.
Posted
by
Post not yet marked as solved
1 Replies
747 Views
I am trying to create an album for an app in photo_kit and store images in it, is there any way to do this under the NSPhotoLibraryAddUsageDescription permission? At first glance, using NSPhotoLibraryAddUsageDescription seems to be the best choice for this app, since I will not be loading any images. However, there are two album operations that can only be done under NSPhotoLibraryUsageDescription. Creating an album Even though the creation of an album does not involve loading media, it is necessary to use NSPhotoLibraryUsageDescription to allow users to load media. This is a bit unconvincing. Saving images in the created album Before saving, you must check to see if the app has already created the album. You need to fetch the name as a key. This is where NSPhotLibraryUsageDescription is needed. I understand that the NSPhotLibraryUsageDescription is needed for fetching, but if iOS forbids the creation of albums with the same name and ignores attempts to create an album with an already existing name, this fetching will not be necessary. In summary, I just want to create an album for my app and store media in it, but in order to do so I need to get permission from the user to read into the photos, which goes against the idea that the permissions I request should be minimal and only what I need. If there is a way to do this under the NSPhotoLibraryAddUsageDescription permission I would like to know, I am new to Swift so sorry if I am wrong.
Posted
by
Post not yet marked as solved
2 Replies
958 Views
I have a safari webkit inside my mobile app which renders a webpage. The webpage has File upload option. when I click on it 3 options are shown as in screenshot. I am trying to make the safari kit to only allow Camera capture and hide Upload already existing files.  Is there any safari permission which I can remove to configuration that hide the options of upload from files.
Posted
by
Post not yet marked as solved
0 Replies
501 Views
I'm using PHPickerViewController in my app and I want to keep it open even after the user selects an image. My flow is the following: after the user select the photo, another viewController is presented modally over it, displaying the selected photo. The user can dismiss this viewController and come back to the picker. The issue is that in case the keyboard is open in the PHPickerViewController and I select an image, the keyboard is not dismissed no matter what I do. I tried the following: func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) { guard let provider = results.first?.itemProvider else { return } if provider.canLoadObject(ofClass: UIImage.self) { provider.loadObject(ofClass: UIImage.self) { image, _ in guard let image = image as? UIImage else { print("Error converting image") return } DispatchQueue.main.async { [weak self] in self?.dismissKeyboard() // -> HERE I TRY TO DISMISS THE KEYBOARD self?.presentImagePreviewViewController(image: image) } } } } func dismissKeyboard() { // NONE OF THESE DISMISS THE KEYBOARD phPickerViewController?.view.endEditing(true) phPickerViewController?.view.resignFirstResponder() view.endEditing(true) view.resignFirstResponder() UIApplication.shared.sendAction( #selector(UIApplication.resignFirstResponder), to: nil, from: nil, for: nil) UIApplication.shared.sendAction(#selector(UIResponder.resignFirstResponder), to: nil, from: nil, for: nil) } I am assuming that this is because the picker runs on a separate process and doesn't get the dismiss keyboard notification. As per the documentation here. Displaying the photo library doesn’t need user permission because it’s running in a separate process. Can I dismiss the keyboard in any other way?
Posted
by