Discuss using the camera on Apple devices.

Posts under Camera tag

171 Posts
Sort by:
Post not yet marked as solved
3 Replies
10k Views
I have a camera app. I want to test it in the simulator I can't succeed to do this. I read it was simply not possible. I cannot believe that Apple does not grant access to the camera of the Host Mac from its iOS simulator How this could even be possible? At the price we pay for such material.... does the simulator able to use an externally USB connected camera ?
Posted
by
Post not yet marked as solved
1 Replies
572 Views
We are making application of ios. The application is for Disaster prevention in Japan. Because, Recentry in Japan has many Disaster in all season. We spplied to Local government in Japan and using general public and they are using well. One of the local government request to us, they want also supply to deaf person and help whe disaster occur. Local government already has disaster prevention broadcast but it is using loudspeaker. Then, when rains heavily, it can not hear to general public. And our application is from disaster prevention broadcast and forwarding to ios smartphone. It is helpfull to general public well. We are making new application it is not only prevention broadcast text but flashlight with iphone. But after making application, the flashlight is lighting only open our application. like beloe link. But, if Deaf person using it maybe he is not notice well. Our application is already has viblation function but Deaf person put his smartphone in his bag maybe I think he never noticed that Heavy rain and tsunami. Here is our application https://apps.apple.com/jp/app/cosmocast/id1247774270?mt=8 Here is application rule for flashlight. https://stackoverflow.com/questions/32136442/iphone-flashlight-not-working-while-app-is-in-background/32137434 I want help Deaf person and also senior citizens for Heavy rain and tsunami by our application. We'd like to make a flash light at the same time as the push notification arrived. Does anyone know a good way? Thank you and Best regards. Tomita.
Posted
by
Post not yet marked as solved
1 Replies
867 Views
Hi, I was trying to configure people with the iPAD as a primary system. They got the apple keyboard, mouse, and an external monitor 36in. However the external camera via USB or bluetooth cannot be used for FaceTime, Webex, Zoom, Teams, etc. It always default to Apple camera. Is there a way to get an external camera and apps to recognize as the primary camera on an iPAD. Thanks
Posted
by
Post not yet marked as solved
3 Replies
3.4k Views
Hello, I connected Universal Links between the website and the application, everything was done according to the official documentation, and everything works fine, when the site is launched, safari offers to launch the application, there is also a banner above the launch page. In SwiftUI, I listen to the URL via .onOpenURL (perform :), everything works, but if the link is encoded in a QR code and scanned through the camera, a plate will appear prompting you to go to the application, and it goes, but only does not pass the URL to the .onOpenURL method (perform: ) . What could be the problem ?
Posted
by
Post not yet marked as solved
3 Replies
1.5k Views
I ran into a strange problem. A camera app using AVFoundation, I use the following code; captureDevice = AVCaptureDevice.default(AVCaptureDevice.DeviceType.builtInUltraWideCamera, for: AVMediaType.video, position: .back) then, let isAutoFocusSupported = captureDevice.isFocusModeSupported(.autoFocus) "isAutoFocusSupported" should be "true". For iPhone 13 pro, it is "true". But for 13 / 13 mini, it is "false". Why?
Post not yet marked as solved
10 Replies
5.7k Views
Hi, just experienced using the Apple demo app for Truedepth images on different devices that there are significant differences in the provided data quality. Data derived on iPhones before iPhone 13 lineup provide quite smooth surfaces - like you may know from one of the many different 3D scanner apps displaying data from the front facing Truedepth camera. Data derived on e.g. iPhone13 Pro has now some kind of wavy overlaying structures and has - visually perceived - very low accuracy compared to older iPhones. iPhone 12pro: data as pointcloud, object about 25cm from phone: iPhone 13pro: data as pointcloud, same setup Tried it out on different iPhone 13 devices, same result, all running on latest iOS. Images captured with the same code. Capturing by using some of the standard 3D scanner apps for the Truedepth camera are providing similar lower quality images or point clouds.   Is this due to degraded hardware (smaller sized Truedepth camera) on new iPhone release or a software issue from within iOS, e.g. driver part of the Truedepth camera?   Are there any foreseen improvements or solutions already announced by Apple?   Best Holger
Posted
by
Post not yet marked as solved
1 Replies
933 Views
CVPixelBuffer.h defines kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange = '420v', /* Bi-Planar Component Y'CbCr 8-bit 4:2:0, video-range (luma=[16,235] chroma=[16,240]). baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */ kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange = 'x420', /* 2 plane YCbCr10 4:2:0, each 10 bits in the MSBs of 16bits, video-range (luma=[64,940] chroma=[64,960]) */ But when I set above format camera output, and I find the output pixelbuffer's value is exceed the range.I can see [0 -255] for 420YpCbCr8BiPlanarVideoRange and [0,1023] for 420YpCbCr10BiPlanarVideoRange Is it a bug or something wrong of the output?If it is not how can I choose the correct matrix transfer the yuv data to rgb?
Posted
by
Post not yet marked as solved
0 Replies
1.7k Views
I am building an iOS app that uses the phone's GPS EXIF data from both camera and image library. My problem is that I while I am able to get GPS data from images in the phone's library, I have not been able to get any GPS data when using the camera within my app. I first built this app about a year ago and at that time I was able to get GPS data from both the library AND the camera from within the app. I believe that at that point I was still building for iOS 12.. I believe that the new security features that came with iOS 13 or 14 now dissalow my app's access to the GPS data when using the camera. This issue is new as of iOS 13 or 14. The code I had was working fine with earlier versions of iOS I am having no issues with getting GPS from the EXIF on the device library images. Images taken with the NATIVE IOS CAMERA APP are saved to the library with full GPS data.  - However I am not able to get GPS data directly from camera image EXIF when using the camera from within my app. When saving an image taking by the camera from within my app the image is saved to the library with NO GPS data. I am able, at any time, to ask the device for current GPS coordinates. As far as I can tell, device settings are all correct. Location services are available at all times.  My feeling is that iOS is stripping the GPS data from the EXIF image before handing the image data to my app. I have searched Apple developer forums, Apple documention, Stack Exchange, on and on for over several weeks now and I seem no closer to knowing if the camera API even returns this data or not and if it will be necessary for me to talk to the location services and add the GPS data myself (which is what I am working on now as I have about given up on getting it from the camera). Info.plist keys I am currently setting:  LSRequiresIPhoneOS  NSCameraUsageDescription  NSLocationAlwaysUsageDescription  NSLocationWhenInUseUsageDescription  NSMicrophoneUsageDescription  NSPhotoLibraryUsageDescription  NSPhotoLibraryAddUsageDescription Am I missing some required plist key? I have been searching and searching for the name of a key that I might be missing but have found absolutely nothing other than people trying to hack some post-camera device location merging. This has been very frustrating.. Any insite is appreciated Is it currently possible to get GPS data directly from the camera's EXIF output any more? Do I need to ask the device for the current GPS values and insert the GPS data into the image EXIF on my own? Is there any example code of getting GPS data from the camera? Is there any example code of inserting GPS data into the Exif before saving the file to the device? Sample swift code which processes the camera image.   func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {     let pickedImage = info[UIImagePickerController.InfoKey.originalImage] as? UIImage     // let pickedImage = info[UIImagePickerController.InfoKey.editedImage] as? UIImage     // using editedImage vs originalImage has no effect on the availabilty of the GPS data     userImage.image = pickedImage     picker.dismiss(animated: true, completion: nil)   }
Posted
by
Post not yet marked as solved
1 Replies
1.3k Views
I have been unable to capture Live Photos using UIImagePickerController. I can capture still photos and even video (which is not my scenario but I checked just to make sure), but the camera does not capture live photos. The documentation suggests it should (source): To obtain the motion and sound content of a live photo for display (using the PHLivePhotoView class), include the kUTTypeImage and kUTTypeLivePhoto identifiers in the allowed media types when configuring an image picker controller. When the user picks or captures a Live Photo, the editingInfo dictionary contains the livePhoto key, with a PHLivePhoto representation of the photo as the corresponding value. I've set up my controller: let camera = UIImagePickerController() camera.sourceType = .camera camera.mediaTypes = [UTType.image.identifier, UTType.livePhoto.identifier] camera.delegate = context.coordinator In the delegate I check for the Live Photo: func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) { if let live = info[.livePhoto] as? PHLivePhoto { // handle live photo } else if let takenImage = info[.originalImage] as? UIImage, let metadata = info[.mediaMetadata] as? [AnyHashable:Any] { // handle still photo } } But I never get the Live Photo. I've tried adding NSMicrophoneUsageDescription to the info.plist thinking it needs permissions for the microphone, but that did not help. Of course, I've added the NSCameraUsageDescription to give camera permissions. Has anyone successfully captured Live Photos using UIImagePickerController?
Posted
by
Post not yet marked as solved
2 Replies
2.9k Views
Hello folks! How can I get a real-world measurement between the device (iPad Pro 5th. gen) and an object measured with the LiDAR? Let's say I have a reticle in the middle of my CameraView and want to measure precisely from my position to that point I'm aiming?. Almost like the "Measure App" from Apple. sceneDepth doesn't give me anything. I also looked into the Sample Code "Capturing Depth Using the LiDAR Camera" Any ideas how to do that? A push in to the right direction might also be very helpful Thanks in advance!
Posted
by
Post not yet marked as solved
18 Replies
11k Views
I found that almost all third-party applications cannot focus when they ask me to take an ID card photo. The same issue description from Twitter: https://twitter.com/mlinzner/status/1570759943812169729?s=20&t=n_Z5lTJac3L5QVRK7fbJZg Who can fix it? Apple or third-party developers?
Posted
by
Post not yet marked as solved
2 Replies
900 Views
The sample code Capturing Depth Using the Lidar Camera works fine on my phone with iOS 15.6. But after upgraded the phone to iOS 16 and the Xcode to 14.0, now the sample code can only run for a few minutes on the phone and start to show the following error: LiDARDepth[1454:256343] Execution of the command buffer was aborted due to an error during execution. Ignored (for causing prior/excessive GPU errors) (00000004:kIOGPUCommandBufferCallbackErrorSubmissionsIgnored) Did anyone get the same error after upgrade to Xcode 14 or iOS 16. Can an expert take a look of the sample code and point out a possible solution? Thanks!
Posted
by
Post not yet marked as solved
2 Replies
1k Views
My goal is to implement a moving background in a virtual camera, implemented as a Camera Extension, on macOS 13 and later. The moving background is available to the extension as a H.264 file in its bundle. I thought i could create an AVAsset from the movie's URL, make an AVPlayerItem from the asset, attach an AVQueuePlayer to the item, then attach an AVPlayerLooper to the queue player. I make an AVPlayerVideoOutput and add it to each of the looper's items, and set a delegate on the video output. This works in a normal app, which I use as a convenient environment to debug my extension code. In my camera video rendering loop, I check self.videoOutput.hasNewPixelBuffer , it returns true at regular intervals, I can fetch video frames with the video output's copyPixelBuffer and composite those frames with the camera frames. However, it doesn't work in an extension - hasNewPixelBuffer is never true. The looping player returns 'failed', with an error which simply says "the operation could not be completed". I've tried simplifying things by removing the AVPlayerLooper and using an AVPlayer instead of an AVQueuePlayer, so the movie would only play once through. But still, I never get any frames in the extension. Could this be a sandbox thing, because an AVPlayer usually renders to a user interface, and camera extensions don't have UIs? My fallback solution is to use an AVAssetImageGenerator which I attempt to drive by firing off a Task for each frame each time I want to render one, I ask for another frame to keep the pipeline full. Unfortunately the Tasks don't finish in the same order they are started so I have to build frame-reordering logic into the frame buffer (something which a player would fix for me). I'm also not sure whether the AVAssetImageGenerator is taking advantage of any hardware acceleration, and it seems inefficient because each Task is for one frame only, and cannot maintain any state from previous frames. Perhaps there's a much simpler way to do this and I'm just missing it? Anyone?
Posted
by
Post marked as solved
3 Replies
1.9k Views
When capturing RAW (not ProRAW) photos using AVCapturePhotoOutput, the resulting images are subject to a strange overexposed effect when viewed in Apple software. I have been able to recreate this in multiple iOS apps which allow RAW capture. Some users report previously normal images transforming over the span of a few minutes. I have actually watched this happen in real-time: if you observe the camera roll after taking a few RAW photos, the highlights in some will randomly **** (edit: this just says b l o w, nice job profanity filter) out of proportion after whatever is causing this issue kicks in. The issue can also be triggered by zooming in to one of these images from the stock Photos app. Once the overexposure happens on a given image, there doesn't appear to be a way to get it to display normally again. However, if you AirDrop an image to a different device and then back, you can see it display normally at first and then break once more. Interestingly, the photo displays completely fine when viewed in Affinity photo or random photo viewers on Ubuntu. Sometimes the issue is not that bad, but it is often egregious, resulting in completely white areas of a previously balanced photo (see https://discussions.apple.com/thread/254424489). This definitely seems like a bug, but is there any way to prevent it? Could there be an issue with color profiles? This is not the same issue in which users think RAW photos are broken because they are viewing the associated JPG – this happens even with photos that have no embedded JPG or HEIC preview. Very similar (supposedly fixed) issue on MacOS: https://www.macworld.com/article/1444389/overexposed-raw-image-export-macos-monterey-photos-fixed.html Numerous similar complaints: https://discussions.apple.com/thread/254424489 https://discussions.apple.com/thread/253179308 https://discussions.apple.com/thread/253773180 https://discussions.apple.com/thread/253954972 https://discussions.apple.com/thread/252354516
Posted
by
Post not yet marked as solved
0 Replies
555 Views
Desde iOS 16, en Apps de terceros, la cámara no hace foco a objetos con una distancia de 10 a 12 cm. Esto hace que todas las Apps que necesitan del enfoque a corta distancia, como scanners de barras, QR y otros, queden totalmente obsoletos, sin funcionar. Pueden comprobarlo abriendo la cámara desde Instagram o WhatsApp, por ejemplo. No hace foco a corta distancia. Aún no he encontrado una solución como desarrollador. Alguien puede ayudarme con este problema a partir de iOS 16? Habrá alguna actualización de Apple para solucionar este problema? Lo he testeado con iPhone 14 y iPhone S6. Ambos con iOS 16..
Posted
by
Post not yet marked as solved
1 Replies
659 Views
In regards to the extrinsicMatrix attribute of the AVCameraCalibrationData class, the description provided is as follows: A matrix relating a camera’s position and orientation to a world or scene coordinate system. I'm trying to build an app for 3D reconstruction/scanning that only uses the AVFoundation framework (not ARKit). I'm able to extract RGB frames, depth maps, and the camera's intrinsic matrix. However, the extrinsicMatrix is always an identity matrix. The documentation mentions this: The camera’s pose is expressed with respect to a reference camera (camera-to-world view). If the rotation matrix is an identity matrix, then this camera is the reference camera. My questions are: Does the extrinsicMatrix param refer to a global coordinate system at all? If so, which coordinate system it is? Are there settings to configure that would trigger the extrinsicMatrix to change according to camera movement? If the extrinsicMatrix can't be used in this manner, can you recommend another way to estimate camera motion between frames to provide accurate 3D reconstruction? Thanks in advance, and I'd be happy to provide more info if needed. I'm using an iPhone 14 Pro and the .builtInDualWideCamera as the AVCaptureDevice.
Posted
by
Post not yet marked as solved
1 Replies
329 Views
Hi, I'd like to know about Enterprise MDM. Currently, MDM camera lock is used, but in certain situations, the camera is available. However, if you go outside the application, the camera is locked, but the camera opens unless you lock the cell phone and unlock the cell phone password when the camera is released. If you unlock your phone password, the camera locks, has anyone fixed this phenomenon?
Posted
by
Post not yet marked as solved
0 Replies
398 Views
Is the 240fps video taken by the iphone12, a real 240fps shot in one second, or just by frame interpolation? Is there any technical documentation to prove it?
Posted
by