extrinsicMatrix inside AVCameraCalibrationData

In regards to the extrinsicMatrix attribute of the AVCameraCalibrationData class, the description provided is as follows:

A matrix relating a camera’s position and orientation to a world or scene coordinate system.

I'm trying to build an app for 3D reconstruction/scanning that only uses the AVFoundation framework (not ARKit). I'm able to extract RGB frames, depth maps, and the camera's intrinsic matrix. However, the extrinsicMatrix is always an identity matrix.

The documentation mentions this:

The camera’s pose is expressed with respect to a reference camera (camera-to-world view). If the rotation matrix is an identity matrix, then this camera is the reference camera.

My questions are:

  1. Does the extrinsicMatrix param refer to a global coordinate system at all? If so, which coordinate system it is?
  2. Are there settings to configure that would trigger the extrinsicMatrix to change according to camera movement?
  3. If the extrinsicMatrix can't be used in this manner, can you recommend another way to estimate camera motion between frames to provide accurate 3D reconstruction?

Thanks in advance, and I'd be happy to provide more info if needed. I'm using an iPhone 14 Pro and the .builtInDualWideCamera as the AVCaptureDevice.

Post not yet marked as solved Up vote post of vklovo Down vote post of vklovo
677 views

Replies

Hi I'm having the same problem. Have you managed to resolve it?

  • @vieraleonel1 Hi, the extrinsicMatrix doesn't refer to the world coordinate system. If refers to the alignment of cameras on the iPhone. One of the cameras is the reference camera with a unity extrinsicMatric, and the others have extrinsic matrices with values describing the alignment of that camera compared to the reference cameras.

Add a Comment