Use CoreImage filters on Vision Pro (visionOS) view

I have an iOS app that uses (camera) video feed and applies CoreImage filters to simulate a specific real world effect (for educational purposes).

Now I wanted to make a similar app for visionOS and apply the same CoreImage filters to the content (live view) users sees while wearing Apple Vision Pro headset.

Is there a way to do it with current APIs and what would you recommend?

I saw that we cannot get video feed from camera(s), is there a way to do it with ARKit and applying the filters somehow using that?

I know visionOS is a young/fresh platform but any help would be great!

Thank you!

Replies

So far, there is no API in visionOS that allows developers access to the live video feed.

This is by design, and most likely to protect the user's privacy: While on an iPhone you explicitly consent to sharing your surroundings with an app by pointing your camera at things, you can't really avoid that on the Apple Vision Pro.