'MultipeerConnectivityService' is unavailable in visionOS?

Hi!

despite the documentation saying otherwise, MultipeerConnectivityService appears to be unavailable in visionOS. Am I missing something or is this an issue with the current beta or (hopefully not 😬) the documentation?

https://developer.apple.com/documentation/realitykit/multipeerconnectivityservice

do {
        entity.scene?.synchronizationService = try MultipeerConnectivityService(session: MultipeerSession.shared.session)
        print("RealityKit synchronization started.")
} catch {
        fatalError("RealityKit synchronization  could not be started. Error: \(error.localizedDescription)")
}

Xcode complains 'MultipeerConnectivityService' is unavailable in visionOS, while the scene's synchronizationService property can be accessed...

Post not yet marked as solved Up vote post of RK123 Down vote post of RK123
964 views

Replies

As of now, VisionOS does not appear to support synchronization of ARKit entities.

There are ”SharePlay” WWDC23 videos for sharing Shared-Space and Immersive-Space experiences, so you may want to see if that can solve your use case.

  • I will also add that despite Vision Pro being designed to not keep the user isolated from the world around them, shareplay experiences don’t seem to consider users as being in the same space.

    I’d bet it’s being saved as a 2.0 thing. Shareplay this year even feels like a 2.0 version of multipeersharing.

Add a Comment

@J0hn SharePlay is not exactly what I am looking for, unfortunately. The APIs related to entity synchronisation seem all to have visionOS supported status: https://developer.apple.com/documentation/realitykit/content-synchronization

Wonder if these APIs will make it into the 1.0 release...

SharePlay with visionOS appears to hide location data of other people from the app. For example, data about the other personas (outside of their immersion status) is not exposed via APIs. I am guessing this is for privacy reasons (?).

I am not sure how Apple handles (or will handle) people in the same physical room. So far, I haven't seen any examples of (or WWDC videos) covering this. I look forward to some clarification and examples.

One possible workaround for people in the same physical room is to anchor the virtual content to an image. Print that image on a piece of paper and place it on the floor or a table. The two participants should see the same virtual content in the same location and same orientation because it is tied to something physical (the printed paper).

  • " For example, data about the other personas (outside of their immersion status) is not exposed via APIs. I am guessing this is for privacy reasons (?)." Could each persona share it's world anchor and keep it updated within the SharePlay session?

Add a Comment

Well, bonjour MultipeerConnectivityService works. I added a test to MuPeer package here

Or at least, it works across VisionOS and iPhone simulators. But, doesn't show up on iPad running iPadOS 17 Beta 5.

  • Thanks, good to know! You mean Multipeer Connectivity rather than RealityKit's MulitipeerConnectivityService, right? This is still good news, because MC is required by MulitipeerConnectivityService. So apparently you can network across simulators? Nice.

  • @RK123 am using import MultipeerConnectivity. The question had me worried, as am marshalling midi events to a visual synth. Just tested on iPadOS 17 Beta 5 with same result (not found)

Add a Comment