Bring your Unity VR app to a fully immersive space

RSS for tag

Bring your Unity VR app to a fully immersive space

View Session

Posts under wwdc2023-10093 tag

4 Posts
Sort by:
Post not yet marked as solved
1 Replies
736 Views
I’m trying to make a pass-through experience where there is an emissive/glow-y/bloom-y object. Everything I’ve tried and researched so far, it looks like that won’t be possible with RealityKit rendering and I’ll have to use Unity with its URP. But will it be possible to use that renderer while still having pass-through video to ground the experience in the real world? Or is that only possible with fully immersive experiences? if there’s a completely different approach that is better, I’m willing and wanting to learn. Thank you in advance! -Dan
Posted Last updated
.
Post not yet marked as solved
0 Replies
1.2k Views
Hi, I have a question about Apple Vision Pro's support for Unity programmable shaders. Shaders applied to Material are not supported. RenderTextures are supported. (Can be used as texture input to Shader Graph for display through RealityKit.) Regarding the above, are Shared Space, Full Space, and Full immersive space all covered? Is Full immersive space irrelevant because it is Metal and not RealityKit? Best regards. Sadao Tokuyama https://1planet.co.jp/ https://twitter.com/tokufxug
Posted Last updated
.
Post not yet marked as solved
2 Replies
742 Views
Can we create apps that aren't just screen based? For example, playing a music instrument using visionOS. The screen could be utilised to show the music notes, or a video of a play along (imagine a course that teaches you to play the instrument). Would like to understand the potentinal capabilites & limitations of Vision Pro + visionOS.
Posted Last updated
.
Post not yet marked as solved
1 Replies
1.5k Views
Do I understand correctly that to use Unity (Universal Render Pipeline) for Vision Pro's fully immersive apps, we can normally use Unity's Shader Graph to make custom shaders; but for immersive mixed reality apps, we cannot use Shader Graph anymore, instead, we have to create shaders for Unity in Reality Composer? How does bringing Reality Composer's shader into Unity work? Is it simply working in Unity or will it require special adaptation for Unity? Are there some cases to avoid using Reality Composer, and use Unity's Shader Graph for immersive Vision apps? For instance, we may lose real-time lighting adaptation for virtual objects, but on the other hand, we will be able to use Shader Graph.
Posted
by NikitaSh.
Last updated
.