Post not yet marked as solved
I’m trying to make a pass-through experience where there is an emissive/glow-y/bloom-y object. Everything I’ve tried and researched so far, it looks like that won’t be possible with RealityKit rendering and I’ll have to use Unity with its URP. But will it be possible to use that renderer while still having pass-through video to ground the experience in the real world? Or is that only possible with fully immersive experiences?
if there’s a completely different approach that is better, I’m willing and wanting to learn.
Thank you in advance!
-Dan
Post not yet marked as solved
Hi,
I have a question about Apple Vision Pro's support for Unity programmable shaders.
Shaders applied to Material are not supported.
RenderTextures are supported. (Can be used as texture input to Shader Graph for display through RealityKit.)
Regarding the above, are Shared Space, Full Space, and Full immersive space all covered?
Is Full immersive space irrelevant because it is Metal and not RealityKit?
Best regards.
Sadao Tokuyama
https://1planet.co.jp/
https://twitter.com/tokufxug
Post not yet marked as solved
Can we create apps that aren't just screen based? For example, playing a music instrument using visionOS. The screen could be utilised to show the music notes, or a video of a play along (imagine a course that teaches you to play the instrument). Would like to understand the potentinal capabilites & limitations of Vision Pro + visionOS.
Post not yet marked as solved
Do I understand correctly that to use Unity (Universal Render Pipeline) for Vision Pro's fully immersive apps, we can normally use Unity's Shader Graph to make custom shaders; but for immersive mixed reality apps, we cannot use Shader Graph anymore, instead, we have to create shaders for Unity in Reality Composer?
How does bringing Reality Composer's shader into Unity work? Is it simply working in Unity or will it require special adaptation for Unity?
Are there some cases to avoid using Reality Composer, and use Unity's Shader Graph for immersive Vision apps? For instance, we may lose real-time lighting adaptation for virtual objects, but on the other hand, we will be able to use Shader Graph.