Graphics and Games

RSS for tag

Build captivating gaming experiences for Apple platforms.

Graphics and Games Documentation

Posts under Graphics and Games tag

27 Posts
Sort by:
Post not yet marked as solved
0 Replies
861 Views
In regular Metal, I can do all sorts of tricks with texture masking to create composite objects and effects, similar to CSG. Since for now, AR-mode in visionOS requires RealityKit without the ability to use custom shaders, I'm a bit stuck. I'm pretty sure so far that what I want is impossible and requires a feature request, but here it goes: Here's a 2D example: Say I have some fake circular flashlights shining into the scene, depthwise, and everything else is black except for some rectangles that are "lit" by the circles. The result: How it works: In Metal, my per-instance data contain a texture index for a mask texture. The mask texture has an alpha of 0 for spots where the instance should not be visible, and an alpha of 1 otherwise. So in an initial renderpass, I draw the circular lights to this mask texture. In pass 2, I attach the fullscreen mask texture (circular lights) to all mesh instances that I want hidden in the darkness. A custom fragment shader multiplies the alpha of the full-screen mask texture sample at the given fragment with the color that would otherwise be output. i.e. out_color *= mask.a. The way I have blending and clear colors set-up, wherever the mask alpha is 0, an object will be hidden. The background clear color is black. The following is how the scene looks if I don't attach the masking texture. You can see that behind the scenes, the full rectangle is there. In visionOS AR-mode, the point is for the system to apply lighting, depth, and occlusion information to the world. For my effect to work, I need to be able to generate an intermediate representation of my world (after pass 2) that shows some of that world in darkness. I know I can use Metal separately from RealityKit to prepare a texture to apply to a RealityKit mesh using DrawableQueue However, as far as I know there is no way to supply a full-screen depth buffer for RealityKit to mix with whatever it's doing with the AR passthrough depth and occlusion behind the scenes. So my Metal texture would just be a flat quad in the scene rather than something mixed with the world. Furthermore, I don't see a way to apply a full-screen quad to the scene, period. I think my use case is impossible in visionOS in AR mode without customizable rendering in Metal (separate issue: I still think in single full app mode, it should be possible to grant access to the camera and custom rendering more securely) and/or a RealityKit feature enabling mixing of depth and occlusion textures for compositing. I love these sorts of masking/texture effects because they're simple and elegant to pull-off, and I can imagine creating several useful and fun experiences using this masking and custom depth info with AR passthrough. Please advise on how I could achieve this effect in the meantime. However, I'll go ahead and say a specific feature request is the ability to provide full-screen depth and occlusion textures to RealityKit so it's easier to mix Metal rendering as a pre-pass with RealityKit as a final composition step.
Posted Last updated
.
Post marked as solved
16 Replies
1.8k Views
Apparenrly, shadows aren’t generated for procedural geometry in RealityKit: https://codingxr.com/articles/shadows-lights-in-realitykit/ Has this been fixed? My projects tend to involve a lot of procedurally-generated meshes as opposed to importes-models. This will be even more important when VisionOS is out. On a similar note, it used to be that ground shadows were not per-entity. I’d like to enable or disable them per-entity. Is it possible? Since currently the only way to use passthrough AR in Vision OS will be to use RealityKit, more flexibility will be required. I can‘t simply apply my own preferences.
Posted Last updated
.
Post not yet marked as solved
2 Replies
985 Views
I render with func drawIndexedPrimitives( type primitiveType: MTLPrimitiveType, indexType: MTLIndexType, indexBuffer: MTLBuffer, indexBufferOffset: Int, indirectBuffer: MTLBuffer, indirectBufferOffset: Int ), when I use Argument Buffer to set textures(e.g. to fragnent shader), everything is fine. But is there a way to set textures without Argument Buffer? Argument Buffer is the only way?
Posted
by Lareid.
Last updated
.
Post not yet marked as solved
14 Replies
14k Views
Hi There, I just bought a IIYAMA G-MASTER GB3461WQSU-B1 which has a native resolution of 3440x1440 but my MacBook Pro (Retina, 15-inch, Mid 2014) doesn't recognise the monitor and I can't run it at its full resolution. It is currently recognised as a PL3461WQ 34.5-inch (2560 x 1440). Is there anything that I can do to get it sorted or I will have to wait until this monitor driver is added to the Big Sur list? Thanks
Posted Last updated
.
Post not yet marked as solved
1 Replies
2.1k Views
Sadly, I was not able to register for the Games Q&A at. WWDC. My question is as follows: For making a Universal Binary, one compatible with Big Sur and up, will the tools in Xcode 15 help me port over a Win32 project over, or does the tool only work assuming macOS Sonoma and up, exclusively on Apple Silicon?
Posted
by pi1024e.
Last updated
.
Post not yet marked as solved
290 Replies
152k Views
Hello! Help please, after updating the system to Big Sur, my MacBook Pro 13 began to slow down, previously there were no such problems. MacBook Pro (Retina, 13-inch, Early 2015) 2,7 GHz 2‑core processor Intel Core i5 8 GB 1867 MHz DDR3 Intel Iris Graphics 6100 1536 MB
Posted
by Seplaz.
Last updated
.