Z-Buffer culling / Hi-Z 

I am looking for info about Hi-Z buffer usage across modern GPUs

I see a lot of people manually generate a mip'ed depth-texture, but as far as I can tell every modern GPU already uses a Hi-Z buffer automatically (as long as you don't do depth writes in the fragment shader or have a ton of depth buffers etc) ?

Anybody know if you can rely on or check this across GPUs ?

Can I reproject it In a shader without losing GPU support ?

Z-Buffer culling / Hi-Z 

@norado Are you maybe thinking of Early-Z (discards fragments early if depth satisfies certain conditions), instead of hierarchical Z (Hi-Z, uses a tree structure)?

Tiled GPUs (e.g mobile) might not use Hi-Z (would require sync across tiles), and it sounds like some mobile GPUs also don't implement Early-Z.

For GPUs that use Hi-Z, there's also no guarantee that the levels would use factors of 2 (which would be required to correspond to a manually mipmapped Hi-Z buffer).

Sign in to participate in the conversation
Gamedev Mastodon

Mastodon server focused on game development and related topics.