-
-
Notifications
You must be signed in to change notification settings - Fork 35.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ShaderChunk: Use transformed UV when reading displacement map #17487
Conversation
Related #7826 There are obviously good arguments for both approaches (the current one and your proposal). I personally tend to keep the status quo and argue that displacement maps are often tied to the geometry. Besides, changing this will potentially break code and thus produce migration effort. As a workaround, you can always monkey-patch the displacement shader chunk with |
For the sake of consistency, I think it may be better to accept the PR. Also, I like the new possibilities this enables. Lets hope not that many things break... 🤞 |
Thanks! |
Thanks @mrdoob! I do see the concerns others brought up, it's one of those scenarios where either default could be considered right or wrong depending on your perspective and what you're trying to achieve. Maybe it makes sense for all maps to support separate uvTransform matrices? That gets tricky too though, it potentially makes the simple case of all textures sharing one more complicated, now you have to keep a bunch of separate values in sync. I also don't know how much that many additional mat4 uniforms would hinder performance or have us running into gpu limits. |
I recommend against doing that. It was implemented this way for a reason. This issue will be resolved with /ping @sunag |
Supporting this in NodeMaterial is great, and I agree that either default is the wrong thing for some scenarios, but it seems like we could support this use case without requiring a switch to NodeMaterial by exposing separate uvTransform uniforms for each texture slot. Scaling the various maps independently from each other can also be used for some interesting graphical effects, eg, to break up repetitive tiling or to do simple texture animations. How do engines like Unity handle this? Do they expose these values in their default shaders, or require you to write custom ones for this kind of thing? |
Right. But it is the correct thing for the default scenario.
That was proposed in #8278, but my understanding is we decided instead to support the feature with |
Right. I made the same point. This is causing breakage for anyone who has a displacement map coupled to their geometry and is using This is also causing breakage for a custom depth material using a displacement map since |
All other textures which use the first uv slot use the transformed UVs (eg, after offset / repeat / rotation are applied) when sampling from the texture, but displacementMap uses the untransformed uvs, leading to a visual mismatch if the textures are supposed to line up. This patch changes the behavior to be consistent.