Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Alternative to .onBeforeCompile for material extension #14232

Closed
Usnul opened this issue Jun 7, 2018 · 22 comments
Closed

Alternative to .onBeforeCompile for material extension #14232

Usnul opened this issue Jun 7, 2018 · 22 comments

Comments

@Usnul
Copy link
Contributor

Usnul commented Jun 7, 2018

Current approach of letting the user do whatever they want with the shader code from within .onBeforeCompile is very powerful, and yet, at the same time, quite messy and fragile.

One of the biggest issues is dynamic nature of the material code, if you allow making arbitrary modification to the code - you can not assume that material code will remained the same between builds, and you have to construct that code string every time to check, this creates waste both in terms of CPU cycles and in terms of GC overhead, as that string will likely end up as garbage.

My proposal is quite simple. I propose having more restrictive transformers which can be registered onto a material. Transformers can be entirely static for most usecases, and where that's not enough, we could offer dynamic transformers - with the main difference that the library can detect that and optimize usecases where only static transformers are being used. One fairly substantial benefit also - is the added semantic information which library can use for various optimizations and error checking.

My observation boils down to the fact that most .onBeforeCompile functions do just 1 thing:

  • find a RegEx pattern in shader code string
  • replace that occurrence with a fixed string

Here is an example to illustrate current usage:

material.onBeforeCompile = function(shader){
        shader.vertex = shader.vertex.replace('a','b');
}

What i propose would be:

const transform = new ShaderTransformReplace(VertexShader, 'a','b');
material.addShaderTransform(transform);

Since there is no arbitrary code being executed anymore, we know that material shader does not change as long as list of transforms hasn't changed (in addition to other things, like defines and lighting which we already have had).

A couple of ideas on top of this proposal:

  • if user adds transorms ('a','b') and ('a','c') - we can detect that and remove latter transform or warn the user, as we know that pattern will no longer be matched.
  • If shader code is 'aaa' and user adds transform ('b','c') - we can detect that such transform would be pointless and remove it or warn the user
@pailhead
Copy link
Contributor

pailhead commented Jun 7, 2018

Related:

#14231
#14206
#14166
#14099
#14098
#14031
#13198
#10791
#13446
#14009
#14011
#13192
#7581
#13364
#12977
#11562

My observation from these is that overall these kind of modifications to the built-in materials are not advised, and seen as an anti-pattern over both currently available alternatives and future proposed ones (see #14098 specifically, where template copying and casting to ShaderMaterial is advised over onBeforeCompile and modifications).

Are you familiar with NodeMaterial (#7522) ? It looks like it should solve this without the need for hacks like onBeforeCompile.

@pailhead
Copy link
Contributor

pailhead commented Jun 7, 2018

we can detect

I find this proposal too restrictive. #14231 proposes something which may possibly be a bit more flexible and leave this kind of warning to whomever implements some feature using some extension API. Three.js throws a lot of warnings as is.

I've mustered up this demo with #14231:

bbsginst

It shows how to create a simple sub class of THREE.MeshStandardMaterial that uses spec/gloss, uses it within GLTFLoader and then independently adds map transformations and instancing. There is one possible conflict, but i believe it's due to the current state of the templates not the approach. Where you would throw a warning, the demo actually takes "a" and modifies it.

#14231 takes a very minimalist approach, it is aiming to touch the least amount of files, and the least amount of lines, while being in compliance with the linter and legible. I imagine a much more powerful interface could be conceived if it had the privilege to introduce it's own classes like you're proposing:

new ShaderTransformReplace()

@Usnul
Copy link
Contributor Author

Usnul commented Jun 7, 2018

I think your approach is valid. Although i don't like the "zoo" of include sources. I would have preferred an include library, which may cascade internally and have overrides, but keep that away from the encapsulating logic.

The overall problem that I see with .onBeforeCompile - it's so flexible as to invalidate a large array of useful assumptions. Your approach is also static - which I like.

Maybe a title ".onBeforeCompile considered harmful" would have been more traditional :)

Node-based approach is solid, in my eyes, but the implementation leaves everything else to be desired, in my eyes. If you want to have a decent node-based system, you need a solid build pipeline, which I didn't find.

I think, following your approach is better, provided a chunk library is encapsulated. Something like:

class Material{
    constructor(){
          this.chunkLibrary =  new ShaderChunkLibrary({parent: THREE.DefaultChunkLibrary});
    }
    ...
}

and then you could do:

myMaterial.chunkLibrary.set('chunky-hippo','hippo = chunky');
myMaterial.chunkLibrary.set('begin_vertex',
     `float _ind = aIndex + uTime;
      transformed *= 1. + 0.5 * sin(${frequency}.0 * ${Math.PI} * _ind );`
);

@pailhead
Copy link
Contributor

pailhead commented Jun 7, 2018

I'd land the includes as is, without introducing new classes. I took extreme care to keep the amount of code change to a minimum, since, it's been impossible so far to convince anyone that this may be useful, and that it's completely opt in.

All these PRs have been blocked by NodeMaterial, i don't care at all what happens with NodeMaterial but i do not understand why it's being a blocker for completely unrelated things (the only thing these have in common i they work on Materials).

Could you possibly voice some thoughts in the other PRs?

@donmccurdy
Copy link
Collaborator

@Usnul would you have time to write up your concerns and suggestions for NodeMaterial implementation in #7522, if you haven’t previously?

@pailhead
Copy link
Contributor

pailhead commented Jun 7, 2018

Could you possibly voice some thoughts in the other PRs?

Coming from the context of just "material extension" (as it's stated in the title), please do not limit yourself to just #7522, it's only one of the proposals to achieve this. Please, look at the other PRs plural. Thank you :)

I would have preferred an include library,

There's no reason not to build one with #14098, but my opinion is that the user should not be limited to one. They should be allowed to use whichever library they choose, and be allowed to use the language/environment (javascript) to achieve so.

@pailhead pailhead mentioned this issue Jun 7, 2018
@Usnul
Copy link
Contributor Author

Usnul commented Jun 8, 2018

@pailhead

Please, look at the other PRs plural. Thank you :)

I have spent some hours reading through some of the topics you have linked. It's a bit more than what I have time for currently and I do not wish to comment without have a good grasp on the conversation.

There's no reason not to build one with #14098, but my opinion is that the user should not be limited to one. They should be allowed to use whichever library they choose, and be allowed to use the language/environment (javascript) to achieve so.

I think there's a misunderstanding. I do not mean a javascript library, i mean library in the sense of collection. A collection of glsl code snippets with a reference name.

my opinion is that the user should not be limited to one.

As far as "one" goes - I believe that's a misunderstanding too, I was not clear enough I believe. What I have sketched out is a cascading structure, where a library may have a parent and when resolving a snippet by reference - parent is to be queries if library itself doesn't contain the reference. Think css.

You could have multiple parents too, if that would make sense.

@pailhead
Copy link
Contributor

pailhead commented Jun 8, 2018

I don't quite understand it still. I didn't mean a javascript library either, but if a dictionary is suitable for this, i meant you should be able to assemble that dictionary yourself (or just reference it from somewhere). We could continue this in #14098, since it does seem to apply and doesn't have to be as generic of a discussion.

@Usnul
Copy link
Contributor Author

Usnul commented Jun 9, 2018

@pailhead I think we're on the same page.

@pailhead
Copy link
Contributor

pailhead commented Jun 9, 2018

@Usnul

I believe we are. My confidence has been shaken up by the many so many rejections, but this helps :)

You've done the courtesy to @donmccurdy and commented on #7522 API, where I believe we're also on the same page.

Can you do me the courtesy and comment on #14245, or even #7522? When a single user shows up and asks for an arguably obscure feature, it get's implemented over 13 core /renderer files in a matter of hours or days.

When many other users suggest PR's or raise issues, it is blocked by #7522.

I want to draw a diagram of processes and dependencies to try to illustrate better what i see as a problem here, it seems to be the only way. In lieu of that for now, i can try to explain it.

Let's consider 10 different projects/apps/examples/effects that use features that are today not available in core nor in /examples.

Out of these 10 features, five can be coded on top of three.js, and submitted to /examples.

^ this is a good thing. All it takes is to deem the example useful ("cool, people would want to do this") and since it's just new files being added to /examples the risk of landing this is minimal to non-existent.

The other five though, cannot be coded on top of three.js and require some core changes. Adding five different PR's that touch renderers, shaders etc. would carry a tremendous amount of overhead. Based on historical data, these five would take longer to land than the first two.

A perfect example of this can be seen here. The overhead starts in #14139 and continues in #14239

  1. user describes a need for a feature
  2. user asks(?) for a feature (says it would be "fun" not "useful", it's a bit ambigous)
  3. request for clarification is made
  4. community provides clarification
  5. user provides clarification
  6. community provides feedback
  7. maintainer commits to providing feature
  8. community suggests API
  9. maintainer submits RFI
  10. community replies to RFI
    11, a need for test data is established
  11. test data is requested
  12. test data is provided (ISSUE END)
  13. maintainer touches 13 core files
  14. maintainer submits PR
  15. community provides mixed feedback
  16. PR is accepted

This problem object space normal maps would be fun can be solved with the system proposed here (and in the past) with a much simpler process:

  1. user codes a feature (either in one file, or built to one file)
  2. User submits the PR with the one file example, and dependent data, including an explanation
  3. community provides feedback on the usefulness of the example
  4. two new files in /examples are either rejected or accepted

With the looming refactor from #7522 i fail to see how further coupling WebGLRenderer with built-in material benefits the road map.

I think i struggle with understanding how stuff in /examples fits the big picture of this library.
Are some examples more valuable than the others? Are some or all of them in there temporarily as sort of a testing ground before they get merged? Do some carry more weight than the others when it comes to influencing core changes?

With a static system like the one proposed specifically in #14231 - #14239 becomes this:

myMaterial.shaderUniforms.object_space_normal_normalMatrix =  { value: myObject.normalMatrix, stage: 'fragment', type: 'mat3' }
myMaterial.shaderIncludes.normal_fragment_maps =  `
normal = texture2D( normalMap, vUv ).xyz * 2.0 - 1.0; 
#ifdef FLIP_SIDED
  normal = - normal;
#endif
#ifdef DOUBLE_SIDED
  normal = normal * ( float( gl_FrontFacing ) * 2.0 - 1.0 );
#endif
normal = normalize( object_space_normal_normalMatrix * normal );
`

@pailhead
Copy link
Contributor

pailhead commented Jun 9, 2018

With a system proposed in this issue, new shader features could be "staged" or tested in /examples first, before being implemented in the core. No idea if this would actually have any value, but it could keep some features as opt-in for a certain period of time. #14239 for example - if i haven't used this feature for 92 of three.js's releases, i probably won't jump in to use it in the 93rd. But any update of any app to 93 will compel me to include this code.

var osnm = require('object-space-normal-maps')

Would allow me to opt in.

@mrdoob
Copy link
Owner

mrdoob commented Jun 18, 2018

#14245 (comment)

@mrdoob mrdoob closed this as completed Jun 18, 2018
@Usnul
Copy link
Contributor Author

Usnul commented Jun 18, 2018

@mrdoob
I appreciate your response. But it doesn't address the topic, if Node-based shaders are the way forward - what's the plan then? They have been in the oven for a long while now, we have onBeforeCompile meanwhile, which people (myself included) build upon, but that's a bad solution long-term. Is the plan to wait for a couple of years until Node-based solution is ready and there is a ton of code built on top on onBeforeCompile?

I believe that onBeforeCompile is harmful architecturally. If you read the post you will know some of the reasons why, I would be happy to elaborate further if you want. Your mention does not address this concern. I want a viable replacement, and not in 2 years time.

@donmccurdy
Copy link
Collaborator

donmccurdy commented Jun 18, 2018

@Usnul from your comments in #7522 (comment) (thank you for writing those by the way!) I assumed your worry was mainly about the internal structure and code clarity of NodeMaterial, and that you weren't concerned about — or haven't had time to consider — its functionality or readiness for use. Is that accurate?

I would like to think we can have NodeMaterial in good shape within months, not years. On that timescale, a general shader transform system would be unnecessary.

@mrdoob
Copy link
Owner

mrdoob commented Jun 18, 2018

@Usnul

I believe that onBeforeCompile is harmful architecturally.

Yeah, adding that method was a mistake on my part. But sometimes you need to experiment until finding the right solution...

I want a viable replacement, and not in 2 years time.

We do what we can.

@pailhead
Copy link
Contributor

It might be worth giving some insight into what's in store for onBeforeCompile. It's used in NodeMaterial to automatically run .build() atm.

@Usnul
Copy link
Contributor Author

Usnul commented Jun 19, 2018

@donmccurdy
My takeaway from this is that there is no plan to touch onBeforeCompile currently with the expectation that Node-based shaders will be ready soon (within a 2+ months)

wrt Node-based shader in general, yes, I find the solution to be a desirable one, my issue is with how it's done, and that's separate from the current thread - you're absolutely right.

@mrdoob
I apologize if I come off as hard. I make no demands, merely express my wishes to clarify my position. I am thankful for the work of yourself and other capable and diverse individuals who have in the past, currently are, and will in the future be contributing to making three.js great, again - I'm sorry if that was not clear.

@pailhead
Thanks for the pointer. Documentation for that code will be a substantial undertaking.

@pailhead
Copy link
Contributor

Documentation for that code will be a substantial undertaking.

Is there a known timeline for this? It's hard to follow NodeMaterial PRs atm because they are several thousands lines of code.

@mrdoob
Copy link
Owner

mrdoob commented Jun 23, 2018

Is there a known timeline for this?

No. Sorry.

@pailhead
Copy link
Contributor

I didn't catch this before:

I would like to think we can have NodeMaterial in good shape within months, not years.

This is a paradox and cannot happen since it already has been several years of development. Unless we don't count the years past since 2015 :)

@donmccurdy
Copy link
Collaborator

donmccurdy commented Jul 23, 2018

I meant to say within months from this discussion, not months from development beginning in ~2015.

@pailhead
Copy link
Contributor

Thanks, that makes sense, so anywhere between 2-11 months is what we're hoping for 🙂

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants