Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[✨ feature] Emissive Volume Rendering #678

Open
wants to merge 35 commits into
base: master
Choose a base branch
from

Conversation

Microno95
Copy link
Contributor

@Microno95 Microno95 commented Apr 18, 2023

Description

This PR adds emissive event sampling in the null-scattering volpathmis integrator for sampling the VRE and MIS for emission. It supersedes the previous PR #40.

Adds emissive volume support by adding a volume emitter. The interface adds a volumelight emitter to a given shape that is then sampled as part of the medium sampling routines in volpath/volpathmis/prbvolpath.

The current implementation modifies none of the interfaces of endpoint and thus does not work with MIS as-of-yet. I am unsure of the best way to implement this in terms of modifying the interface. The cleanest solution seems to be that the endpoint sampling methods take an extra float to sample a 3d point, but whether this should be implemented by changing the existing Point2f sample to Point3f or adding an additional argument is unclear to me.

The current implementation is feature complete except for NEE for volume emitters, I have added and tested the python interfaces by bringing the prbvolpath implementation to parity with volpath.

Checklist

  • My code follows the style guidelines of this project
  • My changes generate no new warnings - generates warnings about unused parameters in volumelight, these methods will be implemented and the parameters used. There are no longer any new warnings.
  • My code also compiles for cuda_* and llvm_* variants. I have compiled every single variant and run all the tests, and everything passes except for the tutorial notebooks, but these throw an unicode char encoding error so I think it's unrelated. I will check with a fresh master copy if the same error occurs, might be a windows only issue.
  • I have commented my code
  • I have made corresponding changes to the documentation
  • I have added tests that prove my fix is effective or that my feature works - this is in-progress, have added emissive volume scenes and tests that I will make a PR for in the mitsuba-renderer/mitsuba-data repo. Created the following PR to merge emissive volume tests: Emissive Participating Media reference data mitsuba-data#17
  • I cleaned the commit history and removed any "Merge" commits
  • I give permission that the Mitsuba 3 project may redistribute my contributions under the terms of its license

Notes

  • The implementation uses non-analogue probabilities as described in Monte Carlo Methods for Volumetric Light Transport Simulation. This leads to more consistent results when a given radiance channel is zero and better convergence in all the tests I've done.
  • Both volpath and prbvolpath have implementations of the analogue, maximum and mean event probability calculations for testing. The issue of inconsistency only appears with analogue probabilities in non-scalar modes, and only for volpathmis and prbvolpath.
  • Sorry for the delay in the implementation! The move to Mitsuba 3 meant I had to rewrite a lot of the code, but I managed to be significantly more thorough this time around so everything compiles and all the tests pass.
  • I've attached an example render as well!
    homogeneous-emissive-cuda

@Microno95
Copy link
Contributor Author

Microno95 commented Apr 19, 2023

I've identified the issue with the failing tutorial notebook, it's specifically related to the prbvolpath integrator and my incorrect implementation of accumulating radiance. I'm not quite sure what the correct implementation is as my understanding of it is that the radiance accumulated from the medium should be treated similarly to intersection with emitters, but this seems to break the differentiability of the rest of the medium parameters.

@njroussel
Copy link
Member

Hi @Microno95

I quickly skimmed through this PR. It's looking good! I'll need to spend more time to carefully review it, and think about some of the apparently necessary interface change 😄

@Microno95
Copy link
Contributor Author

No worries! I've made a list of the necessary features for NEE sampling volumes, hopefully it helps with deciding on the interface and I'm very happy to discuss it in more detail as well 😊

Extensions to shape:

  • Sample a 3d point inside a shape
  • Check that a point is inside a shape
  • Calculate (or estimate) the volume of a shape

I think these should belong to the shape class, same as sample_position, surface_area and pdf_position, but generalising to the enclosing volume. The only issue is how to handle shapes that are not watertight, but I guess that also has issues with other aspects so the onus falls on the user.

And then the emitter calls to the appropriate method while forwarding the extra random sample as needed.

@dvicini
Copy link
Member

dvicini commented May 31, 2023

Nice work! Sorry I somehow missed this new PR.

Just a high level comment for now: This adds quite a lot of code complexity - would it make sense to have special integrators for emissive media? It's somewhat of a special case and the existing volpath integrators are already hard to parse (e.g., for someone coming from using NeRF...)

Also, would it make sense to move some of those "medium probabilities" functions to the medium class?

@Microno95
Copy link
Contributor Author

I have significantly simplified the sampling of emission and reduced the complexity of the integrators in my latest code that I will push soon.

Generally, I think it does make sense to move these to the medium class as the different medium event sampling modes are better suited for different media. e.g. if a medium has 0 scattering for, say the green channel, the analogue probabilities actually underestimate the number of scatters when using hero wavelength sampling and lead to significant fireflies.

Turns out you can sample radiance at every medium interaction (at least when not using NEE for medium emitters, that's a separate case to be worked on), and this has simplified the inclusion of radiative media effects to a single additional line in both volpath and volpathmis integrators and only 4 additional lines in prbvolpath (a few more for the derivative tracking).

I have held off on pushing any updates as I was waiting on how I should go about updating the endpoint/emitter interface to support a third random sample.

@dvicini
Copy link
Member

dvicini commented May 31, 2023

I see, thanks for clarifying.

I don't think adding an extra sample to the endpoint interface is a big deal.

How do you deal with sampling the volume of complex shapes? Do you simply sample the bounding box and fail the sample if it falls outside the shape? I think currently it's still tricky to use an arbitrary number of random samples within a virtual function call (e.g., to do rejection sampling)

@Microno95
Copy link
Contributor Author

Does it make sense to add the 3rd volume sample by changing the Point2f sample to a Point3f sample? It seems the least cluttered way and is the most logical extension. But happy to also make it an additional parameter in the interface.

I have not yet decided on the best way to sample triangle mesh-based shapes. My two ideas were to either sample within the bounding box and fail the sample if it falls outside as you said, or to build a discrete distrubution using a simple tetrahedral decomposition of the mesh.

The upside of using a tetrahedral representation (or really any discretisation of the volume) is that it'll be faster to sample and easier to check if a sample falls inside/outside the shape.

The alternative is iterating over every triangle to calculate a winding number or using ray intersections (calculating the number of crossings). The issue with ray intersections is in regards to choosing a direction for testing and I've found that it can be problematic depending on the mesh and winding number calculation scales linearly with the number of triangles.

@dvicini
Copy link
Member

dvicini commented Jun 2, 2023

I think changing the sample to be Point3f isn't a big deal, better than adding another parameter.

I don't think adding a tetrahedral representation makes sense (from an engineering point of view). As far as I know - correct me on that - computing a tetrahedralization is far from trivial and not something that can easily be implemented in. a standalone helper function.

I would suggest to check if a point is inside the mesh by simply tracing a ray in a fixed (or randomized) direction and checking if the triangle we hit is facing away from the ray direction. This should work for watertight meshes, and that either way is the assumption for any of the volumetric integrators to work correctly.

@Microno95
Copy link
Contributor Author

That makes sense! I will work on those and push an update asap.

@Microno95
Copy link
Contributor Author

Microno95 commented Jun 13, 2023

Hey @dvicini, I am in the process of implementing the interface for Next Event Estimation of volume emitters, but I'm struggling with estimating the pdf of a direction sampled in the emitter, specifically implementing the sample_direction and pdf_direction methods.

The issue I'm having is that while I know the pdf of sampling a given point in the medium, the pdf of a direction is with respect to the solid angle whereas the pdf of sampling a point is with respect to the volume. I'm not sure how to make these two commensurate.

I know the following relation should hold

$$ \int_\Omega p(\vec{\omega}) \mathrm{d\Omega} = \int_V p(x) \mathrm{dV} $$

i.e. the integral of the solid angle pdf over all solid angles should be equal to the integral of the volume pdf over all volumes (and both should be equal to 1 for a normalised pdf.). Knowing that, I can rewrite the equation as follows.

$$ \int_\Omega p(\vec{\omega}) \mathrm{d\Omega} = \int_\Omega\int_r p(x) r^2\mathrm{dr}\mathrm{d\Omega} $$

And (this is the part I'm unsure about), this should imply that

$$ p(\vec{\omega}) \mathrm{d\Omega} =\int_r p(x) r^2\mathrm{dr}\mathrm{d\Omega} $$

If all that is correct, then the pdf of sampling a given direction should be straightforward to compute based on the line segments that pass through the medium - easily found via ray intersections. I'm not sure how this is to be treated in the case of medium vs bsdf interactions given that $\mathrm{d\Omega}$ has two different domains, how to handle the case where a given point is inside the medium and how to handle refractive interfaces (do they matter?).

@dvicini
Copy link
Member

dvicini commented Jun 13, 2023

The Jacobian to change between solid angle and volume sampling is indeed $r^2$. This is exactly the geometry term in a volume.

I am not sure I fully understand you question. If this is primarily about MIS, you will just want to make sure that both PDFs are in the same measure.

The conceptually simplest way is to integrate the volume PDF over the ray segment, but that is costly and potentially inaccurate (e.g. can be done using ray marching). It's good to keep in mind that MIS weights don't have to use exact PDFs, they just need to be computed consistently to ensure unbiasedness.

Another option could be to compute the MIS in volume measure, I think that would forego the need to integrate over segments (potentially at an increase in noise? but with cheaper per-sample cost). In that case, you would just have to convert the solid angle PDF to volume PDF using the geometry term.

@Microno95
Copy link
Contributor Author

Right, I think I understand better now. I was thinking I'd need to convert the volume PDF to a solid angle PDF in order to do MIS with the volume emitter, but you're saying I can do it the other way around (which makes perfect sense).

In terms of converting the solid angle PDF to a volume PDF, would it be correct to divide $p(\vec{\omega})$ by $r^2$ or does it require something else? i.e. the MIS is computed via $\mathrm{MIS}(r^{-2}p(\vec\omega), p(\vec x))$

@dvicini
Copy link
Member

dvicini commented Jun 13, 2023

Yes, this sounds good to me.

You will need to make sure that the MIS is consistent with the evaluation of emission after direction sampling. I.e., the simplest method would again be to evaluate the emission along the entire ray segment.

@Microno95
Copy link
Contributor Author

It seems that simply converting the solid angle PDF to a volume PDF does not work, or perhaps I'm doing something wrong. I've verified that integrating the volume PDF over the ray length works, but it is slower (by a lot, around 20x-40x due to the ray casts required) and raises OptiX warnings in CUDA modes.

At first I thought it would be a simple change of variables using the differential element $$r^{-2}\frac{\mathrm{dV}}{\mathrm{d\Omega}}$$, but this gives very different results compared to integrating over the ray length and is inconsistent with the previous results obtained without NEE.

Is it perhaps to do with the fact that $r^{-2}p(\vec{\omega})dV$ is a non-normalisable PDF? i.e. the following integral does not converge

$$ \int_V r^{-2} p(\vec{\omega}) \mathrm{dV} = \int_V \left(r^{-2}p(\vec{\omega})\right) \mathrm{r^2\sin\theta dr d\theta d\phi} = \int_V p(\vec{\omega})\mathrm{\sin\theta dr d\theta d\phi} $$

@Microno95 Microno95 force-pushed the feature/emissive-media branch 2 times, most recently from a9ebdb5 to a424aed Compare June 18, 2023 23:06
@Microno95
Copy link
Contributor Author

Microno95 commented Jun 18, 2023

The current implementation correctly tracks the ray entry and exit points in a given shape and calculates the exact directional pdf. In llvm_* modes this is quite slow compared to not using NEE by around 5x while in cuda_* modes the discrepancy is much less at around 1.5x slower with a noticeable reduction in the noise for the same sample count.

# ~~~~ Was wrong about this ~~~~ #
There is one big issue in that in llvm_* modes, the ray entry and exit points lead to different results compared to cuda_* modes, most likely due to the different handling of ray intersections by Embree vs OptiX and thus the current implementation only gives correct results for cuda_* modes.

I am working on improving and fixing the implementation, but I would welcome a second set of eyes over the code to see if there's anything I'm missing in how I'm accumulating the pdf.
# ~~~~ ~~~~~~~~ ~~~~ #

Okay, nevermind, it seems that both are incorrect for an as-of-yet unknown reason by the same amount. With OptiX I was getting the correct results in that case due to a bug so I'll investigate down that path.

@Microno95
Copy link
Contributor Author

Looking further into, it seems that cuda_* modes gave the correct results when the ray intersection routines were disabled giving a pdf of 0 - this was due to how I initialized the nested volume scene allowing me to do ray intersections. Ultimately this gave the correct result as it effectively disabled NEE.

I've attached two renders, with and without NEE (512x512@4096spp), that show the discrepancy between the NEE and non-NEE results. This is of a sphere shape with a purely absorbing medium with a radiance of 1 and an extinction coefficient of 1 at the origin with a radius of 1.

The discrepancy definitely has to do with how the PDF is computed, but I'm not sure what's wrong as I've verified that the calculations are correct - by computing the analytic solution to the directional PDF and comparing it to the ray traced solution for the sphere shape as per our earlier discussions. From the emission gathered at the floor and back wall, it looks like the NEE version is underestimating the illumination from the medium - perhaps there is a term that has to be accounted for.

Without NEE:
emitting_sphere_REF

With NEE:
emitting_sphere_NEE

I've also attached the .exr files themselves for better numerical comparison in TEV.
emitting_sphere_renders.zip

@dvicini
Copy link
Member

dvicini commented Jun 20, 2023

Hmm, really hard to say what's wrong just from these two images. I would say that for a homogeneous sphere of emissive volume it should really be possible to get a perfect match, as it doesn't have any of the complexity of complex shapes or delta tracking for heterogeneous media. My only suggestion would be to further reduce the complexity: 1) only render direct illumination, no scattering and 2) completely disable absorption. I.e., can you match the total emission from an emissive sphere volume that does not absorb any light?

@Microno95
Copy link
Contributor Author

Microno95 commented Jun 20, 2023

With a homogeneous sphere where $\sigma_t=\sigma_s=0$ and $\sigma_n=1$ (in order to gather emission from the medium), the renders still diverge for any ray max depth $>1$. For a max ray depth of 1, so just purely illumination from emitter intersections, the two renders agree perfectly. For a max ray depth of 2 or greater, they diverge with the NEE renders looking similarly dimmer as before.

In the literature, the closest I've seen to the implementation of the solid angle PDF is in Equation 15 of Line Integration for Rendering Heterogeneous Emissive Volumes where they estimate the solid angle PDF by integrating the volume PDF along the ray direction with the appropriate jacobian.

Perhaps there is additional consideration in how NEE estimates of illumination are combined with Unidirectional estimates for volume emitters.

I should add that a purely emissive medium is not possible with the current homogeneous medium handling in volpath/volpathmis due to the way distance sampling in the medium is computed and how homogeneous media are optimised in the code paths. i.e. to gather emission from homogeneous emissive media with no absorption, there is no way to sample free-flight distances and gather samples at different points along the ray since the ray immediately exits the medium. Thus I achieve the renders by using a heterogeneous medium with $\sigma_{\left[t,s\right]}=0$ and $\sigma_n=1$.

@dvicini
Copy link
Member

dvicini commented Jun 21, 2023

I see, yes it makes sense that for this setting you need to use some non-zero null density to even sample the emission contributions. I would investigate this case further, this seems like a good minimal test case to debug.

@Microno95
Copy link
Contributor Author

I've looked further into it, and I think there's actually a bug in how volpath handles NEE. I've created a case where a surface emitter with a diffuse reflectance of 0 gives incorrect results with volpath when emitter sampling is enabled, regardless of whether or not there are media in the scene at all.

Here's the reference render using the Path integrator:
ref_path_render

And here's the render from VolPath:
incorrect_volpath_render

I've also included the renders from volpathmis and prbvolpath as well.

volpathmis:
incorrect_volpathmis_render

prbvolpath:
incorrect_prbvolpath_render

I've tested it and the issue exists in the master branch so I am sure it is unrelated to my changes to the code. I've attached the scene here:
minimal_example.zip

@dvicini
Copy link
Member

dvicini commented Jun 22, 2023

Oh, that's a good catch. Not sure what's wrong here, but maybe something related to the ShadowEpsilon or so. I will take a look, would be good to fix

@Microno95
Copy link
Contributor Author

Had a chance to take a look while wrapping my head around the sample_emitter implementation, and as a byproduct, figured out the issue. Essentially, the sample_emitter generates a path with a null vertex on the surface of the emitter before returning; in the case of a surface emitter, this is easily alleviated by avoiding the generation of a null vertex whenever the remaining distance is below the ShadowEpsilon threshold after a surface intersection before

active_surface &= si.is_valid() && active && !active_medium;
if (dr::any_or<true>(active_surface)) {
auto bsdf = si.bsdf(ray);
Spectrum bsdf_val = bsdf->eval_null_transmission(si, active_surface);
bsdf_val = si.to_world_mueller(bsdf_val, si.wi, si.wi);
dr::masked(transmittance, active_surface) *= bsdf_val;
}

remaining_dist = ds.dist * (1.f - math::ShadowEpsilon<Float>) - total_dist;
ray.maxt = remaining_dist;
active &= (remaining_dist > math::ShadowEpsilon<Float>);
if (dr::none_or<false>(active))
    break;

@dvicini
Copy link
Member

dvicini commented Jun 22, 2023

I think the problem is a bit more subtle. By adding the epsilon to the check for the remaining distance, you are essentially doubling the shadow epsilon. The epsilon is already included in the computation of remaining_dist. The better solution is to replace the spawn_ray function call with a call to spawn_ray_to, which has a better epsilon calculation. Then, we can also replace the use of ds.dist with the maxt set within spawn_ray_to. This is more consistent with what the path integrator does. I will make a PR

…d sampling routines and added appropriate tests to check the uniformity of the 3d sampling.
… volume sampling. Modified every dependency accordingly and added a flag to signal when the third sample is used. Updated all tests to correctly account for the use of Point3f over Point2f.
…`cuda_*` modes and removed TODO as it's technically implemented
…ed flags to volpath for disentangling emitter and unidirectional sampling
…g with analytic sphere shape, needs mesh shape to be fixed
… 2d for volume sampling. Modified every dependency accordingly and added a flag to signal when the third sample is used. Updated all tests to correctly account for the use of Point3f over Point2f.
…phere and cylinder, and rejection-based sampling to mesh-based shapes
…ters and added appropriate python interfaces/documentation.
TODO: Revert back to mitsuba3/mitsuba-data when merged
… differentiation to prbvolpath.py - probably needs further testing. Optimized volpath.cpp for better ray coherence by looping through the medium traversal for all lanes.
…rd-mode differentiation to prbvolpath.py - probably needs further testing. Optimized volpath.cpp for better ray coherence by looping through the medium traversal for all lanes.
…ium looping as it is slower to render in many cases.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants