Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Choice of texture interpolation method #618

Closed
Dragoo417 opened this issue Mar 29, 2021 · 9 comments
Closed

Choice of texture interpolation method #618

Dragoo417 opened this issue Mar 29, 2021 · 9 comments
Assignees
Labels
enhancement New feature or request

Comments

@Dragoo417
Copy link

Dragoo417 commented Mar 29, 2021

Choice of texture interpolation method

Motivation

It would be useful to be able to choose the interpolation method used to sample the texture of a mesh. I would like that the visible texture on the mesh matches the texture as closely as possible but it is not possible for now. I believe it is fixed as bilinear for now, which yields blurry patches of colors for low resolution textures.

@nikhilaravi nikhilaravi self-assigned this Mar 29, 2021
@nikhilaravi nikhilaravi added the enhancement New feature or request label Mar 29, 2021
@nikhilaravi
Copy link
Contributor

@Dragoo417 You could actually implement this yourself. e.g. if you are using the UV textures class you could subclass this to create a custom class and override the sample_textures method to sample the textures in any way you like.

def sample_textures(self, fragments, **kwargs) -> torch.Tensor:
"""
Interpolate a 2D texture map using uv vertex texture coordinates for each
face in the mesh. First interpolate the vertex uvs using barycentric coordinates
for each pixel in the rasterized output. Then interpolate the texture map
using the uv coordinate for each pixel.
Args:
fragments:
The outputs of rasterization. From this we use
- pix_to_face: LongTensor of shape (N, H, W, K) specifying the indices
of the faces (in the packed representation) which
overlap each pixel in the image.
- barycentric_coords: FloatTensor of shape (N, H, W, K, 3) specifying
the barycentric coordianates of each pixel
relative to the faces (in the packed
representation) which overlap the pixel.
Returns:
texels: tensor of shape (N, H, W, K, C) giving the interpolated
texture for each pixel in the rasterized image.
"""
if self.isempty():
faces_verts_uvs = torch.zeros(
(self._N, 3, 2), dtype=torch.float32, device=self.device
)
else:
packing_list = [
i[j] for i, j in zip(self.verts_uvs_list(), self.faces_uvs_list())
]
faces_verts_uvs = torch.cat(packing_list)
texture_maps = self.maps_padded()
# pixel_uvs: (N, H, W, K, 2)
pixel_uvs = interpolate_face_attributes(
fragments.pix_to_face, fragments.bary_coords, faces_verts_uvs
)
N, H_out, W_out, K = fragments.pix_to_face.shape
N, H_in, W_in, C = texture_maps.shape # 3 for RGB
# pixel_uvs: (N, H, W, K, 2) -> (N, K, H, W, 2) -> (NK, H, W, 2)
pixel_uvs = pixel_uvs.permute(0, 3, 1, 2, 4).reshape(N * K, H_out, W_out, 2)
# textures.map:
# (N, H, W, C) -> (N, C, H, W) -> (1, N, C, H, W)
# -> expand (K, N, C, H, W) -> reshape (N*K, C, H, W)
texture_maps = (
texture_maps.permute(0, 3, 1, 2)[None, ...]
.expand(K, -1, -1, -1, -1)
.transpose(0, 1)
.reshape(N * K, C, H_in, W_in)
)
# Textures: (N*K, C, H, W), pixel_uvs: (N*K, H, W, 2)
# Now need to format the pixel uvs and the texture map correctly!
# From pytorch docs, grid_sample takes `grid` and `input`:
# grid specifies the sampling pixel locations normalized by
# the input spatial dimensions It should have most
# values in the range of [-1, 1]. Values x = -1, y = -1
# is the left-top pixel of input, and values x = 1, y = 1 is the
# right-bottom pixel of input.
pixel_uvs = pixel_uvs * 2.0 - 1.0
texture_maps = torch.flip(texture_maps, [2]) # flip y axis of the texture map
if texture_maps.device != pixel_uvs.device:
texture_maps = texture_maps.to(pixel_uvs.device)
texels = F.grid_sample(
texture_maps,
pixel_uvs,
align_corners=self.align_corners,
padding_mode=self.padding_mode,
)
# texels now has shape (NK, C, H_out, W_out)
texels = texels.reshape(N, K, C, H_out, W_out).permute(0, 3, 4, 1, 2)
return texels

@Dragoo417
Copy link
Author

Oh, I didn't think of that. Thank you for pointing it out !

@Dragoo417
Copy link
Author

Dragoo417 commented May 3, 2021

I finally got around to try this and encoutered a problem: I subclassed TexturesUV with MyTexturesUV and overrode the sample_textures() method. I then use an instance of MyTexturesUV to instantiate a Mesh instead of TexturesUV (as the textures parameter in the Mesh constructor) but when rendering sample_textures() from MyTexturesUV does not seem to even get called. Any hint on what might be wrong ?

@bottler
Copy link
Contributor

bottler commented May 3, 2021

Can you paste the relevant parts of your code?

@bottler bottler reopened this May 3, 2021
@Dragoo417
Copy link
Author

Dragoo417 commented May 3, 2021

My custom class:

class NoInterpolationTexturesUV(TexturesUV):
    def sample_textures(self, fragments, **kwargs) -> torch.Tensor:
        print("hi")
        # same code as in base class

And I use it as such:

tex = NoInterpolationTexturesUV(
    verts_uvs=[verts_uvs],    # Comes from loading an .obj file
    faces_uvs=[faces_uvs],   # Comes from loading an .obj file
    maps=self._adversarial_texture_tensor
)
myMesh = Meshes(
    verts=[verts.to(self._device)], # Comes from loading an .obj file
    faces=[faces.verts_idx.to(self._device)], # Comes from loading an .obj file
    textures=tex
)

Thank you for your swift reply. Let me know if anthing more is needed.

@bottler
Copy link
Contributor

bottler commented May 3, 2021

That should work. If you render myMesh you should see "hi".

If you ever call extend() it won't work (this can be fixed). I can't think of any other reason it wouldn't work.

@Dragoo417
Copy link
Author

Dragoo417 commented May 3, 2021

I do actually call extend() later in the pipeline but I didn't think it could affect this. What would be the way to go to fix it in this case ?

Edit: I looked at the code and it looks like I would have to override the extend() method in my custom texture

@bottler
Copy link
Contributor

bottler commented May 3, 2021

The easiest thing for now would be to define extend in your own class

class NoInterpolationTexturesUV(TexturesUV):
    def sample_textures(self, fragments, **kwargs)
         ....
    def extend(self,  N: int):
        new_props = self._extend(
            N,
            [
                "maps_padded",
                "verts_uvs_padded",
                "faces_uvs_padded",
                "_num_faces_per_mesh",
            ],
        )
        new_tex = NoInterpolationTexturesUV(
            maps=new_props["maps_padded"],
            faces_uvs=new_props["faces_uvs_padded"],
            verts_uvs=new_props["verts_uvs_padded"],
            padding_mode=self.padding_mode,
            align_corners=self.align_corners,
        )

        new_tex._num_faces_per_mesh = new_props["_num_faces_per_mesh"]
        return new_tex

This is the same as the extend in TexturesUV except I replace TexturesUV( with NoInterpolationTexturesUV(. Ideally we would use something like self.__class__( in TexturesUV to avoid this problem.

@bottler bottler self-assigned this May 3, 2021
@Dragoo417
Copy link
Author

This works perfectly, thanks a lot for your time and have a nice day !

@bottler bottler closed this as completed May 3, 2021
facebook-github-bot pushed a commit that referenced this issue May 7, 2021
Summary: 3 extend functions in textures.py updated to call `self.__class__` rather than create a new object of its type. As mentioned in #618 .

Reviewed By: bottler

Differential Revision: D28281218

fbshipit-source-id: b9c99ab87e46a3f28c37efa1ee2c2dceb560b491
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants