Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for Custom models? #269

Open
anegi19 opened this issue Oct 27, 2022 · 1 comment
Open

Support for Custom models? #269

anegi19 opened this issue Oct 27, 2022 · 1 comment

Comments

@anegi19
Copy link

anegi19 commented Oct 27, 2022

Hi,
I cannot find it in the docs for the package, but how is extend(model) actually implemented such that it computes the higher order extensions for the parameters of the model?
Basically if I want to calculate the Generalised Gauss Newton matrix using backpack for a model which doesn't actually use the torch.nn layers, but rather a custom Forward operation... is it possible ?

For example -->

class MyModel(nn.Module):
    """
    Define the Forward Model of the Experiment
    Args
    -------  
    X (Tensor),  Y (Tensor)

    Returns
    -------
    Z : Tensor

    """
    def __init__(self, X,  Y ):

        super().__init__()
        self.X = torch.nn.Parameter(X)
        self.Y = torch.nn.Parameter(Y)

    def forward( self, X_0):
         #some operation on X and Y
         return  ( self.X- X_0 )*self.Y 

and then -->

x = torch.tensor([1.,2.,3.])
y = torch.tensor([11.,22.,33.])

x0 = torch.tensor([0.5])

inputs = torch.tensor([10.,18.,13.])


model = MyModel(x,y)
model = extend(model)
cost_function = extend(torch.nn.MSEloss())

preds = model(x0)
cost = cost_function(preds, inputs)
with backpack(extensions.GGNMP()):
    cost.backward()

It doesn't work, and I get the following error -->

Extension saving to ggnmp does not have an extension for Module <class 'main.MyModel'>

Should I modify the extend(model) somehow?
Another question, does backpack support Complex Tensors for computation of these higher order extensions?

@f-dangel
Copy link
Owner

Hi, thanks for your question!

In principle, you can add support for your custom layer to BackPACK. We have an example in the documentation that walks you through the process.

Let me know if you run into issues.

Best,
Felix

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants