Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is there anyway to analyze activations in flax? #4058

Open
mohamad-amin opened this issue Jul 5, 2024 · 1 comment
Open

Is there anyway to analyze activations in flax? #4058

mohamad-amin opened this issue Jul 5, 2024 · 1 comment
Labels
Priority: P2 - no schedule Best effort response and resolution. We have no plan to work on this at the moment.

Comments

@mohamad-amin
Copy link

A common thing in deep learning research/engineering is to analyze the intermediate activations of a model. In PyTorch, this is fairly simple to do (though I think it can be even simpler):

import torch, torch.nn as nn, torch.nn.functional as F

class SimpleMLP(nn.Module):

    def __init__(self): 
        super().__init__()
        self.fc1 = nn.Linear(5, 10);
        self.fc2 = nn.Linear(10, 2)
        self.fc3 = nn.Linear(2, 1)

    def forward(self, x): 
        x = F.relu(self.fc1(x))
        x = F.relu(self.fc2(x))
        x = self.fc3(x)
        return x

model, X = SimpleMLP(), torch.randn(3, 5)
h1 = F.relu(model.fc1(X))
h2 = F.relu(model.fc2(h1))
h3 = model.fc3(h2)

How should one implement this in flax? It's possible to write multiple apply functions, but that's really not something someone debugging a model wants to do.

@cgarciae
Copy link
Collaborator

cgarciae commented Jul 5, 2024

Pass capture_intermediate=True to apply.

@cgarciae cgarciae added the Priority: P2 - no schedule Best effort response and resolution. We have no plan to work on this at the moment. label Jul 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Priority: P2 - no schedule Best effort response and resolution. We have no plan to work on this at the moment.
Projects
None yet
Development

No branches or pull requests

2 participants