-
Notifications
You must be signed in to change notification settings - Fork 77
Issues: lucidrains/enformer-pytorch
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
AssertionError: if using tf gamma, only sequence length of 1536 allowed for now
#40
by jerome-f
was closed Mar 30, 2024
Hard coded input sequence length to the transformer blocks with using use_tf_gamma = True
#32
by zhhhhahahaha
was closed Dec 15, 2023
Initializing AttentionPool weights with 2 * Identity matrix, again!
#22
by dohlee
was closed Feb 12, 2023
Fine-tuning without freezing transformer parameter leads to poor performance
#16
by Zehui127
was closed Feb 21, 2023
Using EleutherAI/enformer-official-rough PyTorch implementation to just get human output head
#15
by aaronwtr
was closed Dec 23, 2022
Why do we need Residual here while we have residual connection inside conv block
#6
by inspirit
was closed Jun 12, 2022
ProTip!
Adding no:label will show everything without a label.