Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Addition of LayerNorm layer #147

Merged
merged 6 commits into from
Jan 26, 2020
Merged

Conversation

guillaume-be
Copy link
Contributor

Addition of a layer norm layer with trainable parameters. Follows a similar design to the batch norm. Added a base functionality test as done for batch norm. Tested on few example locally and normalizes the input as expected (core logic delegated to the Tensor f_layer_norm)

@LaurentMazare
Copy link
Owner

Looks very nice, thanks. There are some formatting failures in Travis CI, could you trying these and then we could merge the PR ?

@guillaume-be
Copy link
Contributor Author

Thank you - I have updated the code formatting and it should be fine from a Travis standpoint. The AppVeyor build still fails and I am not sure why

@LaurentMazare
Copy link
Owner

Merged, thanks for the PR (the appveyor test is flaky for now)

@LaurentMazare LaurentMazare merged commit 364f4ee into LaurentMazare:master Jan 26, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants