Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Doc latex fixes #406

Merged
merged 3 commits into from
Dec 6, 2022
Merged

Doc latex fixes #406

merged 3 commits into from
Dec 6, 2022

Conversation

chadykamar
Copy link
Contributor

Spotted a couple of issues with some of the LaTeX docs

LaTeX formulas for group_norm and batch_norm
weren't rendered correctly on hexdocs
@@ -410,7 +410,7 @@ defmodule Axon.Activations do
@doc ~S"""
Log-sigmoid activation.

$$f(x_i) = \log(\sigmoid(x))$$
$$f(x_i) = \log(\sigma(x))$$
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the correct fix would be:

Suggested change
$$f(x_i) = \log(\sigma(x))$$
$$f(x_i) = \log(sigmoid(x))$$

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can change it if you think that's clearer. The sigmoid function can be denoted as σ so I assumed that was the intent here.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This module exposes a sigmoid function, so I think sigmoid is the clearer alternative with that in mind

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There's also Nx.sigmoid that uses the full name as well

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point, I amended it.

@@ -599,7 +599,7 @@ defmodule Axon.Activations do
@doc ~S"""
Sigmoid weighted linear unit activation.

$$f(x_i) = x\sigmoid(x)$$
$$f(x_i) = x\sigma(x)$$
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
$$f(x_i) = x\sigma(x)$$
$$f(x_i) = x\ sigmoid(x)$$

@@ -88,7 +88,7 @@ defmodule Axon.Schedules do
@doc ~S"""
Cosine decay schedule.

$$\gamma(t) = \gamma_0 * (1 - \alpha)*(\frac{1}{2}(1 + \cos{\pi \frac{t}{k}})) + \alpha$$
$$\gamma(t) = \gamma_0 * \left(\frac{1}{2}(1 - \alpha)(1 + \cos\pi \frac{t}{k}) + \alpha\right)$$
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This one seems to correctly reflect the implemented code. @seanmor5 could you check if the code itself isn't what's wrong?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I compared with the paper as well as the Optax implementation and I think the code's correct.

\sigmoid isn't a valid LaTex symbol
Also fixed incorrect formatting which caused
rendering to fail on hexdocs
@seanmor5
Copy link
Contributor

seanmor5 commented Dec 6, 2022

Thanks!

@seanmor5 seanmor5 merged commit e0489d7 into elixir-nx:main Dec 6, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants