Skip to content

Commit

Permalink
Added leaky relu recipe.
Browse files Browse the repository at this point in the history
  • Loading branch information
vahidk committed Aug 30, 2017
1 parent d377c4f commit 1a669bb
Showing 1 changed file with 8 additions and 0 deletions.
8 changes: 8 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ Table of Contents
- [Entropy](#entropy)
- [KL-Divergence](#kld)
- [Make parallel](#make_parallel)
- [Leaky Relu](#leaky_relu)

---

Expand Down Expand Up @@ -1406,3 +1407,10 @@ def make_parallel(fn, num_gpus, **kwargs):

return tf.concat(out_split, axis=0)
```

## Leaky relu <a name="leaky_relu"></a>
```python
def leaky_relu(tensor, alpha=0.1):
"""Computes the leaky rectified linear activation."""
retrun tf.maximum(x, alpha * x)
```

0 comments on commit 1a669bb

Please sign in to comment.