Skip to content
This repository has been archived by the owner on Dec 16, 2022. It is now read-only.

Small typo #5555

Merged
merged 2 commits into from
Feb 10, 2022
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion allennlp/modules/stacked_alternating_lstm.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ class StackedAlternatingLstm(torch.nn.Module):
"""
A stacked LSTM with LSTM layers which alternate between going forwards over
the sequence and going backwards. This implementation is based on the
description in [Deep Semantic Role Labelling - What works and what's next][0].
description in [Deep Semantic Role Labeling - What works and what's next][0].

[0]: https://www.aclweb.org/anthology/P17-1044.pdf
[1]: https://arxiv.org/abs/1512.05287
Expand Down