Skip to content

Commit

Permalink
Revert "Update bahdanau-attention.md"
Browse files Browse the repository at this point in the history
This reverts commit 7d7955f.
  • Loading branch information
astonzhang committed Nov 11, 2022
1 parent d158a02 commit a7dbeeb
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion chapter_attention-mechanisms/bahdanau-attention.md
Original file line number Diff line number Diff line change
Expand Up @@ -370,7 +370,7 @@ output.shape, len(state), state[0].shape, len(state[1]), state[1][0].shape
#@tab all
embed_size, num_hiddens, num_layers, dropout = 32, 32, 2, 0.1
batch_size, num_steps = 64, 10
lr, num_epochs, device = 0.005, 200, d2l.try_gpu()
lr, num_epochs, device = 0.005, 250, d2l.try_gpu()
train_iter, src_vocab, tgt_vocab = d2l.load_data_nmt(batch_size, num_steps)
encoder = d2l.Seq2SeqEncoder(
Expand Down

0 comments on commit a7dbeeb

Please sign in to comment.