Skip to content
This repository has been archived by the owner on Dec 16, 2022. It is now read-only.

v1.1.0rc3

Pre-release
Pre-release
Compare
Choose a tag to compare
@epwalsh epwalsh released this 12 Aug 20:18
· 140 commits to master since this release

Changes since v1.1.0rc2

Fixed

  • Fixed how truncation was handled with PretrainedTransformerTokenizer.
    Previously, if max_length was set to None, the tokenizer would still do truncation if the
    transformer model had a default max length in its config.
    Also, when max_length was set to a non-None value, several warnings would appear
    for certain transformer models around the use of the truncation parameter.
  • Fixed evaluation of all metrics when using distributed training.

Commits

0ac13a4 fix CHANGELOG
3b86f58 Prepare for release v1.1.0rc3
44d2847 Metrics in distributed setting (#4525)
1d61965 Bump mkdocs-material from 5.5.3 to 5.5.5 (#4547)
5b97780 tick version for nightly releases
b32608e add gradient checkpointing for transformer token embedders (#4544)
f639336 Fix logger being created twice (#4538)
660fdaf Fix handling of max length with transformer tokenizers (#4534)
15e288f EpochCallBack for tracking epoch (#4540)
9209bc9 Bump mkdocs-material from 5.5.0 to 5.5.3 (#4533)
bfecdc3 Ensure len(self.evaluation_data_loader) is not called (#4531)
5bc3b73 Fix typo in warning in file_utils (#4527)
e80d768 pin torch >= 1.6