Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ADD BORT #9813

Merged
merged 15 commits into from
Jan 27, 2021
Prev Previous commit
Next Next commit
clean doc a bit
  • Loading branch information
patrickvonplaten committed Jan 27, 2021
commit 0e1ea10a8819d1fd5ad6f3f148c0e6a1bc2ed693
14 changes: 8 additions & 6 deletions docs/source/model_doc/bort.rst
Original file line number Diff line number Diff line change
Expand Up @@ -34,12 +34,14 @@ absolute, with respect to BERT-large, on multiple public natural language unders

Tips:

- This implementation is the same as BERT. Refer to the :doc:`documentation of BERT <bert>` for usage examples as well
as the information relative to the inputs and outputs.
- The RoBERTa tokenizer is used instead of BERT tokenizer.

BORT's architecture is based on the BERT model, so one can refer to BERT's `docstring
<https://huggingface.co/transformers/model_doc/bert.html>`_.
- BORT's model architecture is based on BERT, so one can refer to BERT's `docstring
<https://huggingface.co/transformers/model_doc/bert.html>`_ for the model's API as well as usage examples.
- BORT uses the RoBERTa tokenizer instead of the BERT tokenizer, so one can refer to to RoBERTa's `docstring
<https://huggingface.co/transformers/model_doc/roberta.html>`_ for the tokenizer's API as well as usage examples
- BORT requires a specific fine-tuning algorithm, called `Agora
<https://adewynter.github.io/notes/bort_algorithms_and_applications.html#fine-tuning-with-algebraic-topology>`__ ,
that is sadly not open-sourced yet. It would be very useful for the community, if someone tries to implement the
algorithm to make BORT fine-tuning work.

The original code can be found `here <https://github.com/alexa/bort/>`__.