Skip to content

Commit

Permalink
docs: apply suggested fixes
Browse files Browse the repository at this point in the history
Co-Authored-By: Fedor Ignatov <[email protected]>
  • Loading branch information
yoptar and IgnatovFedor authored Feb 26, 2020
1 parent 9890ebd commit 7c4f4df
Show file tree
Hide file tree
Showing 4 changed files with 4 additions and 5 deletions.
2 changes: 1 addition & 1 deletion deeppavlov/models/embedders/transformers_embedder.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ def __call__(self, subtoken_ids_batch: Collection[Collection[int]], startofwords
Args:
subtoken_ids_batch: padded indexes for every subtoken
startofwords_batch: a mask matrix with ``1`` for every first subtoken init in a token and ``0``
for every other subtoken
for every other subtoken
attention_batch: a mask matrix with ``1`` for every significant subtoken and ``0`` for paddings
"""
ids_tensor = torch.tensor(subtoken_ids_batch, device=self.device)
Expand Down
2 changes: 1 addition & 1 deletion docs/features/models/bert.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ English.
| BERT paper: https://arxiv.org/abs/1810.04805
| Google Research BERT repository: https://github.com/google-research/bert
There are several pre-trained BERT models released by Google Research, more detail about these pretrained models could be found here: https://github.com/google-research/bert#pre-trained-models
There are several pre-trained BERT models released by Google Research, more details about these pre-trained models could be found here: https://github.com/google-research/bert#pre-trained-models

- BERT-base, English, cased, 12-layer, 768-hidden, 12-heads, 110M parameters: download from `[google] <https://storage.googleapis.com/bert_models/2018_10_18/cased_L-12_H-768_A-12.zip>`__,
`[deeppavlov] <http://files.deeppavlov.ai/deeppavlov_data/bert/cased_L-12_H-768_A-12.zip>`__
Expand Down
3 changes: 1 addition & 2 deletions docs/features/models/squad.rst
Original file line number Diff line number Diff line change
Expand Up @@ -228,8 +228,7 @@ Pretrained models are available and can be downloaded:
.. code:: bash
python -m deeppavlov download deeppavlov/configs/squad/squad_zh_bert.json
python -m deeppavlov download deeppavlov/configs/squad/squad_zh_zh_bert.json
python -m deeppavlov download deeppavlov/configs/squad/squad_zh_zh_bert.json
Link to DRCD dataset: http://files.deeppavlov.ai/datasets/DRCD.tar.gz
Link to DRCD paper: https://arxiv.org/abs/1806.00920
Expand Down
2 changes: 1 addition & 1 deletion docs/features/models/syntaxparser.rst
Original file line number Diff line number Diff line change
Expand Up @@ -167,4 +167,4 @@ and dependency head.
.. _`UD Pipe Future`: https://github.com/CoNLL-UD-2018/UDPipe-Future
.. _`UDify (multilingual BERT)`: https://github.com/hyperparticle/udify

So our model is by a valuable margin the state-of-the-art system for Russian syntactic parsing.
So our model is the state-of-the-art system for Russian syntactic parsing by a valuable margin.

0 comments on commit 7c4f4df

Please sign in to comment.