From e593bc1added32482a1582a3235113744f81f9e7 Mon Sep 17 00:00:00 2001 From: Andrey Churkin Date: Wed, 17 Mar 2021 09:30:22 +0300 Subject: [PATCH] Minor improvements --- docs/compression_algorithms/Pruning.md | 14 +++++++------- docs/styleguide/PyGuide.md | 11 ++++++++--- 2 files changed, 15 insertions(+), 10 deletions(-) diff --git a/docs/compression_algorithms/Pruning.md b/docs/compression_algorithms/Pruning.md index 75aa0d4f9eb..c129c980eb2 100644 --- a/docs/compression_algorithms/Pruning.md +++ b/docs/compression_algorithms/Pruning.md @@ -33,9 +33,9 @@ Then during pruning filters with smaller ![G(F_i)](https://latex.codecogs.com/pn The zeroed filters are frozen afterwards and the remaining model parameters are fine-tuned. **Parameters of the scheduler:** - - `num_init_steps` - number of epochs for model pretraining **before** pruning. - - `pruning_target` - pruning rate target . For example, the value `0.5` means that right after pretraining, at the epoch - with the number of `num_init_steps`, the model will have 50% of its convolutional filters zeroed. +- `num_init_steps` - number of epochs for model pretraining **before** pruning. +- `pruning_target` - pruning rate target . For example, the value `0.5` means that right after pretraining, at the epoch +with the number of `num_init_steps`, the model will have 50% of its convolutional filters zeroed. **Exponential scheduler** @@ -48,10 +48,10 @@ Current pruning rate ![P_{i}](https://latex.codecogs.com/svg.latex?P_{i}) (on i- Where ![a, k](https://latex.codecogs.com/svg.latex?a,%20k) - parameters. **Parameters of scheduler:** - - `num_init_steps` - number of epochs for model pretraining before pruning. - - `pruning_steps` - the number of epochs during which the pruning rate target is increased from `pruning_init` to `pruning_target` value. - - `pruning_init` - initial pruning rate target. For example, value `0.1` means that at the begging of training, the model is trained to have 10% of convolutional filters zeroed. - - `pruning_target` - pruning rate target at the end of the schedule. For example, the value `0.5` means that at the epoch with the number of `num_init_steps + pruning_steps`, the model is trained to have 50% of its convolutional filters zeroed. +- `num_init_steps` - number of epochs for model pretraining before pruning. +- `pruning_steps` - the number of epochs during which the pruning rate target is increased from `pruning_init` to `pruning_target` value. +- `pruning_init` - initial pruning rate target. For example, value `0.1` means that at the begging of training, the model is trained to have 10% of convolutional filters zeroed. +- `pruning_target` - pruning rate target at the end of the schedule. For example, the value `0.5` means that at the epoch with the number of `num_init_steps + pruning_steps`, the model is trained to have 50% of its convolutional filters zeroed. **Exponential with bias scheduler** Similar to the `Exponential scheduler`, but current pruning rate ![P_{i}](https://latex.codecogs.com/svg.latex?P_{i}) (on i-th epoch) during training calculates by equation: diff --git a/docs/styleguide/PyGuide.md b/docs/styleguide/PyGuide.md index 988474cc47e..845ba84d060 100644 --- a/docs/styleguide/PyGuide.md +++ b/docs/styleguide/PyGuide.md @@ -273,7 +273,7 @@ the first line. # Aligned with opening delimiter foo = long_function_name(var_one, var_two, - var_three, var_four) + var_three, var_four) meal = (spam, beans) @@ -521,14 +521,18 @@ def load_state(model: torch.nn.Module, state_dict_to_load: dict, is_resume: bool #### 3.5.3 Classes -Classes should have a docstring below the class definition describing the class. +Classes should have a docstring below the class definition describing the class. If your class +has public attributes, they should be documented here follow the same formatting as a function's +params section. ```python class ModelTransformer: """ Applies transformations to the model. + + :param public_attribute: Public attribute description """ - + def __init__(self, model: ModelType, transformation_layout: TransformationLayout): """ Initializes Model Transformer @@ -539,6 +543,7 @@ class ModelTransformer: """ self._model = model self._transformations = transformation_layout.transformations + self.public_attribute = None def transform(self) -> ModelType: """