Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Docs improvements #579

Merged
merged 1 commit into from
Mar 17, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Minor improvements
  • Loading branch information
andrey-churkin committed Mar 17, 2021
commit e593bc1added32482a1582a3235113744f81f9e7
14 changes: 7 additions & 7 deletions docs/compression_algorithms/Pruning.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,9 +33,9 @@ Then during pruning filters with smaller ![G(F_i)](https://latex.codecogs.com/pn
The zeroed filters are frozen afterwards and the remaining model parameters are fine-tuned.

**Parameters of the scheduler:**
- `num_init_steps` - number of epochs for model pretraining **before** pruning.
- `pruning_target` - pruning rate target . For example, the value `0.5` means that right after pretraining, at the epoch
with the number of `num_init_steps`, the model will have 50% of its convolutional filters zeroed.
- `num_init_steps` - number of epochs for model pretraining **before** pruning.
- `pruning_target` - pruning rate target . For example, the value `0.5` means that right after pretraining, at the epoch
with the number of `num_init_steps`, the model will have 50% of its convolutional filters zeroed.


**Exponential scheduler**
Expand All @@ -48,10 +48,10 @@ Current pruning rate ![P_{i}](https://latex.codecogs.com/svg.latex?P_{i}) (on i-
Where ![a, k](https://latex.codecogs.com/svg.latex?a,%20k) - parameters.

**Parameters of scheduler:**
- `num_init_steps` - number of epochs for model pretraining before pruning.
- `pruning_steps` - the number of epochs during which the pruning rate target is increased from `pruning_init` to `pruning_target` value.
- `pruning_init` - initial pruning rate target. For example, value `0.1` means that at the begging of training, the model is trained to have 10% of convolutional filters zeroed.
- `pruning_target` - pruning rate target at the end of the schedule. For example, the value `0.5` means that at the epoch with the number of `num_init_steps + pruning_steps`, the model is trained to have 50% of its convolutional filters zeroed.
- `num_init_steps` - number of epochs for model pretraining before pruning.
- `pruning_steps` - the number of epochs during which the pruning rate target is increased from `pruning_init` to `pruning_target` value.
- `pruning_init` - initial pruning rate target. For example, value `0.1` means that at the begging of training, the model is trained to have 10% of convolutional filters zeroed.
- `pruning_target` - pruning rate target at the end of the schedule. For example, the value `0.5` means that at the epoch with the number of `num_init_steps + pruning_steps`, the model is trained to have 50% of its convolutional filters zeroed.

**Exponential with bias scheduler**
Similar to the `Exponential scheduler`, but current pruning rate ![P_{i}](https://latex.codecogs.com/svg.latex?P_{i}) (on i-th epoch) during training calculates by equation:
Expand Down
11 changes: 8 additions & 3 deletions docs/styleguide/PyGuide.md
Original file line number Diff line number Diff line change
Expand Up @@ -273,7 +273,7 @@ the first line.

# Aligned with opening delimiter
foo = long_function_name(var_one, var_two,
var_three, var_four)
var_three, var_four)
meal = (spam,
beans)

Expand Down Expand Up @@ -521,14 +521,18 @@ def load_state(model: torch.nn.Module, state_dict_to_load: dict, is_resume: bool
<a id="classes"></a>
#### 3.5.3 Classes

Classes should have a docstring below the class definition describing the class.
Classes should have a docstring below the class definition describing the class. If your class
has public attributes, they should be documented here follow the same formatting as a function's
params section.

```python
class ModelTransformer:
"""
Applies transformations to the model.

:param public_attribute: Public attribute description
"""

def __init__(self, model: ModelType, transformation_layout: TransformationLayout):
"""
Initializes Model Transformer
Expand All @@ -539,6 +543,7 @@ class ModelTransformer:
"""
self._model = model
self._transformations = transformation_layout.transformations
self.public_attribute = None

def transform(self) -> ModelType:
"""
Expand Down