Skip to content

Commit

Permalink
Fix typos in .md and .rst files (pytorch#88962)
Browse files Browse the repository at this point in the history
This PR fixes typos `Github` in `.md` and `.rst` files.
`Github` -> `GitHub`

Pull Request resolved: pytorch#88962
Approved by: https://github.com/kit1980
  • Loading branch information
kiszk authored and pytorchmergebot committed Nov 17, 2022
1 parent 573eaf1 commit a5f04e9
Show file tree
Hide file tree
Showing 9 changed files with 10 additions and 10 deletions.
2 changes: 1 addition & 1 deletion .github/scripts/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
> NOTE: This README contains information for the `.github` directory but cannot be located there because it will overwrite the
repo README.

This directory contains workflows and scripts to support our CI infrastructure that runs on Github Actions.
This directory contains workflows and scripts to support our CI infrastructure that runs on GitHub Actions.

## Workflows

Expand Down
2 changes: 1 addition & 1 deletion RELEASE.md
Original file line number Diff line number Diff line change
Expand Up @@ -281,7 +281,7 @@ need to support these particular versions of software.

In the event a submodule cannot be fast forwarded and a patch must be applied we can take two different approaches:

* (preferred) Fork the said repository under the pytorch Github organization, apply the patches we need there, and then switch our submodule to accept our fork.
* (preferred) Fork the said repository under the pytorch GitHub organization, apply the patches we need there, and then switch our submodule to accept our fork.
* Get the dependencies maintainers to support a release branch for us

Editing submodule remotes can be easily done with: (running from the root of the git repository)
Expand Down
2 changes: 1 addition & 1 deletion caffe2/contrib/tensorrt/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,4 +15,4 @@ For further information please explore `caffe2/python/trt/test_trt.py` test show

## Questions and Feedback

Please use Github issues (https://github.com/pytorch/pytorch/issues) to ask questions, report bugs, and request new features.
Please use GitHub issues (https://github.com/pytorch/pytorch/issues) to ask questions, report bugs, and request new features.
2 changes: 1 addition & 1 deletion docs/source/community/contribution_guide.rst
Original file line number Diff line number Diff line change
Expand Up @@ -138,7 +138,7 @@ A great deal of the tutorials on `pytorch.org <https://pytorch.org/>`__
come from the community itself and we welcome additional contributions.
To learn more about how to contribute a new tutorial you can learn more
here: `PyTorch.org Tutorial Contribution Guide on
Github <https://github.com/pytorch/tutorials/#contributing>`__
GitHub <https://github.com/pytorch/tutorials/#contributing>`__

Improving Documentation & Tutorials
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Expand Down
2 changes: 1 addition & 1 deletion docs/source/masked.rst
Original file line number Diff line number Diff line change
Expand Up @@ -157,7 +157,7 @@ Binary Operators
As you may have seen in the tutorial, :class:`MaskedTensor` also has binary operations implemented with the caveat
that the masks in the two MaskedTensors must match or else an error will be raised. As noted in the error, if you
need support for a particular operator or have proposed semantics for how they should behave instead, please open
an issue on Github. For now, we have decided to go with the most conservative implementation to ensure that users
an issue on GitHub. For now, we have decided to go with the most conservative implementation to ensure that users
know exactly what is going on and are being intentional about their decisions with masked semantics.

The available binary operators are:
Expand Down
2 changes: 1 addition & 1 deletion docs/source/onnx.rst
Original file line number Diff line number Diff line change
Expand Up @@ -594,7 +594,7 @@ all of the unconvertible ops in one go you can::
The set is approximated because some ops may be removed during the conversion
process and don't need to be converted. Some other ops may have partial support
that will fail conversion with particular inputs, but this should give you a
general idea of what ops are not supported. Please feel free to open Github Issues
general idea of what ops are not supported. Please feel free to open GitHub Issues
for op support requests.

Frequently Asked Questions
Expand Down
4 changes: 2 additions & 2 deletions docs/source/sparse.rst
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ torch.sparse
.. warning::

The PyTorch API of sparse tensors is in beta and may change in the near future.
We highly welcome feature requests, bug reports and general suggestions as Github issues.
We highly welcome feature requests, bug reports and general suggestions as GitHub issues.

Why and when to use sparsity
++++++++++++++++++++++++++++
Expand Down Expand Up @@ -40,7 +40,7 @@ Like many other performance optimization sparse storage formats are not
always advantageous. When trying sparse formats for your use case
you might find your execution time to decrease rather than increase.

Please feel encouraged to open a Github issue if you analytically
Please feel encouraged to open a GitHub issue if you analytically
expected to see a stark increase in performance but measured a
degradation instead. This helps us prioritize the implementation
of efficient kernels and wider performance optimizations.
Expand Down
2 changes: 1 addition & 1 deletion torch/csrc/jit/operator_upgraders/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -226,7 +226,7 @@ def foo(x, y, z=100):
return x, y, z
```

2. To help understanding the BC/FC breakage changes, here are some FC breaking changes examples. The solution to resolve it is not there yet. If it's desired, please report it in either [PyTorch Forum](https://discuss.pytorch.org/) or [PyTorch Github](https://github.com/pytorch/pytorch). We will prioritize it accordingly.
2. To help understanding the BC/FC breakage changes, here are some FC breaking changes examples. The solution to resolve it is not there yet. If it's desired, please report it in either [PyTorch Forum](https://discuss.pytorch.org/) or [PyTorch GitHub](https://github.com/pytorch/pytorch). We will prioritize it accordingly.

- Adding new default argument:
- Adding a new default argument not RIGHT BEFORE the out arguments which can be 0 or more.
Expand Down
2 changes: 1 addition & 1 deletion torch/csrc/lazy/tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -283,4 +283,4 @@ This concludes our brief introduction to LT. Hopefully, you'll remember the main
* It's really tricky to produce such graphs without overburdening a user too much. Think, torch.jit.script, torch.jit.trace! Also, think ifs, fors, "Lions, and Tigers, and Bears, Oh My" We digressed.


Please give LT a try and tell us what you think on Github! We are **eager, not lazy** (haha!) to hear from you!
Please give LT a try and tell us what you think on GitHub! We are **eager, not lazy** (haha!) to hear from you!

0 comments on commit a5f04e9

Please sign in to comment.