From d15d98c0026aa56232c4bac3832ea3bd3fd1dd47 Mon Sep 17 00:00:00 2001 From: liferoad Date: Fri, 17 May 2024 12:11:35 -0400 Subject: [PATCH] Update code-change-guide.md (#31333) --- contributor-docs/code-change-guide.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/contributor-docs/code-change-guide.md b/contributor-docs/code-change-guide.md index ee1944ccc658f..935a2c6276c5f 100644 --- a/contributor-docs/code-change-guide.md +++ b/contributor-docs/code-change-guide.md @@ -505,7 +505,7 @@ To run an integration test on the Dataflow Runner, follow these steps: ``` cd sdks/python - pip install build && python -m build –sdist + pip install build && python -m build --sdist ``` The tarball file is generated in the `sdks/python/sdist/` directory. @@ -570,7 +570,7 @@ You can use the [official Beam SDK container image](https://gcr.io/apache-beam-t To run your pipeline with modified beam code, follow these steps: -1. Build the Beam SDK tarball. Under `sdks/python`, run `python -m build –sdist`. For more details, +1. Build the Beam SDK tarball. Under `sdks/python`, run `python -m build --sdist`. For more details, see [Run an integration test](#run-an-integration-test) on this page. 2. Install the Apache Beam Python SDK in your Python virtual environment with the necessary