Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Proposed design for Django testing on rewrite #22206

Closed
eleanorjboyd opened this issue Oct 12, 2023 · 12 comments · Fixed by #23935
Closed

Proposed design for Django testing on rewrite #22206

eleanorjboyd opened this issue Oct 12, 2023 · 12 comments · Fixed by #23935
Assignees
Labels
area-testing community ask Feature request that the community expressed interest in needs proposal Need to make some design decisions

Comments

@eleanorjboyd
Copy link
Member

eleanorjboyd commented Oct 12, 2023

REVISED 11/1/2023 with updates regarding a new way to implement this in vscode.

Background

With the amazing help from @mh-firouzjah, we are getting close to Django testing support in VS Code! Below is a proposed design on how Django testing would look with a few outstanding questions where we are looking for input from Django developers.

One important part of this proposal is that it will be part of the new testing rewrite (see the release notes for a short explanation of what we are working on). The rewrite will be fully stable soon but for now, it is under the experiment you can turn on by setting"python.experiments.optInto": ["pythonTestAdapter",] in your user settings. Additionally, this proposal is based on another proposal we are working on here, that will change how testing args are configured in VS Code for python. This will move to having separate JSON entries with args for run and discovery as well as allow for environment variables to be set. Please review this proposal as the design of the json object with the Django enablement you see below will be based on that design principle (and will therefore align will all testing argument input in python for VS Code).

Proposal Introduction

Since Django testing is based off the unittest library, we will be hooking in Django test support into our rewritten testing framework for unittest. How unittest now works is the extension runs a subprocess which then calls a script for discovery or execution. These scripts extend unittest functionality to gather results and format them for return. (see more on how this would work with Django later)

User Facing Design

From a user perspective, this will look like just two additional environment variables we will introduce that you would configure on your setup. The JSON for test configuration will be the same as all other testing with our new design, but with the environment variables "DJANGO_TESTING" and "DJANGO_MANAGE_PY_PATH". "DJANGO_TESTING" when set to true will then make our script run Django setup, the default value is false. "DJANGO_MANAGE_PY_PATH" is a custom path to your manage.py you want to run and the default is just looking for manage.py in your cwd. Please add your feedback on these two environment variables below.

  • Are these names good?
  • Are the environment variables intuitive even for beginners?
  • Is the default for manage.py script what people would want?
    (If you have feedback on the general json args structure please see the overarching issue)

Design Examples

not using Django at all, regular proposal design of new testing args

{
      "name": "Python: Run Unittest Tests",
      "type": "python-unittest",
      "request": "launch",
      "args": ["-v"],
      "env": {
	"PYTHONPATH": "${workspaceFolder}"
      }
    },

enable Django testing but use the default setup script

{
      "name": "Python: Run Django Tests",
      "type": "python-unittest",
      "request": "launch",
      "args": ["-v"],
      "env": {
	"PYTHONPATH": "${workspaceFolder}",
        "DJANGO_TESTING": "true",
      }
    },

enable Django testing and specify the given setup script

{
      "name": "Python: Run Django Tests",
      "type": "python-unittest",
      "request": "launch",
      "args": ["-v"],
      "env": {
	"PYTHONPATH": "${workspaceFolder}",
        "DJANGO_TESTING": "true",
	"DJANGO_MANAGE_PY_PATH": "/path/to/manage.py",
      }
    },

Implementation

You can see the full draft of how to would look here. We have not finished discovery yet as this draft is for execution. I will outline the general steps we will take below to help you follow along with the draft's design. Feedback on the draft and design is very appreciated.

  1. VS Code Python extension calls execution.py
  2. within execution.py, the env var DJANGO_TESTING_ENABLED is checked
  3. if DJANGO_TESTING_ENABLED is true, run django_execution_runner(start_dir)
  4. in django_execution_runner get "MANAGE_PY_PATH" env var if exists
  5. set manage.py path to the env var if present or default which is cwd + "manage.py"
  6. set custom_test_runner = "django_test_runner.CustomTestRunner"
  7. subprocess.run("") with args python manage_py_path test --testrunner custom_test_runner
  8. in CustomTestRunner set kwargs["resultclass"] = UnittestTestResult
  9. now UnittestTestResult as defined in execution.py will be used for processing results
  10. within UnittestTestResult, see formatResult which is the primary difference, this function also calls a send method to return data to the VS Code Python extension
  11. cleanup happens as normal from manage.py test

Questions

  1. Do I need to use the custom runner / result class for discovery too? What does Django discovery do differently compared to just unittest discovery?
  2. How often are people editing to include their own runners or resultclass? Is this design going to conflict with how many people run Django testing?
  3. Any other concerns / questions?

Conclusion

Thank you for reading this issue and we appreciate all your comments and suggestions! We are always looking for contributions to our repo so please do so if you have any interest or thread here if you don't know where to start. One important step in this process will be writing test cases which we will need many different suggestions on what cases to cover.

@eleanorjboyd eleanorjboyd added area-testing needs proposal Need to make some design decisions labels Oct 12, 2023
@eleanorjboyd eleanorjboyd self-assigned this Oct 12, 2023
@eleanorjboyd eleanorjboyd added the needs community feedback Awaiting community feedback label Oct 12, 2023
@github-actions
Copy link

Thanks for the feature request! We are going to give the community 60 days from when this issue was created to provide 7 👍 upvotes on the opening comment to gauge general interest in this idea. If there's enough upvotes then we will consider this feature request in our future planning. If there's unfortunately not enough upvotes then we will close this issue.

@eleanorjboyd
Copy link
Member Author

Update: I have created a secondary design for how Django testing could be implemented: #22222. This is only experimental at this time but could be an improved solution because it allows the extension to directly call manage.py test exactly like the command line which from talking with @mh-firouzjah is better.

I put this together quickly in hopes of getting a working idea out by Django-Con but please send feedback on if this would work / what changes would need to be made (like if there is a better way than directly editing settings.json).

Thanks!

@eleanorjboyd
Copy link
Member Author

Hello everyone! As I mentioned in my last comment, we have a second design iteration for Django testing in VS Code which is drafted at #22222. I have now updated the body of this proposal to include this new design. The proposal above has new questions and information so please re-review and submit your feedback. For anyone who is just joining this thread, the proposal is up to date, as revised on 11/1/2023 with our new plan for implementation. Thanks!

@brettcannon
Copy link
Member

Thank you to everyone who upvoted this issue! Since the community showed interest in this feature request we will leave this issue open as something to consider implementing at some point in the future.

We do encourage people to continue 👍 the first/opening comment as it helps us prioritize our work based on what the community seems to want the most.

@brettcannon brettcannon added community ask Feature request that the community expressed interest in and removed needs community feedback Awaiting community feedback labels Dec 14, 2023
@mh-firouzjah
Copy link

while enabling Django testing and optionally specifying the given setup script (DJANGO_MANAGE_PY_PATH) it seems a good idea to (optionally) specify the DJANGO_SETTINGS_MODULE (path to settings.py or any alternative module) also this could be derived from the content of manage.py. However, specifying this item will lead to less coding and effort in obtaining the correct value.

Path to manage.py is required for test execution if intending to use manage.py test.

Path to settings.py is required for test discovery if intending to use built-in unittest test discovery utility. this requires importing django and invoking the django.setup() function (5 lines of code), as well as setting the DJANGO_SETTINGS_MODULE as an environment variable.

    {
      "name": "Python: Run Django Tests",
      "type": "python-unittest",
      "request": "launch",
      "args": ["-v"],
      "env": {
	"PYTHONPATH": "${workspaceFolder}",
        "DJANGO_TESTING": "true",
	"DJANGO_MANAGE_PY_PATH": "/path/to/manage.py",
        "DJANGO_SETTINGS_MODULE": "/path/to/settings.py",
      }
    },

@kavdev
Copy link

kavdev commented Feb 2, 2024

For what it's worth, we're using a custom runner that runs all tests in parallel, spins up coverage, lints, and spits out artifact files (happy to share the code if anyone wants it).

Support for custom runners isn't super necessary for our use case. We'd probably start using vs code testing to spot-check individual tests (still very useful) and continue to run python manage.py test to run the full suite to do all the extra stuff before sending up a commit.

@mh-firouzjah
Copy link

well, you're right. python manage.py test will do the job but with a as part of the implementation of the solution to the problem this is also covered. I mean when vscode can find each and every single test, so it is obvious that it found all of them and is able to run all of them 😄. (the run test buttons will be available as per test function, class, module, or the whole project)

@kavdev
Copy link

kavdev commented Feb 3, 2024

I've had issues with the current test functionality not reporting parallel test runs properly (they run in subprocesses). Granted, I was trying to hack things together to try and make things work.

Will the new implementation be able to display test results from parallel runs?

@mh-firouzjah
Copy link

mh-firouzjah commented Feb 3, 2024

Will the new implementation be able to display test results from parallel runs?

python manage.py test --parallel is an option.

@fabietto01
Copy link

Hello everyone, are there any updates on the topic?

@kavdev
Copy link

kavdev commented Jun 3, 2024

Will the new implementation be able to display test results from parallel runs?

python manage.py test --parallel is an option.

When I've tried doing this through hacks in the past, I found that any test result data from other processes didn't make it back to the vscode UI. Fingers crossed that a first-party solution will "just work".

@eleanorjboyd
Copy link
Member Author

eleanorjboyd commented Aug 15, 2024

Hello Everyone! The long requested ask is live!! Django tests are now supported in the VS Code python extension! Currently this is pre-release and would greatly appreciate Django users try this out and give feedback / submit bugs! (there will likely be bugs so I appeciate your patience as we work through them)

Steps:

  1. Change your python extension so you are on the pre-release of the extension
    1. specifically version v2024.13.2024081501 or newer
  2. Add an environment variable called MANAGE_PY_PATH='path-string-to-manange.py-path' - below are steps specifically if you aren’t sure how to do this:
    1. create a .env file
    2. add MANAGE_PY_PATH='path-string-to-manange.py-path' to the file
    3. add the following to your settings.json "python.envFile": "${workspaceFolder}/.env” to pick up this file as part of the environment (edit path as necessary if .env not at root)
  3. Now click on the “beaker” icon and pull up the test explorer panel
  4. If not already selected, select “unittest” as your testing framework (you can also do the following in settings.json "python.testing.unittestEnabled": true,)
  5. From here your tests should be discovered and populated in the tree and then run and debug should work! if this is not working try:
    1. adding "python.testing.unittestArgs": [], maybe the tests just aren’t being found
    2. look at the python output panel- this will show error messages
    3. try from command line to make sure it works, then transfer that exact command to how it needs to be represented in VS Code. For example you have python manage.py test --arg to get the same in VS Code make MANAGE_PY_PATH='./manage.py' "python.testing.unittestArgs": [--arg] and check the output as it should print the final command that is actually run
    4. attempt using absolute path for manage.py if you are using relative to start
  6. Let me know how it went! If it works or doesn’t please let me know so we can get an idea of how the feature is doing
  7. To submit a bug:
    1. try it again but with your logs on verbose, turn this on via “developer: set log level” command in the command palette
    2. use the command palette and select python: report issue.. command, follow steps there and add your logs copied above
    3. either describe your repo structure / tests or include your repo (or it can be a minimal repro if your repo is private) so I can try it myself.

Thank you!

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Sep 16, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
area-testing community ask Feature request that the community expressed interest in needs proposal Need to make some design decisions
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants