-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pytest Discovery Error #21757
Comments
Thank you for your issue report. We are looking into this now! In the meantime, you are likely on the new testing rewrite and this is why you saw a change in behavior. You can opt out of the rewrite as I get this fix in by setting this in your user settings: |
Hi @eleanorjboyd, Thanks for your reply. I've actually just got test discovery and debugging working now by changing There is one outstanding issue which is that running tests without debugging shows
As mentioned in my previous post, I only need to run tests in VS Code to debug so I'm happy with the current situation as I can still run tests from the terminal and have set up a VS Code task to do so. However, in the interest of providing more data to improve this extension I applied your suggestion of setting |
Thanks for the extra info! From your perspective what is the desired experience on the Yes, the test server connected to a client does print every time- I will work on a fix. Finally the issue regarding |
I would like to add I've been having a similar issue reported here for the past few weeks. Thank you @alimbada for finding a workaround. Installing the Little Fox Team extensions worked without changing any of the VS Code settings. @eleanorjboyd Reverting to the previous test adapter allows tests to discovered and run correctly within the test explorer (both sidebar and inline), but switching to the new test adapter results in the "0/0 tests passed" issue mentioned by @alimbada. I am running pytest with a Django project, with a few extensions installed. The 'OUTPUT' tab for Python shows (includes redacted lines, happy to send these privately):
The 'TEST RESULTS' tab shows:
Happy to provide do more tests or provide more information, let me know what you need. I've reverted by to the original test adapter for the moment. |
@eleanorjboyd, yes, I wasn't sure if
I am running normal pytests. The Python Test Log output is below. It is the same output as what I get when running tests in the terminal.
|
I also was hit with the "no tests discovered" issue. Running But when configuring tests via VScode palette command "Python: Configure tests"
Now testing view shows Copying the Anways, adjusting the auto-configured setting seems to work around the problem for me. Tests are now discovered and run as expected.
running in a venv created with |
Hi @MawKKe! It seems that your issue may be different- could you open a new issue and we can converse there? Also can you also see if there is a difference if you are on our new rewrite? You can make sure you are on it by adding this setting to your user You can confirm you have the rewrite enabled by setting Thanks! |
Hi @JackMorganNZ sorry for the delay in responding! Could you send your settings for Also to clarify discovery is working fine but run is not working at all? When you say Thank you for your assistance I appreciate any help you can provide! |
hm very interesting. That block of code is already in a try-catch to handle any key errors but that didn't catch it because it was a different error type. Can you send along if you are using any plugins? Also can you send the machine you are using / any interesting setup there? Thinking it tried to throw a key error but instead threw an internal error because of the system. Thanks |
What do you mean by the other test explorer? |
There are 2 test extensions installed:
If I switch Python extension to pre-release version, the internal error goes away. However, the gutter decorations disappear. If I switch Python extension to v2023.14.0, everything works as expected. I believe @JackMorganNZ and I are having the same issue. |
I started hitting an problem yesterday/today that seems similar to comments on this issue. I'm using a workspace configuration where
However, I've also tried removing all those args and my experience doesn't really improve. I see a lot of this in the Python logs:
Followed by:
The "Python Test Logs" can barely render because there's so much content and it gets cleared/overwritten so quickly, yet what I do see seems to be
My Test Explorer panel seems to be "hiding" (the result of the I can see nothing wrong with my |
Hi, thanks for including this info. When an error occurs when we try and send the data it prints it all out which is why you are seeing so much data and it also retries which causes it to write over the data printed first. What is your setup for your machine? Are you using SSH or anything to connect to a remote machine and what are the machine types? |
Thanks for your response, @eleanorjboyd!
I am not using SSH for anything. Remote machines are not involved. I'm running on a MacOS 14 Apple Silicon machine using the arm64 build of VSCode. Are there any other machine details that would be helpful? Perhaps also relevant: In my setup, each workspace folder that runs pytest is relying on a relative
One other note: I have known, but wasn't previously concerned about, missing packages in some of my workspace folder virtual environments. As a test, I started installing those missing dependencies and more and more test discovery results seemed to populate the Test Explorer. This, again, makes me suspicious that there's some unintended interdependence between spawned process results for multiple workspace folders. I would expect the outcome of test discovery for one workspace folder to never affect the outcome of another. Yet the pattern of test enumeration, then disappearance accompanied with the observation that the results improve as exceptions are addressed is rather suggestive. |
seems to be similar to: #22192 (comment). The UUID is shown as not defined which means the port would likely not be defined either. What version of the python extension are you on @tboddyspargo? |
@pnelson-bto, our team only supports the core functionality of the Python extension and not any other extensions that you can install such as Little Fox Team. If you are seeing any issues with the default text explorer then I can help you there otherwise I would reach out to the Little Fox Team specifically as they can help with issues on their extension. Thanks! |
I'm on
Further experimentation down this path resulted in me resolving all test discovery issues (e.g. exceptions for missing imports, non-zero exit codes for no tests discovered, etc.) in all workspace folders. That results in all tests enumerating and populating the Test Explorer page as expected without disappearing. Again, a big contributor to the impact of the problem seems to be: discovery errors in workspace folderB clearing the successfully enumerated tests from workspace folderA. |
hm yes- I am seeing how this would happen. It is written that the test explorer is cleared when the Python subprocess exists with a non-0 exit code which happens when this other issue arises. Will put in a fix for this |
Hello! You're experiencing an issue with the pytest hooks coming from our extension implementation. Do you have any other plugins you are using? They might be conflicting with each other. Otherwise I think #22240 will fix it. Once that is merged if you could try it with the newest version of the python extension tomorrow that would be extremely helpful! Thanks |
Thanks for the super-quick fix, will definitely try it out! |
I was having this issue. Switching to pre-release fixed this for me. However, when the tests run I no longer get the green "tick" despite my tests all passing. Red circle shows as normal when test fails. |
mmm I can't seem to get it working yet despite using the pre-release version (currently using |
@MaxHorwood can you send over your logs and a description of your workspace? |
@msonsona can you send over a minimal repro of what is failing? I might not have gotten your configuration right and therefore not fixed the root cause. Also do you have any plugins in use for pytest and is the error your experiencing now still the same? Sorry for all the questions! |
not sure how to check the plugins for pytest, but if I run the test discovery command directly from the command line without any additional argument, it runs properly the issue I see in my environment, is that VS Code is trying to discover the tests using the
but it's not present in my environment:
if I try to |
I don't want to distract from any of the other repros, but I did just identify a case where we had a |
the plugin vscode_pytest is a custom plugin we wrote to collect all discovery / run results and return them to our extension via a socket. This is required for the implementation of our new rewrite. The plugin is located inside the extension we bundle and add the location to the python path before run which is why we can find the plugin when we run it and you cannot when you run it locally. You can add the path to PYTHONPATH before running from the command line then it should work. The path to add is: the root of your extension + With this being said it seems like there is not a problem with your configuration (since it works normally from the command line). The next step is for me to try and repro your behavior or understand what your workspace looks like. What version of pytest are you using and are you able to make a minimal repro to send my way? This error is in the hook implementation I have for our plugin and it not aligning with what pytest expects so I am trying to figure out where that mismatch could have occurred in your configuration. Thanks! |
sorry for the ignorance, what would be the best way to provide this? I might be using a quite old version of pytest, to be honest, this is what I get inquiring the version:
|
@eleanorjboyd The logs don't show much. Just as I'd expect, example below is just one test but is the same as running all. Pytest Logs``` CLIENT: Server listening on port 57459... Received JSON data in run script ============================= test session starts ============================== platform darwin -- Python 3.10.6, pytest-7.2.1, pluggy-1.0.0 rootdir: /...., configfile: pyproject.toml plugins: asyncio-0.20.3, flaky-3.7.0, Faker-17.0.0, cov-4.0.0 asyncio: mode=auto collected 1 itemsource/tests/unit/test_connection.py . [100%] =============================== warnings summary =============================== -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
|
@msonsona, could you try running In terms of a minimal repro this is just a small example project that replicates the bug. If you are happy to share your current project you could send that over so I can try it myself but since some people do not like sharing their whole projects a minimal repro could be a small project you create where you still see the bugs. For example if you create a new workspace, create a simple pytest, and run it does the bug still exist? If so then send over that simple pytest so we are talking about running the same code. Thanks! |
@MaxHorwood we have done many size tests (with repos >20,000 tests) to try and test against any size issues so hopefully that's not the issue. It seems like your pytest works right and doesn't error so it might be on the vscode parsing side. Can you send the "python" output logs? This will show what is happening once vscode receives the data and why you would "no longer get the green "tick" despite my tests all passing". Specifically there will be something printing a payload and i am curious for the tests you expect to be successful what that says |
@eleanorjboyd 👍
|
hm this is not what I expected to see. I have just added some extra logging which will be on the pre-release of the extension tomorrow. Would you be able to try it again tomorrow and send the same logs? You should see more logs regarding data received. |
Sorry for the delay, this is now working as expected... I don't believe I've changed anything on my end, maybe it was sleeping the computer (which I do regularly and not shutdown - but I did also have to reload vscode when switching to the pre-release anyway so that's a bit odd) Switching back to release version is still broken. |
After upgrading pytest to
and now it seems to be able to discover and run tests! thanks @eleanorjboyd for your support! 🙌 |
Hi @MaxHorwood, glad it now works! That's weird it happened unexpectedly but let me know if it resurfaces. @msonsona glad that upgrade worked as well! Are you not sure about the feasibility at your company you are saying as opposed to just on your own machine? Let me know if you do have any other questions, I am closing this as resolved in the meantime. Thanks! |
Type: Bug
Behaviour
Expected vs. Actual
Expected: Discover, run and debug tests
Actual: Python3 error (see output from Output panel below).
Steps to reproduce:
Everything works fine from the terminal using the same command (
python -m pytest --collect-only tests
) from the samecwd
(./backend/
). Tests also run fine usingpytest
,pytest -v
,python -m pytest
andpython -m pytest tests
The project structure is as follows:
The diagnostic data below shows virtual environment as
Global
despite the fact that I am using a virtual environment. I have since changed the value ofpython.defaultInterpreterPath
tobackend/venv/bin/python
and the virtual environment is now being detected and I have changed the interpreter path to use it. However, this has made no difference to test discovery.Additionally, I'd like to add that as a workaround I have installed the Python Test Explorer extension by the Little Fox Team as this at least allows me to debug tests which is the whole reason for me wanting to run tests from VS Code. I can debug tests using the inline buttons that the aforementioned extension shows in the editor. The test explorer itself is still broken.
Diagnostic data
python.languageServer
setting: DefaultOutput for
Python
in theOutput
panel (View
→Output
, change the drop-down the upper-right of theOutput
panel toPython
)User Settings
Extension version: 2023.14.0
VS Code version: Code 1.80.2 (2ccd690cbff1569e4a83d7c43d45101f817401dc, 2023-07-27T20:57:59.134Z)
OS version: Darwin arm64 22.6.0
Modes:
System Info
canvas_oop_rasterization: disabled_off
direct_rendering_display_compositor: disabled_off_ok
gpu_compositing: enabled
metal: disabled_off
multiple_raster_threads: enabled_on
opengl: enabled_on
rasterization: enabled
raw_draw: disabled_off_ok
video_decode: enabled
video_encode: enabled
vulkan: disabled_off
webgl: enabled
webgl2: enabled
webgpu: enabled
A/B Experiments
The text was updated successfully, but these errors were encountered: