Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pytest Discovery Error #21757

Closed
alimbada opened this issue Aug 4, 2023 · 42 comments
Closed

Pytest Discovery Error #21757

alimbada opened this issue Aug 4, 2023 · 42 comments
Assignees
Labels
area-testing info-needed Issue requires more information from poster triage-needed Needs assignment to the proper sub-team

Comments

@alimbada
Copy link

alimbada commented Aug 4, 2023

Type: Bug

Behaviour

Expected vs. Actual

Expected: Discover, run and debug tests
Actual: Python3 error (see output from Output panel below).

Steps to reproduce:

  1. Try to discover/run/debug tests

Everything works fine from the terminal using the same command (python -m pytest --collect-only tests) from the same cwd (./backend/). Tests also run fine using pytest, pytest -v, python -m pytest and python -m pytest tests

The project structure is as follows:

<PROJECT_ROOT>
  |-backend
  |  |-.pytest_cache
  |  |-tests
  |  |-models
  |  |-__pycache__
  |  |-venv
  |  |-data
  |  |-services
  |-frontend
  |-.git
  |-.vscode

The diagnostic data below shows virtual environment as Global despite the fact that I am using a virtual environment. I have since changed the value of python.defaultInterpreterPath to backend/venv/bin/python and the virtual environment is now being detected and I have changed the interpreter path to use it. However, this has made no difference to test discovery.

Additionally, I'd like to add that as a workaround I have installed the Python Test Explorer extension by the Little Fox Team as this at least allows me to debug tests which is the whole reason for me wanting to run tests from VS Code. I can debug tests using the inline buttons that the aforementioned extension shows in the editor. The test explorer itself is still broken.

Diagnostic data

  • Python version (& distribution if applicable, e.g. Anaconda): 3.11.4
  • Type of virtual environment used (e.g. conda, venv, virtualenv, etc.): Global
  • Value of the python.languageServer setting: Default
Output for Python in the Output panel (ViewOutput, change the drop-down the upper-right of the Output panel to Python)

2023-08-04 12:46:15.980 [error] pytest test discovery error
 [Error: spawn /opt/homebrew/bin/python3 ENOENT
	at ChildProcess._handle.onexit (node:internal/child_process:285:19)
	at onErrorNT (node:internal/child_process:506:16)
	at process.processTicksAndRejections (node:internal/process/task_queues:83:21)] {
  errno: -2,
  code: 'ENOENT',
  syscall: 'spawn /opt/homebrew/bin/python3',
  path: '/opt/homebrew/bin/python3',
  spawnargs: [ '-m', 'pytest', '-p', 'vscode_pytest', '--collect-only', 'tests' ]
}

User Settings


languageServer: "Pylance"

testing
• cwd: "backend"
• pytestArgs: ["tests"]
• pytestEnabled: true

Extension version: 2023.14.0
VS Code version: Code 1.80.2 (2ccd690cbff1569e4a83d7c43d45101f817401dc, 2023-07-27T20:57:59.134Z)
OS version: Darwin arm64 22.6.0
Modes:

System Info
Item Value
CPUs Apple M1 Pro (8 x 24)
GPU Status 2d_canvas: enabled
canvas_oop_rasterization: disabled_off
direct_rendering_display_compositor: disabled_off_ok
gpu_compositing: enabled
metal: disabled_off
multiple_raster_threads: enabled_on
opengl: enabled_on
rasterization: enabled
raw_draw: disabled_off_ok
video_decode: enabled
video_encode: enabled
vulkan: disabled_off
webgl: enabled
webgl2: enabled
webgpu: enabled
Load (avg) 4, 6, 7
Memory (System) 16.00GB (0.19GB free)
Process Argv --crash-reporter-id 9dabde45-0274-4158-bbba-eb1f02f0cefb
Screen Reader no
VM 0%
A/B Experiments
vsliv368cf:30146710
vsreu685:30147344
python383:30185418
vspor879:30202332
vspor708:30202333
vspor363:30204092
vslsvsres303:30308271
vserr242cf:30382550
pythontb:30283811
vsjup518:30340749
pythonptprofiler:30281270
vsdfh931:30280409
vshan820:30294714
vstes263cf:30335440
vscod805:30301674
binariesv615:30325510
bridge0708:30335490
bridge0723:30353136
vsaa593cf:30376535
pythonvs932:30410667
py29gd2263:30792226
vscaac:30438847
vsclangdf:30486550
c4g48928:30535728
dsvsc012:30540252
pynewext54:30695312
azure-dev_surveyone:30548225
vsccc:30803844
282f8724:30602487
f6dab269:30613381
showlangstatbar:30737416
vsctsb:30748421
03d35959:30757346
pythonfmttext:30731395
pythoncmv:30756943
fixshowwlkth:30771522
showindicator:30805244
pythongtdpath:30769146
i26e3531:30792625
gsofa:30804715
pythonnosmt12:30797651
pythonidxptcf:30805731
pythonnoceb:30805159
e537b577:30795824
dsvsc013:30795093
dsvsc014:30804076

@github-actions github-actions bot added the triage-needed Needs assignment to the proper sub-team label Aug 4, 2023
@eleanorjboyd
Copy link
Member

Thank you for your issue report. We are looking into this now! In the meantime, you are likely on the new testing rewrite and this is why you saw a change in behavior. You can opt out of the rewrite as I get this fix in by setting this in your user settings: "python.experiments.optOutFrom": ["pythonTestAdapter"],. If this doesn't work let me know as this will mean it is a different issue. Thank you and I will send updates in this thread as I get a fix in.

@eleanorjboyd eleanorjboyd self-assigned this Aug 4, 2023
@github-actions github-actions bot added the info-needed Issue requires more information from poster label Aug 4, 2023
@alimbada
Copy link
Author

alimbada commented Aug 4, 2023

Hi @eleanorjboyd,

Thanks for your reply. I've actually just got test discovery and debugging working now by changing python.testing.cwd to ${workspaceFolder}/backend instead of just backend. I thought I'd tried this before but maybe the combination of setting python.defaultInterpreterPath to backend/venv/bin/python along with setting python.testing.cwd fixed it. Previously, I was unsure if the ${workspaceFolder} placeholder would work as it did not show up as an autocomplete option when changing that setting but a suggestion from a StackOverflow comment prompted me to try again and this time it worked.

There is one outstanding issue which is that running tests without debugging shows 0/0 tests passed 0.00% in the Test Explorer and shows the following in the output (with one client connection line per test):

2023-08-04 17:20:56.782 [info] > ./backend/venv/bin/python ~/.vscode/extensions/ms-python.python-2023.14.0/pythonFiles/vscode_pytest/run_pytest_script.py --rootdir .
2023-08-04 17:20:56.782 [info] cwd: ./backend
2023-08-04 17:20:57.547 [info] Test server connected to a client.
... <SNIP> ...
2023-08-04 17:20:57.962 [info] Test server connected to a client.
2023-08-04 17:20:57.996 [info] Client disconnected

As mentioned in my previous post, I only need to run tests in VS Code to debug so I'm happy with the current situation as I can still run tests from the terminal and have set up a VS Code task to do so.

However, in the interest of providing more data to improve this extension I applied your suggestion of setting python.experiments.optOutFrom but this did not change the behaviour of running without debugging.

@github-actions github-actions bot removed the info-needed Issue requires more information from poster label Aug 4, 2023
@eleanorjboyd
Copy link
Member

Thanks for the extra info! From your perspective what is the desired experience on the python.testing.cwd? Did you find that confusing that it required you to provide the workspace folder? Just curious about your thoughts.

Yes, the test server connected to a client does print every time- I will work on a fix.

Finally the issue regarding 0/0 tests passing is interesting. What type of tests are you running? Like normal pytests or any unittests / subtests? If you could maybe provide the output from your test run from the "python test logs" this might help me determine why the display isn't right. Thanks

@github-actions github-actions bot added the info-needed Issue requires more information from poster label Aug 4, 2023
@JackMorganNZ
Copy link

JackMorganNZ commented Aug 8, 2023

I would like to add I've been having a similar issue reported here for the past few weeks. Thank you @alimbada for finding a workaround. Installing the Little Fox Team extensions worked without changing any of the VS Code settings.

@eleanorjboyd Reverting to the previous test adapter allows tests to discovered and run correctly within the test explorer (both sidebar and inline), but switching to the new test adapter results in the "0/0 tests passed" issue mentioned by @alimbada. I am running pytest with a Django project, with a few extensions installed.

The 'OUTPUT' tab for Python shows (includes redacted lines, happy to send these privately):

2023-08-08 22:16:44.514 [info] Running PYTEST execution for the following test ids: X/X/X.py::XTest::test_X (more tests here, shortened for post)
2023-08-08 22:16:44.515 [info] Server listening on port 49234
2023-08-08 22:16:44.518 [info] Running pytests with arguments: /Users/X/.vscode/extensions/ms-python.python-2023.15.12191008/pythonFiles/vscode_pytest/run_pytest_script.py --rootdir /Users/X/X
2023-08-08 22:16:44.518 [info] > ./X/X/.venv/bin/python ~/.vscode/extensions/ms-python.python-2023.15.12191008/pythonFiles/vscode_pytest/run_pytest_script.py --rootdir .
2023-08-08 22:16:44.518 [info] cwd: ./X/X/
2023-08-08 22:16:52.293 [info] Test server connected to a client.
2023-08-08 22:16:52.334 [info] Test server connected to a client.
2023-08-08 22:16:52.404 [info] Test server connected to a client.
2023-08-08 22:16:52.479 [info] Test server connected to a client.
2023-08-08 22:16:52.526 [info] Test server connected to a client.
2023-08-08 22:16:52.846 [info] Test server connected to a client.
2023-08-08 22:16:52.970 [info] Test server connected to a client.
2023-08-08 22:16:53.064 [info] Client disconnected

The 'TEST RESULTS' tab shows:

Finished running tests!

Happy to provide do more tests or provide more information, let me know what you need. I've reverted by to the original test adapter for the moment.

@alimbada
Copy link
Author

Thanks for the extra info! From your perspective what is the desired experience on the python.testing.cwd? Did you find that confusing that it required you to provide the workspace folder? Just curious about your thoughts.

@eleanorjboyd, yes, I wasn't sure if ${workspaceFolder} needed to be part of the cwd as usually the autocomplete suggests the placeholder variable but in this case it did not.

Yes, the test server connected to a client does print every time- I will work on a fix.

Finally the issue regarding 0/0 tests passing is interesting. What type of tests are you running? Like normal pytests or any unittests / subtests? If you could maybe provide the output from your test run from the "python test logs" this might help me determine why the display isn't right. Thanks

I am running normal pytests. The Python Test Log output is below. It is the same output as what I get when running tests in the terminal.

LIENT: Server listening on port 59928...
Received JSON data: [<snipped list of tests>]
============================= test session starts ==============================
platform darwin -- Python 3.11.4, pytest-7.2.2, pluggy-1.2.0
rootdir: <snipped_project_dir>, configfile: backend/pytest.ini
plugins: md-report-0.3.0, mock-3.10.0, anyio-3.7.1, cov-4.0.0
collected 18 items

tests/<snipped>.py ....                                   [ 22%]
tests/<snipped>.py .                                                       [ 27%]
tests/<snipped>.py .                                          [ 33%]
tests/<snipped>.py ...                                       [ 50%]
tests/<snipped>.py ..                                             [ 61%]
tests/<snipped>.py .                                            [ 66%]
tests/<snipped>.py .....                                 [ 94%]
tests/<snipped>.py .                                           [100%]pytest session has finished, exit status:  0 in discovery?  False


============================== 18 passed in 0.98s ==============================

@github-actions github-actions bot removed the info-needed Issue requires more information from poster label Aug 10, 2023
@MawKKe
Copy link

MawKKe commented Sep 13, 2023

I also was hit with the "no tests discovered" issue.

Running pytest mymodule in terminal finds and executes tests as expected.

But when configuring tests via VScode palette command "Python: Configure tests"

  • -> select test framework: pytest
  • -> select the directory containing tests: mymodule

Now testing view shows mymodule, but lists 0/0 tests. Pressing the run button does something ("Finished running tests!") but no test cases are actually run.

Copying the .../run_adapter.py discover ...from Python log and running it manually produces JSON output that does mention the test files and cases. So it seems the underlying test discovery mechanism works but the information then just doesn't reach the UI.

Anways, adjusting the auto-configured setting
"python.testing.pyTestArgs": "mymodule"
with
"python.testing.pyTestArgs": "${workspaceFolder}/mymodule"

seems to work around the problem for me. Tests are now discovered and run as expected.


Vscode: 1.82
vscode-python:  v2023.16.0
pytest: 7.4.2

running in a venv created with pdm(version 2.9.2).

@eleanorjboyd
Copy link
Member

Hi @MawKKe! It seems that your issue may be different- could you open a new issue and we can converse there? Also can you also see if there is a difference if you are on our new rewrite? You can make sure you are on it by adding this setting to your user settings.json "python.experiments.optInto": ["pythonTestAdapter"].

You can confirm you have the rewrite enabled by setting "python.analysis.logLevel": "Trace", in your user settings then check for Experiment 'pythonTestAdapter' is active in your python logs.

Thanks!

@github-actions github-actions bot added the info-needed Issue requires more information from poster label Sep 13, 2023
@eleanorjboyd
Copy link
Member

Hi @JackMorganNZ sorry for the delay in responding!

Could you send your settings for python.testing.pytestargs and an example of the test files you are running? If you can't share the exact tests you are running a minimal repro or a description of what these tests look like would be great.

Also to clarify discovery is working fine but run is not working at all? When you say "0/0 tests passed" no tests data is being returned at all right?

Thank you for your assistance I appreciate any help you can provide!

@pnelson-bto
Copy link

pnelson-bto commented Oct 10, 2023

I am getting this in my Python Test Output. This was working as expected until yesterday when it seems that VSCode may have updated.
image

@eleanorjboyd
Copy link
Member

hm very interesting. That block of code is already in a try-catch to handle any key errors but that didn't catch it because it was a different error type. Can you send along if you are using any plugins? Also can you send the machine you are using / any interesting setup there? Thinking it tried to throw a key error but instead threw an internal error because of the system. Thanks

@pnelson-bto
Copy link

I'm on Ubuntu. The other test explorer seems to work fine, but the one by Little Fox Team does not. I prefer that one because of the tree view.

image

@eleanorjboyd
Copy link
Member

What do you mean by the other test explorer?

@pnelson-bto
Copy link

pnelson-bto commented Oct 10, 2023

There are 2 test extensions installed:

  • Python Test Explorer for Visual Studio Code by Little Fox Team
  • Test Explorer UI by Holger Benl

If I switch Python extension to pre-release version, the internal error goes away. However, the gutter decorations disappear.
Both v2023.16.0 and v2023.18.0 have the same issue.

If I switch Python extension to v2023.14.0, everything works as expected.

I believe @JackMorganNZ and I are having the same issue.

@tboddyspargo
Copy link

tboddyspargo commented Oct 11, 2023

I started hitting an problem yesterday/today that seems similar to comments on this issue. I'm using a workspace configuration where python.testing.pytestArgs is defined in the code-workspace file and each folder inherits from it.

"python.testing.pytestArgs": [
      "--disable-warnings",
      "-vv", // Shows full diff on test failure
      "-s", // Shows print statements in test output
    ],

However, I've also tried removing all those args and my experience doesn't really improve.

I see a lot of this in the Python logs:

2023-10-11 10:25:19.336 [error] pytest test discovery error 
  
 The python test process was terminated before it could exit on its own, the process errored with: Code: 4, Signal: null

Followed by:

Disposing data receiver for /Users/me/workspace/folder and deleting UUID; pytest discovery.

The "Python Test Logs" can barely render because there's so much content and it gets cleared/overwritten so quickly, yet what I do see seems to be "status": "success" messages that are listing all the discovered tests, followed by:

Plugin error connection error[vscode-pytest]
[vscode-pytest] data: Content-Length: 13537
Content-Type: application/json
Request-uuid: None

NOTE: On VSCode startup, it actually seems like multiple discovery runs are successful before some failure triggers the removal of all preceding results...

My Test Explorer panel seems to be "hiding" (the result of the deleting UUID lines, no doubt) the tests that I expect to be discovered.

I can see nothing wrong with my pytestArgs that would cause the 4 exit code...

@eleanorjboyd
Copy link
Member

Hi, thanks for including this info. When an error occurs when we try and send the data it prints it all out which is why you are seeing so much data and it also retries which causes it to write over the data printed first.

What is your setup for your machine? Are you using SSH or anything to connect to a remote machine and what are the machine types?

@tboddyspargo
Copy link

Thanks for your response, @eleanorjboyd!

What is your setup for your machine? Are you using SSH or anything to connect to a remote machine and what are the machine types?

I am not using SSH for anything. Remote machines are not involved. I'm running on a MacOS 14 Apple Silicon machine using the arm64 build of VSCode. Are there any other machine details that would be helpful?

Perhaps also relevant: In my setup, each workspace folder that runs pytest is relying on a relative .vscode/settings.json file that defines its python.defaultInterpreterPath as .venv/bin/python (meaning, each workspace folder has its own local virtual environment that should be used for running test discovery). From my brief interpretation of the logs, I believe this is happening as expected.

2023-10-11 12:36:10.174 [info] > ~/workspace/folder/.venv/bin/python -m pytest -p vscode_pytest --collect-only --disable-warnings -vv -s
2023-10-11 12:36:10.174 [info] cwd: ~/workspace/folder

One other note: I have known, but wasn't previously concerned about, missing packages in some of my workspace folder virtual environments. As a test, I started installing those missing dependencies and more and more test discovery results seemed to populate the Test Explorer.

This, again, makes me suspicious that there's some unintended interdependence between spawned process results for multiple workspace folders. I would expect the outcome of test discovery for one workspace folder to never affect the outcome of another. Yet the pattern of test enumeration, then disappearance accompanied with the observation that the results improve as exceptions are addressed is rather suggestive.

@eleanorjboyd
Copy link
Member

seems to be similar to: #22192 (comment). The UUID is shown as not defined which means the port would likely not be defined either. What version of the python extension are you on @tboddyspargo?

@eleanorjboyd
Copy link
Member

@pnelson-bto, our team only supports the core functionality of the Python extension and not any other extensions that you can install such as Little Fox Team. If you are seeing any issues with the default text explorer then I can help you there otherwise I would reach out to the Little Fox Team specifically as they can help with issues on their extension. Thanks!

@tboddyspargo
Copy link

What version of the python extension are you on @tboddyspargo?

I'm on ms-python.python version v2023.18.0

One other note: I have known, but wasn't previously concerned about, missing packages in some of my workspace folder virtual environments. As a test, I started installing those missing dependencies and more and more test discovery results seemed to populate the Test Explorer.

Further experimentation down this path resulted in me resolving all test discovery issues (e.g. exceptions for missing imports, non-zero exit codes for no tests discovered, etc.) in all workspace folders. That results in all tests enumerating and populating the Test Explorer page as expected without disappearing. Again, a big contributor to the impact of the problem seems to be: discovery errors in workspace folderB clearing the successfully enumerated tests from workspace folderA.

@github-actions github-actions bot removed the info-needed Issue requires more information from poster label Oct 12, 2023
@eleanorjboyd
Copy link
Member

hm yes- I am seeing how this would happen. It is written that the test explorer is cleared when the Python subprocess exists with a non-0 exit code which happens when this other issue arises. Will put in a fix for this

@github-actions github-actions bot added the info-needed Issue requires more information from poster label Oct 12, 2023
@eleanorjboyd
Copy link
Member

Hello! You're experiencing an issue with the pytest hooks coming from our extension implementation. Do you have any other plugins you are using? They might be conflicting with each other. Otherwise I think #22240 will fix it. Once that is merged if you could try it with the newest version of the python extension tomorrow that would be extremely helpful! Thanks

eleanorjboyd added a commit that referenced this issue Oct 17, 2023
@msonsona
Copy link

Thanks for the super-quick fix, will definitely try it out!

@MaxHorwood
Copy link

I was having this issue. Switching to pre-release fixed this for me. However, when the tests run I no longer get the green "tick" despite my tests all passing. Red circle shows as normal when test fails.

@msonsona
Copy link

mmm I can't seem to get it working yet despite using the pre-release version (currently using v2023.19.12901009)

@eleanorjboyd
Copy link
Member

@MaxHorwood can you send over your logs and a description of your workspace?

@eleanorjboyd
Copy link
Member

@msonsona can you send over a minimal repro of what is failing? I might not have gotten your configuration right and therefore not fixed the root cause. Also do you have any plugins in use for pytest and is the error your experiencing now still the same? Sorry for all the questions!

@msonsona
Copy link

not sure how to check the plugins for pytest, but if I run the test discovery command directly from the command line without any additional argument, it runs properly

the issue I see in my environment, is that VS Code is trying to discover the tests using the vscode_pytest plugin:

2023-10-18 18:02:41.987 [info] > . ./env/bin/activate && echo 'e8b39361-0157-4923-80e1-22d70d46dee6' && python ~/.vscode/extensions/ms-python.python-2023.19.12901009/pythonFiles/printEnvVariables.py
2023-10-18 18:02:41.988 [info] shell: bash
2023-10-18 18:02:42.087 [info] > ./env/bin/python -m pytest -p vscode_pytest --collect-only prime_test
2023-10-18 18:02:42.087 [info] cwd: ./prime_test
2023-10-18 18:02:42.446 [error] Traceback (most recent call last):
...

but it's not present in my environment:

$ ./env/bin/python -m pytest -p vscode_pytest --collect-only prime_test
Traceback (most recent call last):
  File "/Users/me/code/repo/env/lib/python3.8/site-packages/_pytest/config/__init__.py", line 522, in import_plugin
    __import__(importspec)
ModuleNotFoundError: No module named 'vscode_pytest'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/homebrew/opt/[email protected]/Frameworks/Python.framework/Versions/3.8/lib/python3.8/runpy.py", line 194, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/opt/homebrew/opt/[email protected]/Frameworks/Python.framework/Versions/3.8/lib/python3.8/runpy.py", line 87, in _run_code
    exec(code, run_globals)
...
ImportError: Error importing plugin "vscode_pytest": No module named 'vscode_pytest'

if I try to pip install it won't work, so not sure if that flag can be disabled and run the test discovery without it?

@github-actions github-actions bot removed the info-needed Issue requires more information from poster label Oct 18, 2023
@tboddyspargo
Copy link

tboddyspargo commented Oct 18, 2023

I don't want to distract from any of the other repros, but I did just identify a case where we had a pytest hook that was running os.environ.clear() (even during discovery, apparently). This was a bad practice (and we've removed it), but it did result in VSCode discovery failure (until it was removed). Presumably this is because discovery success/completion depends on specific environment variables (e.g. TEST_UUID, TEST_PORT, etc.) being present even after tests run.

@eleanorjboyd
Copy link
Member

the plugin vscode_pytest is a custom plugin we wrote to collect all discovery / run results and return them to our extension via a socket. This is required for the implementation of our new rewrite. The plugin is located inside the extension we bundle and add the location to the python path before run which is why we can find the plugin when we run it and you cannot when you run it locally. You can add the path to PYTHONPATH before running from the command line then it should work. The path to add is: the root of your extension + pythonFiles. There you will see a folder called vscode_pytest which is the plugin we created which python will now be able to find.

With this being said it seems like there is not a problem with your configuration (since it works normally from the command line). The next step is for me to try and repro your behavior or understand what your workspace looks like. What version of pytest are you using and are you able to make a minimal repro to send my way?

This error is in the hook implementation I have for our plugin and it not aligning with what pytest expects so I am trying to figure out where that mismatch could have occurred in your configuration. Thanks!

@github-actions github-actions bot added the info-needed Issue requires more information from poster label Oct 18, 2023
@msonsona
Copy link

are you able to make a minimal repro to send my way?

sorry for the ignorance, what would be the best way to provide this?

I might be using a quite old version of pytest, to be honest, this is what I get inquiring the version:

$ pytest --version
This is pytest version 3.10.1, imported from /Users/me/code/repo/env/lib/python3.8/site-packages/pytest.py
setuptools registered plugins:
  celery-4.4.0 at /Users/me/code/repo/env/lib/python3.8/site-packages/celery/contrib/pytest.py
  pytest-cov-2.6.1 at /Users/me/code/repo/env/lib/python3.8/site-packages/pytest_cov/plugin.py
  pytest-profiling-1.6.0 at /Users/me/code/repo/env/lib/python3.8/site-packages/pytest_profiling.py
  pytest-benchmark-3.2.2 at /Users/me/code/repo/env/lib/python3.8/site-packages/pytest_benchmark/plugin.py
  pytest-split-0.1.5 at /Users/me/code/repo/env/lib/python3.8/site-packages/pytest_split/plugin.py

@github-actions github-actions bot removed the info-needed Issue requires more information from poster label Oct 18, 2023
@MaxHorwood
Copy link

MaxHorwood commented Oct 18, 2023

@MaxHorwood can you send over your logs and a description of your workspace?

@eleanorjboyd The logs don't show much. Just as I'd expect, example below is just one test but is the same as running all.
Workspace is a docker-compose which spins up a database, runs some migrations etc. Tests connect to that. There are >500 could it just be quantity?
I have another separate project/workspace which all works as expected.
(These may be the wrong logs happy to give any others)

Pytest Logs ``` CLIENT: Server listening on port 57459... Received JSON data in run script ============================= test session starts ============================== platform darwin -- Python 3.10.6, pytest-7.2.1, pluggy-1.0.0 rootdir: /...., configfile: pyproject.toml plugins: asyncio-0.20.3, flaky-3.7.0, Faker-17.0.0, cov-4.0.0 asyncio: mode=auto collected 1 item

source/tests/unit/test_connection.py . [100%]

=============================== warnings summary ===============================
source/.... MovedIn20Warning: Deprecated API features detected! These feature(s) are not compatible with SQLAlchemy 2.0. To prevent incompatible upgrades prior to updating applications, ensure requirements files are pinned to "sqlalchemy<2.0". Set environment variable SQLALCHEMY_WARN_20=1 to show all deprecation warnings. Set environment variable SQLALCHEMY_SILENCE_UBER_WARNING=1 to silence this message. (Background on SQLAlchemy 2.0 at: https://sqlalche.me/e/b8d9)
Base = declarative_base(

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
========================= 1 passed, 1 warning in 0.53s =========================
Finished running tests!

</summary>

@eleanorjboyd
Copy link
Member

@msonsona, could you try running pip install --upgrade pytest then running again? We are on pytest 7.4.2 now so I am not sure if there has been changes to this specific hook but that could be an issue.

In terms of a minimal repro this is just a small example project that replicates the bug. If you are happy to share your current project you could send that over so I can try it myself but since some people do not like sharing their whole projects a minimal repro could be a small project you create where you still see the bugs. For example if you create a new workspace, create a simple pytest, and run it does the bug still exist? If so then send over that simple pytest so we are talking about running the same code. Thanks!

@github-actions github-actions bot added the info-needed Issue requires more information from poster label Oct 18, 2023
@eleanorjboyd
Copy link
Member

@MaxHorwood we have done many size tests (with repos >20,000 tests) to try and test against any size issues so hopefully that's not the issue. It seems like your pytest works right and doesn't error so it might be on the vscode parsing side. Can you send the "python" output logs? This will show what is happening once vscode receives the data and why you would "no longer get the green "tick" despite my tests all passing". Specifically there will be something printing a payload and i am curious for the tests you expect to be successful what that says

@MaxHorwood
Copy link

@eleanorjboyd 👍
This?

2023-10-18 21:54:31.627 [info] > . ./.venv/bin/activate && echo 'e8b39361-0157-4923-80e1-22d70d46dee6' && python ~/.vscode/extensions/ms-python.python-2023.19.12901009/pythonFiles/printEnvVariables.py
2023-10-18 21:54:31.627 [info] shell: bash
2023-10-18 21:54:31.816 [info] Server listening on port 58685
2023-10-18 21:54:31.816 [info] All environment variables set for pytest execution: {"COMMAND_MODE":"unix2003","CPPFLAGS":"-I/opt/homebrew/opt/postgresql@13/include","DEFAULT_USER":"myname","DOCKER_DEFAULT_PLATFORM":"linux/amd64/v8","HOME":"/Users/myname","HOMEBREW_CELLAR":"/opt/homebrew/Cellar","HOMEBREW_PREFIX":"/opt/homebrew","HOMEBREW_REPOSITORY":"/opt/homebrew","INFOPATH":"/opt/homebrew/share/info:","LDFLAGS":"-L/opt/homebrew/opt/postgresql@13/lib","LESS":"-R","LOGNAME":"myname","LSCOLORS":"Gxfxcxdxbxegedabagacad","LS_COLORS":"di=1;36:ln=35:so=32:pi=33:ex=31:bd=34;46:cd=34;43:su=30;41:sg=30;46:tw=30;42:ow=30;43","MANPATH":"/opt/homebrew/share/man::","MallocNanoZone":"0","OLDPWD":"/","ORIGINAL_XDG_CURRENT_DESKTOP":"undefined","PAGER":"less","PIPENV_PYTHON":"/Users/myname/.pyenv/shims/python","PWD":"/","PYENV_ROOT":"/Users/myname/.pyenv","PYENV_SHELL":"zsh","PYENV_VIRTUALENV_INIT":"1","SHELL":"/bin/zsh","SHLVL":"0","SSH_AUTH_SOCK":"/private/tmp/com.apple.launchd.ESlJzGokBS/Listeners","TMPDIR":"/var/folders/rl/4h85cf_12gb2_zp3rr2lrq7c0000gn/T/","USER":"myname","VSCODE_AMD_ENTRYPOINT":"vs/workbench/api/node/extensionHostProcess","VSCODE_CODE_CACHE_PATH":"/Users/myname/Library/Application Support/Code/CachedData/fdb98833154679dbaa7af67a5a29fe19e55c2b73","VSCODE_CRASH_REPORTER_PROCESS_TYPE":"extensionHost","VSCODE_CWD":"/","VSCODE_HANDLES_UNCAUGHT_ERRORS":"true","VSCODE_IPC_HOOK":"/Users/myname/Library/Application Support/Code/1.82-main.sock","VSCODE_NLS_CONFIG":"{\"locale\":\"en-gb\",\"osLocale\":\"en-gb\",\"availableLanguages\":{},\"_languagePackSupport\":true}","VSCODE_PID":"24946","XPC_FLAGS":"0x0","XPC_SERVICE_NAME":"application.com.microsoft.VSCode.18399353.18399359","ZSH":"/Users/myname/.oh-my-zsh","_":"/Applications/Visual Studio Code.app/Contents/MacOS/Electron","__CFBundleIdentifier":"com.microsoft.VSCode","__CF_USER_TEXT_ENCODING":"0x1F5:0:2","ELECTRON_RUN_AS_NODE":"1","VSCODE_L10N_BUNDLE_LOCATION":"","PATH":"/opt/homebrew/opt/node@18/bin:/opt/homebrew/opt/postgresql@13/bin:/opt/homebrew/bin:/opt/homebrew/sbin:/opt/homebrew/Cellar/pyenv-virtualenv/1.1.5/shims:/Users/myname/.pyenv/shims:/usr/local/bin:/Users/myname/.pyenv/bin:/Users/myname/bin:/usr/local/bin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin:~/.pyenv:/var/run/com.apple.security.cryptexd/codex.system/bootstrap/usr/local/bin:/var/run/com.apple.security.cryptexd/codex.system/bootstrap/usr/bin:/var/run/com.apple.security.cryptexd/codex.system/bootstrap/usr/appleinternal/bin:/opt/homebrew:/opt/homebrew/bin:/opt/homebrew/Cellar/rabbitmq/3.11.0/sbin","PYTHONPATH":"/Users/myname/.vscode/extensions/ms-python.python-2023.19.12901009/pythonFiles","TEST_UUID":"470acf28-643c-47e7-aabe-a90b1c0a7667","TEST_PORT":"54271","RUN_TEST_IDS_PORT":"58685"}
2023-10-18 21:54:31.816 [info] Running pytest with arguments: /Users/myname/.vscode/extensions/ms-python.python-2023.19.12901009/pythonFiles/vscode_pytest/run_pytest_script.py --rootdir <hidden>/DataModel

2023-10-18 21:54:31.816 [info] > ./.venv/bin/python ~/.vscode/extensions/ms-python.python-2023.19.12901009/pythonFiles/vscode_pytest/run_pytest_script.py --rootdir .
2023-10-18 21:54:31.816 [info] cwd: .
2023-10-18 21:54:32.829 [info] Test server connected to a client.
2023-10-18 21:54:32.832 [info] ResultResolver EOT received for execution.
2023-10-18 21:54:32.850 [info] Client disconnected
2023-10-18 21:54:32.912 [info] Disposing data receiver for <hidden>/DataModel and deleting UUID; pytest execution.

@github-actions github-actions bot removed the info-needed Issue requires more information from poster label Oct 18, 2023
@eleanorjboyd
Copy link
Member

hm this is not what I expected to see. I have just added some extra logging which will be on the pre-release of the extension tomorrow. Would you be able to try it again tomorrow and send the same logs? You should see more logs regarding data received.

@github-actions github-actions bot added the info-needed Issue requires more information from poster label Oct 19, 2023
@MaxHorwood
Copy link

MaxHorwood commented Oct 25, 2023

Sorry for the delay, this is now working as expected... I don't believe I've changed anything on my end, maybe it was sleeping the computer (which I do regularly and not shutdown - but I did also have to reload vscode when switching to the pre-release anyway so that's a bit odd)
Version: v2023.19.12980103

Switching back to release version is still broken.

@github-actions github-actions bot removed the info-needed Issue requires more information from poster label Oct 25, 2023
@msonsona
Copy link

@msonsona, could you try running pip install --upgrade pytest then running again? We are on pytest 7.4.2 now so I am not sure if there has been changes to this specific hook but that could be an issue.

After upgrading pytest to 7.4.3 (it's a work-related repo, so not 100% sure about the feasibility of upgrading this on my end, but let's try for now), I also upgraded to the pre-release version of the extension (now running v2023.19.12981006), and had to remove the folder argument for pytest being set originally in my .vscode/settings.json:

{
    // "python.testing.pytestArgs": [
    //     "repo_test"
    // ],
    "python.testing.unittestEnabled": false,
    "python.testing.pytestEnabled": true,
    "python.testing.pytestPath": "/Users/me/code/repo/env/bin/pytest",
    "git.closeDiffOnOperation": true,
    "python.testing.cwd": "/Users/me/code/repo/repo_test",
    "python.analysis.typeCheckingMode": "off",
    "python.analysis.autoImportCompletions": true
}

and now it seems to be able to discover and run tests!

thanks @eleanorjboyd for your support! 🙌

@eleanorjboyd
Copy link
Member

eleanorjboyd commented Oct 25, 2023

Hi @MaxHorwood, glad it now works! That's weird it happened unexpectedly but let me know if it resurfaces.

@msonsona glad that upgrade worked as well! Are you not sure about the feasibility at your company you are saying as opposed to just on your own machine? Let me know if you do have any other questions, I am closing this as resolved in the meantime. Thanks!

@github-actions github-actions bot added the info-needed Issue requires more information from poster label Oct 25, 2023
@github-actions github-actions bot locked as resolved and limited conversation to collaborators Nov 25, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
area-testing info-needed Issue requires more information from poster triage-needed Needs assignment to the proper sub-team
Projects
None yet
Development

No branches or pull requests

8 participants