Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cypress crashes with "We detected that the Chromium Renderer process just crashed." #27415

Open
Lasidar opened this issue Jul 28, 2023 · 119 comments
Assignees
Labels
stage: investigating Someone from Cypress is looking into this type: performance 🏃‍♀️ Performance related

Comments

@Lasidar
Copy link

Lasidar commented Jul 28, 2023

Test code to reproduce

I cannot provide the specific test code I am using to produce this, since it contains internal company data. However, what I can tell you is these tests run for quite a long time (5-10 minutes), as they contain a lot of steps through our E2E flow in our application.

Cypress Mode

cypress run

Cypress Version

12.13.0

Browser Version

Chrome 110.0.5481.96

Node version

v16.18.1

Operating System

Debian GNU/Linux 11 (Docker image cypress/browsers:node-16.18.1-chrome-110.0.5481.96-1-ff-109.0-edge-110.0.1587.41-1)

Memory Debug Logs

Only the end of the failing test results from the memory dump have been included due to limitations of this input form.

[
  {
    "checkMemoryPressureDuration": 7764.029118001461,
    "testTitle": "Crashing test (example #1)",
    "testOrder": 18.2,
    "garbageCollected": true,
    "timestamp": 1690553650738
  },
  ...
  {
    "getRendererMemoryUsageDuration": 2.619260013103485,
    "totalMemoryWorkingSetUsed": 6995816448,
    "getAvailableMemoryDuration": 58.69076597690582,
    "jsHeapSizeLimit": 4294705152,
    "totalMemoryLimit": 9223372036854772000,
    "rendererProcessMemRss": 5469155328,
    "rendererUsagePercentage": 127.34646813770371,
    "rendererMemoryThreshold": 2147352576,
    "currentAvailableMemory": 9223372029858955000,
    "maxAvailableRendererMemory": 4294705152,
    "shouldCollectGarbage": true,
    "timestamp": 1690553801030,
    "calculateMemoryStatsDuration": 58.72436600923538
  },
  {
    "getRendererMemoryUsageDuration": 2.208419978618622,
    "totalMemoryWorkingSetUsed": 5089853440,
    "getAvailableMemoryDuration": 61.31387501955032,
    "jsHeapSizeLimit": 4294705152,
    "totalMemoryLimit": 9223372036854772000,
    "rendererProcessMemRss": 0,
    "rendererUsagePercentage": 0,
    "rendererMemoryThreshold": 2147352576,
    "currentAvailableMemory": 9223372031764918000,
    "maxAvailableRendererMemory": 4294705152,
    "shouldCollectGarbage": false,
    "timestamp": 1690553802092,
    "calculateMemoryStatsDuration": 61.33369600772858
  },
  {
    "getRendererMemoryUsageDuration": 2.69021999835968,
    "totalMemoryWorkingSetUsed": 1682976768,
    "getAvailableMemoryDuration": 50.05962598323822,
    "jsHeapSizeLimit": 4294705152,
    "totalMemoryLimit": 9223372036854772000,
    "rendererProcessMemRss": 0,
    "rendererUsagePercentage": 0,
    "rendererMemoryThreshold": 2147352576,
    "currentAvailableMemory": 9223372035171795000,
    "maxAvailableRendererMemory": 4294705152,
    "shouldCollectGarbage": false,
    "timestamp": 1690553803143,
    "calculateMemoryStatsDuration": 50.07922601699829
  },
  {
    "getRendererMemoryUsageDuration": 2.889739990234375,
    "totalMemoryWorkingSetUsed": 1682792448,
    "getAvailableMemoryDuration": 60.31445497274399,
    "jsHeapSizeLimit": 4294705152,
    "totalMemoryLimit": 9223372036854772000,
    "rendererProcessMemRss": 0,
    "rendererUsagePercentage": 0,
    "rendererMemoryThreshold": 2147352576,
    "currentAvailableMemory": 9223372035171979000,
    "maxAvailableRendererMemory": 4294705152,
    "shouldCollectGarbage": false,
    "timestamp": 1690553804204,
    "calculateMemoryStatsDuration": 60.33361500501633
  },
  {
    "getRendererMemoryUsageDuration": 2.6974300146102905,
    "totalMemoryWorkingSetUsed": 1682558976,
    "getAvailableMemoryDuration": 225.94400304555893,
    "jsHeapSizeLimit": 4294705152,
    "totalMemoryLimit": 9223372036854772000,
    "rendererProcessMemRss": 0,
    "rendererUsagePercentage": 0,
    "rendererMemoryThreshold": 2147352576,
    "currentAvailableMemory": 9223372035172213000,
    "maxAvailableRendererMemory": 4294705152,
    "shouldCollectGarbage": false,
    "timestamp": 1690553805431,
    "calculateMemoryStatsDuration": 225.9711429476738
  }
]

Other

Our test specs that contain multiple long running tests are prone to crashing mid-run in CI. This seems to be more likely when there are test retries in the run. We are running with both experimentalMemoryManagement set to true and numTestsKeptInMemory set to 0. We also have the memory and CPU allocation in our GitLab runners set quite high (see below). Despite this, we still get the crashes. Example:

Some top level test description
    (Attempt 1 of 4) A test scenario containing a scenario outline template (example #1)
    (Attempt 2 of 4) A test scenario containing a scenario outline template (example #1)
    ✓ A test scenario containing a scenario outline template (example #1) (849857ms)
    ✓ A test scenario containing a scenario outline template (example #2) (360954ms)
    ✓ A test scenario containing a scenario outline template (example #3) (556574ms)
    (Attempt 1 of 4) A test scenario containing a scenario outline template (example #4)

We detected that the Chromium Renderer process just crashed.

This can happen for a number of different reasons.

If you're running lots of tests on a memory intense application.
  - Try increasing the CPU/memory on the machine you're running on.
  - Try enabling experimentalMemoryManagement in your config file.
  - Try lowering numTestsKeptInMemory in your config file during 'cypress open'.

You can learn more here:

https://on.cypress.io/renderer-process-crashed

Here are the memory allocations we are providing in Gitlab CI:

  KUBERNETES_CPU_REQUEST: "3"
  KUBERNETES_CPU_LIMIT: "4"
  KUBERNETES_MEMORY_REQUEST: "12Gi"
  KUBERNETES_MEMORY_LIMIT: "32Gi"

It should be noted these tests run within docker in CI, and are running in the cypress/browsers:node-16.18.1-chrome-110.0.5481.96-1-ff-109.0-edge-110.0.1587.41-1 version of the cypress image.

We are utilizing the cypress-cucumber-preprocessor library, but I do not believe that is having any impact on this issue.

@Lasidar
Copy link
Author

Lasidar commented Jul 28, 2023

I have a speculation as to what is causing this which is based on a few logical leaps, but I think they are reasonable. I think the issue is mainly caused by the open issues involving cypress and K8s (see jlandure/alpine-chrome#109).

Looking at the logs in question, I noticed the jsHeapSizeLimit limit is being exceeded right before the crash.

    "jsHeapSizeLimit": 4294705152,
    "rendererProcessMemRss": 5469155328,
    "rendererUsagePercentage": 127.34646813770371,

This is strange, since we have 12GB allocated to the Gitlab runner, with the ability to scale up to 32GB if the process calls for it. But looking at the issue linked above, I believe this prevents the Chrome renderer from being able to scale up its heap memory usage, even if the system has capacity.

From looking at the Cypress source code, it seems the renderer process memory is pulled from the chrome jsHeap, and if we exceed this, we are likely smashing the stack which probably leads to the crash.

  // the max available memory is the minimum of the js heap size limit and
  // the current available memory plus the renderer process memory usage
  const maxAvailableRendererMemory = Math.min(jsHeapSizeLimit, currentAvailableMemory + rendererProcessMemRss)

So my workaround, at least temporarily, is to increase the chrome jsHeapSizeLimit value. I was able to achieve this through the following additions to my cypress/plugins/index.js file:

module.exports = (on, config) => {
  on("before:browser:launch", (browser = {}, launchOptions) => {
    if (browser.family === "chromium") {
      launchOptions.args.push('--js-flags="--max_old_space_size=1024 --max_semi_space_size=1024"');
    }
    return launchOptions;
  });
});

This seems to have made my memory crashing issues go away for the time being. I believe the correct fix for this is for jlandure/alpine-chrome#109 to be resolved, since as I mentioned above, I suspect this issue prevents the process from being able to properly scale up its heap size.

@Pardoniano
Copy link

I started to have the same problem this week. Already tried your solution on cypress.config.ts, with --max_old_space_size=4096. And still crash on a specific element. He crashes in Chrome and Edge, but work's fine in electron.

@uladzimirdev
Copy link

increasing max_old_space_size helps locally, but I can clearly see memory usage increase on chrome 115 - it can be 3.8Gb in my case

@Pardoniano
Copy link

I tested it in the homolog environment, I encountered this error. Interestingly, the button who causes the crash it works perfectly outside of Cypress, and I have tested it in various scenarios with cypress. The version of Cypress used is 12.7.0.

Here are the details of the environments where the issue occurred:

  1. Environment: Windows 22621.1992
    Browser: Chrome - Version 115.0.5790.110 (Official Build) (64-bit)
    Result: Didn't work

  2. Environment: macOS Ventura version 13.4.1 (c) (22F770820d)
    Browser: Chrome - Version 115.0.5790.114 (Official Build) (arm64)
    Result: Didn't work

  3. Environment: macOS Ventura version 13.1 (22C65)
    Browser: Chrome - Version 115.0.5790.114 (Versão oficial) (arm64)
    Result: Didn't work

  4. Environment: Ubuntu
    Browser: Chrome - Version 113.0.5672.126
    Result: Worked fine

  5. Also tested it in our pipeline, and it worked without any issues. The Docker image used was cypress/browsers:node16.14.2-slim-chrome100-ff99-edge.

I suspect that the Chrome version 115 might be causing the problem, possibly due to memory leaks or other issues, idk, just guessing so far. Additionally, in the Electrum application, after clicking the specific button he shows in the cypress console one of the following error messages:

  1. ResizeObserver loop limit exceeded;
  2. ResizeObserver loop completed with undelivered notifications.

Wich i already fixed way back then with Cypress.on("uncaught:exception")

Unfortunately, this is all the information I currently have, as I haven't had enough time to investigate further. And i tried the workaround, sadly didn't work. So far my workaround is gonna be work on electron.

@nagash77
Copy link
Contributor

nagash77 commented Aug 4, 2023

Hi everyone on this thread. Thank you for submitting this ticket and all the details and investigation you all have done. We do continually revisit performance and resource utilization within Cypress so these are great data points. Unfortunately due to the nature of these problems they can be extremely hard to reproduce and even harder to track down so I cannot give any timelines about a resolution. If anyone seeing this ticket has ideas on culprits, I encourage you to open a PR and we will work with you to get that merged if it shows that it can help this problem.

@Adrian-Sana
Copy link

I am having the same issue, it is crashing with Chrome 115 but if I download the stable 114 from here: https://chromium.cypress.io/mac/ and then run it with --browser [path to the downloaded chrome] it works.

@Lasidar
Copy link
Author

Lasidar commented Aug 4, 2023

@nagash77 I understand that actually finding the root cause here will be extremely difficult. One thing I did find interesting when I was analyzing the memory usage leading up to the crash is between tests, the memory usage never returns back down to a baseline level. It is continually increasing. This is even with experimentalMemoryManagement set to true and numTestsKeptInMemory set to 0. I find this strange since I am not sure what would be persisting in memory between tests.

image

If a true root cause fix is not possible, I have a few possible suggestions:

  • Give the users control over the browser heap size allocation via a parameter in cypress.config.js. That way, instead of setting it through a before:browser:launch trigger, it could be set more transparently via a test configuration option.
  • In the event of a renderer crash, perhaps we could print more verbose memory statistics (i.e. max rendererProcessMemRss used, jsHeapSizeLimit, etc).
  • If possible, try and solve the K8s / docker issue (i.e. ☸️ Can not run in Kubernetes? jlandure/alpine-chrome#109), which I believe may be contributing to this issue.

@ChristianKlima
Copy link

It seems as if Chrome has been overwhelmed with the layout of our application including Cypress since version 115. Our application runs unstably when command-log window is displayed. As soon as an action is performed in the application that writes a new entry in the command log, the browser crashes. This is usually not easy to reproduce, but we were able to reproduce it in a test at some point. Debugging the Chrome crashdumbs didn't help us either. Everything was stable up to version 114 of Chrome. Our solution is to set CYPRESS_NO_COMMAND_LOG: 1 now. Maybe it will help someone. We hope that at some point the problem will solve itself with a Chrome update or with a simpler cypress layout. Unfortunately, this is not that easy to reproduce in a small sample application.

@olahaida
Copy link

olahaida commented Aug 22, 2023

I ran in to the same issue. Set of test that I was running without any problems a month ago, now is crushing with

"We detected that the Chromium Renderer process just crashed.
This can happen for a number of different reasons.
If you're running lots of tests on a memory intense application.

  • Try increasing the CPU/memory on the machine you're running on.
  • Try enabling experimentalMemoryManagement in your config file.
  • Try lowering numTestsKeptInMemory in your config file during 'cypress open'."

I've added experimentalMemoryManagement: true to my config file. Also reduced the numTestsKeptInMemory, but it's still crashing. Did not crash only when I reduced the number of tests to 0, but it's not solving my problem because I need to see those results (and a month ago it was working without any reduction).

I can see in the Task Manager that the process is crashing whenever Cypress is reaching over 4GB of RAM, I know it was always consuming a lot of resources but not that much

Windows 10
Cypress version: 12.17.4
Chrome v116
Electron v106

@AMEntwistle
Copy link

We also ran into this issue, particularly with chrome v116. We currently are locking to chrome version 115.0.5790.110-1 which is stable as far as we can see in regards to this memory issue. I don't know if this is more an issue with chrome or cypress, but being unable to test on latest browser versions is not ideal.

@testauto05
Copy link

I am getting error consistently on Windows machine with all browsers from command line whether headed or not. It only works once in a while in Electron when run from the App. Please suggest any workaround for this.

Chrome, Edge, Electron - all latest versions.
Cypress - 12.14 and 12.17

@Lasidar
Copy link
Author

Lasidar commented Aug 31, 2023

As mentioned in the original ticket, we are experiencing this in Chrome 110.0.5481.96, so at least for us, it does not appear to be a case where Chrome 116+ is causing it. However, we are going to try Chrome 115 and report back.

@testauto05
Copy link

testauto05 commented Sep 3, 2023

Thanks for reply. It works fine by increasing experimentalMemoryManagement to 4GB. But more scenarios are to be added to our suite, so even 4GB may not be sufficient later. Cypress team please find better option. @nagash77

@VitaliLebedzeu
Copy link

VitaliLebedzeu commented Sep 4, 2023

I am getting error consistently on Windows machine with all browsers from command line whether headed or not. It only works once in a while in Electron when run from the App. Please suggest any workaround for this.

Chrome, Edge, Electron - all latest versions. Cypress - 12.14 and 12.17

The same problem. Browser crashed with all browsers.
Started in chrome only and switching on the electron helps to solve this issue, but now it happens on all browsers.

@akondratsky
Copy link

akondratsky commented Sep 5, 2023

Our team faced this issue, too. We use Cypress 12.17.4 with Electron, and all tests run in the Docker container (built on top of the CentOS image). We tried suggestions from the article Fixing Cypress errors part 1: chromium out of memory crashes, but none of them helped (we did not check Chromium option).

It is worth noting that issue appears only on one machine. Three of the other works fine, without crashes at all. Same image, being run in our Jenkins pipeline, also works smoothly. We've already checked our Docker configurations - they are the same. All machines except Jenkins agent under the MacOS. And it is obvious that different processors also do not play a crucial role here.

I wonder what can be the difference here. If anyone has any idea what we can check - please ping me. We will return to investigating tomorrow.

@kevingorry
Copy link

@akondratsky have you found a solution ? I'm running into something very similar

@valeraine
Copy link

After recent Chrome updates (144 onwards) our cypress e2e test keep crashing like crazy, tests that have never had those issues before with 3 to 4 test cases...

@akondratsky
Copy link

@kevingorry unfortunately, we did not find a solution. We have tried different options, like disabling GPU / rasterization and some other stuff, but without success.

One idea I have come to is about logging. Per as I understand this topic, here may be hidden not just a single issue, but a few, and one of the problems is that it is impossible to figure out a reason for why Electron is crashing. This stuff is far beyond of my competencies, and I cannot dive deep enough into the topic because of our deadlines.

@kevingorry
Copy link

@akondratsky I'm in the same boat, I run my tests on Azure runners (Ubuntu + Electron 106) on Cypress 13.1.0 with 16GB of memory and they still crash that is ridiculous !
I enabled everything they said
numTestsKeptInMemory: 0,
experimentalMemoryManagement: true,

@nagash77 Any news on this ongoing issue ?

@irbrad
Copy link

irbrad commented Sep 19, 2023

I, too, am in the same boat on GHA runners. 😞 We're on Chrome 114 running on Cypress 13.2 now. Maybe 50% of the runs are crashing out. I had to override the timeout from the default six hours to ten minutes because of it.

@uladzimirdev
Copy link

as a temp solution you can use old version of chrome at GHA

@irbrad
Copy link

irbrad commented Sep 19, 2023

as a temp solution you can use old version of chrome at GHA

How old? I just backtracked to 108 and the same thing is happening. Going to try going back to Cypress 12.

@irbrad
Copy link

irbrad commented Sep 19, 2023

It doesn't seem to be crashing on Chrome 109 with Cypress 12.17.4.

@uladzimirdev
Copy link

How old? I just backtracked to 108

@irbrad we sticked to the 111 version, it does not crash for us

@j0sephx
Copy link

j0sephx commented Sep 26, 2023

we're experiencing this on chromium 112 & cypress version 13.2.0 - set numTestsKeptInMemory: 0,
experimentalMemoryManagement: true - will try changing the version to 111 unless there is a resolution that hasnt been posted?

@jennifer-shehane
Copy link
Member

I wanted to update here that we have identified and fixed a memory leak when Test Replay is turned on. If you previously disabled Test Replay due to noticeable perf issues when it was on, please try enabling and let us know if the issues persist.

As for the issue of a memory leak in the App where the memory is not cleaning up as it should between tests with this example, we're still actively investigating the cause of this.

@irbrad
Copy link

irbrad commented Aug 16, 2024

@jennifer-shehane I'll turn off the filter I created that blocks tons of public API's that our application uses within Cypress tests. Replay was definitely crashing it out on every single run before I added in the blockHost list.

@ottopaulsen
Copy link

Not sure if it is any help, but I was just troubleshooting the same error, and I got it even in a single simple test that should not use much memory. I was using the { delay: 0 } option to type, and when I removed it, the error disappeared. I got the problem when updating cypress from 13.6.4 to 13.13.3.

@jennifer-shehane
Copy link
Member

@ottopaulsen If you could provide a way to reproduce this, it would be very helpful. 💯 There could be multiple issues involved.

@ottopaulsen
Copy link

@ottopaulsen If you could provide a way to reproduce this, it would be very helpful. 💯 There could be multiple issues involved.

Sorry, I cannot do that. And yes, there are multiple issues, and I am getting the same error in other situations too. It looks like as soon as I am getting rid of one, I am getting another one another place. But they seem to appear at the same place every time, so kind of stable.

I am using Vuetify too, and I noticed that those two I ran into today both where in a modal dialog using drop-downs (v-select). I guess i will continue tomorrow, but I have to admit I am getting more and more tempted to try playwright, as other have mentioned.

@ottopaulsen
Copy link

My problems seem to be related to this: vuetifyjs/vuetify#17922

@Narretz
Copy link
Contributor

Narretz commented Aug 28, 2024

Interesting. ResizeObserver was already suspected in #27415 (comment) and the fact that it's often ignored via uncaught exception may have contributed to it not being investigated more

It even mentions the same (starting) Chrome version, 115

@theKashey
Copy link

A reminder that there is a fix for ResizeObserver - #27415 (comment) , I hope it will solve the problem for you as it did for me. Well, it only hides the problem.

@juliusknorr
Copy link

juliusknorr commented Aug 30, 2024

I cannot provide an isolated example for this crash, but at least some relatively easy steps to run our cypress tests with a local docker container. I hope this helps to further investigate:

git clone https://github.com/nextcloud/text.git
cd text
git checkout renovate/main-cypress
# run our server code in a docker container
docker run --rm -p 8080:80 -v $PWD:/var/www/html/apps/text ghcr.io/juliushaertl/nextcloud-dev-php81:latest

# run cypress tests in a separate terminal
npm ci
# Make sure that the container is running and http://localhost:8080 shows a login interface
CYPRESS_baseUrl=http://localhost:8080/index.php npx cypress open --e2e --browser electron
# Most tests crash but one example is ImageView.spec.js

The branch is from nextcloud/text#6281 which bumps cypress from the last known working version 13.6.4 to 13.14.1.

@balaarunreddyv1
Copy link

Same for me aswell i have too many crashed on my mac book. can run a test fully.

@NoodlesKamiSama
Copy link

I'm experiencing same issues, the crazy thing is when running everything on our staging/test environment I don't get these crash issues only when running whole (same) suite on Production environment. I've managed to temporary fix the crashes resizingObserver, #27415 (comment) but it's a temporary solution.

@jennifer-shehane
Copy link
Member

We're looking into the ResizeObserver theory and reproduction. There's a lot of deep diving into code involved, so we'll update here once we have factual takeaways.

@mscudlik
Copy link

I had the same issue and tried changing the memory parameters, the resize observer polyfill, up/downgraded a bunch of related packages back and forth and in the end i found one thing that made all other combinations suddenly work (locally and especially in the CI environment): removing the code-coverage collection (@cypress/code-coverage + istanbul / nyc)

@Titas-OpenTrack
Copy link

Titas-OpenTrack commented Oct 3, 2024

@jennifer-shehane even with Cypress v13.15.0, we still get a lot of "Out of memory" issues when "Test Replay" is turned ON... It is completely unusable at this point 😞

When the "Test Replay" is OFF, the "Out of memory" issues still occur occasionally but we have decreased it to a minimum with all of the workarounds mentioned here (splitting into smaller tests, numTestsKeptInMemory = 1, etc.).

So, clearly, the "Test Replay" functionality is somehow contributing to the issue as it happens every time when it's ON.

I have submitted more data with URLs to the exact test runs via "Report an issue" button in the "Test Replay" itself.
Please let me know if there's anything else I could do to help 🙌

@DenisHdz
Copy link

DenisHdz commented Oct 4, 2024

We are experimenting the same issue in our team. We have a frontend project which uses Vue 3.5.3 + Vuetify 3.5.16 and we are doing UI integration testing with Cypress v13.15.0.

We have self-hosted runners and when running the tests on our CI workflow, the tests randomly fail, throwing the mentioned We detected that the Chromium Renderer process just crashed. error.

We have also tried splitting the test files into smaller tests, but no luck so far :(

@AlecksJohannes
Copy link

The same for us as well, this just happened today. I cannot even downgrade to older Chrome version because of this bug #28243.

@vmtl-adsk
Copy link

vmtl-adsk commented Oct 8, 2024

My Cypress version was 12.17.4, and I had several specs that used to crash the browser. The workaround was to use Electron when running those specs. I decided it was time to upgrade and tried 13.15.0, but I had twice as many specs crashing: Electron had the issue as well. I even tried Firefox, but it got stuck indefinitely when running those same specs instead of crashing the browser engine. Now I'm testing 13.6.4 and it's same specs that were failing on 12.17.4, but I can use electron as workaround.

I increased container resources, adjusted max_old_space_size, numTestsKeptInMemory=0, experimentalMemoryManagement=true, tried different browser parameters and resolutions. One of the failing specs is just a single test, so I tend to think it could be a size of the DOM that leads to the issue

@Titas-OpenTrack
Copy link

In our case, the crashes occurred largely due to the cypress-plugin-api plugin (details of the issue here: filiphric/cypress-plugin-api#135).

I'm sure there's possible workarounds to disable it in headless mode, but since this plugin isn't crucial for us, we have decided to remove it altogether and so the performance of our tests has drastically improved!

Big kudos to @jennifer-shehane for investigating & helping to get to the bottom of this! 🙌

@amitabh2811
Copy link

amitabh2811 commented Oct 14, 2024

We started getting this memory leaks issue after upgrading to cypress version from 12.17.4 to 13.13.0.
We were encountering memory leak issues due to a forEach function we used, it’s likely because the function might be handling a large number of elements or performing complex operations repeatedly, causing memory build-up over time. Cypress tests can be sensitive to resource management, especially when dealing with DOM manipulations, network requests, or state management.

Avoid Chaining Too Many Commands Inside Loops
If you are chaining multiple commands like cy.get, cy.click, or others inside a loop, this can cause the memory to spike

// Instead of this:

for (let i = 0; i < elements.length; i++) {
  cy.get(elements[i]).click().t').should('be.visible'); // Memory intensive
}

// Use flattened chain of commands:

elements.forEach((element) => {
  cy.get(element).then(($el) => {
    cy.wrap($el).click().type('text');
  });
});

Refactoring the tests which were causing this issue with the above use of flattened chain of commands worked for us, And we stopped getting the memory leak crashes.

@can-angun
Copy link

Hello @jennifer-shehane

Is there any update on this issue?

We are experiencing the same memory crash problem while running our tests in CI. I have tried all the solutions suggested here (and in other sources), but I am still facing the issue. The error occurs roughly one out of every three runs.

Since our project is open-source, I can also share the action link where the error occurs.
MemoryCrashAction

@krishhna123
Copy link

Hello team,

I understand team is looking into this issue but I am getting errors for simple steps as well like selecting dropdown and choosing option from that. Fixing this issue will help me so much. Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
stage: investigating Someone from Cypress is looking into this type: performance 🏃‍♀️ Performance related
Projects
Status: Understanding
Development

No branches or pull requests