Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] yarn workspaces foreach run prefix string will mess up console, when sudo. #6667

Open
1 task
loynoir opened this issue Jan 27, 2025 · 7 comments
Open
1 task
Labels
bug Something isn't working waiting for feedback Will autoclose in a while unless more data are provided

Comments

@loynoir
Copy link

loynoir commented Jan 27, 2025

Self-service

  • I'd be willing to implement a fix

Describe the bug

yarn workspaces foreach run prefix string mess up console

To reproduce

https://github.com/loynoir/reproduce-yarn-berry-6667

$ yarn
➤ YN0000: · Yarn 4.6.0
➤ YN0000: ┌ Resolution step
➤ YN0000: └ Completed
➤ YN0000: ┌ Fetch step
➤ YN0000: └ Completed
➤ YN0000: ┌ Link step
➤ YN0000: └ Completed
➤ YN0000: · Done in 0s 44ms
$ yarn build
[hello]: Process started
[hello]: #0 building with "default" instance using docker driver
                                                                [hello]: 
                                                                         [hello]: #1 [internal] load build definition from Dockerfile
                                                                                                                                     [hello]: #1 transferring dockerfile:
                                 [hello]: #1 transferring dockerfile: 90B done
                                                                              [hello]: #1 DONE 0.6s
                                                                                                   [hello]: 
                                                                                                            [hello]: #2 [internal] load metadata for mcr.microsoft.com/devcontainers/base:alpine-3.20
                                                             [hello]: #2 DONE 1.3s
                                                                                  [hello]: 
                                                                                           [hello]: #3 [internal] load .dockerignore
                                                                                                                                    [hello]: #3 transferring context:
                             [hello]: #3 transferring context: 2B done
                                                                      [hello]: #3 DONE 0.5s
                                                                                           [hello]: 
                                                                                                    [hello]: #4 [1/1] FROM mcr.microsoft.com/devcontainers/base:alpine-3.20@sha256:5212d8ed44c89bfadad14a03104ef75b09c5de8642a58721c271f2e9155f5023
                                                                                                           [hello]: #4 CACHED
                                                                                                                             [hello]: 
                                                                                                                                      [hello]: #5 exporting to image
                            [hello]: #5 exporting layers
                                                        [hello]: #5 exporting layers done
                                                                                         [hello]: #5 writing image sha256:6609a4e82f8b1d391aec77dd37288540ce9437dfbdcb28bcf8a9a6bf9d89d493
                                                  [hello]: #5 writing image sha256:6609a4e82f8b1d391aec77dd37288540ce9437dfbdcb28bcf8a9a6bf9d89d493 0.3s done
                     [hello]: #5 DONE 0.6s
                                          [hello]: Process exited (exit code 0), completed in 6s 420ms
Done in 6s 423ms
$ 

Environment

node v23.4.0
yarn 4.6.0
npm 11.0.0

Additional context

No response

@loynoir loynoir added the bug Something isn't working label Jan 27, 2025
@clemyan
Copy link
Member

clemyan commented Jan 29, 2025

I am unable to reproduce. Looks like your terminal is not correctly processing newline characters, in which case there are nothing we can do on our end.

@clemyan clemyan added the waiting for feedback Will autoclose in a while unless more data are provided label Jan 29, 2025
@loynoir
Copy link
Author

loynoir commented Jan 29, 2025

Not sure what happened, I was never meet this situation within vscode terminal + zsh + grml-zsh-config before.

I believe nothing wrong with this simple combination, may try other combination later.

Current workaround is very simple.

Maybe your terminal using something similar?

Bug combination is yarn + "sudo docker build ."

Workaround combination is yarn + deno_workaround_wrapper.ts

async function cp_output_workaround(cp: Deno.ChildProcess) {
    await Promise.all([
      (async () => {
        for await (const el of cp.stdout) {
          Deno.stdout.write(el)
        }
      })(),
      (async () => {
        for await (const el of cp.stderr) {
          Deno.stderr.write(el)
        }
      })(),
    ])
  }

@loynoir
Copy link
Author

loynoir commented Jan 29, 2025

Confirm able to always reproduce using

  • vscode devcontainer + latest archlinux + vscode terminal + zsh

  • wezterm + zsh + docker + latest archlinux + bash

  • alacritty + bash + docker + latest archlinux + bash

$ img=public.ecr.aws/docker/library/archlinux:latest@sha256:8b0a7d7e22c2e78539406a37ba2a1eb431d31981be3d76e076f517d3b62204d0
$ docker run --rm -it -v /var/run/docker.sock:/var/run/docker.sock -v "$PWD":/reproduce -w /reproduce "$img"
[root@43ba36db2fc0 reproduce]# pacman -Sy --disable-sandbox && pacman --noconfirm -S sudo nodejs docker docker-buildx && corepack enable && yarn
[root@43ba36db2fc0 reproduce]# yarn build
[hello]: Process started
[hello]: #0 building with "default" instance using docker driver
                                                                [hello]:
                                                                         [hello]: #1 [internal] load build definition from Dockerfile
                                                                                                                                     [hello]: #1 transferring dockerfile:
           [hello]: #1 transferring dockerfile: 90B done
                                                        [hello]: #1 DONE 0.7s
                                                                             [hello]:
                                                                                      [hello]: #2 [internal] load metadata for mcr.microsoft.com/devcontainers/base:alpine-3.20
                 [hello]: #2 DONE 0.0s
                                      [hello]:
                                               [hello]: #3 [internal] load .dockerignore
                                                                                        [hello]: #3 transferring context:
                                                                                                                         [hello]: #3 transferring context: 2B done
    [hello]: #3 DONE 0.8s
                         [hello]:
                                  [hello]: #4 [1/1] FROM mcr.microsoft.com/devcontainers/base:alpine-3.20
                                                                                                         [hello]: #4 CACHED
                                                                                                                           [hello]:
                                                                                                                                    [hello]: #5 exporting to image
    [hello]: #5 exporting layers done
                                     [hello]: #5 writing image sha256:6609a4e82f8b1d391aec77dd37288540ce9437dfbdcb28bcf8a9a6bf9d89d493 0.0s done
                                                                                                                                                [hello]: #5 DONE 0.2s
       [hello]: Process exited (exit code 0), completed in 4s 677ms
Done in 4s 680ms

@loynoir
Copy link
Author

loynoir commented Jan 29, 2025

Also able to reproduce when using deno 2.1.6 call yarn berry 4.6.0

@loynoir
Copy link
Author

loynoir commented Jan 29, 2025

Summary

  • When docker build, console OK

  • When yarn workspaces foreach run + non docker build, console OK

  • When yarn workspaces foreach run + docker build, console mess up BUG, using runtime nodejs or deno

Confirm able to always reproduce using combo

  • vscode devcontainer + latest archlinux + vscode terminal + zsh

  • wezterm + zsh + docker + latest archlinux + bash

  • alacritty + bash + docker + latest archlinux + bash

Workaround

  • Exists, just a few lines of code, see above.

@loynoir loynoir changed the title [Bug?]: yarn workspaces foreach run prefix string mess up console [BUG] yarn workspaces foreach run prefix string will mess up console, when sudo. Jan 31, 2025
@loynoir
Copy link
Author

loynoir commented Jan 31, 2025

@clemyan

Figure out root cause is sudo.

Reproduce repo updated.

https://github.com/loynoir/reproduce-yarn-berry-6667

@loynoir
Copy link
Author

loynoir commented Feb 1, 2025

summary

Until now, have not yet been confirmed as bug by nodejs and yarn.

Below is summary based on my current understanding of this problem.

I think

  • This is a nodejs + deno race condition bug when sudo

  • This may called yarn berry bug, depends on individual preference.

brief

https://github.com/loynoir/reproduce-yarn-berry-6667

  "scripts": {
    "reproduce": "yarn workspaces foreach --all --exclude . run reproduce",
    "workaround": "yarn workspaces foreach --all --exclude . run workaround",
    "simulate": "yarn workspaces foreach --all --exclude . run simulate"
  }
  "scripts": {
    "reproduce": "sudo node ./simulate.mjs",
    "workaround": "deno run -A ./workaround.ts",
    "simulate": "node ./simulate.mjs"
  }
script console
yarn reproduce mess up
yarn workaround OK
yarn simulate Ok
DISABLE_WORKAROUND=1 yarn workaround mess up

node

Because

  • yarn simulate console OK

Which means

  • yarn berry logic is fine.

But

  • yarn reproduce console mess up.

  • yarn workaround console OK

Which means

  • There is a way, I have not yet have time to extract logic out from yarn berry, leads to sudo XXX console is not isomprphic to XXX console.

  • There is also a way, let sudo XXX console isomprphic to XXX console

  • sudo XXX console is not always isomprphic to XXX console

I think

  • This is a nodejs race condition bug when sudo.

yarn

Because

  • yarn reproduce console mess up.

  • yarn workaround console OK

  • yarn simulate console OK

I think

  • This may called yarn berry bug, depends on individual preference.

deno

Because

  • yarn workaround console OK

  • DISABLE_WORKAROUND=1 yarn workaround console mess up

I think

  • This is a deno race condition bug when sudo without async iterator.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working waiting for feedback Will autoclose in a while unless more data are provided
Projects
None yet
Development

No branches or pull requests

2 participants