Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add CI on Github Actions for macOS #543

Merged
merged 2 commits into from
Jun 3, 2024
Merged

Conversation

Andarwinux
Copy link
Contributor

No description provided.

@blapie
Copy link
Collaborator

blapie commented May 29, 2024

Do we understand what is going on with the cancelled actions? I don't remember cancelling anything, so I retriggerred a run but the re-run got cancelled again. https://github.com/shibatch/sleef/actions/runs/9217334465

@blapie
Copy link
Collaborator

blapie commented May 29, 2024

Regarding your message about the cost of macos runners. If I understand the pricing policy correctly, since SLEEF is a public repository we have access to most runners for free, including the macos ones.

In this case I'm happy to go an merge this PR, provided all checks pass.

@Andarwinux
Copy link
Contributor Author

Do we understand what is going on with the cancelled actions? I don't remember cancelling anything, so I retriggerred a run but the re-run got cancelled again. https://github.com/shibatch/sleef/actions/runs/9217334465

I also encountered this in my repo, I found that ubuntu-24.04 runner may be faulty, github seems to be AB testing for ubuntu-latest, my workaround is to hardcode it to 22.04 for now.

Regarding your message about the cost of macos runners. If I understand the pricing policy correctly, since SLEEF is a public repository we have access to most runners for free, including the macos ones.

In this case I'm happy to go an merge this PR, provided all checks pass.

In fact GHA has a subtle hidden “quota” for public repositories, which can lead to repos being banned if you go too far. In addition, I'm concerned that this may cause some private forks to accidentally use up their limited quota.

@blapie
Copy link
Collaborator

blapie commented May 29, 2024

In fact GHA has a subtle hidden “quota” for public repositories, which can lead to repos being banned if you go too far. In addition, I'm concerned that this may cause some private forks to accidentally use up their limited quota.

I wonder if we are ever gonna be close to these quota in practice, but I understand the issue with forks.
Can the workflow be triggered at a given frequency instead of at pre/post-commit, say weekly? Is it safe to let the jobs be triggered manually?

@Andarwinux
Copy link
Contributor Author

Can the workflow be triggered at a given frequency instead of at pre/post-commit, say weekly? Is it safe to let the jobs be triggered manually?

Sure, just use schedule. Manual trigger would require repo members to keep an eye on GHA, which I think could be a significant burden.

I think the main bottleneck in CI at the moment is QEMU-based testing, so maybe consider split it into separate workflows, run build CI for each commit, and run testing CI periodically.

@blapie
Copy link
Collaborator

blapie commented May 30, 2024

Thank you for your insight.

Sure, just use schedule. Manual trigger would require repo members to keep an eye on GHA, which I think could be a significant burden.

Agreed, that is quite a fair price to pay. That is also probably an easier way to monitor resource usage in general, since we know almost exactly what is ran and how often.

I think the main bottleneck in CI at the moment is QEMU-based testing, so maybe consider split it into separate workflows, run build CI for each commit, and run testing CI periodically.

Absolutely, we might simply split native and cross jobs into different workflows. Then, use conditionals to run native build+test and cross build on each commit and run cross tests only periodically.

Please leave the on section of your worflows as is for now we will adjust all workflows once your new ones are merged.

Comment on lines 93 to 94
#- os: "macos-14"
#arch: "aarch64"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Our internal tests pass with apple clang 14 on apple silicon (M1).
Would it be possible to test aarch64 with apple clang 14? I believe it can be installed on macos-13, but maybe maco-14 is still fine.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like there's a problem with the AppleClang that GHA provides by default. I passed all tests after update it to XCode 15.4. Should I add a new matrix for different XCode versions?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Brilliant, thanks for checking. Your last version looks good! If a minor compiler update fixes the issue I would not worry too much and probably won't open a new issue.

Mark macOS x86_64 and aarch64 as Tested.

Mark i686 and aarch32 as N/A because these ABI are deprecated.

Correct "MacOS" to "macOS".
@blapie
Copy link
Collaborator

blapie commented May 31, 2024

Just re-ran the failing workflows as the issue with ubuntu runners seem to have been resolved.
Otherwise ready to merge the new workflow files.

@blapie
Copy link
Collaborator

blapie commented May 31, 2024

Are you happy for me to merge?

@Andarwinux
Copy link
Contributor Author

Yes, LGTM.

@blapie
Copy link
Collaborator

blapie commented Jun 3, 2024

Thanks again for an outstanding contribution, this is very much appreciated! I'll make sure we have more control over CPU runners usage by re-organising workflows in the upcoming days.

@blapie blapie merged commit 23c1ad7 into shibatch:master Jun 3, 2024
43 checks passed
@blapie blapie mentioned this pull request Jun 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants