Skip to content

[misc][Long Context] feat: support ulysses for long context training #121

[misc][Long Context] feat: support ulysses for long context training

[misc][Long Context] feat: support ulysses for long context training #121

Workflow file for this run

name: vllm
on:
# Trigger the workflow on push or pull request,
# but only for the main branch
push:
branches:
- main
paths:
- "**/*.py"
- .github/workflows/vllm.yml
pull_request:
branches:
- main
paths:
- "**/*.py"
- .github/workflows/vllm.yml
jobs:
vllm:
runs-on: [self-hosted, l20-0]
env:
HTTP_PROXY: ${{ secrets.PROXY_HTTP }}
HTTPS_PROXY: ${{ secrets.PROXY_HTTPS }}
NO_PROXY: "localhost,127.0.0.1"
HF_HUB_ENABLE_HF_TRANSFER: 1
container:
image: verlai/verl:vemlp-th2.4.0-cu124-vllm0.6.3-ray2.10-te1.7-v0.0.3
options: --gpus all --shm-size=10g
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
fetch-depth: 0
- name: Install the current repository
run: |
pip3 install hf_transfer
pip3 install -e .[test]
pip3 install vllm==0.5.4
- name: Running vllm tests on 8 L20 GPUs
run: |
cd tests/rollout
torchrun --standalone --nnodes=1 --nproc_per_node=8 $(which pytest) -s test_vllm_hf_loader.py