Skip to content

Commit

Permalink
Merge branch 'cherry-pick-5accb3ba' into 'core_r0.8.0'
Browse files Browse the repository at this point in the history
ADLR/megatron-lm!1764 - Build and publish manylinux wheel

See merge request ADLR/megatron-lm!1905
  • Loading branch information
ko3n1g committed Aug 9, 2024
2 parents a3fe0c7 + f23d0e8 commit 4cdf318
Show file tree
Hide file tree
Showing 4 changed files with 47 additions and 31 deletions.
22 changes: 22 additions & 0 deletions .gitlab-ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,13 +21,20 @@ stages:
- build
- unit_tests
- functional_tests
- publish

variables:
JET_CUSTOM_FILTER:
description: |
Selects what functional tests to run. For mr tests: "type == 'build' or 'mr' in spec.scope". For nightly tests: "type == 'build' or 'nightly' in spec.scope"
value: ""
TIME_LIMIT: "10:00" # Default time limit for all jobs
PUBLISH:
value: "no"
options:
- "yes"
- "no"
description: Build and publish a wheel to PyPi
SLURM_CLUSTER:
value: "dgxa100_dracooci"
options:
Expand Down Expand Up @@ -333,3 +340,18 @@ formatting:

include:
- jet-tests.yml

publish-wheel:
image: quay.io/pypa/manylinux_2_28_x86_64
stage: publish
rules:
- if: $CI_COMMIT_BRANCH =~ /^core_r/ && $PUBLISH == "yes"
when: manual
- when: never
before_script:
- pip install twine
script:
- /opt/python/cp310-cp310/bin/python -m build
- /opt/python/cp311-cp311/bin/python -m build
- auditwheel repair dist/*.whl
- twine upload --repository pypi wheelhouse/*
1 change: 1 addition & 0 deletions MANIFEST.in
Original file line number Diff line number Diff line change
@@ -1 +1,2 @@
include megatron/core/requirements.txt
include megatron/core/README.md
15 changes: 14 additions & 1 deletion megatron/core/README.md
Original file line number Diff line number Diff line change
@@ -1 +1,14 @@
Megatron Core is a library for efficient and scalable training of transformer based models.
# Megatron-Core

Megatron-Core is an open-source PyTorch-based library that contains GPU-optimized techniques and cutting-edge system-level optimizations. It abstracts them into composable and modular APIs, allowing full flexibility for developers and model researchers to train custom transformers at-scale on NVIDIA accelerated computing infrastructure. This library is compatible with all NVIDIA Tensor Core GPUs, including FP8 acceleration support for [NVIDIA Hopper architectures](https://www.nvidia.com/en-us/data-center/technologies/hopper-architecture/).

Megatron-Core offers core building blocks such as attention mechanisms, transformer blocks and layers, normalization layers, and embedding techniques. Additional functionality like activation re-computation, distributed checkpointing is also natively built-in to the library. The building blocks and functionality are all GPU optimized, and can be built with advanced parallelization strategies for optimal training speed and stability on NVIDIA Accelerated Computing Infrastructure. Another key component of the Megatron-Core library includes advanced model parallelism techniques (tensor, sequence, pipeline, context, and MoE expert parallelism).

Megatron-Core can be used with [NVIDIA NeMo](https://www.nvidia.com/en-us/ai-data-science/products/nemo/), an enterprise-grade AI platform. Alternatively, you can explore Megatron-Core with the native PyTorch training loop [here](https://github.com/NVIDIA/Megatron-LM/tree/main/examples). Visit [Megatron-Core documentation](https://docs.nvidia.com/megatron-core/developer-guide/latest/index.html) to learn more.

## Quick links

- [Benchmark using NVIDIA NeMo](https://docs.nvidia.com/nemo-framework/user-guide/latest/overview.html#performance-benchmarks)
- [Multimodal example (LLaVA training pipeline)](https://github.com/NVIDIA/Megatron-LM/tree/main/examples/multimodal)
- [Mixture-of-Experts](https://github.com/NVIDIA/Megatron-LM/tree/main/megatron/core/transformer/moe)
- [Training Mamba-based Language Models](https://github.com/NVIDIA/Megatron-LM/tree/main/examples/mamba)
40 changes: 10 additions & 30 deletions setup.py
Original file line number Diff line number Diff line change
@@ -1,13 +1,10 @@
"""Setup for pip package."""

import importlib.util
import os
import subprocess
import sys

import setuptools
from setuptools import Extension, setup
from setuptools.command.build_ext import build_ext
from setuptools import Extension

spec = importlib.util.spec_from_file_location('package_info', 'megatron/core/package_info.py')
package_info = importlib.util.module_from_spec(spec)
Expand All @@ -26,37 +23,20 @@
__version__ = package_info.__version__


if os.path.exists('megatron/core/README.md'):
with open("megatron/core/README.md", "r", encoding='utf-8') as fh:
long_description = fh.read()
long_description_content_type = "text/markdown"

else:
long_description = 'See ' + __homepage__
long_description_content_type = "text/plain"


###############################################################################
# Dependency Loading #
# %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% #


def req_file(filename, folder="megatron/core"):
with open(os.path.join(folder, filename), encoding='utf-8') as f:
content = f.readlines()
# you may also want to remove whitespace characters
# Example: `\n` at the end of each line
return [x.strip() for x in content]


install_requires = req_file("requirements.txt")

with open("megatron/core/README.md", "r", encoding='utf-8') as fh:
long_description = fh.read()
long_description_content_type = "text/markdown"

###############################################################################
# Extension Making #
# %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% #

extra_compile_args = subprocess.check_output(["python3", "-m", "pybind11", "--includes"]).decode("utf-8").strip().split()
extra_compile_args = (
subprocess.check_output(["python3", "-m", "pybind11", "--includes"])
.decode("utf-8")
.strip()
.split()
)

###############################################################################

Expand Down

0 comments on commit 4cdf318

Please sign in to comment.