Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update libcoral for Python 3.11 and modern versions of Tensorflow #36

Open
wants to merge 43 commits into
base: master
Choose a base branch
from
Open
Changes from all commits
Commits
Show all changes
43 commits
Select commit Hold shift + click to select a range
057f505
Synced with latest commit of libedgetpu
feranick Feb 29, 2024
767c4bc
Remove outdated deb package for python 2 in docker.
feranick Feb 29, 2024
a4e6408
Update version fo bazel to 6.1.0 in docker needed for TF2.15.0
feranick Feb 29, 2024
a858b8d
Update WROKSPACE to allow compilatin of current version of TF
feranick Feb 29, 2024
28226de
Fix FTFS due to change in Eigen3 API
feranick Feb 29, 2024
d88f35e
Replace signature_def_names with signature_keys
feranick Feb 29, 2024
5797c5e
Fix ambiguous call leading to FTFS, by overriding TfLiteFloatArrayCopy
feranick Feb 29, 2024
3a45b84
Update Dockerfile.windows
feranick Feb 29, 2024
f36e4f5
Updated README.md
feranick Feb 29, 2024
accbdbc
Fix additional ambiguous call leading to FTFS, by overriding TfLiteFl…
feranick Feb 29, 2024
1d5222d
Fix additional FTFS due to change in Eigen3 API
feranick Feb 29, 2024
6f46ccf
Use std=c++17 not c++14
feranick Mar 1, 2024
743e804
Fix position of comment on TF version in WORKSPACE
feranick Mar 1, 2024
aae943b
Updated rules-python to v0.31.0 and simplification of WORKSPACE
feranick Mar 1, 2024
5d59e6f
Temporarily use fork of libedgetpu with initial support for TF2.16.
feranick Mar 1, 2024
9638480
Sync with latest feranick/libedgetpu
feranick Mar 1, 2024
a676b35
Add support for TF 2.16.0-rc0
feranick Mar 1, 2024
5d42cc8
Sync with latest libedgetpu commit
feranick Mar 1, 2024
87812af
Revert commit aae943b5eca
feranick Mar 1, 2024
437af95
Sync with libedgetpu
feranick Mar 1, 2024
1c87cc1
Use more modern version fo debian in docker.mk
feranick Mar 3, 2024
f778cb8
Update distributions in build script.
feranick Mar 3, 2024
cd9d98f
Sync with latest libedgetpu
feranick Mar 4, 2024
e5331a7
Build against TF2.17.0-dev to resolve visibility issue in SCHEMA
feranick Mar 5, 2024
139442b
Add arm-specific compiler flags to fix compilation due to undedefined…
feranick Mar 5, 2024
7c32c20
Revert previous commit for aarch64
feranick Mar 5, 2024
98a9f2b
Revert commit 139442b also for armhf
feranick Mar 5, 2024
d92c702
Use bazel 6.5.0 for TF 2.17.0
feranick Mar 6, 2024
08d1c66
Sync with libedgetpu
feranick Mar 6, 2024
e49f8f1
Reinstate revised version of commit 139442bf1
feranick Mar 6, 2024
bc7ee28
Resync with libedgetpu
feranick Mar 8, 2024
16f6bfa
Update stable TF version required to 2.16.1
feranick Mar 8, 2024
45e73e4
Update file for dh-python, needed to build debian packages
Apr 18, 2024
7b421d0
Sync with libedge for TF 2.17.0-rc0
feranick Jun 21, 2024
501030f
Updated dockerfile.windows with latest dependencies
feranick Jun 21, 2024
3bd6aa0
Added support for temsorflow 2.17.0-rc1
feranick Jul 7, 2024
576612c
Add support for TF 2.17.0 stable
feranick Jul 12, 2024
4e096ec
Added support for Ubuntu:24.04
feranick Aug 8, 2024
139a39c
Change su user to root to allow compilation in Ubuntu 24.04
feranick Aug 9, 2024
b74ede2
Pass correct TF_PYTHON_VERSION during build.
feranick Aug 10, 2024
d7054de
Updated third party libraries in Docker.windows
feranick Aug 11, 2024
f5645e0
Merge pull request #1 from tranzmatt/master
feranick Oct 29, 2024
54ea594
Added support for TF 2.17.1
feranick Oct 29, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion .bazelrc
Original file line number Diff line number Diff line change
@@ -7,8 +7,9 @@ build --enable_platform_specific_config

build:linux --crosstool_top=@crosstool//:toolchains
build:linux --compiler=gcc
build:linux --cxxopt=-std=c++17

build:macos --cxxopt=-std=c++14
build:macos --cxxopt=-std=c++17

build:windows --incompatible_restrict_string_escapes=false
build:windows --cxxopt=/std:c++latest
2 changes: 1 addition & 1 deletion .gitmodules
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[submodule "libedgetpu"]
path = libedgetpu
url = https://github.com/google-coral/libedgetpu
url = https://github.com/feranick/libedgetpu
[submodule "test_data"]
path = test_data
url = https://github.com/google-coral/test_data
2 changes: 1 addition & 1 deletion Makefile
Original file line number Diff line number Diff line change
@@ -43,7 +43,7 @@ BAZEL_BUILD_FLAGS := --compilation_mode=$(COMPILATION_MODE) \
ifeq ($(CPU),aarch64)
BAZEL_BUILD_FLAGS += --copt=-ffp-contract=off
else ifeq ($(CPU),armv7a)
BAZEL_BUILD_FLAGS += --copt=-ffp-contract=off
BAZEL_BUILD_FLAGS += --copt=-ffp-contract=off --copt=-mfp16-format=ieee
endif

# $(1): pattern, $(2) destination directory
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
@@ -76,7 +76,7 @@ You need to install the following software:
1. Bazel for macOS from https://github.com/bazelbuild/bazel/releases
1. MacPorts from https://www.macports.org/install.php
1. Ports of `python` interpreter and `numpy` library: `sudo port install
python35 python36 python37 py35-numpy py36-numpy py37-numpy`
python38 python39 python310 python311 py38-numpy py39-numpy py310-numpy python311-numpy`
1. Port of `libusb` library: `sudo port install libusb`

Right after that all normal `make` commands should work as usual. You can run
@@ -88,6 +88,6 @@ Docker allows to avoid complicated environment setup and build binaries for
Linux on other operating systems without complicated setup, e.g.,

```
make DOCKER_IMAGE=debian:buster DOCKER_CPUS="k8 armv7a aarch64" DOCKER_TARGETS=tests docker-build
make DOCKER_IMAGE=ubuntu:18.04 DOCKER_CPUS="k8 armv7a aarch64" DOCKER_TARGETS=tests docker-build
make DOCKER_IMAGE=debian:bookworm DOCKER_CPUS="k8 armv7a aarch64" DOCKER_TARGETS=tests docker-build
make DOCKER_IMAGE=ubuntu:22.04 DOCKER_CPUS="k8 armv7a aarch64" DOCKER_TARGETS=tests docker-build
```
88 changes: 84 additions & 4 deletions WORKSPACE
Original file line number Diff line number Diff line change
@@ -21,6 +21,86 @@ local_repository(
path = "libedgetpu",
)

# ==================================================================

Add definition of tensorflow version 2.17.1 stable.
http_archive(
name = "org_tensorflow",
urls = [
"https://github.com/tensorflow/tensorflow/archive/3c92ac03cab816044f7b18a86eb86aa01a294d95.tar.gz",
],
sha256 = "317dd95c4830a408b14f3e802698eb68d70d81c7c7cfcd3d28b0ba023fe84a68,
strip_prefix = "tensorflow-" + "3c92ac03cab816044f7b18a86eb86aa01a294d95",
)

http_archive(
name = "bazel_skylib",
sha256 = "74d544d96f4a5bb630d465ca8bbcfe231e3594e5aae57e1edbf17a6eb3ca2506",
urls = [
"https://storage.googleapis.com/mirror.tensorflow.org/github.com/bazelbuild/bazel-skylib/releases/download/1.3.0/bazel-skylib-1.3.0.tar.gz",
"https://github.com/bazelbuild/bazel-skylib/releases/download/1.3.0/bazel-skylib-1.3.0.tar.gz",
],
)

http_archive(
name = "rules_python",
sha256 = "9d04041ac92a0985e344235f5d946f71ac543f1b1565f2cdbc9a2aaee8adf55b",
strip_prefix = "rules_python-0.26.0",
url = "https://github.com/bazelbuild/rules_python/releases/download/0.26.0/rules_python-0.26.0.tar.gz",
)

load("@rules_python//python:repositories.bzl", "py_repositories")

py_repositories()

load("@rules_python//python:repositories.bzl", "python_register_toolchains")
load(
"@org_tensorflow//tensorflow/tools/toolchains/python:python_repo.bzl",
"python_repository",
)

python_repository(name = "python_version_repo")

load("@python_version_repo//:py_version.bzl", "HERMETIC_PYTHON_VERSION")

python_register_toolchains(
name = "python",
ignore_root_user_error = True,
python_version = HERMETIC_PYTHON_VERSION,
)

load("@python//:defs.bzl", "interpreter")
load("@rules_python//python:pip.bzl", "package_annotation", "pip_parse")

NUMPY_ANNOTATIONS = {
"numpy": package_annotation(
additive_build_content = """\
filegroup(
name = "includes",
srcs = glob(["site-packages/numpy/core/include/**/*.h"]),
)
cc_library(
name = "numpy_headers",
hdrs = [":includes"],
strip_include_prefix="site-packages/numpy/core/include/",
)
""",
),
}

pip_parse(
name = "pypi",
annotations = NUMPY_ANNOTATIONS,
python_interpreter_target = interpreter,
requirements = "@org_tensorflow//:requirements_lock_" + HERMETIC_PYTHON_VERSION.replace(".", "_") + ".txt",
)

load("@pypi//:requirements.bzl", "install_deps")

install_deps()

# ==================================================================

load("@libedgetpu//:workspace.bzl", "libedgetpu_dependencies")
libedgetpu_dependencies()

@@ -37,7 +117,7 @@ load("@org_tensorflow//tensorflow:workspace0.bzl", "tf_workspace0")
tf_workspace0()

load("@coral_crosstool//:configure.bzl", "cc_crosstool")
cc_crosstool(name = "crosstool", cpp_version = "c++14")
cc_crosstool(name = "crosstool", cpp_version = "c++17")

# External Dependencies
http_archive(
@@ -57,10 +137,10 @@ glog_library(with_gflags=0)

http_archive(
name = "com_github_google_benchmark",
sha256 = "6e40ccab16a91a7beff4b5b640b84846867e125ebce6ac0fe3a70c5bae39675f",
strip_prefix = "benchmark-16703ff83c1ae6d53e5155df3bb3ab0bc96083be",
sha256 = "8e7b955f04bc6984e4f14074d0d191474f76a6c8e849e04a9dced49bc975f2d4",
strip_prefix = "benchmark-344117638c8ff7e239044fd0fa7085839fc03021",
urls = [
"https://github.com/google/benchmark/archive/16703ff83c1ae6d53e5155df3bb3ab0bc96083be.tar.gz"
"https://github.com/google/benchmark/archive/344117638c8ff7e239044fd0fa7085839fc03021.tar.gz"
],
)

8 changes: 4 additions & 4 deletions coral/detection/adapter.cc
Original file line number Diff line number Diff line change
@@ -78,11 +78,11 @@ std::vector<Object> GetDetectionResults(const tflite::Interpreter& interpreter,
// If a model has signature, we use the signature output tensor names to parse
// the results. Otherwise, we parse the results based on some assumption of
// the output tensor order and size.
if (!interpreter.signature_def_names().empty()) {
CHECK_EQ(interpreter.signature_def_names().size(), 1);
VLOG(1) << "Signature name: " << *interpreter.signature_def_names()[0];
if (!interpreter.signature_keys().empty()) {
CHECK_EQ(interpreter.signature_keys().size(), 1);
VLOG(1) << "Signature name: " << *interpreter.signature_keys()[0];
const auto& signature_output_map = interpreter.signature_outputs(
interpreter.signature_def_names()[0]->c_str());
interpreter.signature_keys()[0]->c_str());
CHECK_EQ(signature_output_map.size(), 4);
count = TensorData<float>(
*interpreter.tensor(signature_output_map.at("output_0")));
2 changes: 1 addition & 1 deletion coral/learn/backprop/layers.cc
Original file line number Diff line number Diff line change
@@ -40,7 +40,7 @@ MatrixXf CrossEntropyGradient(const MatrixXf& c, const MatrixXf& p) {
MatrixXf FullyConnected(const MatrixXf& mat_x, const MatrixXf& mat_w,
const MatrixXf& mat_b) {
MatrixXf mat_y = mat_x * mat_w;
mat_y.array().rowwise() += mat_b.array()(0, Eigen::all);
mat_y.array().rowwise() += mat_b.array()(0, Eigen::indexing::all);
return mat_y;
}

2 changes: 1 addition & 1 deletion coral/learn/backprop/softmax_regression_model.cc
Original file line number Diff line number Diff line change
@@ -129,7 +129,7 @@ void SoftmaxRegressionModel::Train(const TrainingData& data,
const auto& batch_indices =
GetBatchIndices(data.training_data, train_config.batch_size);
MatrixXf train_batch, labels_batch;
train_batch = data.training_data(batch_indices, Eigen::all);
train_batch = data.training_data(batch_indices, Eigen::indexing::all);

// Create one-hot label vectors
labels_batch = MatrixXf::Zero(train_config.batch_size, num_classes_);
10 changes: 5 additions & 5 deletions coral/learn/backprop/test_utils.cc
Original file line number Diff line number Diff line change
@@ -41,7 +41,7 @@ TrainingData ShuffleAndSplitData(const MatrixXf& data_matrix,
std::mt19937(rd()));
MatrixXf shuffled_data =
MatrixXf::Zero(data_matrix.rows(), data_matrix.cols());
shuffled_data = data_matrix(shuffled_indices, Eigen::all);
shuffled_data = data_matrix(shuffled_indices, Eigen::indexing::all);
std::vector<int> shuffled_labels(total_rows, -1);
for (int i = 0; i < total_rows; ++i) {
shuffled_labels[i] = labels_vector[shuffled_indices[i]];
@@ -50,9 +50,9 @@ TrainingData ShuffleAndSplitData(const MatrixXf& data_matrix,
// Eigen::seq boundaries are inclusive on both sides.
TrainingData fake_data;
fake_data.training_data =
shuffled_data(Eigen::seq(0, num_train - 1), Eigen::all);
shuffled_data(Eigen::seq(0, num_train - 1), Eigen::indexing::all);
fake_data.validation_data =
shuffled_data(Eigen::seq(num_train, Eigen::last), Eigen::all);
shuffled_data(Eigen::seq(num_train, Eigen::placeholders::last), Eigen::indexing::all);

fake_data.training_labels.assign(shuffled_labels.begin(),
shuffled_labels.begin() + num_train);
@@ -105,7 +105,7 @@ TrainingData GenerateMvnRandomData(const std::vector<int>& class_sizes,
MultiVariateNormalDistribution dist(means[i], cov_mats[i]);
MatrixXf samples = dist.Sample(n);
// Eigen::seq boundaries are inclusive on both sides.
data_matrix(Eigen::seq(start_index, start_index + n - 1), Eigen::all) =
data_matrix(Eigen::seq(start_index, start_index + n - 1), Eigen::indexing::all) =
samples.transpose();
labels_vector.insert(labels_vector.end(), n, i);
start_index += n;
@@ -127,7 +127,7 @@ TrainingData GenerateUniformRandomData(const std::vector<int>& class_sizes,
int n = class_sizes[i];
MatrixXf samples = MatrixXf::Random(total_cols, n);
// Eigen::seq boundaries are inclusive on both sides.
data_matrix(Eigen::seq(start_index, start_index + n - 1), Eigen::all) =
data_matrix(Eigen::seq(start_index, start_index + n - 1), Eigen::indexing::all) =
samples.transpose();
labels_vector.insert(labels_vector.end(), n, i);
start_index += n;
2 changes: 1 addition & 1 deletion coral/pipeline/internal/segment_runner.cc
Original file line number Diff line number Diff line change
@@ -82,7 +82,7 @@ absl::Status SegmentRunner::SetExternalTensorBuffer(const char* buffer,
// its memory here.
auto* quant_params_clone = reinterpret_cast<TfLiteAffineQuantization*>(
malloc(sizeof(TfLiteAffineQuantization)));
quant_params_clone->scale = TfLiteFloatArrayCopy(quant_params->scale);
quant_params_clone->scale = coral::internal::TfLiteFloatArrayCopy(quant_params->scale);
CHECK(quant_params_clone->scale);
quant_params_clone->zero_point =
TfLiteIntArrayCopy(quant_params->zero_point);
2 changes: 1 addition & 1 deletion coral/tflite_utils.cc
Original file line number Diff line number Diff line change
@@ -46,7 +46,7 @@ TfLiteAffineQuantization* TfLiteAffineQuantizationCopy(
auto* copy = static_cast<TfLiteAffineQuantization*>(
malloc(sizeof(TfLiteAffineQuantization)));
CHECK(copy);
copy->scale = TfLiteFloatArrayCopy(src->scale);
copy->scale = coral::TfLiteFloatArrayCopy(src->scale);
copy->zero_point = TfLiteIntArrayCopy(src->zero_point);
copy->quantized_dimension = src->quantized_dimension;
return copy;
22 changes: 16 additions & 6 deletions docker/Dockerfile
Original file line number Diff line number Diff line change
@@ -15,7 +15,7 @@ RUN apt-get update \
&& DEBIAN_FRONTEND=noninteractive apt-get install -y \
sudo \
debhelper \
python \
dh-python \
python3-all \
python3-numpy \
python3-setuptools \
@@ -52,10 +52,20 @@ RUN if grep 'Bionic Beaver' /etc/os-release > /dev/null; then \

# On older Ubuntu these packages can't be installed in a multi-arch fashion.
# Instead we download the debs and extract them for build time linking.
RUN mkdir /debs && chmod a=rwx /debs && cd /debs && apt-get update && apt-get download \
libglib2.0-0 \
libglib2.0-0:armhf \
libglib2.0-0:arm64 \

RUN if grep 'Noble Numbat' /etc/os-release > /dev/null; then \
mkdir /debs && chmod a=rwx /debs && cd /debs && apt-get update && apt-get download \
libglib2.0-0t64 \
libglib2.0-0t64:armhf \
libglib2.0-0t64:arm64; \
else \
mkdir /debs && chmod a=rwx /debs && cd /debs && apt-get update && apt-get download \
libglib2.0-0 \
libglib2.0-0:armhf \
libglib2.0-0:arm64; \
fi

RUN cd /debs && apt-get update && apt-get download --ignore-missing \
libglib2.0-dev \
libglib2.0-dev:armhf \
libglib2.0-dev:arm64 \
@@ -78,7 +88,7 @@ RUN git clone https://github.com/raspberrypi/tools.git && \
cd tools && \
git reset --hard 4a335520900ce55e251ac4f420f52bf0b2ab6b1f

ARG BAZEL_VERSION=4.0.0
ARG BAZEL_VERSION=6.5.0
RUN wget -O /bazel https://github.com/bazelbuild/bazel/releases/download/${BAZEL_VERSION}/bazel-${BAZEL_VERSION}-installer-linux-x86_64.sh && \
bash /bazel && \
rm -f /bazel
83 changes: 83 additions & 0 deletions docker/Dockerfile.orig
Original file line number Diff line number Diff line change
@@ -0,0 +1,83 @@
ARG IMAGE
FROM ${IMAGE}

COPY update_sources.sh /
RUN /update_sources.sh

RUN dpkg --add-architecture armhf
RUN dpkg --add-architecture arm64
RUN echo 'APT::Immediate-Configure false;' >> /etc/apt/apt.conf

RUN apt-get update \
&& DEBIAN_FRONTEND=noninteractive apt-get install -y \
libc6-dev:arm64 \
libc6-dev:armhf \
&& DEBIAN_FRONTEND=noninteractive apt-get install -y \
sudo \
debhelper \
python3-all \
python3-numpy \
python3-setuptools \
python3-six \
python3-wheel \
libpython3-dev \
libpython3-dev:armhf \
libpython3-dev:arm64 \
build-essential \
crossbuild-essential-armhf \
crossbuild-essential-arm64 \
libusb-1.0-0-dev \
libusb-1.0-0-dev:arm64 \
libusb-1.0-0-dev:armhf \
zlib1g-dev \
zlib1g-dev:armhf \
zlib1g-dev:arm64 \
pkg-config \
p7zip-full \
zip \
unzip \
curl \
wget \
git \
vim \
mc \
software-properties-common

# Bionic Beaver == Ubuntu 18.04
RUN if grep 'Bionic Beaver' /etc/os-release > /dev/null; then \
add-apt-repository ppa:ubuntu-toolchain-r/test \
&& DEBIAN_FRONTEND=noninteractive apt-get install -y gcc-9 g++-9; \
fi

# On older Ubuntu these packages can't be installed in a multi-arch fashion.
# Instead we download the debs and extract them for build time linking.
RUN mkdir /debs && chmod a=rwx /debs && cd /debs && apt-get update && apt-get download \
libglib2.0-0 \
libglib2.0-0:armhf \
libglib2.0-0:arm64 \
libglib2.0-dev \
libglib2.0-dev:armhf \
libglib2.0-dev:arm64 \
libgstreamer1.0-0 \
libgstreamer1.0-0:armhf \
libgstreamer1.0-0:arm64 \
libgstreamer1.0-dev \
libgstreamer1.0-dev:armhf \
libgstreamer1.0-dev:arm64 \
libgstreamer-plugins-base1.0-0 \
libgstreamer-plugins-base1.0-0:armhf \
libgstreamer-plugins-base1.0-0:arm64 \
libgstreamer-plugins-base1.0-dev \
libgstreamer-plugins-base1.0-dev:armhf \
libgstreamer-plugins-base1.0-dev:arm64

RUN for d in /debs/*.deb; do dpkg -x $d /usr/system_libs; done

RUN git clone https://github.com/raspberrypi/tools.git && \
cd tools && \
git reset --hard 4a335520900ce55e251ac4f420f52bf0b2ab6b1f

ARG BAZEL_VERSION=6.5.0
RUN wget -O /bazel https://github.com/bazelbuild/bazel/releases/download/${BAZEL_VERSION}/bazel-${BAZEL_VERSION}-installer-linux-x86_64.sh && \
bash /bazel && \
rm -f /bazel
22 changes: 15 additions & 7 deletions docker/Dockerfile.windows
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
FROM mcr.microsoft.com/windows/servercore:1903
SHELL ["powershell", "-command"]

ARG BAZEL_VERSION=4.0.0
ARG BAZEL_VERSION=6.5.0
# Install Bazel
ADD https://github.com/bazelbuild/bazel/releases/download/${BAZEL_VERSION}/bazel-${BAZEL_VERSION}-windows-x86_64.exe c:/windows/system32/bazel.exe

@@ -21,7 +21,7 @@ RUN setx /M PATH $($Env:PATH + ';C:\Program Files\7-Zip')
ARG MSYS_BASE_REPO=http://repo.msys2.org/distrib/x86_64

# Install msys2
ARG MSYS_VERSION=20200602
ARG MSYS_VERSION=20240727
ADD ${MSYS_BASE_REPO}/msys2-base-x86_64-${MSYS_VERSION}.tar.xz c:/windows/temp
RUN 7z.exe x c:\windows\temp\msys2-base-x86_64-$env:MSYS_VERSION.tar.xz
RUN 7z.exe x c:\msys2-base-x86_64-$env:MSYS_VERSION.tar -o"c:\\"
@@ -30,22 +30,24 @@ RUN setx /M PATH $($Env:PATH + ';C:\msys64\usr\bin')
ARG MSYS_PACKAGE_REPO=https://repo.msys2.org/msys/x86_64

# Install patch
ARG PATCH_VERSION=2.7.6-1
ARG PATCH_VERSION=2.7.6-2
ADD ${MSYS_PACKAGE_REPO}/patch-${PATCH_VERSION}-x86_64.pkg.tar.xz c:/windows/temp
RUN 7z.exe x c:\windows\temp\patch-$env:PATCH_VERSION-x86_64.pkg.tar.xz
RUN 7z.exe x c:\patch-$env:PATCH_VERSION-x86_64.pkg.tar -o"c:\\msys64"

# Install vim (for xxd)
ARG VIM_VERSION=8.2.0592-1
ARG VIM_VERSION=VIM_VERSION=9.1.0643-1
ADD ${MSYS_PACKAGE_REPO}/vim-${VIM_VERSION}-x86_64.pkg.tar.xz c:/windows/temp
RUN 7z.exe x -y c:\windows\temp\vim-$env:VIM_VERSION-x86_64.pkg.tar.xz
RUN 7z.exe x -y c:\windows\temp\vim-$env:VIM_VERSION-x86_64.pkg.tar.zst
RUN 7z.exe x -y c:\vim-$env:VIM_VERSION-x86_64.pkg.tar -o"c:\\msys64"

RUN choco install -m -y python3 --version=3.5.4
RUN choco install -m -y python3 --version=3.6.8
RUN choco install -m -y python3 --version=3.7.9
RUN choco install -m -y python3 --version=3.8.10
RUN choco install -m -y python3 --version=3.9.5
RUN choco install -m -y python3 --version=3.9.13
RUN choco install -m -y python3 --version=3.10.14
RUN choco install -m -y python3 --version=3.11.9

RUN c:\python35\python.exe -m pip install --upgrade pip
RUN c:\python35\python.exe -m pip install numpy six pillow wheel
@@ -62,7 +64,13 @@ RUN c:\python38\python.exe -m pip install numpy six pillow wheel
RUN c:\python39\python.exe -m pip install --upgrade pip
RUN c:\python39\python.exe -m pip install numpy six pillow wheel

RUN c:\python310\python.exe -m pip install --upgrade pip
RUN c:\python310\python.exe -m pip install numpy six pillow wheel

RUN c:\python311\python.exe -m pip install --upgrade pip
RUN c:\python311\python.exe -m pip install numpy six pillow wheel

# Install libusb release package
ARG LIBUSB_VERSION=1.0.24
ARG LIBUSB_VERSION=1.0.27
ADD https://github.com/libusb/libusb/releases/download/v${LIBUSB_VERSION}/libusb-${LIBUSB_VERSION}.7z c:/windows/temp
RUN 7z x -oc:\libusb c:\windows\temp\libusb-$env:LIBUSB_VERSION.7z
6 changes: 3 additions & 3 deletions docker/docker.mk
Original file line number Diff line number Diff line change
@@ -3,7 +3,7 @@ DOCKER_MK_DIR := $(realpath $(dir $(lastword $(MAKEFILE_LIST))))
# Docker
DOCKER_CPUS ?= k8 armv7a aarch64
DOCKER_TARGETS ?=
DOCKER_IMAGE ?= debian:stretch
DOCKER_IMAGE ?= debian:bookworm
DOCKER_TAG_BASE ?= "bazel-cross"
DOCKER_TAG := "$(DOCKER_TAG_BASE)-$(subst :,-,$(DOCKER_IMAGE))"
DOCKER_SHELL_COMMAND ?=
@@ -16,15 +16,15 @@ endif
WORKSPACE := /workspace
MAKE_COMMAND := \
for cpu in $(DOCKER_CPUS); do \
make CPU=\$${cpu} COMPILATION_MODE=$(COMPILATION_MODE) -C $(WORKSPACE)/$(DOCKER_WORKSPACE_CD) $(DOCKER_TARGETS) || exit 1; \
TF_PYTHON_VERSION=$(DOCKER_TF_PYTHON_VERSION) make CPU=\$${cpu} COMPILATION_MODE=$(COMPILATION_MODE) -C $(WORKSPACE)/$(DOCKER_WORKSPACE_CD) $(DOCKER_TARGETS) || exit 1; \
done

define run_command
chmod a+w /; \
groupadd --gid $(shell id -g) $(shell id -g -n); \
useradd -m -e '' -s /bin/bash --gid $(shell id -g) --uid $(shell id -u) $(shell id -u -n); \
echo '$(shell id -u -n) ALL=(ALL) NOPASSWD:ALL' >> /etc/sudoers; \
su $(shell id -u -n) $(if $(1),-c '$(1)',)
su $(if $(1),-c '$(1)',)
endef

docker-image:
42 changes: 42 additions & 0 deletions docker/docker.mk.orig
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
DOCKER_MK_DIR := $(realpath $(dir $(lastword $(MAKEFILE_LIST))))

# Docker
DOCKER_CPUS ?= k8 armv7a aarch64
DOCKER_TARGETS ?=
DOCKER_IMAGE ?= debian:bookworm
DOCKER_TAG_BASE ?= "bazel-cross"
DOCKER_TAG := "$(DOCKER_TAG_BASE)-$(subst :,-,$(DOCKER_IMAGE))"
DOCKER_SHELL_COMMAND ?=
DOCKER_WORKSPACE_CD ?=

ifndef DOCKER_WORKSPACE
$(error DOCKER_WORKSPACE is not defined)
endif

WORKSPACE := /workspace
MAKE_COMMAND := \
for cpu in $(DOCKER_CPUS); do \
make CPU=\$${cpu} COMPILATION_MODE=$(COMPILATION_MODE) -C $(WORKSPACE)/$(DOCKER_WORKSPACE_CD) $(DOCKER_TARGETS) || exit 1; \
done

define run_command
chmod a+w /; \
groupadd --gid $(shell id -g) $(shell id -g -n); \
useradd -m -e '' -s /bin/bash --gid $(shell id -g) --uid $(shell id -u) $(shell id -u -n); \
echo '$(shell id -u -n) ALL=(ALL) NOPASSWD:ALL' >> /etc/sudoers; \
su $(if $(1),-c '$(1)',)
endef

docker-image:
docker build $(DOCKER_IMAGE_OPTIONS) -t $(DOCKER_TAG) \
--build-arg IMAGE=$(DOCKER_IMAGE) $(DOCKER_MK_DIR)

docker-shell: docker-image
docker run --rm -i --tty -v $(DOCKER_WORKSPACE):$(WORKSPACE) \
--workdir $(WORKSPACE)/$(DOCKER_WORKSPACE_CD) \
$(DOCKER_TAG) /bin/bash -c "$(call run_command,$(DOCKER_SHELL_COMMAND))"

docker-build: docker-image
docker run --rm -i $(shell tty -s && echo --tty) -v $(DOCKER_WORKSPACE):$(WORKSPACE) \
$(DOCKER_TAG) /bin/bash -c "$(call run_command,$(MAKE_COMMAND))"

26 changes: 26 additions & 0 deletions docker/update_sources.orig.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
#!/bin/bash
#
# Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
. /etc/os-release

[[ "${NAME}" == "Ubuntu" ]] || exit 0

sed -i "s/deb\ /deb \[arch=amd64\]\ /g" /etc/apt/sources.list

cat <<EOT >> /etc/apt/sources.list
deb [arch=arm64,armhf] http://ports.ubuntu.com/ubuntu-ports ${UBUNTU_CODENAME} main universe
deb [arch=arm64,armhf] http://ports.ubuntu.com/ubuntu-ports ${UBUNTU_CODENAME}-updates main universe
deb [arch=arm64,armhf] http://ports.ubuntu.com/ubuntu-ports ${UBUNTU_CODENAME}-security main universe
EOT
36 changes: 36 additions & 0 deletions docker/update_sources.sh
Original file line number Diff line number Diff line change
@@ -19,8 +19,44 @@

sed -i "s/deb\ /deb \[arch=amd64\]\ /g" /etc/apt/sources.list

if [ ${UBUNTU_CODENAME} == "noble" ]; then

echo "NOBLE"

rm /etc/apt/sources.list.d/ubuntu.sources
cat <<EOT >> /etc/apt/sources.list.d/ubuntu.sources
Types: deb
URIs: http://archive.ubuntu.com/ubuntu/
Suites: ${UBUNTU_CODENAME} ${UBUNTU_CODENAME}-updates ${UBUNTU_CODENAME}-backports
Components: main universe restricted multiverse
Signed-By: /usr/share/keyrings/ubuntu-archive-keyring.gpg
Architectures: amd64
## Ubuntu security updates. Aside from URIs and Suites,
## this should mirror your choices in the previous section.
Types: deb
URIs: http://security.ubuntu.com/ubuntu/
Suites: ${UBUNTU_CODENAME}-security
Components: main universe restricted multiverse
Signed-By: /usr/share/keyrings/ubuntu-archive-keyring.gpg
Architectures: amd64
Types: deb
URIs: http://ports.ubuntu.com/ubuntu-ports
Suites: ${UBUNTU_CODENAME} ${UBUNTU_CODENAME}-updates ${UBUNTU_CODENAME}-security
Components: main universe
Signed-By: /usr/share/keyrings/ubuntu-archive-keyring.gpg
Architectures: arm64 armhf
EOT

else

cat <<EOT >> /etc/apt/sources.list
deb [arch=arm64,armhf] http://ports.ubuntu.com/ubuntu-ports ${UBUNTU_CODENAME} main universe
deb [arch=arm64,armhf] http://ports.ubuntu.com/ubuntu-ports ${UBUNTU_CODENAME}-updates main universe
deb [arch=arm64,armhf] http://ports.ubuntu.com/ubuntu-ports ${UBUNTU_CODENAME}-security main universe
EOT

fi
8 changes: 4 additions & 4 deletions scripts/build.sh
Original file line number Diff line number Diff line change
@@ -45,11 +45,11 @@ for i in "$@"; do
fi
done

# Build for k8 (use Ubuntu 18.04 for compatibility with most platforms).
docker_build "k8" "ubuntu:18.04"
# Build for k8 (use Ubuntu 22.04 for compatibility with most platforms).
docker_build "k8" "ubuntu:22.04"

# Build for armv7a.
docker_build "armv7a" "debian:stretch"
docker_build "armv7a" "debian:bookworm"

# Build for aarch64.
docker_build "aarch64" "debian:stretch"
docker_build "aarch64" "debian:bookworm"