ci: Consolidate workflows and optimise build pipeline performance

Merge rust-checks.yml and release-image.yml into unified ci-build.yml
workflow that runs faster and more efficiently. The previous setup ran
4+ parallel jobs immediately (format, clippy, test, builds), causing
resource contention. The new pipeline runs max 2 jobs in parallel at
each stage, catching lint/format issues quickly before attempting
expensive compilation.

Extract all Rust setup logic from both workflows into reusable
rust-with-cache composite action. This replaces 6 separate actions
(rust-toolchain, sccache, timelord, plus inline APT/cache steps) with
a single action that handles:
- Rust toolchain installation with component selection
- Cross-compilation configuration (previously scattered across
  release-image.yml)
- System dependency installation with proper error handling
- Comprehensive caching (sccache, cargo registry, cargo target, uv
  tools)
- Timeline tracking and performance monitoring

The previous release-image.yml had cross-compilation support but it
was implemented inline with complex environment variables. The new
rust-with-cache action centralises this with proper parameters for
pkg-config paths, foreign architecture setup, and toolchain selection.

Performance improvements make the pipeline fast enough to consolidate:
- Warmed sccache cache shared between check and build stages
- Optimised cargo target cache to exclude incremental/ and binaries
  (was caching entire target/ directory via buildkit-cache-dance)
- Add restore-keys fallback for better cache hit rates
- Parallel background tasks for Rust setup while APT runs
- Fail-fast on format/lint errors before expensive compilation
- Enable Haswell CPU optimisations for x86_64 builds (AVX2, FMA, etc.)
- Add cross-language LTO (Link-Time Optimisation) for better performance

Fix ARM64 cross-compilation reliability issues:
- Move APT installations from background to foreground (background
  processes would hang during package downloads despite
  DEBIAN_FRONTEND=noninteractive)
- Set proper pkg-config environment for cross-compilation
- Configure APT sources to ports.ubuntu.com for foreign architectures
- Replace hardened_malloc with jemalloc (ARM64 unsupported)

Modernisation from previous commit (b0ebdb59):
- prefligit renamed to prek (avoid typosquatting)
- Direct uvx rustup replacing custom rust-toolchain action
- Workflow renames: deploy-element, deploy-docs, docker-mirror
- Renovate configuration for .forgejo/ workflows
- fix-byte-order-marker replacing check-byte-order-marker

Docker improvements:
- Remove buildkit-cache-dance injection (now handled natively)
- Align tag naming between arch-specific and multi-platform builds
- Add branch- prefix for non-default branches
- Reserve latest-{arch} tags for version releases only
- Remove dynamic library extraction logic (ldd doesn't work for
  cross-compiled binaries; Rust --release produces mostly-static binaries)

Additional improvements based on maintainer feedback:
- Generate SBOM (Software Bill of Materials) for security compliance
- Include SBOM in uploaded build artefacts alongside binary

The consolidated pipeline completes in ~10 minutes with better
resource utilisation and clearer failure diagnostics. Both x86_64 and
ARM64 builds now work reliably with the centralised cross-compilation
configuration.
This commit is contained in:
Tom Foster 2025-08-15 21:06:06 +01:00
commit b481ff31c0
16 changed files with 1423 additions and 631 deletions

View file

@ -0,0 +1,81 @@
name: display-log-group
description: |
Display a log file in a collapsible group with timing and error handling.
Expects log files to have ELAPSED_TIME and EXIT_CODE metadata lines.
inputs:
name:
description: 'Name of the task (for display)'
required: true
log-file:
description: 'Path to the log file'
required: true
done-file:
description: 'Path to the done marker file to wait for'
required: true
group-icon:
description: 'Icon to show in group title'
required: false
default: '📦'
filter-pattern:
description: 'Optional grep pattern to filter output'
required: false
default: ''
max-lines:
description: 'Maximum lines to show (0 for unlimited)'
required: false
default: '0'
runs:
using: composite
steps:
- name: Display ${{ inputs.name }} results
shell: bash
run: |
# Wait for task completion
while [ ! -f "${{ inputs.done-file }}" ]; do
sleep 0.1
done
echo "::group::${{ inputs.group-icon }} ${{ inputs.name }}"
if [ -f "${{ inputs.log-file }}" ]; then
# Extract metadata
ELAPSED=$(grep "^ELAPSED_TIME=" "${{ inputs.log-file }}" | cut -d= -f2 || echo "unknown")
TOTAL=$(grep "^TOTAL_TIME=" "${{ inputs.log-file }}" | cut -d= -f2 || echo "")
EXIT_CODE=$(grep "^EXIT_CODE=" "${{ inputs.log-file }}" | cut -d= -f2 || echo "1")
# Show output (excluding metadata lines)
if [[ -n "${{ inputs.filter-pattern }}" ]]; then
# Filter output by pattern
grep -E "${{ inputs.filter-pattern }}" "${{ inputs.log-file }}" | grep -v "^ELAPSED_TIME=\|^EXIT_CODE=\|^TOTAL_TIME=" || true
else
# Show full output
if [[ "${{ inputs.max-lines }}" != "0" ]]; then
grep -v "^ELAPSED_TIME=\|^EXIT_CODE=\|^TOTAL_TIME=" "${{ inputs.log-file }}" | head -${{ inputs.max-lines }}
else
grep -v "^ELAPSED_TIME=\|^EXIT_CODE=\|^TOTAL_TIME=" "${{ inputs.log-file }}"
fi
fi
# Show result with timing info
if [ "$EXIT_CODE" = "0" ]; then
if [ -n "$TOTAL" ]; then
echo "✅ ${{ inputs.name }} completed (${ELAPSED}s task, ${TOTAL}s total)"
else
echo "✅ ${{ inputs.name }} completed (${ELAPSED}s)"
fi
else
if [ -n "$TOTAL" ]; then
echo "⚠️ ${{ inputs.name }} had issues (exit code: $EXIT_CODE) (${ELAPSED}s task, ${TOTAL}s total)"
else
echo "⚠️ ${{ inputs.name }} had issues (exit code: $EXIT_CODE) (${ELAPSED}s)"
fi
# Don't exit with failure - let the main job continue so we can see what went wrong
# The actual build steps will fail if packages are missing
fi
else
echo "⚠️ No log file found at ${{ inputs.log-file }}"
fi
echo "::endgroup::"

View file

@ -1,27 +0,0 @@
name: prefligit
description: |
Runs prefligit, pre-commit reimplemented in Rust.
inputs:
extra_args:
description: options to pass to pre-commit run
required: false
default: '--all-files'
runs:
using: composite
steps:
- name: Install uv
uses: https://github.com/astral-sh/setup-uv@v6
with:
enable-cache: true
ignore-nothing-to-cache: true
- name: Install Prefligit
shell: bash
run: |
curl --proto '=https' --tlsv1.2 -LsSf https://github.com/j178/prefligit/releases/download/v0.0.10/prefligit-installer.sh | sh
- uses: actions/cache@v3
with:
path: ~/.cache/prefligit
key: prefligit-0|${{ hashFiles('.pre-commit-config.yaml') }}
- run: prefligit run --show-diff-on-failure --color=always -v ${{ inputs.extra_args }}
shell: bash

View file

@ -1,63 +0,0 @@
name: rust-toolchain
description: |
Install a Rust toolchain using rustup.
See https://rust-lang.github.io/rustup/concepts/toolchains.html#toolchain-specification
for more information about toolchains.
inputs:
toolchain:
description: |
Rust toolchain name.
See https://rust-lang.github.io/rustup/concepts/toolchains.html#toolchain-specification
required: false
target:
description: Target triple to install for this toolchain
required: false
components:
description: Space-separated list of components to be additionally installed for a new toolchain
required: false
outputs:
rustc_version:
description: The rustc version installed
value: ${{ steps.rustc-version.outputs.version }}
rustup_version:
description: The rustup version installed
value: ${{ steps.rustup-version.outputs.version }}
runs:
using: composite
steps:
- name: Check if rustup is already installed
shell: bash
id: rustup-version
run: |
echo "version=$(rustup --version)" >> $GITHUB_OUTPUT
- name: Cache rustup toolchains
if: steps.rustup-version.outputs.version == ''
uses: actions/cache@v3
with:
path: |
~/.rustup
!~/.rustup/tmp
!~/.rustup/downloads
# Requires repo to be cloned if toolchain is not specified
key: ${{ runner.os }}-rustup-${{ inputs.toolchain || hashFiles('**/rust-toolchain.toml') }}
- name: Install Rust toolchain
if: steps.rustup-version.outputs.version == ''
shell: bash
run: |
if ! command -v rustup &> /dev/null ; then
curl --proto '=https' --tlsv1.2 --retry 10 --retry-connrefused -fsSL "https://sh.rustup.rs" | sh -s -- --default-toolchain none -y
echo "${CARGO_HOME:-$HOME/.cargo}/bin" >> $GITHUB_PATH
fi
- shell: bash
run: |
set -x
${{ inputs.toolchain && format('rustup override set {0}', inputs.toolchain) }}
${{ inputs.target && format('rustup target add {0}', inputs.target) }}
${{ inputs.components && format('rustup component add {0}', inputs.components) }}
cargo --version
rustc --version
- id: rustc-version
shell: bash
run: |
echo "version=$(rustc --version)" >> $GITHUB_OUTPUT

View file

@ -0,0 +1,846 @@
name: setup-rust-with-cache
description: |
Complete Rust setup with comprehensive caching for Continuwuity CI.
Installs Rust toolchain and combines sccache, timelord, cargo registry, and system package caching.
inputs:
cache-key-suffix:
description: 'Optional suffix for cache keys (e.g. platform identifier)'
required: false
default: ''
rust-target:
description: 'Rust compilation target (e.g. aarch64-unknown-linux-gnu)'
required: false
default: ''
toolchain:
description: 'Rust toolchain override (e.g. nightly, 1.87.0). Uses rust-toolchain.toml if not specified'
required: false
default: ''
components:
description: 'Additional Rust components to install (e.g. rustfmt for nightly)'
required: false
default: ''
dpkg-arch:
description: 'Debian architecture to add for cross-compilation (e.g. arm64)'
required: false
default: ''
gcc-package:
description: 'GCC package to install (e.g. gcc or gcc-aarch64-linux-gnu)'
required: false
default: 'gcc'
gxx-package:
description: 'G++ package to install (e.g. g++ or g++-aarch64-linux-gnu)'
required: false
default: 'g++'
liburing-package:
description: 'liburing package to install (e.g. liburing-dev or liburing-dev:arm64)'
required: false
default: 'liburing-dev'
extra-packages:
description: 'Additional APT packages to install (space-separated)'
required: false
default: ''
is-cross-compile:
description: 'Whether this is a cross-compilation build'
required: false
default: 'false'
cc:
description: 'C compiler to use (e.g. gcc or aarch64-linux-gnu-gcc)'
required: false
default: 'gcc'
cxx:
description: 'C++ compiler to use (e.g. g++ or aarch64-linux-gnu-g++)'
required: false
default: 'g++'
linker:
description: 'Linker to use (e.g. gcc or aarch64-linux-gnu-gcc)'
required: false
default: 'gcc'
march:
description: 'Architecture to compile for (e.g. x86-64 or armv8-a)'
required: false
default: ''
cargo-linker-env:
description: 'Cargo linker environment variable name (e.g. CARGO_TARGET_X86_64_UNKNOWN_LINUX_GNU_LINKER)'
required: false
default: ''
pkg-config-path:
description: 'PKG_CONFIG_PATH for cross-compilation'
required: false
default: ''
pkg-config-libdir:
description: 'PKG_CONFIG_LIBDIR for cross-compilation'
required: false
default: ''
pkg-config-sysroot:
description: 'PKG_CONFIG_SYSROOT_DIR for cross-compilation'
required: false
default: ''
outputs:
sccache-cache-key:
description: 'The cache key to use for saving sccache cache'
value: ${{ steps.sccache-key.outputs.cache-key }}
runs:
using: composite
steps:
# Record action start time for timeline tracking
- name: Initialize timeline tracking
shell: bash
run: |
ACTION_START=$(date +%s)
echo "ACTION_START_TIME=$ACTION_START" >> $GITHUB_ENV
# Set non-interactive mode for all APT operations to prevent hangs
echo "DEBIAN_FRONTEND=noninteractive" >> $GITHUB_ENV
echo "::group::🎭 Rust build environment setup"
echo "🕐 Started at $(date '+%Y-%m-%d %H:%M:%S')"
# Export the sccache cache key early for the parent workflow to use
- name: Export sccache cache key
id: sccache-key
shell: bash
run: |
echo "cache-key=sccache-v1${{ inputs.cache-key-suffix && format('-{0}', inputs.cache-key-suffix) || '' }}-${{ hashFiles('**/Cargo.lock') }}-${{ github.run_id }}-${{ github.run_attempt }}" >> $GITHUB_OUTPUT
# Install uv for Python/Rust toolchain management
- name: Install uv
shell: bash
run: |
echo "::group::🔧 Phase 1: Core dependencies"
echo "::group::📦 Installing uv package manager"
- uses: https://github.com/astral-sh/setup-uv@v6
with:
enable-cache: true
ignore-nothing-to-cache: true
cache-dependency-glob: '' # Disable Python dependency tracking for Rust project
- shell: bash
if: always()
run: echo "::endgroup::"
# Set up PATH for uv tools installed by this action
- name: Configure PATH for uv tools
shell: bash
run: |
echo "✓ Configuring PATH for uv tools"
# uv tools get installed to ~/.local/share/uv/tools/*/bin
# Add sccache's actual location to PATH
echo "$HOME/.local/share/uv/tools/sccache/bin" >> $GITHUB_PATH
# Cache uv tools to avoid redownloading sccache
- name: Cache uv tools
shell: bash
run: |
echo "::group::💾 Phase 2: Cache restoration"
echo "::group::├─ UV tools cache"
- uses: actions/cache@v4
with:
path: ~/.local/share/uv/tools
key: uv-tools-sccache-v1
- shell: bash
if: always()
run: echo "::endgroup::"
# Cache cargo bin directory for timelord and other installed tools
- name: Cache cargo bin
shell: bash
run: |
echo "::group::├─ Cargo binaries cache"
- uses: actions/cache@v4
id: cargo-bin-cache
with:
path: ~/.cargo/bin
key: cargo-bin-v1-${{ hashFiles('**/Cargo.lock') }}
restore-keys: |
cargo-bin-v1-
- shell: bash
if: always()
run: echo "::endgroup::"
# Configure architecture if needed (before starting parallel tasks)
- name: Configure architecture for cross-compilation
if: inputs.dpkg-arch != ''
shell: bash
run: |
echo "::group::🔧 Adding ${{ inputs.dpkg-arch }} architecture"
sudo dpkg --add-architecture ${{ inputs.dpkg-arch }}
# First, restrict default sources to amd64 only to prevent 404s for foreign architectures
echo "📝 Restricting default APT sources to amd64 only..."
sudo sed -i 's/^deb http/deb [arch=amd64] http/g' /etc/apt/sources.list
sudo sed -i 's/^deb https/deb [arch=amd64] https/g' /etc/apt/sources.list
# Also handle any existing sources in sources.list.d
if ls /etc/apt/sources.list.d/*.list 2>/dev/null; then
for file in /etc/apt/sources.list.d/*.list; do
# Skip our own files we're about to create
if [[ "$file" != *"${{ inputs.dpkg-arch }}.list" ]]; then
sudo sed -i 's/^deb http/deb [arch=amd64] http/g' "$file" 2>/dev/null || true
sudo sed -i 's/^deb https/deb [arch=amd64] https/g' "$file" 2>/dev/null || true
fi
done
fi
# Configure APT sources for the foreign architecture using ports.ubuntu.com
echo "📝 Configuring APT sources for ${{ inputs.dpkg-arch }} architecture..."
sudo tee /etc/apt/sources.list.d/${{ inputs.dpkg-arch }}.list > /dev/null <<EOF
deb [arch=${{ inputs.dpkg-arch }}] http://ports.ubuntu.com/ubuntu-ports/ jammy main restricted universe multiverse
deb [arch=${{ inputs.dpkg-arch }}] http://ports.ubuntu.com/ubuntu-ports/ jammy-updates main restricted universe multiverse
deb [arch=${{ inputs.dpkg-arch }}] http://ports.ubuntu.com/ubuntu-ports/ jammy-backports main restricted universe multiverse
deb [arch=${{ inputs.dpkg-arch }}] http://ports.ubuntu.com/ubuntu-ports/ jammy-security main restricted universe multiverse
EOF
echo "✅ Architecture ${{ inputs.dpkg-arch }} added with APT sources configured"
echo "::endgroup::"
# Setup Rust toolchain and APT in parallel
- name: Start parallel setup tasks
shell: bash
run: |
echo "::endgroup::"
echo "::endgroup::"
echo "::group::🚀 Phase 3: Parallel installations"
START_TIME=$(date +%s)
ELAPSED=$((START_TIME - $ACTION_START_TIME))
echo "📍 Starting parallel tasks at ${ELAPSED}s"
# Clean up any previous markers and logs
rm -f /tmp/*.done /tmp/*.log
# Start Rust toolchain setup in background
(
RUST_START=$(date +%s)
{
set -e # Exit on error
if [[ -n "${{ inputs.toolchain }}" ]]; then
echo "Installing custom Rust toolchain: ${{ inputs.toolchain }}"
uvx rustup override set ${{ inputs.toolchain }}
else
echo "Installing Rust toolchain from rust-toolchain.toml"
uvx rustup show # This will auto-install from rust-toolchain.toml
fi
# Add any additional components
if [[ -n "${{ inputs.components }}" ]]; then
echo "Installing additional components: ${{ inputs.components }}"
uvx rustup component add ${{ inputs.components }}
fi
} > /tmp/rust_setup.log 2>&1
EXIT_CODE=$?
RUST_END=$(date +%s)
echo "ELAPSED_TIME=$(($RUST_END - $RUST_START))" >> /tmp/rust_setup.log
echo "TOTAL_TIME=$(($RUST_END - $ACTION_START_TIME))" >> /tmp/rust_setup.log
echo "EXIT_CODE=$EXIT_CODE" >> /tmp/rust_setup.log
# Only create done marker if successful
if [ $EXIT_CODE -eq 0 ]; then
touch /tmp/rust_setup.done
else
echo "ERROR: Rust setup failed with exit code $EXIT_CODE" >> /tmp/rust_setup.log
fi
) &
RUST_PID=$!
echo " ⏳ Rust toolchain setup started (PID: $RUST_PID)"
# Start APT update in background
(
APT_START=$(date +%s)
{
set -e # Exit on error
sudo apt-get update
} > /tmp/apt_update.log 2>&1
EXIT_CODE=$?
APT_END=$(date +%s)
echo "ELAPSED_TIME=$(($APT_END - $APT_START))" >> /tmp/apt_update.log
echo "TOTAL_TIME=$(($APT_END - $ACTION_START_TIME))" >> /tmp/apt_update.log
echo "EXIT_CODE=$EXIT_CODE" >> /tmp/apt_update.log
# Only create done marker if successful
if [ $EXIT_CODE -eq 0 ]; then
touch /tmp/apt_update.done
else
echo "ERROR: APT update failed with exit code $EXIT_CODE" >> /tmp/apt_update.log
fi
) &
APT_PID=$!
echo " ⏳ APT update started (PID: $APT_PID)"
echo "::endgroup::"
# Determine packages to install
- name: Determine packages
id: packages
shell: bash
run: |
# Base packages
PACKAGES="clang ${{ inputs.gcc-package }} ${{ inputs.gxx-package }} ${{ inputs.liburing-package }}"
# Add any extra packages
if [[ -n "${{ inputs.extra-packages }}" ]]; then
PACKAGES="$PACKAGES ${{ inputs.extra-packages }}"
fi
echo "📦 Packages to install: $PACKAGES"
echo "packages=$PACKAGES" >> $GITHUB_OUTPUT
# Install APT packages synchronously
- name: Install APT packages
shell: bash
run: |
echo "::group::🔨 Installing APT packages"
# Wait for APT update to complete first
WAIT_START=$(date +%s)
MAX_WAIT=120 # 2 minutes timeout for APT update
while [ ! -f /tmp/apt_update.done ]; do
sleep 0.5
CURRENT_TIME=$(date +%s)
ELAPSED=$((CURRENT_TIME - WAIT_START))
if [ $ELAPSED -eq 1 ]; then
echo "⏳ Waiting for APT update to complete..."
fi
if [ $ELAPSED -gt $MAX_WAIT ]; then
echo "❌ Error: APT update timed out after 2 minutes"
if [ -f /tmp/apt_update.log ]; then
echo "📋 APT update log:"
cat /tmp/apt_update.log
fi
exit 1
fi
done
echo "✓ APT update completed"
PACKAGES="${{ steps.packages.outputs.packages }}"
echo "📦 Installing packages: $PACKAGES"
# Install packages synchronously
sudo DEBIAN_FRONTEND=noninteractive apt-get install -y $PACKAGES
EXIT_CODE=$?
if [ $EXIT_CODE -ne 0 ]; then
echo "⚠️ Package installation failed with code $EXIT_CODE, attempting to fix..."
sudo DEBIAN_FRONTEND=noninteractive apt-get install -f -y
EXIT_CODE=$?
fi
if [ $EXIT_CODE -eq 0 ]; then
echo "✅ APT packages installed successfully"
else
echo "❌ APT package installation failed"
exit $EXIT_CODE
fi
echo "::endgroup::"
# Configure compilation environment
- name: Configure compilation environment
shell: bash
run: |
echo "::group::🔧 Configuring compilation environment"
# Set compiler and linker
echo "CC=${{ inputs.cc }}" >> $GITHUB_ENV
echo "CXX=${{ inputs.cxx }}" >> $GITHUB_ENV
# Set architecture-specific flags if provided
if [[ -n "${{ inputs.march }}" ]]; then
echo "CFLAGS=-march=${{ inputs.march }}" >> $GITHUB_ENV
echo "CXXFLAGS=-march=${{ inputs.march }}" >> $GITHUB_ENV
fi
# Set Rust target-specific linker if provided
if [[ -n "${{ inputs.cargo-linker-env }}" ]]; then
echo "${{ inputs.cargo-linker-env }}=${{ inputs.linker }}" >> $GITHUB_ENV
fi
# Configure pkg-config for cross-compilation if needed
if [[ "${{ inputs.is-cross-compile }}" == "true" ]]; then
echo "PKG_CONFIG_ALLOW_CROSS=1" >> $GITHUB_ENV
if [[ -n "${{ inputs.pkg-config-path }}" ]]; then
echo "PKG_CONFIG_PATH=${{ inputs.pkg-config-path }}" >> $GITHUB_ENV
fi
if [[ -n "${{ inputs.pkg-config-libdir }}" ]]; then
echo "PKG_CONFIG_LIBDIR=${{ inputs.pkg-config-libdir }}" >> $GITHUB_ENV
fi
if [[ -n "${{ inputs.pkg-config-sysroot }}" ]]; then
echo "PKG_CONFIG_SYSROOT_DIR=${{ inputs.pkg-config-sysroot }}" >> $GITHUB_ENV
fi
fi
echo "✅ Compilation environment configured"
echo "::endgroup::"
# Install tools in parallel
- name: Start tool installations in parallel
shell: bash
run: |
echo "::group::🛠️ Tool installation"
# Create sccache directory early at default location
mkdir -p /root/.cache/sccache
# Start sccache setup in background (using uv)
(
SCCACHE_START=$(date +%s)
uv tool install sccache > /tmp/sccache_install.log 2>&1
EXIT_CODE=$?
SCCACHE_END=$(date +%s)
echo "ELAPSED_TIME=$(($SCCACHE_END - $SCCACHE_START))" >> /tmp/sccache_install.log
echo "TOTAL_TIME=$(($SCCACHE_END - $ACTION_START_TIME))" >> /tmp/sccache_install.log
echo "EXIT_CODE=$EXIT_CODE" >> /tmp/sccache_install.log
touch /tmp/sccache_install.done
) &
SCCACHE_PID=$!
echo " ⏳ sccache installation started (PID: $SCCACHE_PID)"
# Start timelord installation in background
(
# Wait for Rust to be ready since cargo install needs it
while [ ! -f /tmp/rust_setup.done ]; do
sleep 0.5
done
TIMELORD_START=$(date +%s)
# Check if timelord is already available from cache
if command -v timelord &> /dev/null; then
echo "Timelord already available from cache" > /tmp/timelord_install.log
TIMELORD_VERSION=$(timelord --version 2>&1 || echo "unknown version")
echo "Version: $TIMELORD_VERSION" >> /tmp/timelord_install.log
EXIT_CODE=0
else
echo "Installing timelord-cli..." > /tmp/timelord_install.log
cargo install --locked timelord-cli >> /tmp/timelord_install.log 2>&1
EXIT_CODE=$?
fi
TIMELORD_END=$(date +%s)
echo "ELAPSED_TIME=$(($TIMELORD_END - $TIMELORD_START))" >> /tmp/timelord_install.log
echo "TOTAL_TIME=$(($TIMELORD_END - $ACTION_START_TIME))" >> /tmp/timelord_install.log
echo "EXIT_CODE=$EXIT_CODE" >> /tmp/timelord_install.log
touch /tmp/timelord_install.done
) &
TIMELORD_PID=$!
echo " ⏳ timelord installation started (PID: $TIMELORD_PID)"
echo "::endgroup::"
# Configure and start sccache
- name: Configure and start sccache
shell: bash
run: |
echo "::group::⚙️ Configuring sccache"
# Wait for sccache setup to complete
WAIT_START=$(date +%s)
MAX_WAIT=60 # 60 seconds timeout
while [ ! -f /tmp/sccache_install.done ]; do
sleep 0.1
CURRENT_TIME=$(date +%s)
ELAPSED=$((CURRENT_TIME - WAIT_START))
if [ $ELAPSED -eq 1 ]; then
echo "⏳ Waiting for sccache installation..."
fi
if [ $ELAPSED -gt $MAX_WAIT ]; then
echo "❌ Error: sccache setup timed out after 60 seconds"
exit 1
fi
done
# Ensure PATH includes uv tools for this shell
export PATH="$HOME/.local/share/uv/tools/sccache/bin:$PATH"
# Verify sccache is available
if ! command -v sccache &> /dev/null; then
echo "❌ Error: sccache not found in PATH after installation"
echo "PATH=$PATH"
echo "Checking installation:"
ls -la $HOME/.local/share/uv/tools/sccache/bin/ 2>/dev/null || echo " UV tools directory not found"
exit 1
fi
echo "✅ Found sccache at: $(command -v sccache)"
# Configure sccache environment
echo "📝 Configuring sccache environment variables"
echo "SCCACHE_DIR=/root/.cache/sccache" >> $GITHUB_ENV
echo "SCCACHE_CACHE_SIZE=10G" >> $GITHUB_ENV
echo "RUSTC_WRAPPER=sccache" >> $GITHUB_ENV
echo "CMAKE_C_COMPILER_LAUNCHER=sccache" >> $GITHUB_ENV
echo "CMAKE_CXX_COMPILER_LAUNCHER=sccache" >> $GITHUB_ENV
# Start sccache daemon in background immediately
# It will be ready by the time we need it for compilation
sccache --start-server &>/dev/null &
SCCACHE_PID=$!
# Continue with other setup while sccache starts
SCCACHE_READY=$(date +%s)
TOTAL_ELAPSED=$(($SCCACHE_READY - $ACTION_START_TIME))
echo " ✅ sccache server started (PID: $SCCACHE_PID) at ${TOTAL_ELAPSED}s total"
touch /tmp/sccache_ready.done
echo "::endgroup::"
# Load timelord cache for timestamps
- name: Load timelord files
shell: bash
run: |
echo "::group::├─ Timelord timestamp cache"
- uses: actions/cache/restore@v4
with:
path: /timelord/
key: timelord-v0${{ inputs.cache-key-suffix && format('-{0}', inputs.cache-key-suffix) || '' }}
- shell: bash
if: always()
run: echo "::endgroup::"
- name: Run timelord sync in background
shell: bash
run: |
# Start timelord sync in background after timelord is installed
(
# Wait for timelord installation (silently since we're already tracking it elsewhere)
while [ ! -f /tmp/timelord_install.done ]; do
sleep 0.5
done
# Check if timelord binary actually exists before trying to run it
if command -v timelord &> /dev/null; then
SYNC_START=$(date +%s)
timelord sync --source-dir . --cache-dir /timelord/ > /tmp/timelord_sync.log 2>&1
EXIT_CODE=$?
SYNC_END=$(date +%s)
echo "ELAPSED_TIME=$(($SYNC_END - $SYNC_START))" >> /tmp/timelord_sync.log
echo "TOTAL_TIME=$(($SYNC_END - $ACTION_START_TIME))" >> /tmp/timelord_sync.log
echo "EXIT_CODE=$EXIT_CODE" >> /tmp/timelord_sync.log
else
echo "Timelord binary not found, skipping sync" > /tmp/timelord_sync.log
echo "ELAPSED_TIME=0" >> /tmp/timelord_sync.log
echo "EXIT_CODE=0" >> /tmp/timelord_sync.log
fi
touch /tmp/timelord_sync.done
) &
echo "⏳ Timestamp synchronisation started in background"
- name: Save timelord
if: always()
shell: bash
run: |
echo "::group::💾 Saving timestamp cache"
# Ensure the timelord directory exists before trying to cache it
if [ ! -d /timelord ]; then
echo "Creating /timelord directory for cache"
sudo mkdir -p /timelord
sudo chmod 777 /timelord
fi
- uses: actions/cache/save@v4
if: always()
with:
path: /timelord/
key: timelord-v0${{ inputs.cache-key-suffix && format('-{0}', inputs.cache-key-suffix) || '' }}
- shell: bash
if: always()
run: echo "::endgroup::"
# Cache sccache directory (auto-saves at end of job)
- name: Cache sccache compilation artifacts
shell: bash
run: |
echo "::group::├─ Build artefact cache (sccache)"
- uses: actions/cache@v4
id: sccache-cache
with:
path: /root/.cache/sccache
# Use a unique key with timestamp to force saving updated cache
key: sccache-v1${{ inputs.cache-key-suffix && format('-{0}', inputs.cache-key-suffix) || '' }}-${{ hashFiles('**/Cargo.lock') }}-${{ github.run_id }}-${{ github.run_attempt }}
restore-keys: |
sccache-v1${{ inputs.cache-key-suffix && format('-{0}', inputs.cache-key-suffix) || '' }}-${{ hashFiles('**/Cargo.lock') }}-
sccache-v1${{ inputs.cache-key-suffix && format('-{0}', inputs.cache-key-suffix) || '' }}-
sccache-v1-
- shell: bash
if: always()
run: echo "::endgroup::"
# Cache Rust registry
- name: Cache Rust registry
shell: bash
run: |
echo "::group::├─ Rust registry cache"
- uses: actions/cache@v4
with:
path: |
~/.cargo/git
!~/.cargo/git/checkouts
~/.cargo/registry
!~/.cargo/registry/src
key: rust-registry${{ inputs.cache-key-suffix && format('-{0}', inputs.cache-key-suffix) || '' }}-${{ hashFiles('**/Cargo.lock') }}
restore-keys: |
rust-registry${{ inputs.cache-key-suffix && format('-{0}', inputs.cache-key-suffix) || '' }}-
rust-registry-
- shell: bash
if: always()
run: echo "::endgroup::"
# Cache cargo target directory (optimised for size and performance)
# Only caches compiled dependencies and build artefacts, not incremental compilation or binaries
- name: Cache cargo target
shell: bash
run: |
echo "::group::└─ Compiled dependencies cache"
- uses: actions/cache@v4
with:
path: |
# Include compiled dependencies, build scripts and fingerprints for all profiles
target/**/deps
target/**/build
target/**/.fingerprint
# Exclude incremental compilation cache (changes frequently, poor cache hits)
!target/**/incremental
# Exclude final binaries (rebuilt anyway when code changes)
!target/**/conduwuit
!target/**/conduwuit.exe
# Exclude dependency tracking files (regenerated quickly)
!target/**/*.d
key: cargo-target-v2${{ inputs.cache-key-suffix && format('-{0}', inputs.cache-key-suffix) || '' }}-${{ hashFiles('**/Cargo.lock') }}-${{ hashFiles('**/*.rs') }}
restore-keys: |
cargo-target-v2${{ inputs.cache-key-suffix && format('-{0}', inputs.cache-key-suffix) || '' }}-${{ hashFiles('**/Cargo.lock') }}-
cargo-target-v2${{ inputs.cache-key-suffix && format('-{0}', inputs.cache-key-suffix) || '' }}-
- shell: bash
if: always()
run: echo "::endgroup::"
# Cache cross-compilation toolchain
- name: Cache cross-compilation toolchain
if: inputs.is-cross-compile == 'true'
shell: bash
run: |
echo "::endgroup::"
echo "::endgroup::"
# Install cross-compilation target (no caching - it's fast enough)
- name: Install cross-compilation target
if: inputs.is-cross-compile == 'true'
shell: bash
run: |
echo "::group::🎯 Cross-compilation target setup"
# Wait for basic Rust setup to complete first
WAIT_START=$(date +%s)
while [ ! -f /tmp/rust_setup.done ]; do
sleep 0.5
CURRENT_TIME=$(date +%s)
ELAPSED=$((CURRENT_TIME - WAIT_START))
if [ $ELAPSED -eq 2 ]; then
echo "⏳ Waiting for Rust toolchain to be ready..."
fi
if [ $ELAPSED -gt 300 ]; then
echo "❌ Error: Rust setup timed out after 5 minutes"
exit 1
fi
done
source /tmp/env_vars.sh 2>/dev/null || true
ACTION_START_TIME=${ACTION_START_TIME:-$(date +%s)}
TARGET_START=$(date +%s)
# Check if target is already installed
echo "Checking if target ${{ inputs.rust-target }} is already installed..."
if uvx rustup target list --installed 2>/dev/null | grep -q "^${{ inputs.rust-target }}$"; then
echo "✅ Target ${{ inputs.rust-target }} is already installed"
else
echo "Installing cross-compilation target: ${{ inputs.rust-target }}"
uvx rustup target add ${{ inputs.rust-target }}
if [ $? -eq 0 ]; then
echo "✅ Target installed successfully"
else
echo "❌ Error: Failed to install target ${{ inputs.rust-target }}"
exit 1
fi
fi
TARGET_END=$(date +%s)
TOTAL_ELAPSED=$(($TARGET_END - $ACTION_START_TIME))
echo "✅ Cross-compilation target ready ($(($TARGET_END - $TARGET_START))s task, ${TOTAL_ELAPSED}s total)"
# Create done marker for compatibility with sync step
touch /tmp/rust_target.done
echo "::endgroup::"
# Wait for all background tasks to complete and display results
- name: Ensure all setup tasks are ready
shell: bash
run: |
echo "::group::⏳ Phase 4: Synchronisation"
SYNC_START=$(date +%s)
# List of marker files to wait for (only background tasks now)
MARKERS=(
"/tmp/rust_setup.done"
"/tmp/apt_update.done"
"/tmp/sccache_ready.done"
"/tmp/timelord_sync.done"
)
# Pretty names for tasks
declare -A TASK_NAMES
TASK_NAMES["rust_setup"]="Rust toolchain"
TASK_NAMES["apt_update"]="APT repository update"
TASK_NAMES["sccache_ready"]="sccache server"
TASK_NAMES["timelord_sync"]="Timelord (install/sync)"
echo "📊 Awaiting parallel task completion:"
for MARKER in "${MARKERS[@]}"; do
WAIT_START=$(date +%s)
# Set timeout for tasks
MAX_WAIT=300 # 5 minutes timeout for background tasks
TASK_KEY=$(basename "$MARKER" .done)
TASK_NAME="${TASK_NAMES[$TASK_KEY]}"
while [ ! -f "$MARKER" ]; do
sleep 0.5
CURRENT_TIME=$(date +%s)
ELAPSED=$((CURRENT_TIME - WAIT_START))
if [ $ELAPSED -eq 2 ]; then
echo " ⏳ Waiting for $TASK_NAME..."
fi
# Show periodic status updates for long-running tasks
if [ $(($ELAPSED % 60)) -eq 0 ] && [ $ELAPSED -gt 0 ]; then
echo " Still waiting for $TASK_NAME (${ELAPSED}s elapsed)..."
fi
if [ $ELAPSED -gt $MAX_WAIT ]; then
TIMEOUT_MINS=$(($MAX_WAIT / 60))
echo " ❌ Error: $TASK_NAME timed out after ${TIMEOUT_MINS} minutes"
# Try to show log file content for debugging
LOG_FILE="/tmp/${TASK_KEY}.log"
if [ -f "$LOG_FILE" ]; then
echo " 📋 Last 20 lines of $LOG_FILE:"
tail -20 "$LOG_FILE" | sed 's/^/ /'
fi
exit 1
fi
done
FINAL_TIME=$(date +%s)
WAIT_TIME=$((FINAL_TIME - WAIT_START))
TOTAL_TIME=$((FINAL_TIME - $ACTION_START_TIME))
echo " ✅ $TASK_NAME ready (${WAIT_TIME}s wait, ${TOTAL_TIME}s total)"
done
SYNC_END=$(date +%s)
TOTAL_ACTION_TIME=$(($SYNC_END - $ACTION_START_TIME))
echo ""
echo "🎉 All setup tasks completed in $(($SYNC_END - $SYNC_START))s (${TOTAL_ACTION_TIME}s total)"
echo "::endgroup::"
# Close phase 3 group before starting task results
- name: Close parallel phase
shell: bash
run: |
echo "::endgroup::"
echo "::group::📋 Task results"
# Display all task results now that everything is complete
- name: Display Rust setup results
uses: ./.forgejo/actions/display-log-group
with:
name: '├─ Rust toolchain setup'
log-file: '/tmp/rust_setup.log'
done-file: '/tmp/rust_setup.done'
group-icon: '🦀'
- name: Display APT update results
uses: ./.forgejo/actions/display-log-group
with:
name: '├─ APT repository update'
log-file: '/tmp/apt_update.log'
done-file: '/tmp/apt_update.done'
group-icon: '📦'
max-lines: '20'
- name: Display sccache installation results
uses: ./.forgejo/actions/display-log-group
with:
name: '├─ sccache installation'
log-file: '/tmp/sccache_install.log'
done-file: '/tmp/sccache_install.done'
group-icon: '⚡'
- name: Display timelord installation results
uses: ./.forgejo/actions/display-log-group
with:
name: '├─ Timelord installation'
log-file: '/tmp/timelord_install.log'
done-file: '/tmp/timelord_install.done'
filter-pattern: 'Compiling|Finished|Installing|already available|Version|error'
group-icon: '⏰'
- name: Display timelord sync results
uses: ./.forgejo/actions/display-log-group
with:
name: '├─ Timelord sync'
log-file: '/tmp/timelord_sync.log'
done-file: '/tmp/timelord_sync.done'
group-icon: '🔄'
- name: Display cross-compilation target results
if: inputs.is-cross-compile == 'true'
uses: ./.forgejo/actions/display-log-group
with:
name: '└─ Cross-compilation target'
log-file: '/tmp/rust_target_install.log'
done-file: '/tmp/rust_target.done'
group-icon: '🎯'
max-lines: '5'
# Print summary
- name: Print setup summary
shell: bash
run: |
echo "::endgroup::"
echo "::group::✅ Setup summary"
echo "✅ Build environment ready"
echo " • Rust toolchain: $(rustc --version 2>/dev/null | head -1 || echo 'installed')"
echo " • sccache: $(sccache --version 2>/dev/null | head -1 || echo 'installed')"
echo " • Timelord: $(timelord --version 2>/dev/null || echo 'installed')"
echo " • APT packages: ${{ steps.packages.outputs.packages }}"
echo " • Compiler: CC=${{ inputs.cc }}, CXX=${{ inputs.cxx }}"
if [[ -n "${{ inputs.march }}" ]]; then
echo " • Architecture flags: -march=${{ inputs.march }}"
fi
if [[ "${{ inputs.is-cross-compile }}" == "true" ]]; then
echo " • Cross-compilation target: ${{ inputs.rust-target }}"
echo " • Linker: ${{ inputs.linker }}"
fi
echo "::endgroup::"
echo "::endgroup::"
# Save cargo bin cache
- name: Save cargo bin cache
if: always() && steps.cargo-bin-cache.outputs.cache-hit != 'true'
shell: bash
run: |
echo "::group::💾 Saving binary tools cache"
- uses: actions/cache/save@v4
if: always() && steps.cargo-bin-cache.outputs.cache-hit != 'true'
with:
path: ~/.cargo/bin
key: cargo-bin-v1-${{ hashFiles('**/Cargo.lock') }}
- shell: bash
if: always() && steps.cargo-bin-cache.outputs.cache-hit != 'true'
run: echo "::endgroup::"
# Close the main action group
- name: Close main group
shell: bash
run: |
echo "🎆 Build environment setup complete"
echo "::endgroup::"

View file

@ -1,29 +0,0 @@
name: sccache
description: |
Install sccache for caching builds in GitHub Actions.
inputs:
token:
description: 'A Github PAT'
required: false
runs:
using: composite
steps:
- name: Install sccache
uses: https://github.com/mozilla-actions/sccache-action@v0.0.9
with:
token: ${{ inputs.token }}
- name: Configure sccache
uses: https://github.com/actions/github-script@v7
with:
script: |
core.exportVariable('ACTIONS_RESULTS_URL', process.env.ACTIONS_RESULTS_URL || '');
core.exportVariable('ACTIONS_RUNTIME_TOKEN', process.env.ACTIONS_RUNTIME_TOKEN || '');
- shell: bash
run: |
echo "SCCACHE_GHA_ENABLED=true" >> $GITHUB_ENV
echo "RUSTC_WRAPPER=sccache" >> $GITHUB_ENV
echo "CMAKE_C_COMPILER_LAUNCHER=sccache" >> $GITHUB_ENV
echo "CMAKE_CXX_COMPILER_LAUNCHER=sccache" >> $GITHUB_ENV
echo "CMAKE_CUDA_COMPILER_LAUNCHER=sccache" >> $GITHUB_ENV

View file

@ -1,46 +0,0 @@
name: timelord
description: |
Use timelord to set file timestamps
inputs:
key:
description: |
The key to use for caching the timelord data.
This should be unique to the repository and the runner.
required: true
default: timelord-v0
path:
description: |
The path to the directory to be timestamped.
This should be the root of the repository.
required: true
default: .
runs:
using: composite
steps:
- name: Cache timelord-cli installation
id: cache-timelord-bin
uses: actions/cache@v3
with:
path: ~/.cargo/bin/timelord
key: timelord-cli-v3.0.1
- name: Install timelord-cli
uses: https://github.com/cargo-bins/cargo-binstall@main
if: steps.cache-timelord-bin.outputs.cache-hit != 'true'
- run: cargo binstall timelord-cli@3.0.1
shell: bash
if: steps.cache-timelord-bin.outputs.cache-hit != 'true'
- name: Load timelord files
uses: actions/cache/restore@v3
with:
path: /timelord/
key: ${{ inputs.key }}
- name: Run timelord to set timestamps
shell: bash
run: timelord sync --source-dir ${{ inputs.path }} --cache-dir /timelord/
- name: Save timelord
uses: actions/cache/save@v3
with:
path: /timelord/
key: ${{ inputs.key }}

View file

@ -0,0 +1,413 @@
name: Checks / Build / Publish
on:
push:
paths-ignore:
- "*.md"
- "**/*.md"
- ".gitlab-ci.yml"
- ".gitignore"
- "renovate.json"
- "debian/**"
- "docker/**"
- "docs/**"
pull_request:
paths-ignore:
- "*.md"
- "**/*.md"
- ".gitlab-ci.yml"
- ".gitignore"
- "renovate.json"
- "debian/**"
- "docker/**"
- "docs/**"
workflow_dispatch:
# Cancel in-progress runs when a new push is made to the same branch
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: ${{ github.event_name == 'pull_request' }} # Cancel PRs but not main branch builds
env:
BUILTIN_REGISTRY_ENABLED: "${{ vars.BUILTIN_REGISTRY != '' && ((vars.BUILTIN_REGISTRY_USER && secrets.BUILTIN_REGISTRY_PASSWORD) || (github.event_name != 'pull_request' || github.event.pull_request.head.repo.fork == false)) && 'true' || 'false' }}"
jobs:
# Phase 1: Fast checks (formatting, linting)
fast-checks:
name: Pre-commit & Formatting
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
persist-credentials: false
- name: Install uv
uses: https://github.com/astral-sh/setup-uv@v6
with:
enable-cache: true
ignore-nothing-to-cache: true
cache-dependency-glob: ''
- name: Run prek (formerly prefligit)
run: uvx prek run --show-diff-on-failure --color=always -v --all-files --hook-stage manual
- name: Install Rust nightly with rustfmt
run: |
uvx rustup override set nightly
uvx rustup component add rustfmt
- name: Check formatting
run: |
cargo +nightly fmt --all -- --check && echo "✅ Formatting check passed - all code is properly formatted" || exit 1
# Phase 2: Clippy and tests
clippy-and-tests:
name: Clippy and Cargo Tests
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
persist-credentials: false
- name: Set up Rust with caching
uses: ./.forgejo/actions/rust-with-cache
- name: Run Clippy lints
run: |
cargo clippy \
--workspace \
--features full \
--locked \
--no-deps \
--profile test \
-- \
-D warnings
- name: Run Cargo tests
run: |
cargo test \
--workspace \
--features full \
--locked \
--profile test \
--all-targets \
--no-fail-fast
# Phase 3: Build binaries (depends on both test phases)
build:
name: Build ${{ matrix.platform }}
runs-on: ubuntu-latest
needs: [fast-checks, clippy-and-tests] # Wait for both test jobs to complete
env:
# Define image variables once for reuse
IMAGE_REPOSITORY: ${{ github.repository }}
IMAGE_REGISTRY: ${{ vars.BUILTIN_REGISTRY }}
IMAGE_NAME: ${{ vars.BUILTIN_REGISTRY && format('{0}/{1}', vars.BUILTIN_REGISTRY, github.repository) || '' }}
permissions:
contents: read
packages: write
attestations: write
id-token: write
strategy:
matrix:
include:
- platform: linux/amd64
slug: linux-amd64
rust_target: x86_64-unknown-linux-gnu
march: haswell
cc: gcc
cxx: g++
linker: gcc
dpkg_arch: ""
gcc_package: gcc
gxx_package: g++
liburing_package: liburing-dev
is_cross_compile: false
cargo_linker_env: CARGO_TARGET_X86_64_UNKNOWN_LINUX_GNU_LINKER
pkg_config_path: ""
pkg_config_libdir: ""
pkg_config_sysroot: ""
target_cpu: haswell
profile: release
- platform: linux/arm64
slug: linux-arm64
rust_target: aarch64-unknown-linux-gnu
march: armv8-a
cc: aarch64-linux-gnu-gcc
cxx: aarch64-linux-gnu-g++
linker: aarch64-linux-gnu-gcc
dpkg_arch: arm64
gcc_package: gcc-aarch64-linux-gnu
gxx_package: g++-aarch64-linux-gnu
liburing_package: liburing-dev:arm64
is_cross_compile: true
cargo_linker_env: CARGO_TARGET_AARCH64_UNKNOWN_LINUX_GNU_LINKER
pkg_config_path: /usr/lib/aarch64-linux-gnu/pkgconfig
pkg_config_libdir: /usr/lib/aarch64-linux-gnu/pkgconfig
pkg_config_sysroot: /usr/aarch64-linux-gnu
target_cpu: base
profile: release
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
persist-credentials: false
- name: Set up Rust with caching
id: rust-setup
uses: ./.forgejo/actions/rust-with-cache
with:
cache-key-suffix: ${{ matrix.slug }}
rust-target: ${{ matrix.rust_target }}
dpkg-arch: ${{ matrix.dpkg_arch }}
gcc-package: ${{ matrix.gcc_package }}
gxx-package: ${{ matrix.gxx_package }}
liburing-package: ${{ matrix.liburing_package }}
is-cross-compile: ${{ matrix.is_cross_compile }}
cc: ${{ matrix.cc }}
cxx: ${{ matrix.cxx }}
linker: ${{ matrix.linker }}
march: ${{ matrix.march }}
cargo-linker-env: ${{ matrix.cargo_linker_env }}
pkg-config-path: ${{ matrix.pkg_config_path }}
pkg-config-libdir: ${{ matrix.pkg_config_libdir }}
pkg-config-sysroot: ${{ matrix.pkg_config_sysroot }}
- name: Build binary
run: |
# Set up cross-language LTO for better optimization
export RUSTFLAGS="-C linker-plugin-lto -C link-arg=-flto=thin"
# Configure target CPU optimization if specified
if [[ "${{ matrix.target_cpu }}" != "base" ]]; then
export RUSTFLAGS="$RUSTFLAGS -C target-cpu=${{ matrix.target_cpu }}"
echo "Building with CPU optimizations for: ${{ matrix.target_cpu }}"
fi
# Build with standard features plus jemalloc (excluding hardened_malloc as it conflicts)
cargo build \
--release \
--locked \
--features "standard,jemalloc_prof,perf_measurements,tokio_console" \
--target ${{ matrix.rust_target }}
# Generate SBOM for the binary artifact
cargo cyclonedx --format json --target ${{ matrix.rust_target }} > target/${{ matrix.rust_target }}/release/conduwuit.sbom.json || echo "SBOM generation failed (cargo-cyclonedx may not be installed)"
- name: Upload binary artefact
uses: forgejo/upload-artifact@v4
with:
name: conduwuit-${{ matrix.target_cpu }}-${{ matrix.slug }}-${{ matrix.profile }}
path: |
target/${{ matrix.rust_target }}/release/conduwuit
target/${{ matrix.rust_target }}/release/conduwuit.sbom.json
if-no-files-found: error
- name: Prepare Docker context
run: |
# Create Docker build context
mkdir -p docker-context
# Copy binary
cp target/${{ matrix.rust_target }}/release/conduwuit docker-context/
# Note: We rely on Rust producing mostly-static binaries with --release
# Core system libraries (libc, libm, etc.) will be dynamically linked but
# these are handled by the base image. Application libraries like jemalloc
# are statically linked into the binary.
# Create minimal Dockerfile inline with proper multi-stage build
cat > docker-context/Dockerfile << 'EOF'
# Stage 1: Get SSL certificates from Debian
FROM docker.io/library/debian:bookworm-slim AS certs
RUN apt-get update && apt-get install -y ca-certificates && update-ca-certificates
# Stage 2: Final scratch image
FROM scratch
ARG GIT_COMMIT_HASH
ARG GIT_COMMIT_HASH_SHORT
ARG GIT_REMOTE_COMMIT_URL
ARG GIT_REMOTE_URL
ARG SOURCE_DATE_EPOCH
ENV GIT_COMMIT_HASH=${GIT_COMMIT_HASH}
ENV GIT_COMMIT_HASH_SHORT=${GIT_COMMIT_HASH_SHORT}
ENV GIT_REMOTE_COMMIT_URL=${GIT_REMOTE_COMMIT_URL}
ENV GIT_REMOTE_URL=${GIT_REMOTE_URL}
ENV SOURCE_DATE_EPOCH=${SOURCE_DATE_EPOCH}
COPY --from=certs /etc/ssl/certs /etc/ssl/certs
COPY conduwuit /sbin/conduwuit
EXPOSE 8008
CMD ["/sbin/conduwuit"]
EOF
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Log in to container registry
if: vars.BUILTIN_REGISTRY != ''
uses: docker/login-action@v3
with:
registry: ${{ vars.BUILTIN_REGISTRY }}
username: ${{ vars.BUILTIN_REGISTRY_USER || gitea.actor }}
password: ${{ secrets.BUILTIN_REGISTRY_PASSWORD || secrets.FORGEJO_TOKEN }}
- name: Extract Docker metadata
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ env.IMAGE_NAME }}
env:
DOCKER_METADATA_ANNOTATIONS_LEVELS: manifest,index
- name: Get build metadata
id: build-meta
run: |
echo "epoch=$(git log -1 --pretty=%ct)" >> $GITHUB_OUTPUT
echo "sha_short=$(git rev-parse --short ${{ github.sha }})" >> $GITHUB_OUTPUT
- name: Build and push Docker image
id: build
uses: docker/build-push-action@v6
with:
context: docker-context
platforms: ${{ matrix.platform }}
labels: ${{ steps.meta.outputs.labels }}
annotations: ${{ steps.meta.outputs.annotations }}
# Enable registry-based layer caching
cache-from: type=registry,ref=${{ env.IMAGE_NAME }}:buildcache-${{ matrix.slug }}
cache-to: type=registry,ref=${{ env.IMAGE_NAME }}:buildcache-${{ matrix.slug }},mode=max
sbom: true
outputs: type=image,"name=${{ env.IMAGE_NAME }}",push-by-digest=true,name-canonical=true,push=true
build-args: |
GIT_COMMIT_HASH=${{ github.sha }}
GIT_COMMIT_HASH_SHORT=${{ steps.build-meta.outputs.sha_short }}
GIT_REMOTE_COMMIT_URL=${{ github.event.head_commit.url }}
GIT_REMOTE_URL=${{ github.event.repository.html_url }}
SOURCE_DATE_EPOCH=${{ steps.build-meta.outputs.epoch }}
- name: Push architecture-specific tags
if: vars.BUILTIN_REGISTRY != ''
run: |
# Determine if we need a branch- prefix (for non-default branches)
DEFAULT_BRANCH="${{ github.event.repository.default_branch || 'main' }}"
REF_NAME="${{ github.ref_name }}"
# Add branch- prefix for non-default branches, matching multi-platform logic
if [[ "${{ github.ref }}" == "refs/heads/${DEFAULT_BRANCH}" ]]; then
# Default branch: use name as-is
TAG_PREFIX="${REF_NAME//\//-}"
elif [[ "${{ github.ref }}" == refs/heads/* ]]; then
# Non-default branch: add branch- prefix
TAG_PREFIX="branch-${REF_NAME//\//-}"
elif [[ "${{ github.ref }}" == refs/tags/* ]]; then
# Tag: use as-is
TAG_PREFIX="${REF_NAME}"
elif [[ "${{ github.ref }}" == refs/pull/* ]]; then
# Pull request: use pr-NUMBER format
TAG_PREFIX="pr-${{ github.event.pull_request.number }}"
else
# Fallback
TAG_PREFIX="${REF_NAME//\//-}"
fi
# Create architecture-specific tag
docker buildx imagetools create \
--tag "${{ env.IMAGE_NAME }}:${TAG_PREFIX}-${{ matrix.slug }}" \
"${{ env.IMAGE_NAME }}@${{ steps.build.outputs.digest }}"
# Also create a latest-{arch} tag if this is a version tag (matching multi-platform logic)
if [[ "${{ github.ref }}" == refs/tags/v* ]]; then
docker buildx imagetools create \
--tag "${{ env.IMAGE_NAME }}:latest-${{ matrix.slug }}" \
"${{ env.IMAGE_NAME }}@${{ steps.build.outputs.digest }}"
fi
- name: Export and upload digest
run: |
mkdir -p /tmp/digests
digest="${{ steps.build.outputs.digest }}"
touch "/tmp/digests/${digest#sha256:}"
- uses: forgejo/upload-artifact@v4
with:
name: digests-${{ matrix.slug }}
path: /tmp/digests/*
if-no-files-found: error
retention-days: 5
# Phase 4: Publish multi-platform manifest
publish:
name: Publish Multi-Platform
runs-on: ubuntu-latest
needs: build
if: vars.BUILTIN_REGISTRY != '' # Only run if we have a registry configured
env:
# Define image variables once for reuse
IMAGE_REPOSITORY: ${{ github.repository }}
IMAGE_REGISTRY: ${{ vars.BUILTIN_REGISTRY }}
IMAGE_NAME: ${{ vars.BUILTIN_REGISTRY && format('{0}/{1}', vars.BUILTIN_REGISTRY, github.repository) || '' }}
steps:
- name: Download digests
uses: forgejo/download-artifact@v4
with:
path: /tmp/digests
pattern: digests-*
merge-multiple: true
- name: Log in to container registry
if: vars.BUILTIN_REGISTRY != ''
uses: docker/login-action@v3
with:
registry: ${{ vars.BUILTIN_REGISTRY }}
username: ${{ vars.BUILTIN_REGISTRY_USER || gitea.actor }}
password: ${{ secrets.BUILTIN_REGISTRY_PASSWORD || secrets.FORGEJO_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Extract Docker tags
id: meta
uses: docker/metadata-action@v5
with:
tags: |
type=semver,pattern={{version}},prefix=v
type=semver,pattern={{major}}.{{minor}},enable=${{ !startsWith(github.ref, 'refs/tags/v0.0.') }},prefix=v
type=semver,pattern={{major}},enable=${{ !startsWith(github.ref, 'refs/tags/v0.') }},prefix=v
type=ref,event=branch,prefix=${{ format('refs/heads/{0}', github.event.repository.default_branch) != github.ref && 'branch-' || '' }}
type=ref,event=pr
type=raw,value=latest,enable=${{ startsWith(github.ref, 'refs/tags/v') }}
images: ${{ env.IMAGE_NAME }}
# default labels & annotations: https://github.com/docker/metadata-action/blob/master/src/meta.ts#L509
env:
DOCKER_METADATA_ANNOTATIONS_LEVELS: index
- name: Create and push manifest
working-directory: /tmp/digests
env:
IMAGES: ${{ env.IMAGE_NAME }}
shell: bash
run: |
IFS=$'\n'
IMAGES_LIST=($IMAGES)
ANNOTATIONS_LIST=($DOCKER_METADATA_OUTPUT_ANNOTATIONS)
TAGS_LIST=($DOCKER_METADATA_OUTPUT_TAGS)
for REPO in "${IMAGES_LIST[@]}"; do
docker buildx imagetools create \
$(for tag in "${TAGS_LIST[@]}"; do echo "--tag"; echo "$tag"; done) \
$(for annotation in "${ANNOTATIONS_LIST[@]}"; do echo "--annotation"; echo "$annotation"; done) \
$(for reference in *; do printf "$REPO@sha256:%s\n" $reference; done)
done
- name: Inspect image
env:
IMAGES: ${{ env.IMAGE_NAME }}
shell: bash
run: |
IMAGES_LIST=($IMAGES)
for REPO in "${IMAGES_LIST[@]}"; do
docker buildx imagetools inspect $REPO:${{ steps.meta.outputs.version }}
done

View file

@ -1,4 +1,4 @@
name: Documentation
name: Deploy / Documentation
on:
pull_request:

View file

@ -1,4 +1,4 @@
name: Mirror Container Images
name: Deploy / Mirror Images
on:
schedule:

View file

@ -1,22 +0,0 @@
name: Checks / Prefligit
on:
push:
pull_request:
permissions:
contents: read
jobs:
prefligit:
runs-on: ubuntu-latest
env:
FROM_REF: ${{ github.event.pull_request.base.sha || (!github.event.forced && ( github.event.before != '0000000000000000000000000000000000000000' && github.event.before || github.sha )) || format('{0}~', github.sha) }}
TO_REF: ${{ github.sha }}
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
persist-credentials: false
- uses: ./.forgejo/actions/prefligit
with:
extra_args: --all-files --hook-stage manual

View file

@ -1,296 +0,0 @@
name: Release Docker Image
concurrency:
group: "release-image-${{ github.ref }}"
on:
push:
paths-ignore:
- "*.md"
- "**/*.md"
- ".gitlab-ci.yml"
- ".gitignore"
- "renovate.json"
- "debian/**"
- "docker/**"
- "docs/**"
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
env:
BUILTIN_REGISTRY: forgejo.ellis.link
BUILTIN_REGISTRY_ENABLED: "${{ ((vars.BUILTIN_REGISTRY_USER && secrets.BUILTIN_REGISTRY_PASSWORD) || (github.event_name != 'pull_request' || github.event.pull_request.head.repo.fork == false)) && 'true' || 'false' }}"
jobs:
define-variables:
runs-on: ubuntu-latest
outputs:
images: ${{ steps.var.outputs.images }}
images_list: ${{ steps.var.outputs.images_list }}
build_matrix: ${{ steps.var.outputs.build_matrix }}
steps:
- name: Setting variables
uses: https://github.com/actions/github-script@v7
id: var
with:
script: |
const githubRepo = '${{ github.repository }}'.toLowerCase()
const repoId = githubRepo.split('/')[1]
core.setOutput('github_repository', githubRepo)
const builtinImage = '${{ env.BUILTIN_REGISTRY }}/' + githubRepo
let images = []
if (process.env.BUILTIN_REGISTRY_ENABLED === "true") {
images.push(builtinImage)
}
core.setOutput('images', images.join("\n"))
core.setOutput('images_list', images.join(","))
const platforms = ['linux/amd64', 'linux/arm64']
core.setOutput('build_matrix', JSON.stringify({
platform: platforms,
target_cpu: ['base'],
include: platforms.map(platform => { return {
platform,
slug: platform.replace('/', '-')
}})
}))
build-image:
runs-on: dind
needs: define-variables
permissions:
contents: read
packages: write
attestations: write
id-token: write
strategy:
matrix:
{
"target_cpu": ["base"],
"profile": ["release"],
"include":
[
{ "platform": "linux/amd64", "slug": "linux-amd64" },
{ "platform": "linux/arm64", "slug": "linux-arm64" },
],
"platform": ["linux/amd64", "linux/arm64"],
}
steps:
- name: Echo strategy
run: echo '${{ toJSON(fromJSON(needs.define-variables.outputs.build_matrix)) }}'
- name: Echo matrix
run: echo '${{ toJSON(matrix) }}'
- name: Checkout repository
uses: actions/checkout@v4
with:
persist-credentials: false
- name: Install rust
id: rust-toolchain
uses: ./.forgejo/actions/rust-toolchain
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
# Uses the `docker/login-action` action to log in to the Container registry registry using the account and password that will publish the packages. Once published, the packages are scoped to the account defined here.
- name: Login to builtin registry
uses: docker/login-action@v3
with:
registry: ${{ env.BUILTIN_REGISTRY }}
username: ${{ vars.BUILTIN_REGISTRY_USER || github.actor }}
password: ${{ secrets.BUILTIN_REGISTRY_PASSWORD || secrets.GITHUB_TOKEN }}
# This step uses [docker/metadata-action](https://github.com/docker/metadata-action#about) to extract tags and labels that will be applied to the specified image. The `id` "meta" allows the output of this step to be referenced in a subsequent step. The `images` value provides the base name for the tags and labels.
- name: Extract metadata (labels, annotations) for Docker
id: meta
uses: docker/metadata-action@v5
with:
images: ${{needs.define-variables.outputs.images}}
# default labels & annotations: https://github.com/docker/metadata-action/blob/master/src/meta.ts#L509
env:
DOCKER_METADATA_ANNOTATIONS_LEVELS: manifest,index
# This step uses the `docker/build-push-action` action to build the image, based on your repository's `Dockerfile`. If the build succeeds, it pushes the image to GitHub Packages.
# It uses the `context` parameter to define the build's context as the set of files located in the specified path. For more information, see "[Usage](https://github.com/docker/build-push-action#usage)" in the README of the `docker/build-push-action` repository.
# It uses the `tags` and `labels` parameters to tag and label the image with the output from the "meta" step.
# It will not push images generated from a pull request
- name: Get short git commit SHA
id: sha
run: |
calculatedSha=$(git rev-parse --short ${{ github.sha }})
echo "COMMIT_SHORT_SHA=$calculatedSha" >> $GITHUB_ENV
- name: Get Git commit timestamps
run: echo "TIMESTAMP=$(git log -1 --pretty=%ct)" >> $GITHUB_ENV
- uses: ./.forgejo/actions/timelord
with:
key: timelord-v0
path: .
- name: Cache Rust registry
uses: actions/cache@v3
with:
path: |
.cargo/git
.cargo/git/checkouts
.cargo/registry
.cargo/registry/src
key: rust-registry-image-${{hashFiles('**/Cargo.lock') }}
- name: Cache cargo target
id: cache-cargo-target
uses: actions/cache@v3
with:
path: |
cargo-target-${{ matrix.target_cpu }}-${{ matrix.slug }}-${{ matrix.profile }}
key: cargo-target-${{ matrix.target_cpu }}-${{ matrix.slug }}-${{ matrix.profile }}-${{hashFiles('**/Cargo.lock') }}-${{steps.rust-toolchain.outputs.rustc_version}}
- name: Cache apt cache
id: cache-apt
uses: actions/cache@v3
with:
path: |
var-cache-apt-${{ matrix.slug }}
key: var-cache-apt-${{ matrix.slug }}
- name: Cache apt lib
id: cache-apt-lib
uses: actions/cache@v3
with:
path: |
var-lib-apt-${{ matrix.slug }}
key: var-lib-apt-${{ matrix.slug }}
- name: inject cache into docker
uses: https://github.com/reproducible-containers/buildkit-cache-dance@v3.1.0
with:
cache-map: |
{
".cargo/registry": "/usr/local/cargo/registry",
".cargo/git/db": "/usr/local/cargo/git/db",
"cargo-target-${{ matrix.target_cpu }}-${{ matrix.slug }}-${{ matrix.profile }}": {
"target": "/app/target",
"id": "cargo-target-${{ matrix.target_cpu }}-${{ matrix.slug }}-${{ matrix.profile }}"
},
"var-cache-apt-${{ matrix.slug }}": "/var/cache/apt",
"var-lib-apt-${{ matrix.slug }}": "/var/lib/apt"
}
skip-extraction: ${{ steps.cache.outputs.cache-hit }}
- name: Build and push Docker image by digest
id: build
uses: docker/build-push-action@v6
with:
context: .
file: "docker/Dockerfile"
build-args: |
GIT_COMMIT_HASH=${{ github.sha }})
GIT_COMMIT_HASH_SHORT=${{ env.COMMIT_SHORT_SHA }}
GIT_REMOTE_URL=${{github.event.repository.html_url }}
GIT_REMOTE_COMMIT_URL=${{github.event.head_commit.url }}
platforms: ${{ matrix.platform }}
labels: ${{ steps.meta.outputs.labels }}
annotations: ${{ steps.meta.outputs.annotations }}
cache-from: type=gha
# cache-to: type=gha,mode=max
sbom: true
outputs: type=image,"name=${{ needs.define-variables.outputs.images_list }}",push-by-digest=true,name-canonical=true,push=true
env:
SOURCE_DATE_EPOCH: ${{ env.TIMESTAMP }}
# For publishing multi-platform manifests
- name: Export digest
run: |
mkdir -p /tmp/digests
digest="${{ steps.build.outputs.digest }}"
touch "/tmp/digests/${digest#sha256:}"
- name: Extract binary from container (image)
id: extract-binary-image
run: |
mkdir -p /tmp/binaries
digest="${{ steps.build.outputs.digest }}"
echo "container_id=$(docker create --platform ${{ matrix.platform }} ${{ needs.define-variables.outputs.images_list }}@$digest)" >> $GITHUB_OUTPUT
- name: Extract binary from container (copy)
run: docker cp ${{ steps.extract-binary-image.outputs.container_id }}:/sbin/conduwuit /tmp/binaries/conduwuit-${{ matrix.target_cpu }}-${{ matrix.slug }}-${{ matrix.profile }}
- name: Extract binary from container (cleanup)
run: docker rm ${{ steps.extract-binary-image.outputs.container_id }}
- name: Upload binary artifact
uses: forgejo/upload-artifact@v4
with:
name: conduwuit-${{ matrix.target_cpu }}-${{ matrix.slug }}-${{ matrix.profile }}
path: /tmp/binaries/conduwuit-${{ matrix.target_cpu }}-${{ matrix.slug }}-${{ matrix.profile }}
if-no-files-found: error
- name: Upload digest
uses: forgejo/upload-artifact@v4
with:
name: digests-${{ matrix.slug }}
path: /tmp/digests/*
if-no-files-found: error
retention-days: 5
merge:
runs-on: dind
needs: [define-variables, build-image]
steps:
- name: Download digests
uses: forgejo/download-artifact@v4
with:
path: /tmp/digests
pattern: digests-*
merge-multiple: true
# Uses the `docker/login-action` action to log in to the Container registry registry using the account and password that will publish the packages. Once published, the packages are scoped to the account defined here.
- name: Login to builtin registry
uses: docker/login-action@v3
with:
registry: ${{ env.BUILTIN_REGISTRY }}
username: ${{ vars.BUILTIN_REGISTRY_USER || github.actor }}
password: ${{ secrets.BUILTIN_REGISTRY_PASSWORD || secrets.GITHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Extract metadata (tags) for Docker
id: meta
uses: docker/metadata-action@v5
with:
tags: |
type=semver,pattern={{version}},prefix=v
type=semver,pattern={{major}}.{{minor}},enable=${{ !startsWith(github.ref, 'refs/tags/v0.0.') }},prefix=v
type=semver,pattern={{major}},enable=${{ !startsWith(github.ref, 'refs/tags/v0.') }},prefix=v
type=ref,event=branch,prefix=${{ format('refs/heads/{0}', github.event.repository.default_branch) != github.ref && 'branch-' || '' }}
type=ref,event=pr
type=sha,format=long
type=raw,value=latest,enable=${{ startsWith(github.ref, 'refs/tags/v') }}
images: ${{needs.define-variables.outputs.images}}
# default labels & annotations: https://github.com/docker/metadata-action/blob/master/src/meta.ts#L509
env:
DOCKER_METADATA_ANNOTATIONS_LEVELS: index
- name: Create manifest list and push
working-directory: /tmp/digests
env:
IMAGES: ${{needs.define-variables.outputs.images}}
shell: bash
run: |
IFS=$'\n'
IMAGES_LIST=($IMAGES)
ANNOTATIONS_LIST=($DOCKER_METADATA_OUTPUT_ANNOTATIONS)
TAGS_LIST=($DOCKER_METADATA_OUTPUT_TAGS)
for REPO in "${IMAGES_LIST[@]}"; do
docker buildx imagetools create \
$(for tag in "${TAGS_LIST[@]}"; do echo "--tag"; echo "$tag"; done) \
$(for annotation in "${ANNOTATIONS_LIST[@]}"; do echo "--annotation"; echo "$annotation"; done) \
$(for reference in *; do printf "$REPO@sha256:%s\n" $reference; done)
done
- name: Inspect image
env:
IMAGES: ${{needs.define-variables.outputs.images}}
shell: bash
run: |
IMAGES_LIST=($IMAGES)
for REPO in "${IMAGES_LIST[@]}"; do
docker buildx imagetools inspect $REPO:${{ steps.meta.outputs.version }}
done

View file

@ -0,0 +1,60 @@
name: Maintenance / Renovate
on:
schedule:
# Run at 2am UTC daily
- cron: '0 2 * * *'
workflow_dispatch:
inputs:
dryRun:
description: 'Dry run mode'
required: false
default: 'false'
type: choice
options:
- 'true'
- 'false'
logLevel:
description: 'Log level'
required: false
default: 'info'
type: choice
options:
- 'debug'
- 'info'
- 'warn'
- 'error'
push:
branches:
- main
paths:
- '.forgejo/workflows/renovate.yml'
- 'renovate.json'
jobs:
renovate:
name: Renovate
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Run Renovate
uses: renovatebot/github-action@v40.1.0
with:
token: ${{ secrets.RENOVATE_TOKEN }}
configurationFile: renovate.json
env:
# Platform settings
RENOVATE_PLATFORM: gitea
RENOVATE_ENDPOINT: ${{ github.server_url }}/api/v1
RENOVATE_TOKEN: ${{ secrets.RENOVATE_TOKEN }}
# Repository settings
RENOVATE_REPOSITORIES: '["${{ github.repository }}"]'
# Behaviour settings
RENOVATE_DRY_RUN: ${{ inputs.dryRun || 'false' }}
LOG_LEVEL: ${{ inputs.logLevel || 'info' }}
# Forgejo/Gitea specific
RENOVATE_GIT_AUTHOR: '${{ vars.RENOVATE_AUTHOR }}'

View file

@ -1,144 +0,0 @@
name: Checks / Rust
on:
push:
jobs:
format:
name: Format
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
persist-credentials: false
- name: Install rust
uses: ./.forgejo/actions/rust-toolchain
with:
toolchain: "nightly"
components: "rustfmt"
- name: Check formatting
run: |
cargo +nightly fmt --all -- --check
clippy:
name: Clippy
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
persist-credentials: false
- name: Install rust
uses: ./.forgejo/actions/rust-toolchain
- uses: https://github.com/actions/create-github-app-token@v2
id: app-token
with:
app-id: ${{ vars.GH_APP_ID }}
private-key: ${{ secrets.GH_APP_PRIVATE_KEY }}
github-api-url: https://api.github.com
owner: ${{ vars.GH_APP_OWNER }}
repositories: ""
- name: Install sccache
uses: ./.forgejo/actions/sccache
with:
token: ${{ steps.app-token.outputs.token }}
- run: sudo apt-get update
- name: Install system dependencies
uses: https://github.com/awalsh128/cache-apt-pkgs-action@v1
with:
packages: clang liburing-dev
version: 1
- name: Cache Rust registry
uses: actions/cache@v3
with:
path: |
~/.cargo/git
!~/.cargo/git/checkouts
~/.cargo/registry
!~/.cargo/registry/src
key: rust-registry-${{hashFiles('**/Cargo.lock') }}
- name: Timelord
uses: ./.forgejo/actions/timelord
with:
key: sccache-v0
path: .
- name: Clippy
run: |
cargo clippy \
--workspace \
--features full \
--locked \
--no-deps \
--profile test \
-- \
-D warnings
- name: Show sccache stats
if: always()
run: sccache --show-stats
cargo-test:
name: Cargo Test
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
persist-credentials: false
- name: Install rust
uses: ./.forgejo/actions/rust-toolchain
- uses: https://github.com/actions/create-github-app-token@v2
id: app-token
with:
app-id: ${{ vars.GH_APP_ID }}
private-key: ${{ secrets.GH_APP_PRIVATE_KEY }}
github-api-url: https://api.github.com
owner: ${{ vars.GH_APP_OWNER }}
repositories: ""
- name: Install sccache
uses: ./.forgejo/actions/sccache
with:
token: ${{ steps.app-token.outputs.token }}
- run: sudo apt-get update
- name: Install system dependencies
uses: https://github.com/awalsh128/cache-apt-pkgs-action@v1
with:
packages: clang liburing-dev
version: 1
- name: Cache Rust registry
uses: actions/cache@v3
with:
path: |
~/.cargo/git
!~/.cargo/git/checkouts
~/.cargo/registry
!~/.cargo/registry/src
key: rust-registry-${{hashFiles('**/Cargo.lock') }}
- name: Timelord
uses: ./.forgejo/actions/timelord
with:
key: sccache-v0
path: .
- name: Cargo Test
run: |
cargo test \
--workspace \
--features full \
--locked \
--profile test \
--all-targets \
--no-fail-fast
- name: Show sccache stats
if: always()
run: sccache --show-stats

View file

@ -9,7 +9,7 @@ repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v5.0.0
hooks:
- id: check-byte-order-marker
- id: fix-byte-order-marker
- id: check-case-conflict
- id: check-symlinks
- id: destroyed-symlinks

View file

@ -22,5 +22,24 @@
"tikv-jemalloc-ctl",
"opentelemetry-rust",
"tracing-opentelemetry"
]
],
"github-actions": {
"enabled": true,
"fileMatch": [
"(^|/)\\.forgejo/workflows/[^/]+\\.ya?ml$",
"(^|/)\\.forgejo/actions/[^/]+/action\\.ya?ml$",
"(^|/)\\.github/workflows/[^/]+\\.ya?ml$",
"(^|/)\\.github/actions/[^/]+/action\\.ya?ml$"
]
},
"packageRules": [
{
"description": "Group all non-major GitHub Actions updates",
"matchManagers": ["github-actions"],
"matchUpdateTypes": ["minor", "patch"],
"groupName": "github-actions-non-major"
}
],
"prConcurrentLimit": 3,
"prHourlyLimit": 2
}