Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
88 changes: 0 additions & 88 deletions .circleci/config.yml

This file was deleted.

126 changes: 126 additions & 0 deletions .github/workflows/_regression-job.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,126 @@
name: Regression Job (Reusable)

on:
workflow_call:
inputs:
platform:
required: true
type: string
language:
required: true
type: string
version:
required: true
type: string

jobs:
test:
runs-on: ubuntu-latest

env:
RESOURCE_PREFIX: sebs-ci

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Restore SeBS cache
uses: actions/cache/restore@v4
with:
path: regression-cache/
key: sebs-cache-${{ github.ref_name }}-${{ inputs.platform }}-${{ inputs.language }}-${{ inputs.version }}
restore-keys: |
sebs-cache-${{ github.ref_name }}-

- name: Setup GCP credentials
if: inputs.platform == 'gcp'
run: |
echo "${{ secrets.GCP_SERVICE_ACCOUNT_JSON }}" > /tmp/gcp-credentials.json
echo "GOOGLE_APPLICATION_CREDENTIALS=/tmp/gcp-credentials.json" >> $GITHUB_ENV

- name: Setup Azure credentials
if: inputs.platform == 'azure'
run: |
echo "AZURE_SUBSCRIPTION_ID=${{ secrets.AZURE_SUBSCRIPTION_ID }}" >> $GITHUB_ENV
echo "AZURE_TENANT_ID=${{ secrets.AZURE_TENANT_ID }}" >> $GITHUB_ENV
echo "AZURE_CLIENT_ID=${{ secrets.AZURE_CLIENT_ID }}" >> $GITHUB_ENV
echo "AZURE_CLIENT_SECRET=${{ secrets.AZURE_CLIENT_SECRET }}" >> $GITHUB_ENV

- name: Setup AWS credentials
if: inputs.platform == 'aws'
run: |
echo "AWS_ACCESS_KEY_ID=${{ secrets.AWS_ACCESS_KEY_ID }}" >> $GITHUB_ENV
echo "AWS_SECRET_ACCESS_KEY=${{ secrets.AWS_SECRET_ACCESS_KEY }}" >> $GITHUB_ENV
echo "AWS_DEFAULT_REGION=${{ secrets.AWS_DEFAULT_REGION || 'us-east-1' }}" >> $GITHUB_ENV

- name: Install uv
uses: astral-sh/setup-uv@v4

- name: Install SeBS
run: uv pip install --system .

- name: Run regression tests
timeout-minutes: 5
run: |
sebs benchmark regression test \
--config configs/example.json \
--deployment ${{ inputs.platform }} \
--language ${{ inputs.language }} \
--language-version ${{ inputs.version }} \
--architecture x64 --selected-architecture \
--resource-prefix sebs-ci

- name: Generate test summary
if: always()
run: |
echo "Regression Test Summary" > test-summary.txt
echo "======================" >> test-summary.txt
echo "Platform: ${{ inputs.platform }}" >> test-summary.txt
echo "Language: ${{ inputs.language }}" >> test-summary.txt
echo "Version: ${{ inputs.version }}" >> test-summary.txt
echo "" >> test-summary.txt
if ls regression_*.json 1> /dev/null 2>&1; then
ls -1 regression_*.json | wc -l | xargs echo "Benchmarks tested:" >> test-summary.txt
echo "" >> test-summary.txt
echo "Results saved to artifacts/results/" >> test-summary.txt
else
echo "No benchmark results found" >> test-summary.txt
fi

- name: Upload test summary
if: always()
uses: actions/upload-artifact@v4
with:
name: test-summary-${{ inputs.platform }}-${{ inputs.language }}-${{ inputs.version }}
path: test-summary.txt

- name: Collect and upload regression results
if: always()
run: |
mkdir -p results
if ls regression_*.json 1> /dev/null 2>&1; then
mv regression_*.json results/ || true
fi

- name: Upload regression results
if: always()
uses: actions/upload-artifact@v4
with:
name: results-${{ inputs.platform }}-${{ inputs.language }}-${{ inputs.version }}
path: results/
if-no-files-found: ignore

- name: Upload cache snapshot
if: always()
uses: actions/upload-artifact@v4
with:
name: cache-snapshot-${{ inputs.platform }}-${{ inputs.language }}-${{ inputs.version }}
path: cache/
if-no-files-found: ignore
Comment on lines +113 to +119
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Upload the correct cache directory in the snapshot artifact.

This workflow restores/saves regression-cache/, but snapshot upload points to cache/, so the artifact can miss the actual cache used by regression runs.

🛠️ Proposed fix
       - name: Upload cache snapshot
         if: always()
         uses: actions/upload-artifact@v4
         with:
           name: cache-snapshot-${{ inputs.platform }}-${{ inputs.language }}-${{ inputs.version }}
-          path: cache/
+          path: regression-cache/
           if-no-files-found: ignore
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/_regression-job.yml around lines 113 - 119, The "Upload
cache snapshot" step is uploading the wrong directory (path: cache/) while the
workflow restores/saves regression-cache/; update the step to upload the actual
regression cache by changing the path from "cache/" to "regression-cache/". Keep
the step name "Upload cache snapshot" and the artifact name template (name:
cache-snapshot-${{ inputs.platform }}-${{ inputs.language }}-${{ inputs.version
}}) unchanged so the snapshot contains the real regression-cache contents.
Ensure "if-no-files-found: ignore" remains to avoid failing when the directory
is absent.


- name: Save SeBS cache
if: success()
uses: actions/cache/save@v4
with:
path: regression-cache/
key: sebs-cache-${{ github.ref_name }}-${{ inputs.platform }}-${{ inputs.language }}-${{ inputs.version }}
55 changes: 55 additions & 0 deletions .github/workflows/lint.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
name: Lint

on:
push:
pull_request:
workflow_dispatch:

jobs:
linting:
runs-on: ubuntu-latest

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Set up Python 3.10
uses: actions/setup-python@v5
with:
python-version: '3.10'

- name: Install uv
uses: astral-sh/setup-uv@v4

- name: Install system dependencies
run: sudo apt update && sudo apt install -y libcurl4-openssl-dev

- name: Cache uv dependencies
uses: actions/cache@v4
with:
path: ~/.cache/uv
key: uv-${{ runner.os }}-${{ hashFiles('requirements.txt', 'pyproject.toml') }}
restore-keys: |
uv-${{ runner.os }}-

- name: Install SeBS with dev dependencies
run: uv sync --extra dev

- name: Python code formatting with black
run: uv run black sebs --check --config .black.toml

- name: Python code lint with flake8
run: uv run flake8 sebs --config=.flake8.cfg --tee --output-file flake-reports

- name: Python static code verification with mypy
run: uv run mypy sebs --config-file=.mypy.ini

- name: Check for Python documentation coverage
run: uv run interrogate -v --fail-under 100 sebs

- name: Upload flake8 reports
if: always()
uses: actions/upload-artifact@v4
with:
name: flake-reports
path: flake-reports
25 changes: 25 additions & 0 deletions .github/workflows/regression.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
name: Regression Tests

on:
push:
branches:
- master
- 'feature/**'
workflow_dispatch:

jobs:
regression:
strategy:
matrix:
include:
- platform: aws
language: python
version: "3.11"
fail-fast: false

uses: ./.github/workflows/_regression-job.yml
with:
platform: ${{ matrix.platform }}
language: ${{ matrix.language }}
version: ${{ matrix.version }}
secrets: inherit
5 changes: 0 additions & 5 deletions benchmarks/wrappers/aws/python/setup.py
Original file line number Diff line number Diff line change
@@ -1,14 +1,9 @@
# Copyright 2020-2025 ETH Zurich and the SeBS authors. All rights reserved.
from distutils.core import setup
from glob import glob
from pkg_resources import parse_requirements

with open('requirements.txt') as f:
requirements = [str(r) for r in parse_requirements(f)]

setup(
name='function',
install_requires=requirements,
packages=['function'],
package_dir={'function': '.'},
package_data={'function': glob('**', recursive=True)},
Expand Down
2 changes: 2 additions & 0 deletions sebs/azure/cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -129,6 +129,8 @@ def execute(self, cmd: str) -> bytes:
RuntimeError: If command execution fails.
"""
exit_code, out = self.docker_instance.exec_run(cmd, user="docker_user")
# exec_run without stream=True always returns bytes
assert isinstance(out, bytes)
if exit_code != 0:
raise RuntimeError(
"Command {} failed at Azure CLI docker!\n Output {}".format(
Expand Down
4 changes: 2 additions & 2 deletions sebs/benchmark.py
Original file line number Diff line number Diff line change
Expand Up @@ -1338,14 +1338,14 @@ def ensure_image(name: str) -> None:
with open(tar_archive, "rb") as data:
container.put_archive("/mnt/function", data.read())
# do the build step
exit_code, stdout = container.exec_run(
exit_code, stdout = container.exec_run( # type: ignore[assignment]
cmd="/bin/bash /sebs/installer.sh",
user="docker_user",
stdout=True,
stderr=True,
)
# copy updated code with package
data, stat = container.get_archive("/mnt/function")
data, stat = container.get_archive("/mnt/function") # type: ignore[assignment]
with open(tar_archive, "wb") as output_filef:
for chunk in data:
output_filef.write(chunk)
Expand Down
Loading
Loading