GitHub Actions CI/CD (8/10): Artifacts — Keep Evidence From a Run
Summary: Generate a JUnit XML test report during CI and upload it as an artifact. When a test fails at 2 AM, you can download the report instead of re-running the workflow to see what happened.
| Key | Value |
|---|---|
| Package name | helloci |
| Working directory | ~/projects/helloci |
| Report format | JUnit XML |
| Report file | reports/junit.xml |
| Artifact name | test-results |
| Retention | 14 days |
0. Prerequisites
- The
hellociproject with the three-job CI workflow from Part 6 - Branch protection enabled from Part 7
1. What Are Artifacts
Every CI run executes on a fresh virtual machine that is destroyed after the job finishes. Any files created during the run — logs, test reports, build outputs — vanish with it.
Artifacts let you save files from a CI run and download them later. Common uses:
- Test reports (JUnit XML, HTML coverage reports)
- Build outputs (wheels, binaries)
- Log files from integration tests
- Screenshots from browser tests
Artifacts are not git commits. They live on GitHub’s servers, attached to a specific workflow run, and expire after a retention period.
2. Generate a JUnit XML Report Locally
pytest can produce a JUnit XML report with a single flag.
mkdir -p reports
pytest -v --junitxml=reports/junit.xmlCode language: Shell Session (shell)
11 passed
Inspect the report.
ls -la reports/junit.xmlCode language: Shell Session (shell)
-rw-r--r-- 1 user user 2847 ... reports/junit.xmlCode language: Shell Session (shell)
The JUnit XML format is a standard that most CI tools understand. GitHub can even parse it to show test results directly in the workflow summary.
Tip: Add
reports/to your.gitignoreso generated reports do not get committed to the repository.
3. Update .gitignore
Add the reports directory to .gitignore.
__pycache__/
*.egg-info/
dist/
build/
.venv/
*.pyc
reports/Code language: Shell Session (shell)
4. Add Artifact Upload to the Workflow
Update .github/workflows/ci.yml. Modify the unit-tests and integration-tests jobs to generate reports and upload them as artifacts.
Here is the updated unit-tests job.
unit-tests:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.10", "3.11", "3.12"]
fail-fast: false
steps:
- name: Check out code
uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: pip install -e ".[test]"
- name: Run unit tests
run: pytest -v -m "not integration" --junitxml=reports/junit.xml
- name: Upload test results
if: always()
uses: actions/upload-artifact@v4
with:
name: unit-test-results-${{ matrix.python-version }}
path: reports/junit.xml
retention-days: 14Code language: YAML (yaml)
Here is the updated integration-tests job.
integration-tests:
runs-on: ubuntu-latest
services:
postgres:
image: postgres:16
env:
POSTGRES_USER: helloci_user
POSTGRES_PASSWORD: helloci_pass
POSTGRES_DB: helloci_db
ports:
- 5432:5432
options: >-
--health-cmd="pg_isready -U helloci_user -d helloci_db"
--health-interval=10s
--health-timeout=5s
--health-retries=5
steps:
- name: Check out code
uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.12"
- name: Install dependencies
run: pip install -e ".[test]"
- name: Run integration tests
env:
DATABASE_URL: ${{ secrets.DATABASE_URL }}
run: pytest -v -m integration --junitxml=reports/junit.xml
- name: Upload test results
if: always()
uses: actions/upload-artifact@v4
with:
name: integration-test-results
path: reports/junit.xml
retention-days: 14Code language: YAML (yaml)
5. Understand the Upload Step
- name: Upload test results
if: always()
uses: actions/upload-artifact@v4
with:
name: unit-test-results-${{ matrix.python-version }}
path: reports/junit.xml
retention-days: 14Code language: YAML (yaml)
| Key | What it does |
|---|---|
if: always() | Runs this step even if previous steps failed |
actions/upload-artifact@v4 | GitHub’s official artifact upload action |
name | A label for the artifact — must be unique per job |
path | The file or directory to upload |
retention-days | How long GitHub keeps the artifact (default: 90 days) |
The if: always() condition is critical. Without it, the upload step is skipped when tests fail — which is exactly when you need the report most.
The artifact name includes ${{ matrix.python-version }} so each matrix job produces a uniquely named artifact. Without this, the three matrix jobs would overwrite each other’s artifacts.
6. Push and View Artifacts
Commit and push.
git add .github/workflows/ci.yml .gitignore
git commit -m "Add JUnit XML artifact upload to CI"
git pushCode language: Shell Session (shell)
Open the Actions tab. Click the workflow run. Scroll to the bottom of the run summary page. You see an Artifacts section listing:
unit-test-results-3.10 (12 KB) unit-test-results-3.11 (12 KB) unit-test-results-3.12 (12 KB) integration-test-results (8 KB)Click any artifact to download a ZIP file containing the junit.xml report.
7. Download Artifacts From the Command Line
You can also download artifacts using the GitHub CLI.
gh run download <RUN_ID> -n unit-test-results-3.12Code language: Shell Session (shell)
Replace <RUN_ID> with the workflow run number (visible in the URL or from gh run list).
gh run list --limit 5Code language: Shell Session (shell)
This downloads the artifact to your current directory. Useful for scripting or when you want to compare reports across runs.
8. What Else Can Be an Artifact
Artifacts are not limited to test reports. Here are common examples.
| Artifact | When to use it |
|---|---|
| JUnit XML report | Always — standard test output format |
| HTML coverage report | When you want visual coverage data |
| Built wheel/sdist | Release workflows (Part 9) |
| Log files | Debugging integration test failures |
| Screenshots | Browser/UI test failures |
| Database dumps | Reproducing integration test state |
Warning: Artifacts count toward your GitHub storage quota. Set
retention-daysto a reasonable value (7-30 days for test reports) to avoid accumulating storage costs.
Summary
You added JUnit XML test report generation and artifact uploads to the CI workflow. Every test run now produces downloadable evidence.
--junitxml=reports/junit.xml— pytest produces a standard test reportactions/upload-artifact@v4— uploads files to the workflow runif: always()— ensures artifacts are uploaded even when tests failretention-days: 14— artifacts expire after 14 days- Unique names — matrix jobs use
${{ matrix.python-version }}in the artifact name
Artifacts turn CI from “pass/fail” to “pass/fail with evidence.” In Part 9 you will build a release workflow that creates GitHub Releases and publishes to PyPI when you push a version tag.
GitHub Actions CI/CD — All Parts
- 1 GitHub Actions CI/CD (1/10): Make It a Package You Can Test
- 2 GitHub Actions CI/CD (2/10): Unit Tests — Fast Feedback
- 3 GitHub Actions CI/CD (3/10): Quality Gate Before Tests — Lint and Formatting
- 4 GitHub Actions CI/CD (4/10): Your First CI Workflow — Run on Every PR
- 5 GitHub Actions CI/CD (5/10): Matrix Testing — Multiple Python Versions
- 6 GitHub Actions CI/CD (6/10): Integration Tests — Real Dependencies in CI
- 7 GitHub Actions CI/CD (7/10): Branch Protection — Make CI a Merge Gate
- 8 GitHub Actions CI/CD (8/10): Artifacts — Keep Evidence From a Run You are here
- 9 GitHub Actions CI/CD (9/10): Release Workflow — Tag, Build, Publish
- 10 GitHub Actions CI/CD (10/10): Going Professional — Split Jobs, Caching, and Nightly Builds
