|

GitHub Actions CI/CD (8/10): Artifacts — Keep Evidence From a Run

← Previous GitHub Actions CI/CD (8/10) Next →

Summary: Generate a JUnit XML test report during CI and upload it as an artifact. When a test fails at 2 AM, you can download the report instead of re-running the workflow to see what happened.

KeyValue
Package namehelloci
Working directory~/projects/helloci
Report formatJUnit XML
Report filereports/junit.xml
Artifact nametest-results
Retention14 days

0. Prerequisites

  • The helloci project with the three-job CI workflow from Part 6
  • Branch protection enabled from Part 7

1. What Are Artifacts

Every CI run executes on a fresh virtual machine that is destroyed after the job finishes. Any files created during the run — logs, test reports, build outputs — vanish with it.

Artifacts let you save files from a CI run and download them later. Common uses:

  • Test reports (JUnit XML, HTML coverage reports)
  • Build outputs (wheels, binaries)
  • Log files from integration tests
  • Screenshots from browser tests

Artifacts are not git commits. They live on GitHub’s servers, attached to a specific workflow run, and expire after a retention period.


2. Generate a JUnit XML Report Locally

pytest can produce a JUnit XML report with a single flag.

mkdir -p reports
pytest -v --junitxml=reports/junit.xmlCode language: Shell Session (shell)
11 passed

Inspect the report.

ls -la reports/junit.xmlCode language: Shell Session (shell)
-rw-r--r-- 1 user user 2847 ... reports/junit.xmlCode language: Shell Session (shell)

The JUnit XML format is a standard that most CI tools understand. GitHub can even parse it to show test results directly in the workflow summary.

Tip: Add reports/ to your .gitignore so generated reports do not get committed to the repository.


3. Update .gitignore

Add the reports directory to .gitignore.

__pycache__/
*.egg-info/
dist/
build/
.venv/
*.pyc
reports/Code language: Shell Session (shell)

4. Add Artifact Upload to the Workflow

Update .github/workflows/ci.yml. Modify the unit-tests and integration-tests jobs to generate reports and upload them as artifacts.

Here is the updated unit-tests job.

  unit-tests:
    runs-on: ubuntu-latest
    strategy:
      matrix:
        python-version: ["3.10", "3.11", "3.12"]
      fail-fast: false

    steps:
      - name: Check out code
        uses: actions/checkout@v4

      - name: Set up Python ${{ matrix.python-version }}
        uses: actions/setup-python@v5
        with:
          python-version: ${{ matrix.python-version }}

      - name: Install dependencies
        run: pip install -e ".[test]"

      - name: Run unit tests
        run: pytest -v -m "not integration" --junitxml=reports/junit.xml

      - name: Upload test results
        if: always()
        uses: actions/upload-artifact@v4
        with:
          name: unit-test-results-${{ matrix.python-version }}
          path: reports/junit.xml
          retention-days: 14Code language: YAML (yaml)

Here is the updated integration-tests job.

  integration-tests:
    runs-on: ubuntu-latest

    services:
      postgres:
        image: postgres:16
        env:
          POSTGRES_USER: helloci_user
          POSTGRES_PASSWORD: helloci_pass
          POSTGRES_DB: helloci_db
        ports:
          - 5432:5432
        options: >-
          --health-cmd="pg_isready -U helloci_user -d helloci_db"
          --health-interval=10s
          --health-timeout=5s
          --health-retries=5

    steps:
      - name: Check out code
        uses: actions/checkout@v4

      - name: Set up Python
        uses: actions/setup-python@v5
        with:
          python-version: "3.12"

      - name: Install dependencies
        run: pip install -e ".[test]"

      - name: Run integration tests
        env:
          DATABASE_URL: ${{ secrets.DATABASE_URL }}
        run: pytest -v -m integration --junitxml=reports/junit.xml

      - name: Upload test results
        if: always()
        uses: actions/upload-artifact@v4
        with:
          name: integration-test-results
          path: reports/junit.xml
          retention-days: 14Code language: YAML (yaml)

5. Understand the Upload Step

      - name: Upload test results
        if: always()
        uses: actions/upload-artifact@v4
        with:
          name: unit-test-results-${{ matrix.python-version }}
          path: reports/junit.xml
          retention-days: 14Code language: YAML (yaml)
KeyWhat it does
if: always()Runs this step even if previous steps failed
actions/upload-artifact@v4GitHub’s official artifact upload action
nameA label for the artifact — must be unique per job
pathThe file or directory to upload
retention-daysHow long GitHub keeps the artifact (default: 90 days)

The if: always() condition is critical. Without it, the upload step is skipped when tests fail — which is exactly when you need the report most.

The artifact name includes ${{ matrix.python-version }} so each matrix job produces a uniquely named artifact. Without this, the three matrix jobs would overwrite each other’s artifacts.


6. Push and View Artifacts

Commit and push.

git add .github/workflows/ci.yml .gitignore
git commit -m "Add JUnit XML artifact upload to CI"
git pushCode language: Shell Session (shell)

Open the Actions tab. Click the workflow run. Scroll to the bottom of the run summary page. You see an Artifacts section listing:

unit-test-results-3.10 (12 KB) unit-test-results-3.11 (12 KB) unit-test-results-3.12 (12 KB) integration-test-results (8 KB)

Click any artifact to download a ZIP file containing the junit.xml report.


7. Download Artifacts From the Command Line

You can also download artifacts using the GitHub CLI.

gh run download <RUN_ID> -n unit-test-results-3.12Code language: Shell Session (shell)

Replace <RUN_ID> with the workflow run number (visible in the URL or from gh run list).

gh run list --limit 5Code language: Shell Session (shell)

This downloads the artifact to your current directory. Useful for scripting or when you want to compare reports across runs.


8. What Else Can Be an Artifact

Artifacts are not limited to test reports. Here are common examples.

ArtifactWhen to use it
JUnit XML reportAlways — standard test output format
HTML coverage reportWhen you want visual coverage data
Built wheel/sdistRelease workflows (Part 9)
Log filesDebugging integration test failures
ScreenshotsBrowser/UI test failures
Database dumpsReproducing integration test state

Warning: Artifacts count toward your GitHub storage quota. Set retention-days to a reasonable value (7-30 days for test reports) to avoid accumulating storage costs.


Summary

You added JUnit XML test report generation and artifact uploads to the CI workflow. Every test run now produces downloadable evidence.

  • --junitxml=reports/junit.xml — pytest produces a standard test report
  • actions/upload-artifact@v4 — uploads files to the workflow run
  • if: always() — ensures artifacts are uploaded even when tests fail
  • retention-days: 14 — artifacts expire after 14 days
  • Unique names — matrix jobs use ${{ matrix.python-version }} in the artifact name

Artifacts turn CI from “pass/fail” to “pass/fail with evidence.” In Part 9 you will build a release workflow that creates GitHub Releases and publishes to PyPI when you push a version tag.

Similar Posts

Leave a Reply