RTX 3050 - Order Now
Home / Blog / Tutorials / GitHub Actions Self-Hosted GPU Runner
Tutorials

GitHub Actions Self-Hosted GPU Runner

CI pipelines that test GPU code or fine-tune models need GPU access. Self-hosted runners on a dedicated server give you that cheaply.

GitHub Actions hosted runners do not have GPUs. If your CI tests GPU code, runs small fine-tunes, or validates inference changes, you need a self-hosted runner. On our dedicated GPU hosting standing one up takes 15 minutes.

Contents

Setup

In your GitHub repo settings, go to Actions -> Runners -> New self-hosted runner. Follow the displayed commands on your GPU server:

mkdir actions-runner && cd actions-runner
curl -o actions-runner-linux-x64-2.321.0.tar.gz -L \
  https://github.com/actions/runner/releases/download/v2.321.0/actions-runner-linux-x64-2.321.0.tar.gz
tar xzf actions-runner-linux-x64-2.321.0.tar.gz
./config.sh --url https://github.com/yourorg/yourrepo --token <token>
./run.sh

Install as a service:

sudo ./svc.sh install
sudo ./svc.sh start

Workflow

name: GPU Tests
on: [push]
jobs:
  test:
    runs-on: self-hosted
    steps:
      - uses: actions/checkout@v4
      - run: nvidia-smi
      - run: pip install -r requirements.txt
      - run: pytest tests/gpu/

Target the runner via runs-on: self-hosted or use custom labels if you have multiple runners with different GPUs.

Isolation

GitHub runners run arbitrary code from your CI pipeline. Do not share a runner with your production inference. Best practices:

  • Dedicated user account with minimal permissions
  • Ephemeral runners for each job (--ephemeral flag)
  • Run in a container with GPU passthrough to isolate filesystem
  • No production secrets on the runner host

Security

Do not enable self-hosted runners on public repos – any PR from a stranger could run arbitrary code on your GPU. For public repos use GitHub’s cloud runners or require manual approval.

CI With GPU Access

UK dedicated GPU hosting with GitHub Actions runners ready to register.

Browse GPU Servers

See remote VS Code.

Need a Dedicated GPU Server?

Deploy from RTX 3050 to RTX 5090. Full root access, NVMe storage, 1Gbps — UK datacenter.

Browse GPU Servers

gigagpu

We benchmark, deploy, and optimise GPU infrastructure for AI workloads. All data in our guides comes from real-world testing on our UK-based dedicated GPU servers.

Ready to deploy your AI workload?

Dedicated GPU servers from our UK datacenter. NVMe storage, 1Gbps networking, full root access.

Browse GPU Servers Contact Sales

Have a question? Need help?