RTX 3050 - Order Now
Home / Blog / Benchmarks / Flux.1 on RTX 3090: Images/sec & VRAM Usage, Category: Benchmarks, Slug: flux-1-on-rtx-3090-benchmark, Excerpt: Flux.1 benchmarked on RTX 3090: 0.82 it/s, 2.46 images/min at 1024×1024, VRAM usage, and cost per 1K images., Internal links: 8 –>
Benchmarks

Flux.1 on RTX 3090: Images/sec & VRAM Usage, Category: Benchmarks, Slug: flux-1-on-rtx-3090-benchmark, Excerpt: Flux.1 benchmarked on RTX 3090: 0.82 it/s, 2.46 images/min at 1024×1024, VRAM usage, and cost per 1K images., Internal links: 8 –>

Flux.1 benchmarked on RTX 3090: 0.82 it/s, 2.46 images/min at 1024x1024, VRAM usage, and cost per 1K images., Internal links: 8 -->

Flux.1 from Black Forest Labs demands serious VRAM. Its 12 GB model footprint disqualifies most consumer GPUs outright. But the RTX 3090 and its 24 GB frame buffer? That is a different story entirely. We loaded Flux.1 onto a dedicated 3090 server via GigaGPU and measured exactly what you can expect in production.

Measured Throughput at 1024×1024

MetricValue
Iterations/sec0.82 it/s
Seconds per image24.39 sec (20 steps)
Images per minute2.46
Resolution1024×1024
SamplerEuler a / DPM++ 2M Karras
Performance ratingGood

Benchmark conditions: 20-step generation at 1024×1024, batch size 1, FP16 precision. Using A1111 WebUI or ComfyUI backend.

How VRAM Breaks Down

ComponentVRAM
Model weights12.0 GB
Sampling buffer~2.4 GB
Total RTX 3090 VRAM24 GB
Free headroom~12.0 GB

After Flux.1 loads, you still have roughly 12 GB free. That is enough to run a ControlNet stack, bump resolution to 2048×2048 for select outputs, or even co-host a lightweight Whisper instance for multi-modal pipelines. Few other cards at this price point give you that kind of flexibility with a 12 GB model already resident.

What It Costs per Image

Cost MetricValue
Server cost£0.75/hr (£149/mo)
Cost per 1K images£5.08
Images per £1197

At £5.08 per thousand images, the RTX 3090 undercuts most commercial image APIs by a wide margin. For studios generating a few hundred images a day, this is the sweet spot between cost and capability. Compare other GPUs on our benchmark dashboard.

Who Should Use This Configuration

The 3090 is not the fastest Flux.1 card we have tested, but its 24 GB VRAM buffer makes it the most versatile in the mid-range. If your workflow involves ControlNet, inpainting, or batching multiple resolutions, the headroom matters more than raw speed. For pure throughput at higher volume, take a look at the best GPU for image generation comparison.

Quick deploy:

docker run --gpus all -p 8188:8188 ghcr.io/ai-dock/comfyui:latest

Further reading: Flux.1 hosting guide, all benchmark results, and Stable Diffusion hosting for comparing older architectures.

Deploy Flux.1 on RTX 3090

Order this exact configuration. UK datacenter, full root access.

Order RTX 3090 Server

Need a Dedicated GPU Server?

Deploy from RTX 3050 to RTX 5090. Full root access, NVMe storage, 1Gbps — UK datacenter.

Browse GPU Servers

admin

We benchmark, deploy, and optimise GPU infrastructure for AI workloads. All data in our guides comes from real-world testing on our UK-based dedicated GPU servers.

Ready to deploy your AI workload?

Dedicated GPU servers from our UK datacenter. NVMe storage, 1Gbps networking, full root access.

Browse GPU Servers Contact Sales

Have a question? Need help?