R
RunPod
3
Affordable GPU cloud for AI inference and training
commercialpaid
About
RunPod is a cloud computing platform that provides affordable GPU instances for AI and machine learning workloads. It offers both on-demand and spot GPU pods with pre-built templates for popular frameworks, along with serverless GPU endpoints for production inference. RunPod is popular among indie developers and startups for its competitive pricing and ease of use.
Details
| Type | gpu-cloud |
| GPU Types | NVIDIA H200, NVIDIA H100, NVIDIA A100, NVIDIA A40, NVIDIA RTX 4090, NVIDIA RTX 3090 |
| Regions | US-East, US-West, EU-West, EU-North |
| Starting Price | $0.44/hr |
Tags
gpu-cloudserverlessinferencetrainingaffordablespot-instancestemplates
Quick Info
- Organization
- RunPod
- Pricing
- $0.44/hr (RTX 4090) / $1.99/hr (H100 PCIe) / $3.59/hr (H200)
- Popularity
- 0/100
- MAU
- 200K+
- Updated
- Feb 19, 2026
Also in Infrastructure
L
Lambda Labs
GPU cloud and workstations purpose-built for AI training
Commercialpaid
Lambda$1.85/hr (H100/H200) / $2.99/hr (B200)
M
Modal
Serverless cloud for AI and data-intensive applications
Commercialpaid
Modal Labs$30/month free credits / $0.000356/sec (A100 40GB)
T
Together AI
Fast and affordable inference platform for open-source AI models
Commercialpaid
Together AI$0.20/1M tokens (Llama 3.1 8B)