Introduction
Deploying a web application used to mean one thing: renting a server, installing software, and managing the machine. Today, there are multiple deployment paradigms -- static hosting, serverless functions, edge workers, containers, and traditional servers -- each optimized for different workloads. Understanding these options prevents both over-engineering (using Kubernetes for a landing page) and under-engineering (using static hosting for an application that needs a database).
Static Hosting (Pages)
Static hosting serves pre-built HTML, CSS, and JavaScript files from a CDN without any server-side processing. Services like Cloudflare Pages, Netlify, and Vercel offer static hosting with global distribution, automatic SSL, and instant deployments.
This is the simplest and fastest deployment option. Single-page applications (SPAs) built with React or Vue, marketing sites, portfolios, and documentation sites are all excellent candidates for static hosting. The files are built once during deployment and served to every user identically.
Limitations: static hosting cannot run server-side code. If your application needs to query a database, process form submissions, or perform authenticated operations, you need either a separate API server or a serverless function layer.
Serverless Functions
Serverless functions (AWS Lambda, Vercel Functions, Netlify Functions) execute server-side code on demand without managing a server. Each function handles a single API endpoint. The platform scales automatically -- zero instances when there is no traffic, hundreds of instances during a spike.
The pricing model is pay-per-invocation, which is extremely cost-effective for applications with variable traffic. A side project that receives ten requests per day and a viral application that receives ten thousand requests per minute both work on serverless, with costs scaling linearly.
Limitations: cold starts can add 100-500ms of latency to the first request after a period of inactivity. Functions have execution time limits (typically 10-60 seconds). And the stateless nature means you cannot keep data in memory between requests -- every invocation starts fresh.
Edge Workers
Edge workers (Cloudflare Workers, Deno Deploy) run code at CDN edge locations, placing your server-side logic geographically close to users worldwide. A user in Sydney hits an edge worker in Sydney, not a server in Virginia. This reduces latency dramatically for global applications.
Edge workers start faster than traditional serverless functions (typically under 5ms cold start) and have lower latency for geographically distributed users. They are well-suited to API proxies, authentication checks, A/B testing, and lightweight data transformations.
Limitations: edge workers have stricter resource limits than traditional serverless functions. They typically cannot access traditional databases directly (though Cloudflare D1 and other edge databases are addressing this). Complex computation-heavy tasks are better suited to traditional serverless or containers.
Containers and Traditional Servers
Docker containers running on services like AWS ECS, Google Cloud Run, or self-managed servers on Hetzner or DigitalOcean provide maximum flexibility. You control the operating system, runtime, installed packages, and configuration. This is the right choice for applications with specific infrastructure requirements, long-running processes, or heavy computational workloads.
Containers are also the standard for applications that need persistent connections (WebSockets, server-sent events), background job processing, or complex deployment configurations (multiple services, message queues, caching layers).
Limitations: containers require more operational expertise. You manage scaling, health checks, logging, and updates. The cost model is based on time (instances running) rather than invocations, which means you pay for idle capacity.
Choosing the Right Approach
The decision tree is simpler than the options suggest:
- Is your application purely frontend? Use static hosting (Pages)
- Does it need a few API endpoints? Add serverless functions
- Does it need global low-latency? Use edge workers
- Does it need a database, WebSockets, or long-running processes? Use containers
- Do you need full control? Use a traditional server (VPS)
Many applications combine approaches: static hosting for the frontend, serverless functions for the API, and a managed database service (like AWS RDS) for data persistence.

Cost Comparison
- Static hosting: Often free for moderate traffic (Cloudflare Pages, Netlify free tier)
- Serverless functions: Pay-per-invocation, typically $0.20 per million requests
- Edge workers: Similar to serverless pricing, with lower latency
- Containers: $5-50/month for small applications, scaling with resource usage
- Dedicated servers: $20-100+/month, depending on specifications
Conclusion
The right deployment approach depends on your application's requirements, not on what is trending. Simple applications should use simple deployment. Complex applications may need multiple layers. The most common mistake is over-engineering the deployment for a project that would be perfectly served by static hosting with a few serverless functions.
Start simple, and add infrastructure complexity only when your application's actual needs demand it.


