Definition
Serverless computing is a cloud execution model where the cloud provider dynamically allocates compute resources to run your code, scales automatically based on demand, and charges only for the milliseconds of compute time consumed. You do not provision servers, configure auto-scaling groups, or manage operating system patches. You write a function, define a trigger (an HTTP request, a file upload, a database change, a scheduled timer), and the cloud provider handles everything else.
AWS Lambda, launched in 2014, popularized the model. Google Cloud Functions, Azure Functions, Cloudflare Workers, and Vercel Functions are other major platforms. Beyond functions, "serverless" has expanded to include managed databases (DynamoDB, PlanetScale), message queues (SQS, EventBridge), and storage (S3) where the operational model is the same: you use the service, and the provider manages the infrastructure.
The economic model differs fundamentally from traditional cloud hosting. With an EC2 instance, you pay for the server whether it is handling one request or ten thousand. With Lambda, you pay per invocation and per millisecond of execution time. A function that handles 100 requests per day costs fractions of a cent. This makes serverless extremely cost-effective for low-traffic and spiky workloads, but potentially more expensive than dedicated instances for sustained high-throughput workloads.
Why It Matters for Product Managers
Serverless changes the cost structure and speed of product development. For early-stage products, it removes the need for a dedicated DevOps engineer or infrastructure planning. Your engineering team can focus entirely on building features rather than managing servers. This can shave weeks off the timeline for an MVP or proof-of-concept. Many startups use serverless to validate a product idea at near-zero infrastructure cost, then migrate to dedicated infrastructure after reaching scale.
For PMs at established companies, serverless enables rapid experimentation. Need to add a webhook processor, a PDF generator, or a data pipeline? Deploy a function. No infrastructure ticket, no capacity planning, no waiting for server provisioning. This maps directly to faster iteration cycles and lower cost of experimentation. Understanding your team's DevOps capabilities helps you evaluate whether serverless is the right approach for a given feature.
How to Apply It
When evaluating serverless for a new feature or product, PMs should ask three questions. First, what is the expected traffic pattern? Serverless excels at variable load. If traffic is steady and high, dedicated compute is usually cheaper. Second, what are the latency requirements? Cold starts add 100ms-2s of latency on the first request after idle periods. If your feature requires consistent sub-100ms response times, serverless may not be the right fit. Third, what is the vendor lock-in risk? Migrating from one serverless provider to another requires rewriting deployment configurations and sometimes function signatures. Factor this into your build vs. buy analysis. The TAM Calculator can help you model the cost implications of different infrastructure choices as your product scales.