spinny:~/writing $ vim serverless-aws-lambda-vercel-cloudflare.md
1~2Serverless computing lets you run code without managing servers. You write functions, deploy them, and the platform handles scaling, availability, and infrastructure. In 2026, three platforms dominate the serverless landscape: **AWS Lambda**, **Vercel Serverless Functions**, and **Cloudflare Workers**. Each has different strengths and trade-offs.3~4In this article, we compare them with real examples, performance considerations, and pricing breakdowns.5~6## How Serverless Works7~8```mermaid9graph LR10 User -- "HTTP Request" --> Gateway[API Gateway / Edge]11 Gateway -- "Invoke" --> Function[Serverless Function]12 Function -- "Response" --> User13 Function -- "Read/Write" --> DB[(Database)]14```15~16You deploy a function. When a request arrives, the platform:171. Spins up an execution environment (or reuses a warm one)182. Runs your function193. Returns the response204. Scales to zero when idle (you don't pay for idle time)21~22## Platform Overview23~24### AWS Lambda25~26The original serverless platform (launched 2014). The most mature and feature-rich, with deep integration into the AWS ecosystem.27~28```javascript29// AWS Lambda handler30export const handler = async (event) => {31 const body = JSON.parse(event.body);32~33 const result = await processData(body);34~35 return {36 statusCode: 200,37 headers: { 'Content-Type': 'application/json' },38 body: JSON.stringify(result),39 };40};41```42~43### Vercel Serverless Functions44~45Tightly integrated with Next.js and the frontend deployment workflow. Functions are deployed alongside your frontend with zero configuration.46~47```typescript48// app/api/hello/route.ts (Next.js App Router)49import { NextResponse } from 'next/server';50~51export async function GET(request: Request) {52 const { searchParams } = new URL(request.url);53 const name = searchParams.get('name') || 'World';54~55 return NextResponse.json({ message: `Hello, ${name}!` });56}57```58~59### Cloudflare Workers60~61Runs on Cloudflare's edge network across 300+ cities worldwide. Uses the V8 engine (same as Chrome) instead of Node.js, which means extremely fast cold starts.62~63```javascript64// Cloudflare Worker65export default {66 async fetch(request, env) {67 const url = new URL(request.url);68 const name = url.searchParams.get('name') || 'World';69~70 return new Response(71 JSON.stringify({ message: `Hello, ${name}!` }),72 { headers: { 'Content-Type': 'application/json' } }73 );74 },75};76```77~78## Feature Comparison79~80| Feature | AWS Lambda | Vercel Functions | Cloudflare Workers |81|---------|-----------|-----------------|-------------------|82| **Runtime** | Node.js, Python, Go, Rust, Java, .NET | Node.js, Python, Go, Ruby | V8 Isolates (JS/TS, Rust via WASM) |83| **Max execution time** | 15 minutes | 60s (Hobby), 300s (Pro) | 30s (free), 15min (paid) |84| **Memory** | 128MB - 10GB | 1024MB - 3008MB | 128MB |85| **Cold start** | 100-500ms | 100-300ms | < 5ms |86| **Deploy location** | Single region (or multi with effort) | Multiple regions | 300+ edge locations |87| **Max payload** | 6MB (sync), 256KB (async) | 4.5MB | 100MB |88| **Bundled storage** | No (use DynamoDB, S3) | No (use external DB) | KV, D1 (SQLite), R2 (S3-compatible) |89| **Pricing model** | Per request + duration | Included with plan | Per request + duration |90| **Free tier** | 1M requests/month | 100K/month (Hobby) | 100K requests/day |91~92## Cold Starts93~94Cold start is the time it takes to initialize a new function instance. This is the biggest performance concern with serverless.95~96```mermaid97graph LR98 subgraph "Cold Start"99 A[Request] --> B[Provision Environment]100 B --> C[Load Code]101 C --> D[Initialize Runtime]102 D --> E[Execute Function]103 end104~105 subgraph "Warm Invocation"106 F[Request] --> G[Execute Function]107 end108```109~110| Platform | Typical Cold Start | Why |111|----------|-------------------|-----|112| **Cloudflare Workers** | < 5ms | V8 isolates, no full runtime needed |113| **Vercel Functions** | 100-300ms | Node.js runtime on edge or regional |114| **AWS Lambda** | 100-500ms | Full container initialization |115| **AWS Lambda (Java)** | 1-5 seconds | JVM startup overhead |116~117Cloudflare Workers win cold starts by a huge margin because they use V8 isolates instead of containers.118~119## Pricing Comparison120~121### Free Tier122~123| Platform | Free Requests | Free Compute |124|----------|--------------|--------------|125| **AWS Lambda** | 1M/month | 400,000 GB-seconds |126| **Vercel** | 100K/month | Included with Hobby plan |127| **Cloudflare Workers** | 100K/day (~3M/month) | 10ms CPU per invocation |128~129### At Scale (10M requests/month, 50ms avg duration)130~131| Platform | Estimated Monthly Cost |132|----------|----------------------|133| **AWS Lambda** | ~$2.50 (requests) + ~$4.15 (compute) = **~$6.65** |134| **Vercel** | **$20/month** (Pro plan, includes functions) |135| **Cloudflare Workers** | **$5/month** (Paid plan, includes 10M requests) |136~137For most use cases, Cloudflare Workers is the cheapest. Vercel's pricing is simple but includes the entire platform (hosting, CDN, analytics). AWS Lambda has the most granular pricing.138~139## Real-World Use Cases140~141### AWS Lambda: Best For142~143- **Complex backend workflows** - step functions, event-driven architectures144- **Integration with AWS services** - S3 triggers, DynamoDB streams, SQS queues145- **Long-running tasks** - up to 15 minutes of execution time146- **Multi-language teams** - supports the widest range of runtimes147~148```mermaid149graph TD150 S3[S3 Upload] --> Lambda1[Process Image]151 Lambda1 --> SQS[SQS Queue]152 SQS --> Lambda2[Generate Thumbnails]153 Lambda2 --> DDB[DynamoDB]154 DDB --> Lambda3[Send Notification]155 Lambda3 --> SNS[SNS / Email]156```157~158### Vercel Functions: Best For159~160- **Next.js applications** - zero-config API routes161- **Frontend-first teams** - deploy frontend and backend together162- **Rapid prototyping** - git push to deploy163- **Jamstack architectures** - static frontend + serverless API164~165```typescript166// app/api/subscribe/route.ts167import { NextResponse } from 'next/server';168~169export async function POST(request: Request) {170 const { email } = await request.json();171~172 // Validate173 if (!email || !email.includes('@')) {174 return NextResponse.json(175 { error: 'Invalid email' },176 { status: 400 }177 );178 }179~180 // Save to database181 await db.subscribers.create({ email });182~183 return NextResponse.json({ success: true });184}185```186~187### Cloudflare Workers: Best For188~189- **Low-latency APIs** - code runs in 300+ locations worldwide190- **Edge computing** - transform responses, A/B testing, personalization191- **High-volume APIs** - cheapest at scale with generous free tier192- **Global applications** - data close to users with KV and D1193~194```javascript195// Edge-side A/B test196export default {197 async fetch(request, env) {198 const url = new URL(request.url);199~200 // Assign user to variant201 const cookie = request.headers.get('Cookie') || '';202 let variant = cookie.includes('ab=b') ? 'b' : 'a';203~204 if (!cookie.includes('ab=')) {205 variant = Math.random() < 0.5 ? 'a' : 'b';206 }207~208 // Fetch the appropriate version209 const response = await fetch(`${url.origin}/variants/${variant}`);210 const newResponse = new Response(response.body, response);211~212 // Set cookie for consistent experience213 newResponse.headers.set('Set-Cookie', `ab=${variant}; Path=/; Max-Age=86400`);214~215 return newResponse;216 },217};218```219~220## When to Choose Which221~222### Choose AWS Lambda if:223- You're already invested in the AWS ecosystem224- You need long-running functions (up to 15 minutes)225- You need complex event-driven architectures226- You need runtimes beyond JavaScript (Python, Go, Rust, Java)227~228### Choose Vercel Functions if:229- You're building with Next.js or a frontend framework230- You want the simplest deploy experience (git push)231- Your team is frontend-focused232- You want hosting + functions + CDN in one platform233~234### Choose Cloudflare Workers if:235- You need the lowest possible latency globally236- You want the cheapest option at scale237- You need edge computing capabilities238- Cold start time is critical for your use case239~240## Can You Combine Them?241~242Absolutely. A common architecture:243~244```mermaid245graph TD246 User --> CF[Cloudflare Workers\nEdge caching, routing, A/B tests]247 CF --> Vercel[Vercel\nNext.js frontend + API routes]248 Vercel --> Lambda[AWS Lambda\nHeavy processing, background jobs]249 Lambda --> S3[S3 Storage]250 Lambda --> DB[(Database)]251```252~253- **Cloudflare Workers**: edge routing, caching, security254- **Vercel Functions**: frontend API routes, SSR255- **AWS Lambda**: heavy backend processing, scheduled tasks, event pipelines256~257## Conclusion258~259Serverless has matured significantly. In 2026, the choice between AWS Lambda, Vercel Functions, and Cloudflare Workers comes down to your stack and priorities:260~261- **Simplest developer experience**: Vercel262- **Most powerful and flexible**: AWS Lambda263- **Best performance and pricing**: Cloudflare Workers264~265All three are production-ready and battle-tested. Start with the one that fits your current stack, and expand as your needs grow.266~
NORMAL · serverless-aws-lambda-vercel-cloudflare.md [readonly]266 lines · :q to close