Logo

What 3.26 Billion Requests Taught Us About Black Friday

Lucas Ribeiro
Lucas Ribeiro
February 18, 2026
What 3.26 Billion Requests Taught Us About Black Friday

Black Friday 2025 was the largest event we've handled. Over 100 stores running on deco, including major enterprise brands going through their first Black Friday on our infrastructure. Months of preparation and a full year of hard-earned lessons delivered 99.92% availability across 3.26 billion requests.

This is what we built, what we learned, and where we're going.


The scale

3.26B
Total requests
99.92%
Availability
437.6M
Peak day requests
67.67 TB
Data transferred
87,907
Peak simultaneous connections
2.1B
Cache hits

Peak simultaneous connections reached 87,907 on Thursday evening, not Friday. BF traffic now spreads across the entire week as consumers browse, compare, and build carts days before they buy. If your infrastructure plan is built around a Friday spike, it's built around the wrong day.


What it takes to prepare

Preparing for Black Friday isn't a November problem. Throughout the year, seasonal campaigns, flash sales, and live shopping events create smaller spikes that reveal where infrastructure will break under real pressure. Those events are the actual preparation. They surface problems before the stakes are highest.

Multi-cloud as insurance

Running across AWS and GCP with pre-scaled failover means a provider issue doesn't become a brand's problem. During BF, 42 Spot Instance interruptions were auto-recovered in ~106ms each. Zero reached the storefront.

Multi-CDN for resilience

A CDN incident weeks before Black Friday exposed gaps in our failover strategy. That pressure forced us to build better tooling. By BF, switching between CDN providers was a single command. 98.25% cache hit rate on static assets.

Load testing with real patterns

Synthetic benchmarks don't capture real user behavior. We use K6 on Kubernetes with auto-generated test data based on recorded navigation flows from actual stores. The gap between synthetic and real-world tests is where surprises hide.

Kubernetes handles our horizontal scaling at two levels: adding machines to the cluster for more application capacity, and adding replicas of each store's application for higher throughput. We pre-scaled the base and tuned auto-scaling sensitivity for faster response during spikes. In 2024, we over-provisioned significantly and burned money. In 2025, we found the right balance between cost and headroom.

Some of our clients have seasonal dates as intense as Black Friday itself. Monthly flash sales that drive 10x to 50x traffic spikes. Campaigns tied to TV appearances where traffic goes from normal to a wall in seconds. These events gave us real production-scale validation throughout the year. By November, our infrastructure strategy had been tested and refined dozens of times with real traffic.

warning
Cache configuration matters more than you'd think

Across our top 10 clients, cache hit rates ranged from 14% to 98%. The difference between a well-configured and poorly-configured cache setup can mean 20x the origin server costs for the same traffic volume. This is often the single biggest performance lever for BF.


What we got wrong

Abstract geometric composition representing the tension between built capability and unfinished automation

We spent much of 2025 building tools that could have made this Black Friday easier. AI agents for monitoring, automated performance diagnostics, systems that could respond faster than any on-call engineer. We knew what was possible. We didn't ship them in time.

So Black Friday was manual. The team monitored dashboards live, adjusted configurations in real time, and stayed up through the night. The infrastructure held. The automation didn't exist yet. That gap between what we knew was possible and what we actually had is the honest story of this Black Friday, and the roadmap for the next one.

On the ground

In November, we were hands-on with every client. We recorded a one-hour optimization guide for stores, helped brands navigate cache and deployment trade-offs during code freezes, and fielded calls from agencies around the clock. The week of BF itself, engineers watched dashboards live during every peak hour because monitoring tools have inherent delay. When something needed adjusting, it was done by hand, in real time.

What the data showed us after

We learned from the numbers after the event. Cache hit rates came in lower than expected for some stores. Static assets could be served more efficiently. Edge cases in third-party integrations only surfaced under real BF volumes. Every one of those findings went straight into our Q1 2026 backlog.

The infrastructure performed. But we fought for every hour of it.


What's next

BF 2025 gave us something more valuable than good numbers. It gave us a precise map of where to invest. When you operate at this scale, every inefficiency becomes visible and the path forward gets concrete.

check
Cache strategy overhaul: pushing all clients toward higher hit rates, including HTML-level caching
check
AI-powered monitoring: the agents we didn't ship in time are being built now, for automated anomaly detection and incident response
check
Expanded SRE team with more capacity for infrastructure innovation on our Kubernetes clusters
check
Deeper observability tooling so peak hours don't require engineers watching dashboards manually

The AI platform we spent 2025 building wasn't wasted effort. It's the foundation for everything on this list. The agents, the automation, the smarter monitoring, all of it runs on what we already built. We just didn't have time to close the loop before November. Now we do.

Every tool we build for BF 2026 also makes every normal Tuesday better for every client running on deco. The investment compounds.


3.26 billion requests. 99.92% availability. That's what Black Friday 2025 looked like for our clients on deco.

Black Friday 2026 will be agentic.

Stay up to date

Subscribe to our newsletter and get the latest updates, tips, and exclusive content delivered straight to your inbox.

We respect your privacy. Unsubscribe at any time.

You might also like

See all