The Speed of Trust: Cutting Latency for Inbound API Calls
The Fastest Truly Win. I Can Start Talking About Lions and Gazelles If You Want...
As I went to church in the morning, I noticed something I should have seen years ago. The pews were packed. An usher came up, went row by row, and ensured only a small number of people were in line at a time for our weekly time with the Savior. She was preventing a thundering herd, which had the side benefit of lowering the latency of our spiritual transaction.
Latency matters - in everything, including being in communion with Christ.
So let’s get back to our profession. How do we lower the latency of the most common activity in application development? Everybody immediately steers towards their business logic, database connections, and third-party services. However, just like the Church usher, there are some basic foundational tasks you can perform to reduce latency in your service. Let’s talk about them!
When your REST API is called—over HTTP or HTTPS—you’re not just handling a request. You’re establishing trust, performance, and reliability in a handful of milliseconds. And if you're not paying attention to those milliseconds? You're losing users. You're burning money. You're leaving performance on the table. So let’s talk—briefly, technically, and in full knowledge that the next coffee shop over is selling the same API as you, only faster—about how to make your HTTP/S REST APIs not slow—with CDN edge negotiation, pooled connections, real client-side measurements, and a more innovative way of tracking user actions.
1. TLS Termination at the Edge: Send Your Lawyers to the Border
Keep reading with a 7-day free trial
Subscribe to Wired for Scale: Sid Rao's Musings to keep reading this post and get 7 days of free access to the full post archives.