Latency is the delay between when something is initiated and when its effect begins, like the time between clicking a link and the page starting to load. In software systems, latency often refers to how long it takes for data to travel between the sender and receiver, which may be impacted by things like physical distance, network congestion, or processing overhead.
Did you see the new social media app that Meta just launched? I heard it has crazy high latency - my friend's posts take forever to show up in my feed. I bet their engineers are scrambling to fix it before everyone rage quits to TikTok.
The PM keeps pushing us to add more features to the app, but if we keep cramming in bloated libraries, the cold start latency on our serverless functions is going to be worse than dial-up internet in the 90s.
Latency Numbers Every Programmer Should Know: This interactive article provides a visual representation of latency at different scales, from L1 cache to packet round trips around the globe. It's a great way to build intuition about the relative magnitudes of different sources of delay.
Serverless Cold Starts: This in-depth analysis compares the cold start latency of serverless functions across AWS, Google Cloud, and Azure. It breaks down the factors that contribute to slower cold starts and offers tips for minimizing their impact.
It's the latency, stupid: In this classic rant, Stuart Cheshire argues that latency, not bandwidth, is often the key limiting factor for internet performance. He explains why latency matters more than most people think and offers some (slightly dated) examples.
Note: the Developer Dictionary is in Beta. Please direct feedback to skye@statsig.com.