Latency is determined by several factors, including these:
- Distance
- Transmission medium
- Number of network devices
- Network congestion
Every internet connection has to deal with one fixed time cost: distance. Even data traveling at the speed of light takes longer to reach a server based in Australia than one a few blocks away. No matter how much you optimize your network, you can’t escape these physical limits. This is why satellite internet, which has to send its signal into space and back, will always have to deal with more latency than other types of connections.
Data also travels at different speeds along different transmission media. Fiber optic cable is among the fastest, but light traveling through a fiber optic cable is still slower than light traveling through a vacuum. Copper wire, unsurprisingly, transmits information much slower than fiber optic cable.
In addition to delays from distance, the more devices your information passes through, the longer it takes to arrive. Your computer doesn’t have a direct line connecting it to the server you’re trying to access. Instead, it must cross multiple networks to arrive at its destination.
These networks are connected at internet exchange points, which function like giant routers. Every packet that these devices receive must be examined and then sent in the right direction, which causes additional delays (known as propagation delays). Signals also must be boosted by repeaters the farther they travel, which can introduce even more delays.
Network congestion can also cause delays. If a device on a network has more data passing through it than it can handle, this creates a bottleneck, creating delays as your data packets wait their turn.