AI & Machine Learning Products
Edge Network
Platform as a Service
Virtual & Dedicated Servers
Video Streaming Platform
Cloud for Mobile
Custom Services Products
Media & Entertainment
Financial Services
IT / Technology
Retail
Education
Web Acceleration
Video Streaming
Security & Protection
Cloud
Availability
Partnership Solutions
Corporate Solutions
There are many factors that affect the speed of a web resource. One of them is network latency. Letās take a closer look at what latency is, how it affects application performance, and how it can be reduced.
Broadly speaking, latency is any delay in the execution of some operations. There are different types of latencies: network latencies, audio latencies, when broadcasting video during livestreams, at the storage level, etc.
Basically, any type of latency results from the limitations of the speed at which any signal can be transmitted.
Mostā ābut not allā ālatency types are measured in milliseconds. For example, when communicating between the CPU and SSD, microseconds are used to measure latency.
This article will focus on network latency, hereinafter referred to as ālatencyā.
Network latency (response time) is the delay that occurs when information is transferred across the network from point A to point B.
Imagine a web application deployed in a data center in Paris. This application is accessed by a user from Rome. The browser sends a request to the server at 9:22:03.000 CET. And the server receives it at 9:22:03.174 CET (UTC+1). The delay when sending this request is 174 ms.
This is a somewhat simplified example. It should be noted that data volume is not taken into account when measuring latency. It takes longer to transfer 1,000 MB of data than 1 KB. However, the data transfer rate can be the same, and the latency, in this case, will also be the same.
The concept of network latency is mainly used when discussing interactions between user devices and a data center. The lower the latency, the faster users will get access to the application that is hosted in the data center.
It is impossible to transmit data with no delays since nothing can travel faster than the speed of light.
The main factor that affects latency is distance. The closer the information source is to users, the faster the data will be transferred.
For example, a request from Rome to Naples (a little less than 200 km) takes about 10 ms. And the same request sent under the same conditions from Rome to Miami (a little over 8,000 km) will take about 120 ms.
There are other factors that affect network latency.
Network quality. At speeds above 10 Gbps, copper cables and connections show too much signal attenuation even over short distances, as little as within a few meters. With increasing interface speeds, fiber-optic network cables are mainly used.
Route. Data on the Internet is usually transmitted over more than one network. Information passes through several networksāautonomous systems. At the points of transition from one autonomous system to another, routers process data and send it to the required destination. Processing also takes time. Therefore, the more networks and IX there are on the packageās path, the longer it will take for it to be transferred.
Router performance. The faster the routers process data, the faster the information will reach its destination.
In some sources, the concept of network latency also includes the time the server needs to process a request and send a response. In this case, the server configuration, its capacity, and operation speed will also affect the latency. However, we will stick to the above definition, which includes only the time it takes to send the signal to its destination.
Latency affects other parameters of web resource performance, for example, the RTT and TTFB.
RTT (Round-Trip Time) is the time it takes for sent data to reach its destination, plus the time to confirm that the data has been received. Roughly speaking, this is the time it takes for data to travel back and forth.
TTFB (Time to First Byte) is the time from the moment the request is sent to the server until the first byte of information is received from it. Unlike the RTT, this indicator includes not only the time spent on delivering data but also the time the server takes to process it.
These indicators, in turn, affect the perception of speed and the user experience as a whole. The faster a web resource works, the more actively users will use it. Conversely, a slow application can negatively affect your online business.
The easiest way to determine your resourceās latency is by measuring other speed indicators, for example, the RTT. This parameter is closest to latency. In many cases, it will be equal to twice the latency value (when the travel time to is equal to the travel time back).
It is very easy to measure it using the ping command. Open a command prompt, type āpingā, and enter the resourceās IP address or web address.
Letās try to ping www.google.com as an example.
C:Usersusername>ping www.google.com
Exchange of packages with www.google.com [216.58.207.228] with 32 bytes of data
Response from 216.58.207.228: number of bytes=32 time=24ms TTL=16
Response from 216.58.207.228: number of bytes=32 time=24ms TTL=16
Response from 216.58.207.228: number of bytes=32 time=24ms TTL=16
Response from 216.58.207.228: number of bytes=32 time=24ms TTL=16
The time parameter is the RTT. In our example, it is 24 ms.
The optimal RTT value depends on the specifics of your project. On average, most specialists consider less than 100 ms to be a good indicator.
RTT value | Its meaning |
<100 ms | Very good, no improvements required |
100ā200 ms | Acceptable, but can be improved |
>200 ms | Unsatisfactory, improvements are required |
Here are some basic guidelines:
CDNāa Content Delivery Network with many connected servers that collect information from the origin, cache it, and deliver it using the shortest routeāwill help with the first and second points. A global network with good connectivity will help you significantly reduce latency.
However, keep in mind that latency is only one factor affecting usersā perception of application performance. In some cases, the latency is very low, but the website still loads slowly. This happens, for example, when the server is slow in processing requests.
As a rule, complex optimization is required to significantly speed up the application. You can find the main acceleration tips in the article āHow to increase your web resource speedā.
Gcore CDN provides excellent data transfer speed. We deliver heavy files with minimal delays anywhere in the world.
We have a free plan. Test our network and see how your resource will speed up.
Need to restart your network on Ubuntu? No problem! With a few simple commands, you can quickly restore connectivity and…
Setting up and running live streams can feel overwhelming for both newcomers and experienced creators due to technical complexities, the…
When working with Linux, it is common to create links between files to make file management easier. These links act…