What is latency and how to reduce latency? 😳

Gepubliceerd op 31 maart 2022 om 15:50

What is the definition of latency?

The term "latency" refers to the amount of duration it takes for data to travel from one location on a network. Let's say Server A located in New York sends a data packet to Server B in London. Server A transmits the data message at 04:38:00.000 GMT and Server B receives it at 04:38:00.145 GMT. The latency of this path is determined by the difference between the 2 times of the packet: 0.145 seconds or 145 milliseconds.

 

The majority of the time, the latency measurement is done between a device used by the user (the "client" device) and the central data center. This measure aids developers in understanding how fast a web page or app will be able to load on users.

 

While information on the Internet is transmitted with the speed of light due to the effect of distance as well as delays due to Internet infrastructure equipment means that latency cannot be completely eliminated. It is possible and must, however, be reduced. A high degree of latency causes an unsatisfactory performance of websites, negatively affects SEO and may cause users to quit the site or the application completely.

What is the cause of Internet latency?

One of the primary causes of latency in networks is distance in particular that distance between clients devices that make requests and servers that respond to these requests. If a site is located at a data center in Columbus, Ohio, it will be able to respond quickly from users from Cincinnati (about 100 miles distant) most likely between 5 and 10 milliseconds. However requests from users located who reside in Los Angeles (about 2,200 miles away) may take longer to arrive, averaging 40-50 milliseconds.

 

A few milliseconds might not seem significant however, it is exacerbated due to the back-and forth communication required between the server and the client to establish connections, the total dimension and load time of the site as well as any issues with the equipment that the data is passed through the route. The time required to get a response on the client device following the request of a client is known in the form of the round-trip time (RTT). RTT is the equivalent of double the amount of latency because data must travel through both directions- from there and back.

 

Data that traverses the Internet typically has to traverse not one, but many networks. The more networks the HTTP reply has to cross the more chances exist for delay. As an example, when data packets move across networks, they travel via Internet Exchange Points (IXPs). In these, routers must sort and send the data packets. Additionally, at times , routers might require to split them into smaller bits and this can take a few milliseconds longer to RTT.

Throughput, latency, and bandwidth

Latency, bandwidth, as well as throughput are all connected, however, they are all different aspects. Bandwidth is the largest amount of data that could traverse an internet network in any one moment. Throughput is the total volume of data which flows through the network over a certain duration of. It isn't necessarily equal to bandwidth because it's affected by the latency of data and other variables. The term "latency" refers to the amount of time and not of the amount of data transferred over time.

Resolving issues with latency

The term "latency" refers to the amount of amount of time that data takes to be transferred between a client and a server and back over all connections available. Low bandwidth and/or high latency result in poor throughput, leading to issues with connections and delays.

 

It can happen anywhere between your local machine and the server, which is why it's crucial to take care of both. Most of the time, the strategies you employ on your local machine can also be effective for servers.

 

It's not possible to completely eliminate latency. However, there are simple ways to cut down on latency, or at most, determine the cause.

1. Reboot

Beware, we're going to use one of the most unpopular words in IT assistance... Do you shut it off, then switch it back on? It's not a joke, but rebooting is crucial both from a local as well as from a server perspective.

 

A network might slow down in time, if it's not restarted. Locally the memory of your router or modem is gradually filled up and begins to become a mess. In addition, sometimes, servers require reboots.

2. Shut down bandwidth-hogging applications

As mentioned above that bandwidth and latency are inextricably connected. If you're using close or even more than the bandwidth that you can afford for your connection, it'll cause a rise in the amount of latency.

 

The connection may take longer to transfer such a large quantity of data. Reduce the amount of bandwidth you use the system at any time. This could have an impact on latency. In addition, you might require to upgrade your hosting or use VPS in line with your usage.

3. Think about the possibility of a wired connection

For quite a while technology has been moving toward wireless options. Connecting your network with Ethernet cables Ethernet cable can be a boon for speed of connection in the event that latency is occurring locally. This is crucial when you are uploading or downloading huge amounts of data onto your local device. This also comes with the benefit of being an easy and affordable solution. It isn't common to find wireless devices in server facilities.

4. Reconsider data center locations

Based on the location where your data is stored It is possible to move your data around or create additional places. If you are able to place your data nearer to the end user or the retrieval point it will reduce time to access your data by a significant amount.

5. Add a CDN

Content Delivery Networks (CDNs) are a great way to keep backups of data in multiple places. Through the creation of multiple locations of entry, your information can be stored in a cache for speedy access by users who need it frequently. CDNs can decrease downtime and speed up the process. There are numerous CDNs on the market. If you do a little investigation, you'll find the best solution for your issues with latency.

Reactie plaatsen

Reacties

Er zijn geen reacties geplaatst.

Maak jouw eigen website met JouwWeb