Studies have shown that for every second users have to wait for a page to load the higher the probability they would switch to competing websites. But not everything is under web developers’ control, since many users have slower network connection, which can negatively impact their experience. It is obvious that high-resolution images and videos cause significant performance costs to users. So, they may prefer to have low-resolution images that can make pages download much faster. On the other hand, users with high-speed Internet connection would probably prefer high-quality visuals, due to the diminished performance costs. Addressing both requirements at the same time can be quite tricky for web designers, since they need to serve web elements as a function of network speed.
When Internet access providers publish their data speed, the amount of promised raw data transmitted could only happen in an ideal condition. The number shown on marketing materials have little relevance to users and web developers. What they need to care about is “real” data network speed, which is affected by a number of factors such as type of protocol, packet loss rate and delay value. When transferring a specific website element, the data speed may change abruptly due to a number of reasons:
• Users with mobile devices have moved to areas with slower data connection, such as from 4G/3G-capable cells to EDGE-only networks
• More users come online, causing peak traffic load on Internet provider’s side
• Bad weather conditions, that can affect both mobile users and ADSL subscribers
When considering about data network speed on users’ side, we should also take into account the connection quality between web servers and the Internet. Users’ proximity to the web server may also have a significant impact to the actual data speed. Consequently, determining the actual bottleneck can be rather time-consuming.
The dominant protocol for transmitting data over the Internet is TCP and it comes with specific overhead cost, since some of the data sent is used to organize the actual information required by users. TCP can’t effectively use all the data speed available, because it doesn’t know the actual data speed. Network congestion will occur if the data sent is bigger than the available capacity and this would disastrously affect the network. To prevent this, TCP applies two congestion control mechanisms, congestion avoidance and slow-start. A TCP connection has “congestion windows” or a limited bandwidth allowance. Over time, the congestion windows grow progressively until the pipeline is filled with enough data. This approach is called as “slow-start” and it may take a while before the bandwidth is used optimally. The congestion avoidance method is performed by sending just a bit more data, to see if it will cause congestion. Armed with this knowledge, web developers should understand that data connection speed is variable over time.
Some experts advise the use of a flexible approach, which transfers low-resolution images when users have low connection speed and high-resolution images when the speed exceeds specific value. For some reasons, the network conditions could change and web browsers would adjust by using different image resolutions as a response. Web developers can define the use image resources at different resolutions by using declarative syntax, which would hit web browsers which resources they need to use based on various criteria, such as screen size and pixel density.
Dynamic network measurement for website operations is still a new technology and APIs are being developed to inform Web developers about latest network situations. It will be convenient if web browsers are able to accurately measure latest bandwidth performance during initial load of HTML pages. This would allow web developers to deliver resources responsively to users based on unique characteristics.
Web developers are seeking specific technologies, such as media queries to serve proper resources based on users’ situation. Professionals working in a complex system, typically want to have a say in achieving perfect balance between slow page loading and high-quality resources. It is quite unfortunate that we still can’t quite implement this method accurately and a better method is still to let users optimize their network.