Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Measuring the mean Web page size and its compression to limit latency and improve download time

Measuring the mean Web page size and its compression to limit latency and improve download time Web traffic is doubling every year, according to recent global studies. The user needs more information from Web sites and wants to spend as little time for downloading as possible. Simultaneously, more Internet bandwidth is needed and all ISPs are trying to build high bandwidth networks. This paper presents a case study that calculates the reduction of the time needed for a Web page to be fully downloaded and delivered to the user. Presents a way to calculate the reduction of data transfer, bandwidth resources and response time when the HTTP/1.1’s compressing feature is enabled (either in plain hypertext files or the text output of CGI programs or dynamically generated pages). Measurements are taken from five popular Web sites in order to validate our statement for reduction in transfer time. The definition of the mean size of a Web page that commercial Web sites have is additionally in the scope of this paper. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Internet Research Emerald Publishing

Measuring the mean Web page size and its compression to limit latency and improve download time

Internet Research , Volume 11 (1): 8 – Mar 1, 2001

Loading next page...
 
/lp/emerald-publishing/measuring-the-mean-web-page-size-and-its-compression-to-limit-latency-nWg8Um94eg
Publisher
Emerald Publishing
Copyright
Copyright © 2001 MCB UP Ltd. All rights reserved.
ISSN
1066-2243
DOI
10.1108/10662240110365661
Publisher site
See Article on Publisher Site

Abstract

Web traffic is doubling every year, according to recent global studies. The user needs more information from Web sites and wants to spend as little time for downloading as possible. Simultaneously, more Internet bandwidth is needed and all ISPs are trying to build high bandwidth networks. This paper presents a case study that calculates the reduction of the time needed for a Web page to be fully downloaded and delivered to the user. Presents a way to calculate the reduction of data transfer, bandwidth resources and response time when the HTTP/1.1’s compressing feature is enabled (either in plain hypertext files or the text output of CGI programs or dynamically generated pages). Measurements are taken from five popular Web sites in order to validate our statement for reduction in transfer time. The definition of the mean size of a Web page that commercial Web sites have is additionally in the scope of this paper.

Journal

Internet ResearchEmerald Publishing

Published: Mar 1, 2001

Keywords: World Wide Web; Hypertext; Web sites; Design

References