eWEEK content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.
2Too Little Bandwidth to Start
Most companies start out in a hole by provisioning too little bandwidth on their Internet connections. Whether due to budget limits or poor planning—not understanding users’ real Internet habits and needs—most companies don’t offer enough bandwidth for their users’ needs. It’s common for users to have less than 50K bps, a far cry from the 10M-bps broadband speeds they have at home from their cable company.
3Software as a Service Stresses the WAN
4SaaS Adds Latency
5Surge in HTTP
The shift from client-server to Web-based applications (whether on-premises or SaaS) has meant the shift to greater amounts of HTTP traffic. HTTP is a “chatty” protocol and uses a high number of round trips to get the data from the server to the browser, further exacerbating the feeling that employees get of a really slow network. Today, 57 percent of corporate network traffic is HTTP/S.
6Bigger Web Pages
7Needless Redelivery
The real killer is the needless redelivery of the same Web content over and over again. Browser caches are comically small (typically 250MB), compared with both the amount of Web app and Website usage and the colossal size of hard drives today. Small caches mean that content is forced out by new content too soon, so it has to be redelivered.
8The Web Is Cacheable
The redelivery of content is unnecessary because most of it is static and cacheable. It’s hard to believe, but 88 percent of the average Web page is static and can—and should—be cached and re-used. A lot of the content is invisible, like Cascading Style Sheets and JavaScript, so many don’t even realize it’s there.
9Voice and Video
10Alternative Solutions
11A Faster, Better Web
The impact of significantly improved Web caching would be profound. Caches that are large, per-user and per-site could provide the greatest possible benefit in terms of reducing needless redelivery of content, thus eliminating large amounts of WAN traffic. Latency would also be greatly reduced—by a factor of four to fifteen, depending on conditions, because delivering content from cache is much faster than delivering it over the network.