eWEEK content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.
2Tiger Woods Makes a Comeback
3Tigers Return Expected to Create a Rush on the Masters.com Site
With Tiger back in the fold, the maintainers of the Masters.com Website expect a substantial increase in traffic. DynaTrace has done an analysis of some of the ways the Masters can prepare for the traffic onslaught. Based on a series of analyses by DynaTrace using its free AJAX Edition, it seems that there are a number of issues that slow the overall performance of Masters.com.
4What Are the Major Issues with Masters.com?
Excessive redirects help delay the initial page load of Masters.com to more than 10 seconds. In addition, while many JavaScript files have been merged, their ~300Kb payload is downloaded on each page request—further delaying load times. A bug in Microsofts IE7 causes nearly 2,500 caching requests to check the browser version. And failure to use far-future expires headers in caching controls causes pages to be reloaded unnecessarily. There is also the failure to optimize JavaScript—most XHR calls to retrieve player data are redundant, causing hundreds of unnecessary calls. Additionally, expensive CSS selectors add 1.3 seconds just for the first three calls.
5What Can Be Done? Step 1: Reduce Redirects to Speed the Initial Loading of Masters.com
DynaTrace saw three redirects from Masters.com. The initial request takes 4.2 seconds (including DNS lookup, connect and server time) and redirects to www.masters.com. The second request to www.masters.com redirects to www.masters.com/index.html and takes 0.57 seconds. This redirect could be avoided by redirecting to that page in the first place. The third request now goes to the actual index.html page, which takes 2.4 seconds.
6Step 2: Avoid Merging
A general best practice is to merge JavaScript files into fewer files. This practice was somewhat followed here—but only somewhat. Instead of merging the code into fewer JavaScript files, the code of all required JavaScript libraries on each page was embedded in the HTML document. This approach limits the number of roundtrips to download the code, but it has one major drawback: Caching these files is not possible, as they are embedded in dynamically generated HTML. The big problem here—even though they reduced total roundtrips—is that every user must download ~300Kb of HTML for every page request. In times of high-speed Internet, this might not seem like a problem. However, not everybody has this kind of Internet speed. In addition, if their servers are pounded with additional load, the network bandwidth to the servers might become a problem due to the increased load.
7Step 3: Use Far-Future Expires Headers for Better Caching
Masters.com uses cache controls. However, the controls are set to expire a mere 13 minutes after first caching. Unless these images will change frequently (not likely), many redundant roundtrips can be avoided, saving almost 7 seconds. To solve this problem Masters.com should use far-future expires headers set further out than 13 minutes. Unless there is a reason to constantly download these files (in the case that they frequently change), it is better to leverage the local browser cache and reduce download times.
8Step 4: Avoid Microsoft’s IE7 Bug
Looking more closely at the Summary View of the session, the resource chart shows that 2468 Text resources were accessed from the tester’s local browser cache, which is curious. Masters.com does a browser check by querying the User-Agent property of the Navigator object. This is a common practice to identify the current browser version. Unfortunately, there is a bug in Microsofts Internet Explorer 7 that returns a wrong User-Agent in cases where the User-Agent string is longer than 256 characters. Therefore, code is executed for all the .png files that are on every single page, causing the HTC file to be loaded.
9Step 5: Optimize JavaScript and DOM Access
Data on the players is downloaded via an XHR call returning an XML document that contains a record for each player, including first name, last name and country. This XML is then processed, and JavaScript objects are created for each player. For each of the 99 players returned by the XHR call, the code in the each loop uses the $(this) to get a jQuery object for the current XML record. One call to this method would be enough—and would therefore save several hundred calls to this method, as you can simply store it in a variable and use this variable instead.
10A Better User Experience for Golf Fans
Well all be watching as Tiger Woods makes his return to the links at Augusta. With the potential for large online Web traffic, Masters.com could suffer serious performance problems as the servers are bombarded with more requests than necessary. Following best practices like minifying content, leveraging the browsers cache and optimizing AJAX/JavaScript execution will greatly improve Website performance and will lead to happier users.