Sunday, March 14, 2010

Ajax and jQuery in the Enterprise, is it such a good idea?

image For the past six months I’ve been working on a web application for a business with around 800 employees. The employees are geographically spread around and they access most of their application by logging onto a Citrix server.

Now tracking back to the start of the project. I didn’t know there was a Citrix environment and started out developing the application screens in a traditional way. It’s a read only application and all actions can be performed with url parameters. Therefore all actions on the page navigates to a new url, using GET.

In a previous project I had been playing around with jQuery and ajax, and figured I could beef up the user experience by tapping into behind the scenes calls and DOM manipulation. Less refresh and flicker and updating relevant parts of the UI generally gives a better user experience.

Since all actions points were links it was fairly easy to ajax’ify them with jQuery.

$("A").each(function() {
$(this).click(
function(event) {
event.preventDefault();
navigate($(this).attr("href"));
});
}
);


The ajax function to execute the calls is also simple. It retrieves the #container element of the page retrieved (“code”) and inserts it into the page viewed by the user. I would define this as a “poor mans ajax”, but it required very little work.


function navigate(navurl) {
$.ajax({
url: navurl,
dataType: "html",
success: function(code) {
$("#container").html($('div #container', code).html());
}
});
return false;
}


In my development and test environments this worked like a charm, and even in production - when I accessed it from my laptop.

Then came the problems, test users were accessing it from a browser within the company Citrix environment.

A sub-second action on my machine, suddenly took anywhere from 5 to 40 seconds on Citrix. Investigating the matter showed that the Citrix server was using around 80%+ CPU at most times. The ajax call is executed fairly fast, but the line

$(“#container”).html( $('div #container', code).html() )

performed really slow. What it does is load the returned html into the DOM, traverse it to fetch the html from the #container element, and then find the #container element on the current page and replace the html. This actually uses a fair amount of cpu. On a stand-alone machine this is not an issue, but on a loaded Citrix server it is.

So there were two options, scrap the ajax calls, or try to fix it. Being stubborn by nature I went for the fix.

The fix was fairly easy. I cached a reference to the current page’s #container element in a global variable, and replaced the DOM search of the returned page with placeholders and good old fashioned substring.


var contentContainer = $("#container");
function navigate(navurl) {
$.ajax({
url: navurl,
dataType: "html",
success: function(code) {
var start = code.indexOf("<!-- cStart -->");
var end = code.indexOf("<!-- cEnd -->");
var html = code.substing(start, end);
contentContainer.html(html);
}
});
return false;
}


In effect I removed the two DOM traversals, and as expected indexOf and substring performs fast.

This shows that an application might behave very different in an Enterprise, since there are many factors to consider. Doing initial research to see how much resources are available for your application is a must for choosing the right strategy. This is equally true for desktop applications. How many colors are available and can the graphics card handle WPF transitions etc?

3 comments:

  1. Doing initial research is useful for every project. Your way of using "poor mans ajax" is probably the fasted way to handle this on the client side, much faster than sending a minimum amount of bytes over the wire in json or xml and parsing that on the client side. It doesn't take a lot of time to change the all the ajax calls in the codebase with your fix.

    Sometimes it's cheaper for the company that hires you to upgrade their hardware/software than paying you to write a lot of performance optimization. When the difference between an average machine and a Citrix machine is factor 50-100 as you describe, you might have more client side performance problems in the future. It might be useful to investigate the costs of an upgrade.

    I'm currently working on an intranet application with ten thousands of lines of javascript. The company was using IE6 on most of their machines. I would have to write a lot of code to make it behave the same in IE6 and in other browsers. I estimated the amount of time to write the IE6 specific code, and asked them to calculate the cost to upgrade IE on all of their machines. Upgrading was estimated much cheaper.

    ReplyDelete
  2. @Anonymous: I could most likely speed it up more with doing proper Ajax with JSON and retrieving only the content I actually need. But as you say, the cost of upgrading the hardware might be cheaper in the long run.

    I'm currently inquiring on the upgrade path they have, and I know they will switch to IE8 over the summer. Hopefully the IE8 switch is related to a Citrix farm upgrade.

    ReplyDelete