Wednesday, March 31, 2010

SharePoint 2010: Search-Driven Portals

 workflow

On April 7th, 2010 (1pm Pacific Time) I will be presenting at a Microsoft TechNet webcast together with a colleague. The topic at hand is: How can we use search engines as a source of content for internet facing portals.

TechNet Webcast: SharePoint 2010: Search-Driven Portals (Level 200)

Websites today are often statically authored, meaning the components on the page have been put there by a person, and they show the same content to all visitors.

What if you could leverage the user’s context and intent in a better fashion together with the features of an enterprise search engine? Then package this in reusable business components in order to author better content to the end-user?

In this webcast we will propose FAST together with SharePoint as a starting platform to achieve this. By combining your business logic in workflows you can reuse logic and content across pages in a more dynamic way. Think of search as a more intelligent CMS tool, and define search as lookups to the source necessary to achieve a task, not just queries against the search engine.

Sunday, March 14, 2010

Ajax and jQuery in the Enterprise, is it such a good idea?

image For the past six months I’ve been working on a web application for a business with around 800 employees. The employees are geographically spread around and they access most of their application by logging onto a Citrix server.

Now tracking back to the start of the project. I didn’t know there was a Citrix environment and started out developing the application screens in a traditional way. It’s a read only application and all actions can be performed with url parameters. Therefore all actions on the page navigates to a new url, using GET.

In a previous project I had been playing around with jQuery and ajax, and figured I could beef up the user experience by tapping into behind the scenes calls and DOM manipulation. Less refresh and flicker and updating relevant parts of the UI generally gives a better user experience.

Since all actions points were links it was fairly easy to ajax’ify them with jQuery.

$("A").each(function() {
$(this).click(
function(event) {
event.preventDefault();
navigate($(this).attr("href"));
});
}
);


The ajax function to execute the calls is also simple. It retrieves the #container element of the page retrieved (“code”) and inserts it into the page viewed by the user. I would define this as a “poor mans ajax”, but it required very little work.


function navigate(navurl) {
$.ajax({
url: navurl,
dataType: "html",
success: function(code) {
$("#container").html($('div #container', code).html());
}
});
return false;
}


In my development and test environments this worked like a charm, and even in production - when I accessed it from my laptop.

Then came the problems, test users were accessing it from a browser within the company Citrix environment.

A sub-second action on my machine, suddenly took anywhere from 5 to 40 seconds on Citrix. Investigating the matter showed that the Citrix server was using around 80%+ CPU at most times. The ajax call is executed fairly fast, but the line

$(“#container”).html( $('div #container', code).html() )

performed really slow. What it does is load the returned html into the DOM, traverse it to fetch the html from the #container element, and then find the #container element on the current page and replace the html. This actually uses a fair amount of cpu. On a stand-alone machine this is not an issue, but on a loaded Citrix server it is.

So there were two options, scrap the ajax calls, or try to fix it. Being stubborn by nature I went for the fix.

The fix was fairly easy. I cached a reference to the current page’s #container element in a global variable, and replaced the DOM search of the returned page with placeholders and good old fashioned substring.


var contentContainer = $("#container");
function navigate(navurl) {
$.ajax({
url: navurl,
dataType: "html",
success: function(code) {
var start = code.indexOf("<!-- cStart -->");
var end = code.indexOf("<!-- cEnd -->");
var html = code.substing(start, end);
contentContainer.html(html);
}
});
return false;
}


In effect I removed the two DOM traversals, and as expected indexOf and substring performs fast.

This shows that an application might behave very different in an Enterprise, since there are many factors to consider. Doing initial research to see how much resources are available for your application is a must for choosing the right strategy. This is equally true for desktop applications. How many colors are available and can the graphics card handle WPF transitions etc?