Showing posts with label sql. Show all posts
Showing posts with label sql. Show all posts

Wednesday, April 16, 2014

Explaining SQL alias to a dummy

“If I wear a Donald Duck mask you think you are talking to Donald Duck. Then I give the mask to someone else, and you still think you are talking to Donald Duck”

image

“As long as you’re talking to Donald Duck, you get Donald Duck, which is all that matters”

Tuesday, December 3, 2013

How to fix search schema import with query rules which use dictionary lookups

Note: This post propose a solution which will render your farm in an unsupported state, and should only be done under Microsoft Support approved guidance. A bug for the issue has been filed, and I expect a fix will be provided with a CU later on. Another fix is to re-create your Service Application and get a fresh database.

I’m doing a search project where we have a bunch of query rules defined. Some are promoted results and some change what is displayed and the sort order. For some of the rules we use trigger terms from a term store, which works just fine.

The solution is created on a dev farm and then the search configuration from the search site is exported and moved to the production farm. So far so good.

Importing the search configuration in production works just fine, but when you try to access the query rule page (http://intranet/sites/search/_layouts/15/listqueryrules.aspx?level=sitecol) you get the following error:

image

Tuesday, April 6, 2010

Compiling Linq to SQL the Lazy Way

In the March issue of MSDN magazine there was an article about precompiling Linq queries in order to optimize query speed for queries being executes numerous times.

This was perfect for the current project I’m working with, and I set out to change my code which originally looked like this:

string Original(int refId)
{
var query = DbContext.Notes
.Where( note => note.CaseId == refId )
.Select(note => note.Text);
return string.Join(";", query);
}

Creating a static compiled query along the lines of the article changed the code to this:

private static Func<DataContext, int, IEnumerable<string>> _compiledQuery;
private Func<DataContext, int, IEnumerable<string>> GetQuery()
{
if (_compiledQuery == null)
{
_compiledQuery = CompiledQuery.Compile((DataContext db, int refId) =>
db.Notes
.Where( note => note.CaseId == refId )
.Select(note => note.Text));
}
return _compiledQuery;
}

string Compiled(int refId)
{
var query = GetQuery().Invoke(DbContext, refId);
return string.Join(";", query);
}

This is your regular code with checking if it’s been created and if not instantiate it. What I don’t like with this approach, now that I’m a .Net 4.0 guy, is that you might compile it twice if two threads access it at the same time since it’s not thread safe. Putting double locking in there would also cloud readability.

Certainly no big issue, but since we now have the wonderful Lazy<T> operator we can write the code like this instead:

private static Lazy<Func<DataContext, int, IEnumerable<string>>> NotesQuery = new Lazy<Func<DataContext, int, IEnumerable<string>>>(
() => CompiledQuery.Compile((DataContext db, int refId) =>
db.Notes
.Where( note => note.CaseId == refId )
.Select(note => note.Text))

);

string Lazy(int refId)
{
var query = NotesQuery.Value.Invoke(DbContext, refId);
return string.Join(";", query);
}

Not as clean as the first version, but certainly less messy than the intermediate one. Using Lazy<T> on shared instances is a good way to ensure it’s created and to avoid threading issues. And if you never use it, which could be the case for a function in a general busuiness layer, you won't compile it if you don't need it.

If we could hide some of the signature it would look and read even better.

Sunday, October 11, 2009

My first Azure project in the cloud

I finally got around to testing a small project in the cloud and it went much smoother than I anticipated.

As the unix server we used to run pornolize.com on is currently down I decided to port the perl code to .Net. It always helps to have a concrete project when learning something new. For those who are unfamiliar with The Pornolizer, it’s basically a web page translation service like Google Translate, except it substitutes words with dirty ones. And yes, I know it’s childish :)

The project consists of a web role which serves up the start page (I decided on a new layout as well once I was at it)

pornolize-screenshot

When clicking the “Translate” button the request is picked up by a protocol handler. The handler then downloads the page you want to translate, runs the translation and serves it to the user. Before the handler ends it’s Response, it puts a log message in a queue. This queue entry is picked up by a worker role, which again inserts this into SQL Azure. I could have used a table storage, but since I had a token for SQL Azure I decided to give it a go (and it made it very simple to use Linq to SQL). Just a change of the connection string and it was up and running.

Initially I though about having the worker role do the downloading and parsing, but since I wanted low latency I decided to drop it, and chose to put in the logging instead in order to explore using a queue and SQL Azure.

One of my better weekend projects for a long time – some code cleanup and refactoring and I can move on to something else.