Wednesday, December 23, 2009

Customizing the disabled look of a RichTextBox in winforms

When setting the .Enabled property of a control to false, it renders it with the disabled color of the operating system. Sometimes you might just want a different look.

In order to accomplish this in a winform you need to perform a small workaround. If not you will end up with a blinking cursor in the control. Here’s an example in the form of an extension method.

public static class MyExtensions
{
public static void Disable( this Control control, Control focusTarget )
{
control.TabStop = false;
control.BackColor = Color.DimGray;
control.Cursor = Cursors.Arrow;
control.Enter += delegate { focusTarget.Focus(); };
}
}

In order for this to work you need to pass in another control which can receive focus. This could be a label or some other control on the form.

You would also need to create an Enable method and add some logic to remember the previous background color as well as set the correct Cursor state.

Monday, December 21, 2009

Going the SDD route

(Before and after benchmarks are at the bottom of the post)

[Edit] I’m no longer using the Intel AHCI drivers as they don’t support TRIM

After working with installing new VM’s on my Latitude E6400 where the programs to be installed were iso’s mounted on the same physical drive I decided to go the SSD route. That way I would not only end up with a blazing fast drive, but with two drives. My mind was set and the journey began.

After some research I decided on the OCZ Vertex 256gb drive. I long debated with myself to get an Intel X-25M 160gb, but the size of the OCZ won me over. I should still be able to experience the wow factor of an SSD.

Together with the drive I ordered an Icy Box SATA dock in order to clone the drive over eSATA which I can use afterwards for the old drive (a 250gb WD Scorpio).

Before I go into all the issues I encountered, I recommend non-geeks to do a clean install. It’s most likely faster (unless you know all the steps before hand) and will spare you a lot of grief. Personally I prefer a challenge as it involves learning.

These are the steps I did in order for it to work.

  1. Disabled bitlocker on the drive
  2. Followed this post on how to change SATA mode from IRRT to AHCI, as IRRT didn’t seem to work for the OCZ (I ended up with a blinking cursor and freeze during boot). The guide works for Windows 7 as well, but I downloaded the latest Intel drivers instead, v8.9. No reason to run on old drivers.
  3. Downloaded g4l to clone the drive (Acronis True Image 2010 failed to convert my Windows 7 partition) which I put on a USB stick with UNetbootin.
  4. Plugged the OCZ in the Icy dock and booted up g4l. (F12 brings up a boot menu on the Dell which lets you choose boot device)
  5. At the shell prompt I ran:
    dd if=/dev/sda of=/dev/sdb bs=1M
    This ran for approx. 74 minutes with average transfer of 56,5mb/s.
  6. Next I switched out the old drive with the new and booted where I left off.
  7. Once I had started I followed the “SSD Windows 7 Tweaks” guide on how to optimize Windows 7 on SSD’s.

I started venturing on a path for partition alignment, but the only misaligned partition on the Dell layout is the Dell recovery partition which I never use. The Recover and Data partitions both are aligned just fine for a 64 sector offset.

Benchmarks

Action Before (seconds) After (seconds)
Startup 33 login box
130 Live Mesh
25 login box
60 Live Mesh
Shutdown 25 18
Hibernate 41 29
Resume 27 27

Live Mesh means when all is loaded and Live Mesh login box pops up.

I gained most on startup, as this is a lot of random reads/writes when loading all the programs. But the general snappiness of the OS is much much better. All in all, absolutely worth the upgrade.

My Windows 7 disk mark went up from 5.6 to 7.2 (where 7.9 is max).

The disk came with firmware version 1.4.

Tuesday, December 15, 2009

Filling an array with a default value

After following the discussion on Stackoverflow about how to initialize a byte array with a value I decided to do some benchmarking for fun.

[Update 2014-09-13]
As the question on SO has evolved I have now included poth PInvoke and Memset methods into my tests. The most interesting observation is that the Memset method performs excellent on 64bit, but poorly on 32bit. If you are compiling for 32bit, go with unsafe or PIvoke, if you are running 64bit, Memset delegate is the way to go.

Here are the results, and the code follows below (run on Windows 7 64bit dual core).

Array length: 1048576
Iterations: 1000
Enumerable: 00:00:12.3817249
Parallel Enumerable: 00:00:17.6506353
For loop: 00:00:01.7095341
Parallel for loop: 00:00:06.9824434
Unsafe: 00:00:00.7028914


Here are  the results running on Windows 8.1, 64bit i5 processor (Lenovo T420s) with .Net 4.5.1.

Array Length: 1048576
32bit execution - 5000 iterations

EnumerableFill: 00:00:50.1071043
ParallelEnumerableFill: 00:01:12.2980480
ForLoopFill: 00:00:05.3504656
ParallellForLoopFill: 00:00:45.5518340
UnsafeFill: 00:00:02.2804084
MemsetFill: 00:00:03.9383964
PInvokeFill: 00:00:02.4391258

32bit execution - 10000 iterations
UnsafeFill: 00:00:04.1653674
MemsetFill: 00:00:07.2561020
PInvokeFill: 00:00:04.2709875

64bit execution - 10000 iterations

UnsafeFill: 00:00:03.9618905
MemsetFill: 00:00:03.5594970
PInvokeFill: 00:00:03.8012791

using System;
using System.Diagnostics;
using System.Linq;
using System.Reflection;
using System.Reflection.Emit;
using System.Runtime.InteropServices;
using System.Threading.Tasks;

namespace FillArrayBenchmark
{
    internal class Program
    {
        private static readonly Action MemsetDelegate;
        private static int _arrLength = 1048576;

        static Program()
        {
            var dynamicMethod = new DynamicMethod("Memset", MethodAttributes.Public | MethodAttributes.Static,
                CallingConventions.Standard,
                null, new[] {typeof (IntPtr), typeof (byte), typeof (int)}, typeof (Util), true);

            ILGenerator generator = dynamicMethod.GetILGenerator();
            generator.Emit(OpCodes.Ldarg_0);
            generator.Emit(OpCodes.Ldarg_1);
            generator.Emit(OpCodes.Ldarg_2);
            generator.Emit(OpCodes.Initblk);
            generator.Emit(OpCodes.Ret);

            MemsetDelegate =
                (Action) dynamicMethod.CreateDelegate(typeof (Action));
        }

        private static void Main(string[] args)
        {
            EnumerableFill(12);
            ParallelEnumerableFill(12);
            ForLoopFill(12);
            ParallellForLoopFill(12);
            UnsafeFill(12);
            MemsetFill(12);
            PInvokeFill(12);

            int iteration = 10000;
            Stopwatch sw;
            byte b = 129;
            sw = Stopwatch.StartNew();
            for (int i = 0; i < iteration; i++)
            {
                EnumerableFill(b);
            }
            sw.Stop();
            Console.WriteLine("EnumerableFill: " + sw.Elapsed);
            sw = Stopwatch.StartNew();
            for (int i = 0; i < iteration; i++)
            {
                ParallelEnumerableFill(b);
            }
            sw.Stop();
            Console.WriteLine("ParallelEnumerableFill: " + sw.Elapsed);
            sw = Stopwatch.StartNew();
            for (int i = 0; i < iteration; i++)
            {
                ForLoopFill(b);
            }
            sw.Stop();
            Console.WriteLine("ForLoopFill: " + sw.Elapsed);
            sw = Stopwatch.StartNew();
            for (int i = 0; i < iteration; i++)
            {
                ParallellForLoopFill(b);
            }
            sw.Stop();
            Console.WriteLine("ParallellForLoopFill: " + sw.Elapsed);
            sw = Stopwatch.StartNew();
            for (int i = 0; i < iteration; i++)
            {
                UnsafeFill(b);
            }
            sw.Stop();
            Console.WriteLine("UnsafeFill: " + sw.Elapsed);
            sw = Stopwatch.StartNew();
            for (int i = 0; i < iteration; i++)
            {
                MemsetFill(b);
            }
            sw.Stop();
            Console.WriteLine("MemsetFill: " + sw.Elapsed);
            sw = Stopwatch.StartNew();
            for (int i = 0; i < iteration; i++)
            {
                PInvokeFill(b);
            }
            sw.Stop();
            Console.WriteLine("PInvokeFill: " + sw.Elapsed);
        }

        private static void EnumerableFill(byte value)
        {
            byte[] a = Enumerable.Repeat(value, _arrLength).ToArray();
        }

        private static void ParallelEnumerableFill(byte value)
        {
            byte[] a = ParallelEnumerable.Repeat(value, _arrLength).ToArray();
        }

        private static byte[] ForLoopFill(byte value)
        {
            var a = new byte[_arrLength];
            for (int i = 0; i < _arrLength; i++)
            {
                a[i] = value;
            }
            return a;
        }

        private static byte[] ParallellForLoopFill(byte value)
        {
            var a = new byte[_arrLength];
            Parallel.For(0, _arrLength, i => { a[i] = value; });
            return a;
        }

        private static unsafe byte[] UnsafeFill(byte value)
        {
            Int64 fillValue = BitConverter.ToInt64(new[] {value, value, value, value, value, value, value, value}, 0);

            var a = new byte[_arrLength];
            Int64* src = &fillValue;
            fixed (byte* ptr = &a[0])
            {
                var dest = (Int64*) ptr;
                int length = _arrLength;
                while (length >= 8)
                {
                    *dest = *src;
                    dest++;
                    length -= 8;
                }
                var bDest = (byte*) dest;
                for (byte i = 0; i < length; i++)
                {
                    *bDest = value;
                    bDest++;
                }
            }
            return a;
        }

        public static byte[] MemsetFill(byte value)
        {
            var a = new byte[_arrLength];
            GCHandle gcHandle = GCHandle.Alloc(a, GCHandleType.Pinned);
            MemsetDelegate(gcHandle.AddrOfPinnedObject(), value, _arrLength);
            gcHandle.Free();
            return a;
        }

        private static byte[] PInvokeFill(byte value)
        {
            var arr = new byte[_arrLength];
            GCHandle gch = GCHandle.Alloc(arr, GCHandleType.Pinned);
            MemSet(gch.AddrOfPinnedObject(), value, _arrLength);
            gch.Free();
            return arr;
        }

        [DllImport("msvcrt.dll",
            EntryPoint = "memset",
            CallingConvention = CallingConvention.Cdecl,
            SetLastError = false)]
        public static extern IntPtr MemSet(IntPtr dest, int value, int count);
    }

    public static class Util
    {
        private static readonly Action MemsetDelegate;

        static Util()
        {
            var dynamicMethod = new DynamicMethod("Memset", MethodAttributes.Public | MethodAttributes.Static,
                CallingConventions.Standard,
                null, new[] {typeof (IntPtr), typeof (byte), typeof (int)}, typeof (Util), true);

            ILGenerator generator = dynamicMethod.GetILGenerator();
            generator.Emit(OpCodes.Ldarg_0);
            generator.Emit(OpCodes.Ldarg_1);
            generator.Emit(OpCodes.Ldarg_2);
            generator.Emit(OpCodes.Initblk);
            generator.Emit(OpCodes.Ret);

            MemsetDelegate =
                (Action) dynamicMethod.CreateDelegate(typeof (Action));
        }

        public static void Memset(byte[] array, byte what, int length)
        {
            GCHandle gcHandle = GCHandle.Alloc(array, GCHandleType.Pinned);
            MemsetDelegate(gcHandle.AddrOfPinnedObject(), what, length);
            gcHandle.Free();
        }
    }
}

Thursday, November 26, 2009

Code dojo

I attended code dojo in Oslo today. This was my seconds dojo and a fun experience. We worked in four groups, two python, one java and one C#.

The task at hand was to create a scoring program for tennis. Simple enough to do in 2 hours, yet complex enough to spike good discussions. And it was interesting to see how some did top-down design and others bottom-up.

Coding in groups TDD style is a great experience which I recommend for anyone who haven’t tried it. The group dynamic is very different from coding by yourself and it broadens your problem solving skills.

Looking forward to the next dojo!

Wednesday, November 18, 2009

Upgrade from 32bit Office 2007 to 64bit Office 2010

I downloaded the beta of Office 2010, but wanted to go with the 64bit version, since one exists. 64bit Windows 7 deserves a 64bit Office.

My main concern since an upgrade is not possible from 32bit to 64bit was that my Outlook account wouldn’t be kept. To make sure I had a backup of it I exported the profile from registry:

HKEY_CURRENT_USER\Software\Microsoft\Windows NT\CurrentVersion\Windows Messaging Subsystem\Profiles\Outlook

I didn’t bother backing up any files, since everything is on the Exchange server.

My next step was to uninstall Office 2007, did a reboot, and started Office 2010 installations. Did a reboot, started Outlook and it all worked. The registry key had not been removed upon uninstalling 2007, so Outlook 2010 detected the profile and used it without any work on my part.

Conclusion is that you can just uninstall 2007 and install 2010 and keep your Outlook settings. But it never hurts to backup your outlook registry key just to be sure.

I had trouble activating but found a solution at http://blog.hznet.nl/2009/11/having-troubles-activating-office-2010-beta/.

Thursday, November 12, 2009

Disk based data structures

codeplex-logo Last year I created a project where I used memory mapped files as storage for a large Array. I’ve now polished the project a bit and included generic List and Dictionary implementations as well. The project can be found at Disk Based Data Structures - CodePlex.

I’ve also created a serializer project which benchmarks and picks the fastest serializer method for your type. This serializer is used to persist the data to disk. The classes are also implemented thread safe.

Background for the project

A disk based version of an array would require a lot of caching logic to make it perform fast enough compared to a pure memory implementation and a couple of years ago I stumbled across Memory Mapped Files which has long existed in the operating systems and is typically used in OS’ for the swap space.

The first time I worked with Memory Mapped files I used a library from MetalWrench, but this time around I got hold of Winterdom's much nicer implementation of the Win32 API. I've included the patch from Steve Simpson, but removed the dynamic paging since it slows things down and it's not necessary on 64bit systems. (If you want to use arrays which hold over 2gb of data on 32bit systems I recommend reverting to Steve's original version and set a view size of 200-500mb.) Future releases will use .Net 4.0’s System.IO.MemoryMappedFiles namespace.

The beauty of 64bit is that you have virtually unlimited address space, so each thread can get it's own view of the mapped file without running out of address space. 32bit Windows can only address 4gb.

As for performance my theory is that Microsoft has implemented a fairly good caching algorithm for it's swap file, so it should prove good enough for me. A few tests show a much better disk IO with the Memory Mapped API than using .Net's file IO library. I haven't testet the performance if you add the SEC_LARGE_PAGES flag, but it might help some.

Hope this library is useful for someone out there :)

Sunday, October 11, 2009

My first Azure project in the cloud

I finally got around to testing a small project in the cloud and it went much smoother than I anticipated.

As the unix server we used to run pornolize.com on is currently down I decided to port the perl code to .Net. It always helps to have a concrete project when learning something new. For those who are unfamiliar with The Pornolizer, it’s basically a web page translation service like Google Translate, except it substitutes words with dirty ones. And yes, I know it’s childish :)

The project consists of a web role which serves up the start page (I decided on a new layout as well once I was at it)

pornolize-screenshot

When clicking the “Translate” button the request is picked up by a protocol handler. The handler then downloads the page you want to translate, runs the translation and serves it to the user. Before the handler ends it’s Response, it puts a log message in a queue. This queue entry is picked up by a worker role, which again inserts this into SQL Azure. I could have used a table storage, but since I had a token for SQL Azure I decided to give it a go (and it made it very simple to use Linq to SQL). Just a change of the connection string and it was up and running.

Initially I though about having the worker role do the downloading and parsing, but since I wanted low latency I decided to drop it, and chose to put in the logging instead in order to explore using a queue and SQL Azure.

One of my better weekend projects for a long time – some code cleanup and refactoring and I can move on to something else.

Wednesday, September 16, 2009

How I did resource files with WSP builder and SharePoint in a DataFormWebPart

Resource files can be either in a .resources format or a .resx format. Visual Studio works great with .resx files where you can call the resource directly via a static class.

Resx files can be put in 12/Resources like core.resx, but I couldn’t get this to work out of the box.  After some googling and trying out different things, this is my receipt  on how to develop a SharePoint webpart and use resources with XSLT.

Firstly create a folder called Resources in the root of your project, and then add your resx files there.

resx

Then I added a class to handle the lookup of resources

public class ResourceHandler
{
private static readonly ResourceManager _resourceManager;
public static readonly ResourceHandler Instance = new ResourceHandler();

static ResourceHandler()
{
if (_resourceManager == null)
{
Type type = typeof(FrontResources);
_resourceManager = new ResourceManager("webparts.front.Resources.FrontResources", type.Assembly);
}
}

public string GetLabel(string key)
{
return GetLabel(key, HttpContext.Current);
}

public string GetLabel(string key, HttpContext context)
{
if(!HasKey(key, context)) return key;
return _resourceManager.GetString(key, GetCulture(context));
}

public bool HasKey(string key, HttpContext context)
{
ResourceSet rs = _resourceManager.GetResourceSet(GetCulture(context), true, true);
IDictionaryEnumerator ide = rs.GetEnumerator();
while (ide.MoveNext())
{
if (ide.Key.Equals(key)) return true;
}
return false;
}

private CultureInfo GetCulture(HttpContext context)
{
if (context.Request.UserLanguages.Length > 0)
{
try
{
return new CultureInfo(context.Request.UserLanguages[0]);
}
catch (ArgumentException)
{
return CultureInfo.CurrentUICulture;
}
}
return CultureInfo.CurrentUICulture;
}
}

The HasKey function was added due to the XSLT failing if accessing a key which wasn’t present, even if I try/catch around _resourceManager.GetString.

Inside the DataFormWebPart I override ModifyXsltArgumentList in order to add a namespace for the ResourceHandler in the xslt.
protected override void ModifyXsltArgumentList(ArgumentClassWrapper argList)
{
argList.AddExtensionObject("http://schemas.company.no/SharePoint/Label", ResourceHandler.Instance);
base.ModifyXsltArgumentList(argList);
}

And in the xslt you access a resource in the following way:
<?xml version="1.0" encoding="utf-8"?>
<xsl:stylesheet version="1.0"
xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:front="http://schemas.company.no/SharePoint/Label">
<xsl:output method="xml" indent="yes"/>

<xsl:template match="/">
<xsl:value-of select="front:GetLabel('My_label')" />:
</xsl:template>
</xsl:stylesheet>

When you run WSPBuilder it will automatically pick up the resx files and add them as embedded .resources files in the compiled dll.



Resource Assembly added: nb-NO\webparts.front.resources.dll

Monday, September 14, 2009

Early creation of public properties

I’m battling myself a bit about a public API I’m writing. Should I initialize all my public properties in the default constructor, or should I initialize them when accessed?

Having a public property returning null is not an option in my opinion as it leads to extra checking on the consumer of the API. Grunt work which the API should do for you.

Consider the two following classes:

public class MyClass
{
public List<string> List { get; set; }

public MyClass()
{
List = new List<string>();
}
}

and

public class MyClass
{
private List<string> _list;
public List<string> List
{
get
{
if( _list == null )
{
_list = new List<string>();
}
return _list;
}
set { _list = value; }
}

public MyClass()
{
}
}


The pros for the first class is that it’s short and very easy to read. The pros for the second class is that it is more optimized in terms of memory usage if the property is not always used.

My API has an object structure of about 20 classes, which may or may not be set. Some might be used more frequent and favor the first class, as others are infrequent and would favor the last one.

Having both implementations seems a bit inconsistent, so the big question is; should I favor the easy read, or the optimized? If the object structure is being created often, will creating all these extra objects be bad for the clr or doesn’t it matter?

It might be that benchmarking is the way to go to give the final answer, but any comments on the matter is appreciated.

Tuesday, July 28, 2009

Outlook 2007 RPC/HTTP workaround

I’ve had major issues setting up RPC/HTTP against my new e-mail hosting service. First of all Outlook 2007 with sp2 won’t let you add an account over rpc/http. You have to be local on the domain. This is not possible with a hosting service.

After a lot of back and forth I came up with the following solution. I had the hosting provider set up my account on a machine at their location. Then export the HKEY_CURRENT_USER\Software\Microsoft\Windows NT\CurrentVersion\Windows Messaging Subsystem\Profiles\<profile name> key from registry. I then edited the reg file to point to an .ost file on my local machine, fired up Outlook and it all worked :)

It requires a bit of registry knowledge to do this, but it shouldn’t be very hard making a program creating a correct MAPI profile for any user with the correct data.

Thursday, July 23, 2009

Removing Exif data – continued

Seems like the underlying jpeg library is a bit broken on some OS’ when using the JpegEncoder/JpegDecoder.

Since I only wanted to remove the Exif data, and not modify it, I ended up with a byte patcher instead. A matter of taking control :)

using System.IO;

namespace ExifRemover
{
public class JpegPatcher
{
public Stream PatchAwayExif(Stream inStream, Stream outStream)
{
byte[] jpegHeader = new byte[2];
jpegHeader[0] = (byte)inStream.ReadByte();
jpegHeader[1] = (byte)inStream.ReadByte();
if (jpegHeader[0] == 0xff && jpegHeader[1] == 0xd8) //check if it's a jpeg file
{
SkipAppHeaderSection(inStream);
}
outStream.WriteByte(0xff);
outStream.WriteByte(0xd8);

int readCount;
byte[] readBuffer = new byte[4096];
while ((readCount = inStream.Read(readBuffer, 0, readBuffer.Length)) > 0)
outStream.Write(readBuffer, 0, readCount);

return outStream;
}

private void SkipAppHeaderSection(Stream inStream)
{
byte[] header = new byte[2];
header[0] = (byte)inStream.ReadByte();
header[1] = (byte)inStream.ReadByte();

while (header[0] == 0xff && (header[1] >= 0xe0 && header[1] <= 0xef))
{
int exifLength = inStream.ReadByte();
exifLength = exifLength << 8;
exifLength |= inStream.ReadByte();

for (int i = 0; i < exifLength - 2; i++)
{
inStream.ReadByte();
}
header[0] = (byte)inStream.ReadByte();
header[1] = (byte)inStream.ReadByte();
}
inStream.Position -= 2; //skip back two bytes
}
}
}



Tuesday, July 21, 2009

Remove Exif data from image files with C# and WPF libraries


(For my final solution check out Exif continued..)


A colleague of mine e-mailed me with a problem he had. He was developing a solution where the customer wanted all exif data to be removed from the images they provide on the web. He had tried a bit with no luck.

Since the image libraries in WPF is far superior to the ones in winforms I gave it a shot. I googled around, read the exif spec and came up with the code below. The image is read and then loop over all exif properties, and then blank them out. It might work just as good by removing them, but with blanking the file don’t change header wise. Properties pertaining to the image characteristics such as width and height are skipped. You can check them against the exif spec. I have only tried the code on jpeg images, and I didn’t have one with GPS coordinates in it, but in theory it should remove GPS coordinates as well.

I skipped the metadata.TrySave() all together since it didn’t work when I use the SetQuery method. If I just changed the metadata properties it worked. It’s easy to put this back in and you find a discussion about it in one of the links at the bottom.

using System;
using System.IO;
using System.Windows.Media.Imaging;

namespace ExifRemover
{
public class ExifReader
{
public void SetUpMetadataOnImage(string filename)
{
string tempName = Path.Combine(Path.GetDirectoryName(filename), Guid.NewGuid().ToString());
// open image file to read
using (Stream file = File.Open(filename, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
// create the decoder for the original file. The BitmapCreateOptions and BitmapCacheOption denote
// a lossless transocde. We want to preserve the pixels and cache it on load. Otherwise, we will lose
// quality or even not have the file ready when we save, resulting in 0b of data written
BitmapDecoder original = BitmapDecoder.Create(file, BitmapCreateOptions.PreservePixelFormat, BitmapCacheOption.None);
// create an encoder for the output file
BitmapEncoder output = null;
string ext = Path.GetExtension(filename);
switch (ext)
{
case ".png":
output = new PngBitmapEncoder();
break;
case ".jpg":
output = new JpegBitmapEncoder();
break;
case ".tif":
output = new TiffBitmapEncoder();
break;
}

if (original.Frames[0] != null && original.Frames[0].Metadata != null)
{
// So, we clone the object since it's frozen.
BitmapFrame frameCopy = (BitmapFrame)original.Frames[0].Clone();
BitmapMetadata metadata = original.Frames[0].Metadata.Clone() as BitmapMetadata;

StripMeta(metadata);

// finally, we create a new frame that has all of this new metadata, along with the data that was in the original message
output.Frames.Add(BitmapFrame.Create(frameCopy, frameCopy.Thumbnail, metadata, frameCopy.ColorContexts));
}
// finally, save the new file over the old file
using (Stream outputFile = File.Open(tempName, FileMode.Create, FileAccess.Write, FileShare.ReadWrite))
{
output.Save(outputFile);
}
}
File.Delete(filename);
File.Move(tempName, filename);
}

public void StripMeta(BitmapMetadata metaData)
{
for (int i = 270; i < 42016; i++)
{
if (i == 274 || i == 277 || i == 284 || i == 530 || i == 531 || i == 282 || i == 283 || i == 296) continue;

string query = "/app1/ifd/exif:{uint=" + i + "}";
BlankMetaInfo(query, metaData);

query = "/app1/ifd/exif/subifd:{uint=" + i + "}";
BlankMetaInfo(query, metaData);

query = "/ifd/exif:{uint=" + i + "}";
BlankMetaInfo(query, metaData);

query = "/ifd/exif/subifd:{uint=" + i + "}";
BlankMetaInfo(query, metaData);
}

for (int i = 0; i < 4; i++)
{
string query = "/app1/ifd/gps/{ulong=" + i + "}";
BlankMetaInfo(query, metaData);
query = "/ifd/gps/{ulong=" + i + "}";
BlankMetaInfo(query, metaData);
}
}

private void BlankMetaInfo(string query, BitmapMetadata metaData)
{
object obj = metaData.GetQuery(query);
if (obj != null)
{
if (obj is string)
metaData.SetQuery(query, string.Empty);
else
{
ulong dummy;
if (ulong.TryParse(obj.ToString(), out dummy))
{
metaData.SetQuery(query, 0);
}

}
}
}
}
}



Monday, June 15, 2009

The waves

Finally I’ve discovered what should have been so clear from the beginning. The question I’ve ponders for years is who makes the waves we surf on on the web, and recently it was all revealed. 

From right under our noses emerged Google Wave :)

How could I now have guessed this. At least one of life's existential questions can be checked off my list.

Thursday, March 19, 2009

A better world with jQuery

I know I’m far behind the rest of the world, but that’s because I usually code on the web server and not on the web client. But yesterday I experienced the beauty of  jQuery first hand :)

I was first exposed to jQuery unknowingly some months ago when I lent some “free” consulting time to a web project. Since I happen to have done my share of java scripting over the years I got the task of breaking up the static design to dynamic pages, meaning constructing the script calls depending on the data. I didn’t really have a deep understanding of the $(‘#id’) syntax but I grasped the idea and made it work.

Since then I’ve read some articles about jQuery and yesterday I got another task on the web project. The task was to check if a flash had loaded, and if not, show some html instead. With a bit more knowledge of what I was looking at and figuring out that the flash was loaded with a jQuery plugin it was a breeze to implement.

$(document).ready(function() {
        if( sIFR.isActive == false )
        {
            $('#non_flash_content').show();
        }
    });

Days like this makes it worth while living, and the fact that jQuery will be natively supported in Visual Studio is great news!

Sunday, February 22, 2009

FastForward’09

This year was my first visit to the FastForward conference. The theme of the conference was “Engage your user”, something which has been on my mind for the last couple of years.

The conference was held at the Mirage Hotel in Las Vegas, a place where they certainly try their best engaging their users. A well chosen spot indeed (I lost but $40 bucks, slot machine sounds is not what makes me fuzzy and warm).

Travelling 17 hours each way is a strain, but it was worth it. Talking to FAST customers and others working with search technology was very inspiring, and we all agree that things are moving in the right direction. More and more information is accessible for search, and people are going at great lengths to create search driven applications and interfaces which are intuitive for the end-user. Seems like Enterprise 2.0 might finally be around the corner.

2009 might just be the best year for search yet!

Sunday, February 1, 2009

Going unsafe in managed code – give me speed!

After doing the array comparison article my mind has been working subconsciously on another matter I’ve thought about for several years.What is the fastest possible way to serialize/deserialize an object in .Net?

One way is using the built-in serialization in .Net with a BinaryFormatter or a SoapFormatter. This is the most general way and works for “all” cases. If you know a bit more about the data you want to serialize you can improve speed quite a lot.

In my article Using memory mapped files to conserve physical memory for large arrays I solve the serialization on structs or value types and use Marshal.StructureToPtr and Marshal.Copy in order to get a byte array I can write to disk afterwards (because I didn’t know better at the time) This will work for any struct with only value types in them. My weekend testing showed that if I use explicit layout on a struct or class we can omit the Marshal.StructureToPtr step and use Marshal.Copy.

Now over to the unsafe bit. By using pointes directly and skipping the Marshalling all together we improve speed even more. This fueled me to continue on my Disk Based Dictionary project which will benefit both from memory mapped files and fast serializing. My approach will be to analyze the type of object being used. If it’s an object with explicit layout or a base value type I will use fast pointer to pointer copying. If it’s an object with only value types, but implicit layout I’ll go with the StructureToPtr. For an object with reference types I will use normal serialization, or check if they implement a BinaryWriter/BinaryReader interface for writing out the values manually.

The library will then work for the lazy coder which don’t need killer performance, but also for the conscious ones bothering about speed.

If I’m lucky with inspiration I’ll have it done this week before I go to Vegas.

If you’re wondering why I bother with these things it’s because I used to work with search engines where speed vs. memory is a big issue. In my current job doing SharePoint consulting it’s all a waste of time since the SQL server will always be the bottleneck :)

Thursday, January 29, 2009

WTF is “Sticky Keys”?

When I grew up with  computers sticky keys was what happened when I accidentally spilled Coke over the keyboard. Not so much anymore.

When running my laptop on battery the screen will dim after 60 seconds or so to save power, which works great when doing work. When watching YouTube videos it’s annoying. My solution is to click the shift key every now and then in order to prevent the dimming and not sending an unwanted keystroke which could maybe stop the playing.

Then it happens… in the middle of watching the video “Sticky Keys” appear. I click ‘ESC’ and the window disappears. During lunch today we started talking about this so I’ve checked it out.

If you click SHIFT 5 times in a row you can turn off sticky keys. This will enable a user to use SHIFT, CTRL, ALT or the Windows key by pressing one key at a time, alas working with one hand. Superb for people smoking or texting on their mobile with one hand while doing computer work with the other. Personally I’m not able to multitask and have now turned off the “Sticky Keys” (Control Panel –> Ease of Access Center –> Set up Sticky Keys).

I have stopped drinking soda with sugar several years ago so “Sticky Keys” is now ancient history – until I install a new computer that is.

Monday, January 26, 2009

C# Javascript library

Came across this piece in the January edition of MSDN Magazine. I talks about creating javascript for AJAX apps using C# and Visual Studio. Sounds too good to be true :) The library can be fetched at projects.nikhilk.net/ScriptSharp

I know I will take a look at it the next time a project requires some nifty scripting.

Wednesday, January 14, 2009

Fast byte array comparison in C#

I got into a discussion with a colleague the other day about string comparison in .Net and whether to use

variable.Equals("mystring")

or

"string" == "string"

both in terms of speed (though it wouldn’t matter in most cases) and in terms of readability. As for speed .Equals is faster as you save one method call. == is implemented as an operator which again calls Equals. Our good friend Reflector is always there when you need him.


The interesting part came when reflecting this and I stumbled upon EqualsHelper and CompareOrdinalHelper. Here .Net casts the strings to pointer arrays and compares an int at a time. This lead me to creating a byte[] comparison function after the same code .Net used internally and benchmarking it.


For an equal array with 11 elements the unsafe is 3 times as fast. For unequal arrays the managed implementation is quicker if the first or second byte differs. From the third on and out the unsafe gains speed. Below is some sample code you can experiment with yourself. The longer the array, the more you gain on the unsafe version. Be sure to test the code compile in release mode.


Microsoft don’t recommend you using unsafe unless it’s performance critical, but since they use it internally we can as well ;) (But you should have a good reason due to complexity imo) Why they compare 10 bytes at a time is beyond me and I haven’t tested if this is some magic number which yields good results for general cases.


class Program
{
static void Main(string[] args)
{
byte[] a = new byte[] { 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11 };
byte[] b = new byte[] { 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11 };

Stopwatch sw = new Stopwatch();
sw.Start();
for (int i = 0; i < 30000000; i++)
{
SafeEquals(a, b);
}
sw.Stop();
Console.WriteLine(sw.Elapsed);

sw = new Stopwatch();
sw.Start();
for (int i = 0; i < 30000000; i++)
{
UnSafeEquals(a, b);
}
sw.Stop();
Console.WriteLine(sw.Elapsed);
}

private static bool SafeEquals(byte[] strA, byte[] strB)
{
int length = strA.Length;
if (length != strB.Length)
{
return false;
}
for (int i = 0; i < length; i++)
{
if( strA[i] != strB[i] ) return false;
}
return true;
}

[ReliabilityContract(Consistency.WillNotCorruptState, Cer.MayFail)]
private static unsafe bool UnSafeEquals(byte[] strA, byte[] strB)
{
int length = strA.Length;
if (length != strB.Length)
{
return false;
}
fixed (byte* str = strA)
{
byte* chPtr = str;
fixed (byte* str2 = strB)
{
byte* chPtr2 = str2;
byte* chPtr3 = chPtr;
byte* chPtr4 = chPtr2;
while (length >= 10)
{
if ((((*(((int*)chPtr3)) != *(((int*)chPtr4))) || (*(((int*)(chPtr3 + 2))) != *(((int*)(chPtr4 + 2))))) || ((*(((int*)(chPtr3 + 4))) != *(((int*)(chPtr4 + 4)))) || (*(((int*)(chPtr3 + 6))) != *(((int*)(chPtr4 + 6)))))) || (*(((int*)(chPtr3 + 8))) != *(((int*)(chPtr4 + 8)))))
{
break;
}
chPtr3 += 10;
chPtr4 += 10;
length -= 10;
}
while (length > 0)
{
if (*(((int*)chPtr3)) != *(((int*)chPtr4)))
{
break;
}
chPtr3 += 2;
chPtr4 += 2;
length -= 2;
}
return (length <= 0);
}
}
}
}

Windows 7 beta - issues

Here's my first comments and experiences with Windows 7.

I downloaded Windows 7 this weekend and did an upgrade of a 32bit Vista sp1 laptop. The upgrade itself took 4 hours which is ok since I have a lot of stuff installed. The system itself behaves much more snappy than Vista. Startup and shutdown is quicker, hibernation is quicker, basically the OS response time is much better.

Issues which I had after the upgrade:

  • Only one cpu core was working. This was due to incorrect ACPI driver being used. Resolution was to disable multi core in the bios and let Windows redetect the cpu. Then shutdown and enable the second core again. This made Windows enable both cores.
  • Daemon tools is not working (SPTD driver)
  • Had to do a repair install on Acrobat Reader 9 and Java in order for them to work in IE 8
  • My built-in Broadcom network card is not working properly (BCM5906M) It won't access the LAN at work and get an IP address. Have to debug a bit more to find out more about the issue.
  • Toshiba bluetooth device is not working properly. Vista drivers won't work and Windows built-in driver are just as crap as Vista. Meaning basically not supporting much at all.

I'll report more issues when they appear, but for now I give two thumbs up for this release!