Thursday, December 31, 2015

A quick look back at 2015 – and what lies ahead in 2016

I thought I’d for once take a look at which of my posts for the past year has been views the most. Here’s a pick of five out of the 70 I managed to write – and thanks to all who appreciate me doing this!
In addition to writing posts I also spoke at a few events, most notably Microsoft Ignite, which also was the most fun event in a way. I’m the “Battle of the graph” session was the only one at Ignite without a single slide, and the only one where a dude on stage was wearing a black hooded cape (me). But I did manage to get CollaboGraph app out of it.

So what’s next for 2016? Blogging wise we’ll see, but doing search relevance tuning seems like a topic. Speaking wise my only plan is to say something at the Arctic SharePoint Challenge in February, which I’m part of organizing. But I’m sure something pops up.

As for Office 365 and SharePoint I think it’s moving in the right direction, and the end of the year yielded some good discussions with fellow MVP’s at the MVP summit. Mix that with recent posts by Microsoft and I’m sure 2016 will be awesome.

At the end of the year Puzzlepart was bought by Crayon, certainly an opportunity for creating something interesting in the Office 365 space. Hopefully Puzzlepart’s and my own enthusiasm will rub off for some customers out there :-)

Wednesday, December 9, 2015

My stance on modifying the out of the box display templates

image
My colleague Petter had a request from a customer to open search results in a new window/tab instead of the same one. So he conferred the braintrust at Puzzlepart on which approach he should go for.

One answers was to modify the control template and loop though all links using JavaScript and add the target=”_blank” attribute. Another to hook in a custom action to which registered a javascript which filtered on click events and did the same modification sort of.

Personally I went against what I always preach which is to never modify the OOB display templates. The customer in this case was using the oob templates, and all the oob item templates through inception reference Item_CommonItem_Body.html/js. We’re talking around 30 result types, referencing some oob template which again reference Item_CommonItem_Body.

To me modifying one line in that file to open the item in a new tab/window had the least complexity. Creating or modifying a custom control template would possibly have required reconfiguration of X pages in the search center. The custom action would have solved it all, but a solution hidden away of sorts.

When is it not ok to modify the OOB templates then?

Most modifications I come across involve adding a new managed property into the template to be displayed. Try this with an OOB template and you fail pure and simple. It just doesn’t work as you won’t be able to Update the result type binding to include your custom managed properties. For this scenario you must create a copy and tie it to your own result type.

And this is where we usually start at Puzzlepart. We have custom templates from the get-go and ready configurations for the 30 result types to be replaced. Petter’s customer already had a search solution up and running when he came in to make it rock, but rolling new display templates were not in scope.

So…. should you or should you not modify the OOB display templates? As a general rule, don’t do it, but if you know what you are doing and know the reasons why, there are scenarios which justify this. Another is to edit the default refiner template and turn on refiner counts.

And as my colleague Tarjei pointed out, if you do change the oob ones, remember that you might miss out on some awesome cool changes coming in a CU near you... or not ;-) - the reason being you have ghosted the files so it won't use possible awesomeness deployed in the 15 hive - or not ;-)

Monday, December 7, 2015

FindTime–A really awesome schedule plugin for Outlook

Microsoft has an internal initiative named Microsoft Garage projects where employees from all over Microsoft turn great ideas into real projects. One of the most recent projects to come out of this initiative is called FindTime, announced on the Office blog, an Outlook plugin which lets you schedule meetings with colleagues – but with polling functionality like Doodle. Or much like the Sunrise app on the iPhone. The FindTime add-in works with Outlook 2013, Outlook 2016 and with Outlook Web Access.

It’s real simple. Start a new e-mail, add people to the To or Cc fields, and you will see their availability in the plugin. In the image below you see I have chosen three possible time slots for a 30 minute meeting with my colleagues, Mads, Elsa and Thomas.

image

Thursday, November 26, 2015

Gotcha’s when you move a FAST Search based solution to SharePoint 2013, running in 2010 mode

We’re running around like rabbits in my project these days migrating 2010 solution en-masse over to SharePoint 2013. Some are upgraded to the 2013 look and feel, while some heavy WSP based ones are kept in 2010 mode for the time being to keep cost down.

The 2010 farm is using FAST search and here’s a small list of things which will not necessarily work when you restore 2010 sites using Core Search Results web parts as part of the solution.

  1. Search index location for the web parts should be hard coded to FASTSearch, if it’s set to LocalSearchIndex or Default refiners will look and act weird. (See this post)
  2. If you have multiple web parts on a page, the ones set to use the non-default query id, will pick up the refiners from the default web part.
    imageAs an example image a result page which shows documents and people. You want to search for people and documents with the same query terms. When you click a refiner for documents, you do not want this to affect the people web part as those refiners might not exist. When setting a core result web part to use another query id, this just works in 2010, but when moving to a 2013 farm it does not.

    It’s solvable but you need to use some reflection like my FQL enabled web part (code at http://spsearchparts.codeplex.com/) which I wrote about in 2011. Need some pointers ping me :)

  3. If you have copied refiner URL’s like ?k=%2A&r=servicearea%3D%22ARIBVGVjaG5pY2FsIEFkdmlzb3J5C3NlcnZpY2VhcmVhAQJeIgIiJA%3D%3D%22, which have the FAST values base64 encoded in them, they will continue to work, but you might see double up of refiner values in the web part.
    image
    The reason for this is that the refinement value has changed format in 2013 and wraps the value itself in a hex encoded format, so we actually end up with a double encoding, and the web part will fail to highlight the correct entry.
  4. Metadata like O&G, which in FAST was stored just like that without spaces is tokenized into O & G with spaces in 2013. So if you have code filtering on properties with & in them you need to replace what the search filter looks like.

Summary

Migration is not necessarily easy, but fortunately for Microsoft not too many companies ever installed FAST (my guess), so the caveats on a migration to 2013 was no big concern. Also if customers spend the money to upgrade the UI’s to 2013, you would re-write everything anyways, and would not have to be concerned with all of this.

For this specific solution my initial migration estimate took a huge hit for sure, but the TCO was still lower digging into the weirdness, and I got to use some of my experimental code written for the FAST book.

That said, I hope this is the last time I ever ever ever ever ever have to touch FS4SP in a migration project again. I’m not a masochist by choice.

#FTW

Monday, November 23, 2015

People Search ranking for dummies

image

I’ve gotten a few questions about how people results are ranked in SharePoint/SharePoint On-line so I thought I would give a brief overview of two of the people ranking models available and how they work. I might be off on some points as decoding the models is not all that trivial as they have been created using machine learning and they also have multiple levels and use neural networks. If that sounded Greek or geek to you, then join the club :)

I’ll present my simplified interpretation and you can take it at face value.

For each model I will list the managed properties and associated user profile properties used in the rank profile, and list them in ranking importance from highest to lowest with my relative weights to put it into perspective.

User profile properties not listed in the tables will not influence how results are ranked.

Friday, November 20, 2015

Why Hybrid Crawl in SharePoint is a cold hot potato

image

[This rant is based on the preview of the Cloud Search Service Application]

Let me start by saying that I DO think hybrid crawl (or the Cloud Search Service Application) is a cool thing. It’s just the implementation of it which makes me go: Meh…..

When the Cloud SSA was first mentioned and further put into preview everyone was all YEAH!!!! WOW!!! AWESOMEPANTS!!!…..while I was quickly going… meh. What’s wrong with me?!?

To clue everyone in, hybrid crawl or the Cloud Search Service Application is a function where your on-premises SharePoint 2013 or 2016 farm can index local content (any source) and store it in the SharePoint Online search index instead of storing it in the local search index.

Monday, November 16, 2015

Inconvenient SharePoint 2010 search center in 2013 (Hey Waldek :)

I’m currently working on moving some 2010 farms over to a 2013 farm, and when doing so we decided to keep the 2010 UI for some of the custom solutions for the time being. The 2010 solutions used FAST search, and this is where the inconvenience hit us.
When using FAST in 2010 and you click a refiner, you will see a weird encoded URL parameter similar to
r=knowledgeportalsegment%3D%22ARMBU3Vic2VhIGFuZCBGbG9hdGVycxYBa25vd2xlZGdlcG9ydGFsc2VnbWVudAAcAiLHgseCNTM3NTYyNzM2NTYxMjA2MTZlNjQyMDQ2NmM2ZjYxNzQ2NTcyNzMi%22

which after migration looked like the plain readable non-FAST 2010 version:

r=knowledgeportalsegment%3D%22Subsea%20and%20Floaters%22

In addition to the change of URL format for a refiner multi-term taxonomy fields where being rendered together with a semicolon separator instead of as multiple refiner entries.

Monday, October 19, 2015

Office Graph, Is it your cup of tea? - Presentation from SPSOslo

Saturday October 17th 2015 sported the 3rd SharePoint Saturday in Oslo and probably the best so far. This time they had it in the fall instead of spring time, allowing more people to picking SPS over beautiful weather.

This time I did a business session about the Office Graph, and how it may, or may not help you be more productive on a day to day basis. I covered both your personal insights as well as organizational insights, discussing a bit what it will help you with, and what the future may bring.

If you want some narration with the slides or you are curious if you should or should not start using the Office Graph and Delve, feel free to ping me. –And I am a firm believer that the Office Graph is useful to everyone, but I think we’re still waiting for the killer application using it.

Monday, October 12, 2015

My presentation from #SPSMUC

Here’s my presentation from SharePoint Saturday M√ľnich

And I have to say, this is probably one of the best events I have ever attended. As a speaker we were well taken care of with dinner on both Friday and Saturday, as well as breakfast on Sunday for those who were still in town. The venue at the Sofitel was awesome with good rooms, excellent food and service. And the speaker line-up, wow!! So many amazingly good speakers and I managed to attend 3 sessions myself.

Check out the cool speaker shirt and nametag and the inflatable pretzel we got!

WP_20151010_08_44_29_Pro

WP_20151010_08_51_23_Pro__highres

Monday, October 5, 2015

O365 Dev Challenges - Part 8 – Troubleshooting and tips

The most important tip I can give out is that you should not test the app against a tenant which you don’t have admin access to until you know that it works the way you want. The reason is that if you have to change the permissions needed or are changing URL’s of the web app itself, then you won’t be able to remove the app registration. Removing app registrations require admin permissions in AAD.

Cleaning up App registrations

When creating a lot of sample applications, you might end up with multiple App registrations and Azure resources. App registrations can easily be cleaned up using PowerShell.
If you have admin access you can download the Azure Active Directory Module for Windows PowerShell (64-bit version) package and once opened you first connect to your tenant with

Connect-MsolService

You can list all add-in registrations with

Get-MsolServicePrincipal

and you remove entries with

Remove-MsolServicePrincipal -AppPrincipalId <your app id>

Every time you remove an add-in registration you will be prompted to consent to the add-ins permission request the next time you access it. Very useful for testing.

Cleaning up Azure Resources

Azure resources can be cleaned up from the old and new Azure management portals.


By removing a resource group, you will also automatically remove all artifacts associated with that group.

image

O365 Dev Challenges - Part 7 - Publishing the App to Azure

I know have a fully functional application running from localhost. If I access the local host URL from another tenant (not the one where I registered the app) it works just fine. But I don’t want it sitting locally, as I decided upon project creation to host it as an Azure web site.
Fortunately, this last piece of the puzzle is the easiest one!

Publishing the application

Right click the project node in Visual Studio and pick Publish. The Connection screen should list the URL picked in the wizard upon project creation. Feel free to hit Validate Connection to make sure it’s working.

clip_image002

Before you hit the Publish button, copy the value in the Destination URL field, in my case http://pzlsamplemultitenancy.azurewebsites.net.

Fix the application manifest to reflect the update application URL

Go back to the application manifest in the Azure management portal and replace https://localhost:44300/ with the value copied for your Azure web site – and be sure to add a trailing slash. Replace the URL in both the SIGN-ON URL field and the REPLY URL field and click SAVE.

clip_image004

Testing the application

As the application is now stored in Azure and the manifest is updated all that remains is testing it.
Open up a browser window where you are logged into either the publishing or test tenant – or an anonymous one and navigate to your Azure web URL, http://pzlsamplemultitenancy.azurewebsites.net/UserInfo, consent to the application, and click Send Mail, and you should receive an e-mail from yourself.

Summary

Starting out with something pretty simple took me on a journey where I learned a bit more about how the OWIN libraries work and how resources are access in code with matching delegate permissions in Azure AD. It also took me into refactoring and patching the scaffolding injected in my project, which all in all was a good path. At least I now have enough tools to keep on creating O365 add-ins.

The Unified API has also been launched with a beta of v2 where you can use dynamic app consent, which is quite cool. Don't prompt the user to consent to an action before you actually need it. I'll probably wait until there is wizards and nuget packages to take me there, but the future is looking bright indeed!

My hope for the future is that the wizards and scaffolding gets even better, and are close to production ready. The easier it gets, the more add-ins people will make (I hope).

Hope this tutorial and hoop jumping helps someone else out there - and any bugs or mistakes in my code can only be attributed to my lack of understanding how it all fits together :-)

O365 Dev Challenges - Part 6 - Switching to the Office 365 unified API (preview)

I now have a working solution which can read a user’s profile and also send e-mail on that user’s behalf. With my newly acquired understanding of how O365 resources and authorization works, why not simplify how API’s are called a bit and use the unified API? Effectively replacing ActiveDirectoryClient and OutlookServicesClient with just one client.

Before reading on I’ll tell you that I reverted this route myself as the Unified API client was unable to let me send an e-mail, bombing out with and OData class mapping error. This seems to be a bug on the Microsoft side of things, but if you do pure REST it will work. Using multiple clients is not that much of a hassle when you have the authentication part figured out anyways, and what is a few more server side calls behind the scenes between friends anyway? :-)

Still up for it…. continue reading.

Add a nuget package for the Office 365 unified API (preview)

Open up the nuget package manager, check the Include prerelease box and search for Microsoft.Graph.

clip_image002

Click Install to bring down the code.

Add application manifest settings for the Unified API

Once the nuget package is installed the application manifest in AAD has to be updated with settings for the Unified API. This is not yet available via the connected service wizard in Visual Studio, so instead head over to the Azure management portal, pick your AAD, then the application registration and configure it there.

image

Click Add application, pick Office 365 unified API (preview), and configure the delegated permissions to be able to Send mail as signed-in user and Sign in and read user profile. As we’re using OWIN for login it seems you have to keep the Windows Azure Active Directory delegated permissions for login. If not, your application will fail on sign-in.

When a user consents to the application the dual permissions looks a bit weird as allow sign-in and reading the profile appears twice.
clip_image002[8]

Authorization code

Next up is adding the authorization call to StartupAuth.cs for the Unified API endpoint - https://graph.microsoft.com. Replace the two existing ones with:

public const string UnifiedResourceId = "https://graph.microsoft.com";

AuthorizationCodeReceived = (context) =>
{
    var code = context.Code;

    ClientCredential credential = new ClientCredential(clientId, appKey);
    string tenantID = context.AuthenticationTicket.Identity.FindFirst("http://schemas.microsoft.com/identity/claims/tenantid").Value;
    string signedInUserID = context.AuthenticationTicket.Identity.FindFirst(ClaimTypes.NameIdentifier).Value;

    AuthenticationContext authContext = new AuthenticationContext(aadInstance + tenantID, new ADALTokenCache(signedInUserID));
    AuthenticationResult result = authContext.AcquireTokenByAuthorizationCode(code, new Uri(HttpContext.Current.Request.Url.GetLeftPart(UriPartial.Path)), credential, UnifiedResourceId);

    return Task.FromResult(0);
}

Switch out ActiveDirectoryClient and OutlookServicesClient with GraphService

Open up UserInfo.aspx.cs and you can now load the user’s profile with this code

var servicePointUri = new Uri(UnifiedResourceId);
var serviceRoot = new Uri(servicePointUri, "/beta/" + tenantID);

var unifiedClient = new GraphService(serviceRoot,
    async () => await TokenHelper.GetTokenForApplicationSilent(TokenHelper.UnifiedResourceId));

IUser user = unifiedClient.Me.ExecuteAsync().Result;

You also need to change the aspx page to the correct type as the IUser interface is different between the Unified API’s classes and the Graph Service classes.

The send email code now looks like this (except it bombs at the time of writing October 2015):

private Task SendEmailTaskUnified()
{
    return Task.Run(async () =>
    {
        var tenantId =
            ClaimsPrincipal.Current.FindFirst("http://schemas.microsoft.com/identity/claims/tenantid").Value;
        var servicePointUri = new Uri(TokenHelper.UnifiedResourceId);
        var serviceRoot = new Uri(servicePointUri, "/beta/" + tenantId);

        var unifiedClient = new GraphService(serviceRoot,
            async () => await TokenHelper.GetTokenForApplicationSilent(TokenHelper.UnifiedResourceId));

        // Load your profile and retrieve e-mail address - could have been cached on initial page load
        var user = unifiedClient.Me.ExecuteAsync().Result;

        var body = new Microsoft.Graph.ItemBody
        {
            Content = "<h1>YOU DID IT!!</h1>",
            ContentType = Microsoft.Graph.BodyType.HTML
        };

        var toRecipients = new List<Microsoft.Graph.Recipient>
        {
            new Microsoft.Graph.Recipient
            {
                EmailAddress = new Microsoft.Graph.EmailAddress {Address = user.mail}
            }
        };

        var newMessage = new Microsoft.Graph.Message
        {
            Subject = "O365 Mail by Mikael",
            Body = body,
            ToRecipients = toRecipients,
            Importance = Microsoft.Graph.Importance.High
        };
        await unifiedClient.Me.SendMailAsync(newMessage, true);
    });
}

In my case the above code bombs on the last line to send the e-mail. Instead you can keep using the OutlookClient without having to register the resource endpoint in StartupAuth.cs. Seems the authorization token from the Unified API handles this just fine. Or you can copy the SendMessageAsync snippet from https://github.com/OfficeDev/O365-UWP-Unified-API-Snippets/blob/master/O365-UWP-Unified-API-Snippets/Users/UserSnippets.cs which uses the Unified API endpoint with pure REST. This actually works compared to the GraphClient which uses OData. Replace the code to retrieve the token in the snippet with the one you already have and pass it in to the function.

string token = await TokenHelper.GetTokenForApplicationSilent(TokenHelper.UnifiedResourceId);
await SendMessageAsync("O365 Mail by Mikael", "<h1>YOU DID IT!!</h1>", user.mail, token); 

O365 Dev Challenges - Part 5 - Adding code to send e-mails

Once the delegate permission for send e-mails has been set up, and the nuget package for OutlookServicesClient is in place, all that’s missing is writing the code to send an e-mail.

Add code to authenticate against Exchange and patching AdalTokenCache.cs

The original code added to StartupAuth.cs takes care of the access token from the https://graph.windows.net resource to query AAD. When performing e-mail actions an access token from https://outlook.office365.com/ is needed as well.

I make sure I have constants for both resources at the top of the file.

private ApplicationDbContext db = new ApplicationDbContext();

// This is the resource ID of the AAD Graph API.  We'll need this to request a token to call the Graph API.
private static string graphResourceId = "https://graph.windows.net";
// This is the resource ID of the Outlook API.
private static string outlookResourceId = "https://outlook.office365.com/";
and in the AuthorizationCodeReceived delegate I add a registration to get a token from the outlook resource as well as the Azure AD graph one.
AuthenticationContext authContext = new AuthenticationContext(aadInstance + tenantID, new ADALTokenCache(signedInUserID));
AuthenticationResult result = authContext.AcquireTokenByAuthorizationCode(
    code, new Uri(HttpContext.Current.Request.Url.GetLeftPart(UriPartial.Path)), credential, graphResourceId);
AuthenticationResult result2 = authContext.AcquireTokenByAuthorizationCode(
    code, new Uri(HttpContext.Current.Request.Url.GetLeftPart(UriPartial.Path)), credential, outlookResourceId);

This means that for every resource end-point you want to query; you add a new token registration in StartupAuth.cs. Took me a few tries to figure that one out, and how it all fit together.

Refactor UserInfo.cs to get tokens from multiple resources

If you look at the end of UserInfo.cs, there is a function named GetTokenForApplication which uses your token cache to silently get you a valid authorization token for the operation you are performing. The flow for Azure AD Graph is the same as for Exchange.

I add a new file named TokenHelper.cs to my project with a generic class to be re-used when acquiring tokens.

using System;
using System.Configuration;
using System.Security.Claims;
using System.Threading.Tasks;
using Microsoft.IdentityModel.Clients.ActiveDirectory;
using Pzl.SampleMultiTenancy.Models;

namespace Pzl.SampleMultiTenancy
{
  public static class TokenHelper
  {
    public const string GraphResourceId = "https://graph.windows.net";
    public const string UnifiedResourceId = "https://graph.microsoft.com";
    public const string OutlookResourceId = "https://outlook.office365.com/";
    private static readonly string aadInstance = ConfigurationManager.AppSettings["ida:AADInstance"];
    private static readonly string clientId = ConfigurationManager.AppSettings["ida:ClientId"];
    private static readonly string appKey = ConfigurationManager.AppSettings["ida:ClientSecret"];

    public static async Task<string> GetTokenForApplicationSilent(string resourceId)
    {
      var signedInUserID = ClaimsPrincipal.Current.FindFirst(ClaimTypes.NameIdentifier).Value;
      var tenantID = ClaimsPrincipal.Current.FindFirst("http://schemas.microsoft.com/identity/claims/tenantid").Value;
      var userObjectID = ClaimsPrincipal.Current.FindFirst("http://schemas.microsoft.com/identity/claims/objectidentifier").Value;
      var authenticationContext = new AuthenticationContext(aadInstance + tenantID, new ADALTokenCache(signedInUserID));
      try
      {
        // get a token for the Graph without triggering any user interaction (from the cache, via multi-resource refresh token, etc)
        var clientcred = new ClientCredential(clientId, appKey);
        // initialize AuthenticationContext with the token cache of the currently signed in user, as kept in the app's EF DB

        var authenticationResult =
          await
            authenticationContext.AcquireTokenSilentAsync(resourceId, clientcred,
              new UserIdentifier(userObjectID, UserIdentifierType.UniqueId));
        return authenticationResult.AccessToken;
      }
      catch (AggregateException e)
      {
        foreach (Exception inner in e.InnerExceptions)
        {
          if (!(inner is AdalException)) continue;
          if (((AdalException)inner).ErrorCode == AdalError.FailedToAcquireTokenSilently)
          {
            authenticationContext.TokenCache.Clear();
          }
        }
        throw e.InnerException;
      }
      catch (AdalException exception)
      {
        if (exception.ErrorCode == AdalError.FailedToAcquireTokenSilently)
        {
          authenticationContext.TokenCache.Clear();
          throw;
        }
        return null;
      }
    }
  }
}

In UserInfo.cs swith out

ActiveDirectoryClient activeDirectoryClient =
   new ActiveDirectoryClient(serviceRoot, async () => await GetTokenForApplication());

with

ActiveDirectoryClient activeDirectoryClient =
  new ActiveDirectoryClient(serviceRoot, async () => 
    await TokenHelper.GetTokenForApplicationSilent(TokenHelper.GraphResourceId));

and remove the GetTokenForApplication method as it’s no longer needed.

Patching faulty logic in AdalTokenCache.cs

Each individual user accessing the application will have a cache entry, where each cache entry stores a token for all resources registered. In my case for the AAD Graph API and Outlook API.

The “bug” in the cache logic resides in the AfterAccessNotification method.

void AfterAccessNotification(TokenCacheNotificationArgs args)
{
    // if state changed
    if (this.HasStateChanged)
    {
        Cache = new UserTokenCache
        {
            webUserUniqueId = userId,
            cacheBits = MachineKey.Protect(this.Serialize(), "ADALCache"),
            LastWrite = DateTime.Now
        };
        // update the DB and the lastwrite 
        db.Entry(Cache).State = Cache.UserTokenCacheId == 0 ? EntityState.Added : EntityState.Modified;
        db.SaveChanges();
        this.HasStateChanged = false;
    }
}

Every time the cache is updated with new access tokens, a new cache entry is saved in the database as Cache.UserTokenCacheId will always be 0. Had the original code author used SingleOrDefault instead of FirstOrDefault in the different code parts then this bug would have been caught pretty quick.

As I don’t like to store more data than needed, and in theory can get random cache tokens back, I replace the above code with:

void AfterAccessNotification(TokenCacheNotificationArgs args)
{
    // if state changed
    if (this.HasStateChanged)
    {
        Cache = Cache ?? new UserTokenCache();
        Cache.webUserUniqueId = userId;
        Cache.cacheBits = MachineKey.Protect(this.Serialize(), "ADALCache");
        Cache.LastWrite = DateTime.Now;
        // update the DB and the lastwrite 
        db.Entry(Cache).State = Cache.UserTokenCacheId == 0 ? EntityState.Added : EntityState.Modified;                
        db.SaveChanges();
        this.HasStateChanged = false;
    }
}

My updated code checks if the cache is already loaded from the database and if so, updates the row. If it is indeed a new user, then a new entry is added.

I also replace all instances of FirstOrDefault with SingleOrDefault. This will force a clean-up of duplicate entries in the cache database. In Visual Studio open up the mdb file located in the App_Data folder and clear out entries from the UserTokenCaches table. Fixing this before the final deploy to Azure and millions of users crowd in seems like a pretty smart move.


image

Send an e-mail

Now it’s time to use the Outlook API and see if I can actually send an e-mail. On UserInfo.aspx I drop a button with a click event.

                    <tr>
                        <td>Last Name</td>
                        <td><%#: Item.Surname %></td>
                    </tr>
                </table>
            </ItemTemplate>
        </asp:FormView>
        <asp:Button ID="SubmitBtn" runat="server" Text="Send Mail" OnClick="SubmitBtn_OnClick"></asp:Button>
    </asp:Panel>
</asp:Content>

In the code-behind I add the good old check for post backs so as not to re-bind the user information.

protected void Page_Load(object sender, EventArgs e)
{
    if (!this.IsPostBack)
    {
        RegisterAsyncTask(new PageAsyncTask(GetUserData));
    }
}

and the code to send the e-mail goes as follows.

private Task SendEmailTask()
{
    return Task.Run(async () =>
    {
        var client =
            new OutlookServicesClient(new Uri("https://outlook.office365.com/api/v1.0"),
                async () => await TokenHelper.GetTokenForApplicationSilent(TokenHelper.OutlookResourceId));
        // Prepare the outlook client with an access token
        await client.Me.ExecuteAsync();

        var servicePointUri = new Uri(TokenHelper.GraphResourceId);
        var tenantId = ClaimsPrincipal.Current.FindFirst("http://schemas.microsoft.com/identity/claims/tenantid").Value;

        // Load your profile and retrieve e-mail address - could have been cached on initial page load
        var serviceRoot = new Uri(servicePointUri, tenantId);
        var activeDirectoryClient = new ActiveDirectoryClient(serviceRoot,
            async () => await TokenHelper.GetTokenForApplicationSilent(TokenHelper.GraphResourceId));
        var user = activeDirectoryClient.Me.ExecuteAsync().Result;

        var body = new ItemBody
        {
            Content = "<h1>YOU DID IT!!</h1>",
            ContentType = BodyType.HTML
        };
        var toRecipients = new List<Recipient>
        {
            new Recipient
            {
                EmailAddress = new EmailAddress {Address = user.Mail}
            }
        };
        var newMessage = new Message
        {
            Subject = "O365 Mail by Mikael",
            Body = body,
            ToRecipients = toRecipients,
            Importance = Importance.High
        };
        await client.Me.SendMailAsync(newMessage, true); // true = save a copy in the Sent folder
    });
}

Testing the application and clicking the Send Email button on the UserInfo page should now send an e-mail to yourself. As there is no navigation link to the UserInfo page, merely write /UserInfo at the end of your URL to test it out https://localhost:44300/UserInfo. If all goes to plan you should see an e-mail in the test user’s mailbox.

clip_image002

O365 Dev Challenges - Part 4 - Connecting Office 365 services to the project

At the moment I have an application which lets me sign in and read data from Azure AD. As I also want to send e-mail I need to add capabilities to send e-mail. This is achieved by connecting the application to a connected service.

Adding Office 365 resources to your project

Right click on the web project and add a Connected Service.

clip_image002

I want to change the Azure AD settings and add e-mail capabilities, and thus pick Office 365 APIs, and click Configure.

clip_image004

Pick the domain of the Azure AD which will host the application, typically the Azure instance mentioned in the Prerequisites in Part 1.

image

On the next screen pick Use settings from an existing Azure AD application to access Office 365 API services. The client id should match the id you have in your web.config file.

image

Click Next until you get to the Mail settings, where you select Send mail as you. This allows the application to later send e-mails on the user’s behalf using the Office 365 API’s.

clip_image002[7]

Move on further to Users and Groups.

Remember that I originally checked the Read directory data permission when creating the ASP.NET application in order to get the AdalTokenCache code inserted into the project. Now I can remove it as I’m not going to read multiple items from Azure AD.

clip_image004[6]

When clicking Finish VS2015 will start adding the service and pull in needed nuget packages as well as reconfigure your App registration with the chosen delegate permissions.

clip_image005

Taking a look at web.config I see that two new lines have been added; the tentantid and domain where you are hosting the application. This is all good, but not needed as we are doing a multi-tenant application, so you can safely remove them again.

Check the web.config file.

image

Back in the Azure management portal, the app registration page reflects the changes you made in the wizard. One AAD permission for logging in and reading the users profile, and one Exchange permissions for sending e-mails.

clip_image002

What nuget packages has been added?

Open up the nuget package manager and a reference to OutlookServices has been added. This library contains the OutlookServicesClient which is used to perform e-mail actions.

clip_image004[8]

Now is also a perfect opportunity to upgrade some of the default packages which have been added to the project.

I upgraded Microsoft.Azure.ActiveDirectory.GraphClient from v2.0.2 to 2.1.0 which allows you to replace

IUser user = activeDirectoryClient.Users
.Where(u => u.ObjectId.Equals(userObjectID))
.ExecuteAsync().Result.CurrentPage.ToList().First()

with

IUser user = activeDirectoryClient.Me.ExecuteAsync().Result

to get the current users object from AAD.

NOTE: The above code replacement is a necessity as I removed the application permission to Read directory data. The original code does a query on all items, while the latter loads your profile only. If you don’t replace it, you will get a permission error when loading the user.

Any other package upgrade should also be ok to make sure you are current.

O365 Dev Challenges - Part 3 - What did the wizard leave behind?

Azure Artefacts

To see all that has happened after the VS2015 wizard has cast all its magical spells, let’s log into the Azure Management portal at https://portal.azure.com/, hit browse allresource groups, and click the new resource group you created in the wizard step.

image

First there is a Visual Studio component (which I actually didn’t check on the ASP.NET creation page, but no biggie), then there is the database server with the application database. Next is the service plan entry and the Azure Web Application I will publish to in the end.

Next navigate to the old Azure Management portal at https://manage.windowsazure.com/, pick Active Directory, pick the AD instance for your demo tenant, and click the Applications heading.

The application page should show the entry for the newly created application.

image

Click the application, and click the Configure heading.

image

You might have noticed the auto-generated name on the consent page, and this screen is the place to change it to something more readable. I’m replacing Pzl.SampleMultiTenancy_20150918180222 with My Cool Sample Multi-tenant App.

Scroll a bit down, and you may want to change the APP ID URI as well. I’m changing mine to https://techmikael.onmicrosoft.com/PzlSampleMultiTenancy.

image

The Client ID listed is the App Principal Id, and inside web.config you’ll see it sitting there with the general Azure login URL and client secret. Also note the SIGN-ON URL and REPLY URL fields, which right now are pointing to your local dev. I’ll change these later when I’m ready for production in Part 7! At the bottom of the above screenshot you see two delegated permissions, which are the once checked off in the creating wizard and which displayed at the consent screen shown.

image
Now I have a basic application up and running which can operate against the https://graph.windows.net/ service end-point in Office 365.

Authorization code

When I checked the Read directory data box during the authorization part of the project creation a couple of things happened. In the file StartupAuth.cs located in the App_Start folder a piece of code got injected to make sure you get authenticated and retrieve an authorization token from the graph.windows.net resource for the resources the application is allowed to use.

image

In the Models folder two files got added, AdalTokenCache.cs and ApplicationDbContext.cs. These two files and the injection into StartupAuth.cs is the automagic I was looking for to make my adventure a smooth ride into the sunset. Let someone else write the plumbing!

A reference to AdalTokenCache can be found in line 65 of StartupAuth.cs.

AuthenticationContext authContext = new AuthenticationContext(aadInstance + tenantID, new ADALTokenCache(signedInUserID));

What happens in these lines of code is that on application start the application will retrieve an authorization code, and cache the token for the resource specified, in this case https://graph.windows.net which contains operations against Azure AD. See Azure AD Graph API Common Queries for a full list of operations available.

Take a look at ActiveDirectoryClient

The home page does not take advantage of or need the consented rights, but another page has been added to the project, UserInfo.aspx. This page uses the ActiveDirectoryClient to load up the users display name, first name and last name from the Azure Active Directory user profile.


image

Uri servicePointUri = new Uri(graphResourceId);
Uri serviceRoot = new Uri(servicePointUri, tenantID);
ActiveDirectoryClient activeDirectoryClient = new ActiveDirectoryClient(serviceRoot,
      async () => await GetTokenForApplication());

// use the token for querying the graph to get the user details
IUser user = activeDirectoryClient.Users
    .Where(u => u.ObjectId.Equals(userObjectID))
    .ExecuteAsync().Result.CurrentPage.ToList().First();

You can tell this is demo code, as the same values can actually be retrieved from the user’s claim, saving yourself one extra http call.

var displayName = ClaimsPrincipal.Current.FindFirst("name").Value;
var givenName = ClaimsPrincipal.Current.FindFirst(ClaimTypes.GivenName).Value;
var surname = ClaimsPrincipal.Current.FindFirst(ClaimTypes.Surname).Value;

O365 Dev Challenges - Part 2 – Creating the initial project and add-in registrations in Azure AD

Prerequisites before starting the adventure

  • Visual Studio 2015
  • An Azure account – I’m using my MSDN account
  • An Office 365 tenant to host the application – I’m using a dev tenant tied to my MSDN subscription. If you don’t have MSDN you can sign up for a developer account for $99/year.
  • Make sure the O365 tenant is tied to your Azure account so you can handle the app registration later.
  • A second Office 365 tenant with e-mail capabilitites to test the multi-tenancy – a trial tenant works just fine.

Create a new ASP.NET Web Application

I’ll do an old skool ASP.NET application for this demo, but the patterns should be analog for MVC.
Fire up VS2015 and kick off a new ASP.NET Web Application from VS2015. I’ll name mine Pzl.SampleMultiTenancy.

image

Pick Web Forms as the template type, check Host in the cloud as the app will be hosted in Azure, and click Change Authentication to set how authentication should work for this add-in.

clip_image002

As I’m interested in Office 365 users only I check Work And School Accounts, and choose the Azure AD domain of the tenant where the add-in will be hosted. The drop-down populates according to the account you have logged into using Visual Studio – refer to the prerequisites part.

In the authorization drop-down pick Cloud - Multiple Organizations, as the application should target more than just one tenant making it a multi-tenant application. If you pick Single Organization the app will only work at the tenant where the add-in is registered originally.

clip_image002[7]


Important: The most important part of the above wizard screen before hitting the OK button is to make sure the Read directory data checkbox is checked. This ensures that the AdalTokenCache helper class is provisioned and hooked into your code. If you miss this step, then I’ve found no easy way to get it in except manual wire-up – which to me is too much hassle. Did I say I'm a lazy coder?


Once you click OK the web project setup screen should look similar to image below.

image

Next up is configuring the Azure Web App settings. The values here will be used in Part 7 where the final application is published to Azure.

image
  • Give the web application a unique available name
  • Pick the Azure subscription to host the web application
  • Pick an existing service plan or create a new one
  • Pick an existing resource group or create a new one
  • Pick the Azure region where you want the web application provisioned
  • Pick an existing database server, or a create a new one. This will be used as the storage for the AdalTokenCache.
    • Make sure the password is at least 8 characters and includes characters from three of the following categories: Uppercase letters, lowercase letters, numbers and special characters. The password must also not contain the username.
Note: If you create your resources and database using the Azure management portal before you start the project, you remove the risk of typing names already in use and having a password fail for you. If something fails you cannot easily recover and might want to start again. Check Part 7 about troubleshooting.

Click OK, and Visual Studio should start setting up all the pieces for you.

clip_image008

Test the application

Hit CTRL+F5 or your favorite key combo to start the project, and once logged in you should be redirected to the user consent page. The default wire-up is Read directory data and Sign in as you and read your profile. I will change this later on.

image


Note: Make sure the page opens either in a browser window where you have logged in with your dev tenant, or that it opens in a private window where you can log in. If you open in some other window with a tenant where you don’t have admin you have a harder time changing the app registration later on. See Part 8 for details.


Once consented you should see the default ASP.NET page with your user principal name in the upper right corner.

clip_image011

O365 Dev Challenges - Part 1 - Introduction to creating a multi-tenant Office 365 add-in using VS2015

image

Background

The background for my O365 add-in coding adventure was that for Puzzlepart’s Office 365 customers we wanted a better way to integrate how they log support tickets with Office 365 directly. Today a user has to either send us an e-mail to the support account or log into our JIRA service desk in order to log a ticket.

The ticket logging action takes the user one step away from where he or she works, so why not instead create a multi-tenant Office 365 app/add-in which can be listed in the App launcher? This first implementation provides a simple add-in in Office 365 where they can file their tickets. Very similar to the existing ticket form, except we get single sign-on and tenant info directly. No need for a separate login or figuring out which information to write in an unstructured e-mail.

The next iteration of our support app will remove the need for sending e-mail and instead use REST API against JIRA to log the cases – requiring even less consent from the user. A more advanced version could rely on Application permissions granted by a tenant admin, but that’s a story for the future. I'm not ready to dive into specifics of certificates.

Technical approach

I do have a basic understanding of how the authorization flow works (more or less) for the user flow in Office 365, but I wanted the easiest experience possible where the plumbing code and scaffolding was created for me when I started to program in Visual Studio.

And utopia is almost within reach as long as you check the right boxes in the wizards to ensure the AdalTokenCache class is added and hooked into your projects authentication flow. I’m sure there are github samples out there which could have worked, but again, I wanted the VS only approach.

Delegated permissions

The add-in I’m creating will use a minimal set of Office 365 features, allow login and read the users profile as well as be allowed to send an e-mail on a user’s behalf. I could have started off with the Office 365 unified API which is in preview, but as this was my first go I wanted to go with the wizards and the easiest approach possible. Switching to the Unified API is covered in Part 6 – and the switch was not without challenges.

These are some of the pain points experienced on my journey:
  • hooking up the OWIN authentication flow for multiple resource end-points
  • making sure the token cache worked with multiple resources and cleared correctly when the cached tokens are no longer valid
  • patch the injected code to only keep one database entry per user for caching
  • figuring out that the connected service wizard and the publish web to Azure wizards have overlapping functionality which does work together when done right

Shout-outs

I had some help from reading Tobias Zimmergren’s series on Getting started with Office 365 development and Chaks post on Making your Office 365 App Available to Multiple Organizations, but the wizards have since changed and I’m now on VS2015.

To make the experience as easy as possible I chose not to require any tenant admin setup, but instead let each user consent to the app by themselves. On login I check if the tenant is a registered customer, and if not you won’t be able to log a ticket. This is implemented with a simple database keeping track of valid tenant id’s and domains.

The demo app will not do the tenant check part, but covers logging in a user and sending an e-mail on his/her behalf.

Read on!

Friday, September 25, 2015

Getting past the CSWP item limit of 50 results–in one page load

I wrote about a solution to this back in 2013, which today seems a bit hacky. Elio Struyf also recently wrote about this (and I stole the post title from him :-) where he does subsequent 50 item loading calls to fill up the item list.

Here’' I’ll show a solution which can load up to whatever limit is set at the SSA or tenant (which by default is 500) on the initial load of the CSWP. I will use the same approach as I did when configuring the OSSSearchResults.aspx page in How to do a light weight search center using osssearchresults.aspx (part 2/3).

The limit of 50 is hard coded in the CSWP server side – see my original post for details.

Wednesday, September 16, 2015

Export and import property bag values from SharePoint using PowerShell

This one is for us oldies still doing on-premises work with SharePoint.

image

I’m currently working on a 2010 to 2013 migration and we need to move some values from the farm and web application property bags as we’re not doing a 1:1 migration, but merging multiple 2010 farms solutions into one new 2013 farm.

So I googelized a bit and came up with this script which can export and import property bag values from SPFarm, SPWebApplication, SPSite and SPWeb. Should be pretty self explanatory to use but it could look like this:

./Export-Import-Props.ps1 –Url http://myserver -CsvFile ./test.csv -Level SPWebApplication -Mode Export

Tuesday, September 15, 2015

Adding icons to Visio 2013 files in search results

clip_image001

It’s been a while since Microsoft launched Office 2013, and with Office 2016 soon coming out they still have not thought about giving Visio 2013 files a file icon in the search results.

The issue is two-fold. The Result Type used to pick the default Office display template and the JavaScript function Srch.U.getFriendlyNameForFileExtension (Search.ClientControls.js) are only matching on pre-2013 file extensions.

Wednesday, September 9, 2015

Inconvenient renaming of a Group

I recently created a new group named “Blog posts” at our company, getting the URL https://tenant.sharepoint.com/sites/blogposts. Then a colleague said this was not a good name as it should be named “Authoring like a Pro” instead.

Ok… I could have deleted the group and created a new one, but I opted to rename it instead.

image

Friday, September 4, 2015

Creating another Delve Clone–A real one this time!

Ok…..once again it’s Friday and that means my mind starts to spin. This time around a question about using Delve for external users from the Delve group of the Office 365 Yammer network.

image

The Delve application itself is not available for external users, but the signals from external users are. I have previously created a Delve clone using the Content Search Web Part, which also works for external users. It should be polished a bit though :)

Wednesday, August 26, 2015

“Failed to start the flow” error when importing dictionaries in SharePoint 2013 search

image
During a production deployment today we encountered the following error when running the Import-SPEnterpriseSearchCustomExtractionDictionary commandlet for a custom dictionary.

Importing \\dropserver\deployment$\Internal\dictionaries\site_types.csv to Microsoft.UserDictionaries.EntityExtraction.Custom.WordPart.3
Import-SPEnterpriseSearchCustomExtractionDictionary : Failed to start the flow: Microsoft.CXDDeploymentCaseInSensitive

There were no apparent reason for this and if it had been a test environment we would have just re-started the Search Host Controller service on all machines and hope that it fixed it. As we were in production that was not an option.

Tuesday, August 25, 2015

Delve blog articles dissected

image
Kanwal Khipple had a question on the Office 365 Yammer network the other day about how can you surface Delve blog articles on a SharePoint site?

To figure out if it’s doable and how you need to understand how the Delve article solution is built. I won’t go into the actual digging, but rather lay out the building blocks from a technical perspective.
And if you want the short story, surfacing metadata other than Title and Author is not trivial – from a search perspective. Which is also what you see in a Delve card.

image

Wednesday, August 19, 2015

How to do case-insensitive sorting in SharePoint search

Short answer, you can’t.

image

If you’re working on-premises, read on for the longer answer. If you’re working in SharePoint Online, take a deep breath, sigh, and log a support ticket so we can get this into the product.