Sunday, February 23, 2014

How to trigger a full re-index in SharePoint On-line

[Update 2015-01-05]
Latest version of the script can be found at

In SharePoint Online you cannot trigger a full re-index of all your data as you can on-premises via the Content Sources on your Search Service Application.

And this might be something you would like to do, especially if you start mapping crawled properties to new managed properties. A full list of scenario’s which require a full re-index is available at TechNet.

The only option provided in SPO is to via the settings pages on a site or a list/library, trigger a re-index as outlined in the SPO documentation.

But fortunately this is not entirely true, as you via CSOM can manipulate a site’s property bag and set or change the value of vti_searchversion which is what triggers a re-index of this site. The same property applies to a list/library.

The below script uses the SharePoint Online Management cmdlets to iterate all the site collections, and SharePoint CSOM to iterate the sites within each site collection. For each site it will update vti_searchversion, which will trigger the site to be picked up for re-indexing on the next crawl cycle.

Note: This will not improve crawl time, merely ensure items are re-indexed.
The script may be downloaded from

# Re-index SPO tenant script
# Author: Mikael Svenson - @mikaelsvenson
# Blog:

Import-Module Microsoft.Online.SharePoint.PowerShell -DisableNameChecking
# replace these details or use Get-Credential to enter password securely as script runs
$username = "" 
$password = "password" 
$url = ""
$adminUrl = ""
$securePassword = ConvertTo-SecureString $Password -AsPlainText -Force 

# change to the path of your CSOM dll's
$csomPath = "c:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI"

Add-Type -Path "$csomPath\Microsoft.SharePoint.Client.dll" 
Add-Type -Path "$csomPath\Microsoft.SharePoint.Client.Runtime.dll" 

$credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($username, $securePassword) 

function Reset-Webs( $siteUrl ) {
    $clientContext = New-Object Microsoft.SharePoint.Client.ClientContext($siteUrl)     
    $clientContext.Credentials = $credentials 
    if (!$clientContext.ServerObjectIsNull.Value) 
        Write-Host "Connected to SharePoint Online site: '$siteUrl'" -ForegroundColor Green 
    function processWeb($web)
        $subWebs = $web.Webs
        [int]$version = 0
        $allProperties = $web.AllProperties
        Write-Host "Web URL:" $web.Url -ForegroundColor White
        if( $allProperties.FieldValues.ContainsKey("vti_searchversion") -eq $true ) {
            $version = $allProperties["vti_searchversion"]            
        Write-Host "Current search version: " $version -ForegroundColor White
        $allProperties["vti_searchversion"] = $version
        Write-Host "Updated search version: " $version -ForegroundColor White

        foreach ($subWeb in $subWebs)

    $rootWeb = $clientContext.Web

$spoCredentials = New-Object System.Management.Automation.PSCredential($username, $securePassword)

Connect-SPOService -Url $adminUrl -Credential $spoCredentials
Get-SPOSite | foreach {Reset-Webs -siteUrl $_.Url } 


  1. Perfect, just what I was looking for thank you.

  2. Great post Mikael.

    Do you have any solutions for full crawl of user profiles. This is an even bigger problem.

    1. My guess is that if you do a save on a profile, it will be re-crawled, but haven't tested this yet.

    2. Yes that is absolutely the case. The problem is that when you create new refiners that need to be populated you will need to ask all your users to make a change on their profile in order to get them crawled OR ask MS to run a full crawl that doesn't seem to run ever if you don't ask them. So a solution similar to what you describe, but for user profiles, would be awsome

    3. you're right, i'm facing the same problem. have you found the solution yet? or do i have to populate each profile somehow? thanks in advance

    4. This comment has been removed by a blog administrator.

    5. I've added script for profiles as well.

  3. This comment has been removed by a blog administrator.

  4. I have tested this a couple of days ago. Doesn't seem to work (anymore)? Have you tested this recently?

    1. Used it last week, and use the version from And did you try via the UI for your site instead?

  5. My experience is it never work and perhaps it slow down the site too. I get the latest script from:

    But never worked. Can someone please guide if i am missing anything?


    1. Hi, is the script working without errors, or what are you seeing? I have used it a lot of times and does basically the same as clicking the "re-index this site" button. Ensures it's picked up on the next crawl. If you create the CrawlTime managed property (which is another post) then you can see when the item was last crawled.

  6. I need to add the -Limit All to your script due to the number of site collections where do I add this parameter to the script?