There are numerous posts out there on how to programatically create a cookie based crawl rule which will first login at a specific URL, and then use the cookies recived when crawling the content source. This is however the first one on how to do a persistant one.
Here’s a brief sample:
$ssa = Get-SPEnterpriseSearchServiceApplication #Create rule $rule = New-SPEnterpriseSearchCrawlRule -SearchApplication $ssa -Path 'http://*.contoso.com/*' -Type InclusionRule -FollowComplexUrls $true -AuthenticationType CookieRuleAccess #Add cookie information $authUrl = "http://www.contoso.com/auth?username=foo&password=bar" $rule.SetCredentials([Microsoft.Office.Server.Search.Administration.CrawlRuleAuthenticationType]::CookieRuleAccess,"pzl", $authUrl)
The above code will before each crawl call the url from $authUrl to get the cookie. The cookie added above with “Pzl” is just a dummy cookie for making the command work.
Sometimes however you might have a persistant cookie you want to add, like you can do from the SharePoint UI.
The crux to this is to change the auth url to be file://local, and you are set to go. And be sure to add real cookie data and not dummy data!
$rule.SetCredentials([Microsoft.Office.Server.Search.Administration.CrawlRuleAuthenticationType]::CookieRuleAccess,"myUser=crawlUser; myPwd=crawlPassword","file://local")
How did I find this out? Again, reflected the code used for the UI page, and hope that it might help someone else.