Friday, May 22, 2015

Get Sharepoint current datetime in javascript for Sharepoint Online

First of all let’s clarify what I mean by Sharepoint current datetime in the title. This is the datetime which you can see in Created column when create new item in the list or upload new document to the doclib. As you probably know internally Sharepoint stores all datetimes in content database in UTC format. In the Site settings > Regional settings it is possible to specify timezone for the current site. Depending on selected timezone Sharepoint will show datetimes to end users (if you will change timezone and update list view, values in Created and Modified fields will be changed). I.e. Sharepoint current datetime is not always the same as server date time which is set in OS.

On practice we often need to get Sharepoint current datetime and server datetime, e.g. when want to display news which are not older than N days. In order to do that we will get server current datetime and compare it with Created/Modified fields of the news. The question is how to get this datetime in the javascript? We are talking about javascript because this is the basic way currently to create custom components for Sharepoint Online.

One of the ways is to use _spPageContextInfo.clientServerTimeDelta variable which is defined like that:

   1: clientServerTimeDelta: new Date("2015-05-21T16:54:00.0000000Z") - new Date()

If you will check my other article How to get URL of current site collection and other server side properties on client site in Sharepoint, you will find that date defined in string in example above is current UTC date time:

   1: sb.Append("\", clientServerTimeDelta: new Date(\"");
   2: sb.Append(DateTime.UtcNow.ToString("o", CultureInfo.InvariantCulture));
   3: sb.Append("\") - new Date()");

In javascript “new Date()” will create datetime object created for the local client’s timezone. I.e. _spPageContextInfo.clientServerTimeDelta contains bias between client’s current datetime and server’s UTC current datetime. When we need server’s current UTC datetime for comparing it with Created/Modified date, we may do it like this:

   1: var d = new Date();
   2: var serverDateTimeNow = _spPageContextInfo.clientServerTimeDelta + d;

There will be difference with actual server time depending on how late above code will be executed, but not crucial. The problem is that _spPageContextInfo.clientServerTimeDelta may not be always available on your page, so let’s consider other approach as well.

Interesting solution was posted by colleague Vadim Gremyachev here: Get current server datetime through javascript REST. However you may face with cross domain request problem when will try to use this approach in Sharepoint app (apps have own sub domains). Also when offset is calcuated:

   1: var offset = data.d.Information.Bias / 60.0;

it doesn’t takes into consideration daylight bias. I.e. more correct approach is the following:

   1: var offset = (data.d.Information.Bias + data.d.Information.DaylightBias) / 60.0;

(there is also StandardBias property but it is set to 0 in examples I saw. For safety you may add it as well). In order to be able to use it in Sharepoint app javascript object model should be used:

   1: var context = new SP.ClientContext(appWebUrl);
   2: var factory = new SP.ProxyWebRequestExecutorFactory(appWebUrl);
   3: context.set_webRequestExecutorFactory(factory);
   4:  
   5: var hostContext = new SP.AppContextSite(context, postsArchiveURL);
   6: var web = hostContext.get_web();
   7: context.load(web);
   8:  
   9: context.executeQueryAsync(function () {
  10: create new document or item)
  11:     var regionalSettings = web.get_regionalSettings();
  12:     context.load(regionalSettings);
  13:     context.executeQueryAsync(
  14:         function () {
  15:             var timeZone = regionalSettings.get_timeZone();
  16:             context.load(timeZone);
  17:             context.executeQueryAsync(
  18:                 function () {
  19:                     var info = timeZone.get_information();
  20:                     var offset = (info.get_bias() + info.get_daylightBias()) / 60.0;
  21:                     var serverDateTimeNow =
  22: new Date(new Date().getTime() - offset * 3600 * 1000).toISOString();
  23:                     console.log("serverDateTimeNow: " + serverDateTimeNow);
  24:                 },
  25:                 function (sender, args) {
  26:                     console.log(args.get_message());
  27:                 }
  28:             );
  29:         },
  30:         function (sender, args) {
  31:             console.log(args.get_message());
  32:         }
  33:     );
  34: }, function(sender, err) {
  35:     console.log(err.get_message());
  36: });

I intentially converted serverDateTime variable to string using toISOString() function, because in this format it may be added to the CAML query for comparing with Created or Modified date.

In order to get Sharepoint current datetime we need to combine 2 approaches: to the server’s current UTC datetime

   1: var serverDateTimeNow = _spPageContextInfo.clientServerTimeDelta + d;

add offset like shown in example above. As result we will get time which is almost the same that is shown in Created/Modified fields. Hope that this information will help you in your work.

Thursday, May 21, 2015

Enumerate all tenant’s site collections in Sharepoint Online via PowerShell

It is quite easy to enumerate all site collections via PowerShell for on-premise Sharepoint (see e.g. Set search settings in all site collections of Sharepoint web application via PowerShell), but for Sharepoint Online it is more tricky. C# solution which uses client object model was posted in the following article: Get list of site collections using CSOM in Office365. Let’s try to do the same in PowerShell. We will need Microsoft.Online.SharePoint.Client.Tenant.dll library which can be obtained from nuget.

Here is the script:

   1: param(
   2:     [string]$adminWebAppUrl,
   3:     [string]$login,
   4:     [string]$password
   5: )
   6:  
   7: $currentDir = Convert-Path(Get-Location)
   8: $dllsDir = resolve-path($currentDir + "\dlls")
   9:  
  10: [System.Reflection.Assembly]::LoadFile([System.IO.Path]::Combine($dllsDir,
  11: "Microsoft.SharePoint.Client.dll"))
  12: [System.Reflection.Assembly]::LoadFile([System.IO.Path]::Combine($dllsDir,
  13: "Microsoft.SharePoint.Client.Runtime.dll"))
  14: [System.Reflection.Assembly]::LoadFile([System.IO.Path]::Combine($dllsDir,
  15: "Microsoft.SharePoint.Client.Taxonomy.dll"))
  16: [System.Reflection.Assembly]::LoadFile([System.IO.Path]::Combine($dllsDir,
  17: "Microsoft.Online.SharePoint.Client.Tenant.dll"))
  18:  
  19: if (-not $adminWebAppUrl)
  20: {
  21:     Write-Host "Specify admin web app url in adminWebAppUrl parameter"
  22: -foregroundcolor red
  23:     return
  24: }
  25:  
  26: if (-not $login)
  27: {
  28:     Write-Host "Specify user name in login parameter" -foregroundcolor red
  29:     return
  30: }
  31:  
  32: if (-not $password)
  33: {
  34:     Write-Host "Specify user password in password parameter" -foregroundcolor red
  35:     return
  36: }
  37:  
  38: function Do-Something($url)
  39: {
  40:     Write-Host "Working with $url" -foregroundColor green
  41:     # ... add your logic here
  42: }
  43:  
  44: # initialize client context
  45: $clientContext = New-Object Microsoft.SharePoint.Client.ClientContext($siteURL)    
  46: $clientContext.RequestTimeOut = 1000 * 60 * 10;
  47: $clientContext.AuthenticationMode =
  48: [Microsoft.SharePoint.Client.ClientAuthenticationMode]::Default
  49: $securePassword = ConvertTo-SecureString $password -AsPlainText -Force
  50: $credentials =
  51: New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($username,
  52: $securePassword)
  53: $clientContext.Credentials = $credentials
  54: $web = $clientContext.Web
  55: $site = $clientContext.Site
  56: $clientContext.Load($web)
  57: $clientContext.Load($site)
  58: $clientContext.ExecuteQuery()
  59:  
  60: # enumerate all site collections
  61: $web = $clientContext.Web
  62: $tenant = New-Object "Microsoft.Online.SharePoint.TenantAdministration.Tenant"
  63: -ArgumentList $clientContext
  64: $props = $tenant.GetSiteProperties(0, $true)
  65: $clientContext.Load($props)
  66: $clientContext.ExecuteQuery()
  67:  
  68: foreach($sp in $props)
  69: {
  70:     Do-Something $sp.Url
  71: }

The important moment is that we need to provide Sharepoint admin center URL as parameter for this script, not URL of any real site collection. In script we initialize client context (lines 45-58), enumerate site collections using Tenant class from Microsoft.Online.SharePoint.Client.Tenant.dll (lines 61-71) and for each site collection call our custom function. Hope that it will be helpful.

Sunday, April 19, 2015

Perform search requests in Sharepoint via javascript object model

In one of my previous posts I showed how to get Sharepoint user profile properties via javascript object model: here. In this post I will show how to do another common task via javascript: perform search requests. It will be useful e.g. if you develop search-driven solution for Sharepoint Online. Here is the javascript code:

   1: var newsItems = [];
   2:  
   3: var NewsItem = function (title, url, ingress, newsDate) {
   4:     this.Title = title;
   5:     this.Url = url;
   6:     this.Ingress = ingress;
   7:     this.NewsDate = newsDate;
   8: }
   9: SP.SOD.executeFunc('sp.search.js',
  10: 'Microsoft.SharePoint.Client.Search.Query.KeywordQuery', function () {
  11:     var Search = Microsoft.SharePoint.Client.Search.Query;
  12:     var ctx = SP.ClientContext.get_current();
  13:     var site = ctx.get_site();
  14:     ctx.load(site);
  15:  
  16:     var query = new Search.KeywordQuery(ctx);
  17:     query.set_queryText("..."); // search query
  18:     query.set_enableSorting(true);
  19:  
  20:     var sortproperties = query.get_sortList();
  21:     sortproperties.add("NewsDate", 1);
  22:     query.set_rowLimit(100);
  23:     query.get_selectProperties().add("NewsIngress");
  24:     query.get_selectProperties().add("Path");
  25:     query.get_selectProperties().add("NewsDate");
  26:     query.set_trimDuplicates(false);
  27:  
  28:     var executor = new Search.SearchExecutor(ctx);
  29:     var result = executor.executeQuery(query);
  30:  
  31:     ctx.executeQueryAsync(function () {
  32:  
  33:         var tableCollection = new Search.ResultTableCollection();
  34:         tableCollection.initPropertiesFromJson(result.get_value());
  35:         var rows = tableCollection.get_item(0).get_resultRows();
  36:         var enumItems = rows;
  37:         var currentRow = 0;
  38:         var rowCount = rows.length;
  39:  
  40:         while (currentRow < rowCount) {
  41:             var row = rows[currentRow];
  42:             newsItems.push(new NewsItem(row["Title"], row["Path"], row["NewsIngress"],
  43:                 row["NewsDate"]));
  44:             currentRow++;
  45:         }
  46:     },
  47:     function (sender, args) {
  48:         console.log(args.get_message());
  49:     });
  50: });

At first we prepare query object (lines 16-26). Here we set actual query string (line 17) and various properties, including managed properties which should be retrieved from search index. After that we perform actual query to the search index asynchronously using SearchExecutor object (lines 28-44). Search results are saved to the array of news items which then may be used e.g. for binding to UI component. Having this example you will be able to easily adopt it for your scenario.

Sunday, April 12, 2015

Create search crawl rules for Sharepoint search service application via PowerShell

In one of my previous articles I showed how we may exclude system pages like AllItems.aspx from search results: Exclude AllItems.aspx from search results in Sharepoint 2013. In this post I will show how to create search crawl rules via PowerShell. It may be useful when you need to exclude a lot of contents from search crawling and doing it manually would mean a lot of work (e.g. when you restored large content database from production, but don’t need to crawl all sites). Here is the script:

   1:  
   2: # Ensure SharePoint PowerShell Snapin
   3: if ((Get-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue) -eq $null) 
   4: {
   5:     Add-PSSnapin "Microsoft.SharePoint.PowerShell"
   6: }
   7:  
   8: [xml]$xmlinput=(Get-Content "CrawlRules.xml")
   9:  
  10: foreach($WebApplication in $xmlinput.SelectNodes("Build/WebApplication"))
  11: {
  12:     foreach($SearchService in $WebApplication.SelectNodes("SearchService"))
  13:     {
  14:         #Get search service
  15:         $strServiceName=$SearchService.Name;
  16:         $spService=Get-SPEnterpriseSearchServiceApplication -Identity $strServiceName;
  17:         
  18:         #Clear rules if needed
  19:         $Rules=$SearchService.SelectNodes("Rules");
  20:         $strClearRules=$Rules.ItemOf(0).Clear;
  21:         if ($strClearRules -eq "True")
  22:         {
  23:             $spRules=Get-SPEnterpriseSearchCrawlRule -SearchApplication $spService;
  24:             foreach ($spRule in $spRules)
  25:             {
  26:                 if ($spRule -ne $null)
  27:                 {
  28:                     Write-Host "Deleting rule:" $spRule.Path -ForegroundColor Yellow
  29:                     $spRule.Delete();
  30:                 }
  31:             }
  32:         }
  33:  
  34:         #Add new rules
  35:         foreach($CrawlRule in $SearchService.SelectNodes("Rules/Rule"))
  36:         {
  37:             $FollowComplexUrls=$false;
  38:             if($CrawlRule.FollowComplexUrls -eq "True")
  39:             {
  40:                 $FollowComplexUrls=$true;
  41:             }
  42:             
  43:             if ($CrawlRule.Type -eq "ExclusionRule")
  44:             {
  45:                 #In exclusion FollowComplexUrls actually means "Exclude complex URLs"
  46:                 $FollowComplexUrls=!$FollowComplexUrls;
  47:                 New-SPEnterpriseSearchCrawlRule -Path $CrawlRule.URL -SearchApplication
  48: $spService -Type $CrawlRule.Type -FollowComplexUrls $FollowComplexUrls
  49:             }
  50:             else
  51:             {
  52:                 $CrawlAsHttp=$false;
  53:                 if($CrawlRule.CrawlAsHttp -eq "True")
  54:                 {
  55:                     $CrawlAsHttp=$true;
  56:                 }
  57:                 
  58:                 $SuppressIndexing=$false;
  59:                 if($CrawlRule.SuppressIndexing -eq "True")
  60:                 {
  61:                     $SuppressIndexing=$true;
  62:                 }
  63:                 
  64:                 New-SPEnterpriseSearchCrawlRule -Path $CrawlRule.URL -SearchApplication
  65: $spService -Type $CrawlRule.Type -FollowComplexUrls $FollowComplexUrls -CrawlAsHttp
  66: $CrawlAsHttp -SuppressIndexing $SuppressIndexing
  67:             }
  68:         }
  69:     }
  70: }

Rules are defined in CrawlRules.xml file which has the following structure:

   1:  
   2: <?xml version="1.0" encoding="utf-8"?>
   3: <Build>
   4:   <WebApplication>
   5:     <SearchService Name="Search Service Application">
   6:       <Rules Clear="True">
   7:         <Rule URL="*://*/_layouts/*" Type="ExclusionRule" FollowComplexUrls="False" />
   8:         <Rule URL="*://*/_catalogs/*" Type="ExclusionRule" />
   9:         <Rule URL="*://*/_vti_bin/*" Type="ExclusionRule" />
  10:         <Rule URL="*://*/forms/AllItems.aspx*" Type="ExclusionRule" />
  11:         <Rule URL="*://*/forms/DispForm.aspx*" Type="ExclusionRule" />
  12:         <Rule URL="*://*/forms/EditForm.aspx*" Type="ExclusionRule" />
  13:         <Rule URL="*://*/forms/NewForm.aspx*" Type="ExclusionRule" />
  14:       </Rules>
  15:     </SearchService>
  16:   </WebApplication>
  17: </Build>

As result it will create exclusion rules for layouts pages, also for pages from _catalogs and _bti_bin and for list forms AllItems.aspx, DispForm.aspx, EditForm.aspx and NewForm.aspx. You may generate this xml file programmatically if you have a lot of sites which should be excluded and then pass it to the script above. It will simplify administrative work, which is not needed to be done manually.