Monday, December 14, 2020

How to read Sharepoint Online sites sharing capabilities via CSOM

Sharepoint Online allows to share your sites with external users. At first administrator should enable external sharing on the tenant (organization) level (see Manage sharing settings). After that you may set external capabilities for each individual site (see Set sharing capabilities of Sharepoint site collection via client object model for how to do that). In provided example SharingCapabilities is enum which has the following values (they are quite self-descriptive so I won’t describe they here):

  • Disabled
  • ExternalUserSharingOnly
  • ExternalUserAndGuestSharing
  • ExistingExternalUserSharingOnly

In order to read sharing capabilities of existing Sharepoint Online sites via CSOM you may use the following example:

string username = ...;
string password = ...;
var adminContext = new ClientContext("https://{tenant}");
var secure = new SecureString();
foreach (char c in password)
var credentials = new SharePointOnlineCredentials(username, secure);
adminContext.Credentials = credentials;

var tenant = new Tenant(adminContext);
var properties = tenant.GetSiteProperties(0, true);
foreach (SiteProperties p in properties)
    Console.WriteLine(p.Url + ": " + p.SharingCapability);

This example outputs all sites with their SharingCapability properties. If you need to get SharingCapability for single specific site add condition on site url to the last loop. Hope it will help you.

Wednesday, December 9, 2020

How to check what sensitivity label is applied to O365 group via Graph API

In order to check what sensitivity label is applied to O365 group you may go to Azure portal > Azure Active Directory > Groups > select group. Sensitivity label will be displayed in overview tab of the group:

In order to get sensitivity label applied to O365 group programmatically via Graph API you may use the following endpoint:$select=assignedLabels

It will return applied label id and display name like this:

    "@odata.context": "$metadata#groups(assignedLabels)/$entity",
    "assignedLabels": [
            "labelId": "...",
            "displayName": "Private"

You may test this endpoint e.g. in Graph Explorer:

Friday, December 4, 2020

How to add and remove user from site collection admins in Sharepoint Online using CSOM

Using the following code you may add user to site collection admins in Sharepoint Online via CSOM:

var adminContext = ...; // ClientContext for https://{tenant}
string siteUrl = "https://{tenant}";
string loginName = ...; // user name which should be added to site collection admins
var tenant = new Tenant(adminContext);
tenant.SetSiteAdmin(siteUrl, loginName, true);

If you need to remove user from site collection admins use the following code:

var adminContext = ...; // ClientContext for https://{tenant}
string siteUrl = "https://{tenant}";
string loginName = ...; // user name which should be added to site collection admins
var tenant = new Tenant(adminContext);
tenant.SetSiteAdmin(siteUrl, loginName, false);

(the difference is only that you need to pass false in tenant.SetSiteAdmin() method). Hope it will help in your work.

Tuesday, December 1, 2020

App.config vs App service configuration settings for storing app settings for Azure web jobs

As you probably know when you upload zip package with exe application to Azure App service > Web jobs (to run it as Azure web job by scheduler) – you may override app settings in App service > Configuration > Application settings section. In this case settings defined in Azure will have priority over settings defined in app.config which is deployed as part of zip package:

However it is important to note that even if you will use app settings from Azure – it is still necessary to include app.config to zip package. Otherwise if you will exclude app.config from zip package – then ConfigurationManager.AppSettings will return empty collection. So good practice is to add app.config with empty values to zip package and then manage app settings in Azure. Don’t forget to remove values from app.config – otherwise sensitive information (client secrets, passwords, keys, etc) may be accidentally leaked with zip package of web job.

Tuesday, November 24, 2020

Enable ReSharper unit tests runner with TypeMock isolator

If you use TypeMock isolator for writing unit tests (which is great library for mocking from my point of view) and ReSharper to make user experience in Visual Studio better (which is another great tool :) ) then you may face with the following problem when try to run unit tests in ReSharper test runner: tests will fail with the following error

TypeMock.TypeMockException :
*** Typemock Isolator is currently disabled. Enable using the following:

* Within Visual Studio:
    - Use Typemock Smart Runner 
    - For other runners, Choose Typemock Menu and click "Integrate with Other Runners"
  * To run Typemock Isolator as part of an automated process you can:
     - run tests via TMockRunner.exe command line tool
     - use 'TypeMockStart' tasks for MSBuild or NAnt
For more information consult the documentation (see 'Running Unit Tests in an Automated Build')
    at TypeMock.InterceptorsWrapper.VerifyInterceptorsIsLoaded()

This error will be still there even if “Integrate with other runners” option will be checked in Typemock menu:

In order to fix this problem do the following:

1. Go to TypeMock installation folder (C:\Program Files (x86)\Typemock\Isolator\x.x) and create or edit knownRunners.dat file there

2. In this file add process name of ReSharper unit tests runner on separate line: for ReSharper 2020 it will be “ReSharperTestRunner64c.exe”.
In order to know process name of the runner - run unit tests with it, find it in Process Explorer (if runner was ran from Visual Studio it will be shown under Visual Studio process tree) > right click > Properties > copy exe file name.

After that run unit tests with TypeMock in ReSharper test runner – they should work now.

Monday, November 16, 2020

Remote debugging of Azure functions in Visual Studio

When developing for Azure you should definitely take a look on Visual Studio’s Cloud Explorer (Views > Cloud Explorer) as it has features which are missing in basic Azure portal. E.g. when working with Notification hubs it allows to manage existing subscriptions for push notifications (view and delete if needed). There is one more useful feature if you develop Azure functions which I explored recently: remote debugging. It allows you to attach debugger to remote process of your Azure function app running in Azure and debug it in your Visual Studio.

In order to do that open Cloud explorer > choose subscription > App services > right click on Azure function app > Attach debugger:

If it will show error:

System.Runtime.InteropServices.COMException (0x89710023): Unable to connect to the Microsoft Visual Studio Remote Debugger named '{tenant}'.  The Visual Studio 2017 Remote Debugger (MSVSMON.EXE) does not appear to be running on the remote computer. This may be because a firewall is preventing communication to the remote computer. Please see Help for assistance on configuring remote debugging.

ensure that port 4022 which is used by remote debugger is opened in firewall. After that you should be able to debug remote Azure functions running on Azure.

Friday, November 13, 2020

Create new Team UX when use sensitive labels

In this article I will show how different sensitive labels affect UX on the standard Create new Team form. For testing I created 6 different sensitive labels with different Privacy and external sharing settings:



External user access










Private or public




Private with external users




Public with external users




Private or public with external users



Let’s see how teams creation form looks like for each sensitivity label from table above.

1. Private

2. Public

3. Private or public

4. Private with external users

5. Public with external users

3. Private or public with external users

Thursday, November 12, 2020

Applying O365 sensitivity labels for different containers

In my previous post I showed how to enable sensitivity labels for different containers (SPO sites, Teams, O365 groups) in the tenant . Let’s see in more details how exactly applying of sensitivity labels looks like and to which exact containers it can be applied.

First of all let’s go to Sharepoint Online and try to create new site there. We will see that sensitivity labels are available for Modern Sharepoint Online sites: both Modern Team site

and Modern Communication sites

But if we will try to create Classic site (e.g. using Publishing Portal site template) we will see that sensitivity labels are not available:

So for Classic Sharepoint sites sensitivity labels are not available.

Next let’s go to Teams and try to create new Team there: we will see that Sensitivity field appeared on the Teams creation form:

Finally if we will go to Azure portal and will try to create new O365/M365 group we will also see that Sensitivity field will be available:

In the future posts we will check other details of sensitivity labels functionality in O365.

Monday, November 9, 2020

Enable sensitivity labels for Sharepoint sites, Teams and O365 groups

Sensitivity labels help to maintain content in your organization. In opposite to classification labels which are more like additional metadata for O365 groups/SP sites where custom policies have to be enforced by internal tools or custom PowerShell scripts (i.e. don’t have O365 policies assigned to them) sensitivity labels have policies behind and allow to use O365 infrastructure to maintain sensitive data in your organization.

Sensitivity labels may be enabled from several places:

By default they can be used for files in emails but in order to enable them for “container” (SP online sites, Teams and O365 groups) several additional steps should be done:

1. First of all enable sensitivity labels from PowerShell using the following script:

Import-Module AzureADPreview
$Setting = Get-AzureADDirectorySetting -Id (Get-AzureADDirectorySetting | where -Property DisplayName -Value "Group.Unified" -EQ).id
$Setting["EnableMIPLabels"] = "True"
Set-AzureADDirectorySetting -Id $Setting.Id -DirectorySetting $Setting

2. After that we need to sync them to AzureAD using the following script:

Install-Module -Name ExchangeOnlineManagement
Import-Module ExchangeOnlineManagement
$UserCredential = Get-Credential
Connect-IPPSSession -Credential $UserCredential

If you will have error "It is about Unable to resolve package source” then start new PowerShell session as administrator and run the following command as 1st command in the session:

[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

and then run above script again.

After these steps you will be able to create sensitivity labels for SP sites, Teams and O365 groups. Let’s see how it looks like in Security and compliance center > Classification > Sensitivity labels. Pay attention that there is now the following note:

You can now create sensitivity labels with privacy and access control settings for Teams, SharePoint sites, and Microsoft 365 Groups.

Click Create a label – after that New label wizard will be opened. On the first step we need tp specify name and description and on 2nd step it will be possible to choose both Files & emails and Groups & sites:

Here we are interested in Groups and sites so let’s keep only this option checked. Skip next step for Files and emails and open next step “Define protection settings for groups and sites”. On this step we may set “Privacy and external user access settings” and “Device access and external sharing settings”:

E.g. if we will check “Privacy and external user access settings” then on the next step we will be able to set privacy and external users settings for sites/teams/groups for which this label will be applied:

On the final step we will need to publish our new label (this will open own wizard).

After label has been published it will take up to 10 minutes before it will appear in O365 UI:

This is how you may enable sensitive labels for SP sites, Teams and O365 groups. Hope this information will help someone.

Thursday, November 5, 2020

Call WebExtensions.WebExistsFullUrl method from OfficeDevPnP library for different site collection

Few years ago I wrote about one problem in OfficeDevPnP method WebExtensions.WebExistsFullUrl: shortly when it was called with url which belongs to non-existent site collection from different managed path it returned true. Full description of the problem is available here: How to check does site collection exist by absolute url using CSOM in Sharepoint. Also I posted issue to PnP-Sites-Core in github: WebExtensions.WebExistsFullUrl returns true for non-existent sites from different managed path in Sharepoint 2013 on-premise. I checked this issue with the latest PnP version (which is 3.26.2010 at the moment of writing of this article) and found that this problem is fixed now by MS. I.e. code of WebExtensions.WebExistsFullUrl method is still the same:

public static bool WebExistsFullUrl(ClientRuntimeContext context, string webFullUrl)
    bool exists = false;
        using (ClientContext testContext = context.Clone(webFullUrl))
            testContext.Load(testContext.Web, w => w.Title);
            exists = true;
    catch (Exception ex)
        if (IsUnableToAccessSiteException(ex) || IsCannotGetSiteException(ex))
            // Site exists, but you don't have access .. not sure if this is really valid
            // (I guess if checking if URL is already taken, e.g. want to create a new site
            // then this makes sense).
            exists = true;
    return exists;

but now when we try to clone ClientContext in this line:

ClientContext testContext = context.Clone(webFullUrl))

and webFullUrl corresponds to non-existent site collection from another managed path (e.g. if context was created from the root site collection and we pass webFullUrl = it throws System.Exception (System.Net.WebException) now:

The remote server returned an error: (404) Not Found

while in earlier CSOM version it created ClientContext for the root site in this case. And as result WebExtensions.WebExistsFullUrl returns false as expected. So fortunately issue was fixed and we can call this method for checking existence of different site collections.

Friday, October 16, 2020

Problem in OTB groupstatus.aspx Sharepoint Online page and mobile browsers

In SPO there is one very useful OTB application layouts page groupstatus.aspx. Using this page we may redirect user to Notebook, Planner plan or Sharpeoint site associated with O365 group, e.g.


where {siteUrl} is url of the site associated with O365 group. And target may be "site", "notebook", "planner", etc. Full list may be found here: Generic URLs for all Office 365 Group connected workloads.
If we don't know url of the site associated with O365 group we may still use this page in context of the root tenant site and specify id of O365 group in additional query string parameter:


In this case it will still redirect you to the requested target. And this is exactly how we use this page in TW - we open it in context of the root tenant site and provide target=site and id of the group in query string.
However at the same time Sharepoint has own redirection system for mobile devices: when you visit some Sharepoint site from mobile device it will redirect to special application layouts page optimized for mobile view \_layouts\Mobile\mblwpa.aspx (see Overview of Mobile Pages and the Redirection System). According to documentation this redirection for mobile devices is performed via http module (SPRequestModule). It means that in ASP.Net pipeline it will happen before request will come to groupstatus.aspx page. So when we redirect user to the root tenant site on mobile browser:


Sharepoint will just redirect user to mblwpa.aspx page in this root site


and instead of seeing site associated with specified O365 group - user will see root tenant site. Will try to report this problem to MS using available communication channels. For now posted this problem on StackOverflow here.

Tuesday, October 13, 2020

How to free space on OS Windows drive by moving SoftwareDistribution folder to another disk via junction folder

If you are lucky owner of PC with 100Gb OS SSD disk or when OS disk on your server is not big enough then you may face with situation when there is no enough space anymore for installing updates, new tools, etc. In this post I will share solution which will help to free some space on OS drive.

Probably the first thing which you will try when will face with this problem is WinDirStat tool which allows to gather report of disk space usage per folders. The same report will most probably show that C:\Windows\SoftwareDistribution folder takes a lot of space. This folder is used by Windows Update for storing temporary files needed for installing new updates. As it is important Windows component you can’t just delete or move it and free space on OS drive. However what we can do is to create folder on another drive, move files there and create junction between C:\Windows\SoftwareDistribution and this folder. Here are needed steps:

1. Stop Windows Update service (Control Panel > Administration > Services)
2. Create new folder on another drive with enough space, e.g. D:\WinSoftwareDistribution. This is where files will be stored after all
3. Rename C:\Windows\SoftwareDistribution e.g .to C:\Windows\SoftwareDistribution.old
4. Create junction between folders:

mklink /j C:\Windows\SoftwareDistribution D:\WinSoftwareDistribution

5. Move files from C:\Windows\SoftwareDistribution.old to D:\WinSoftwareDistribution. After that you may go to C:\Windows\SoftwareDistribution junction and ensure that files appear there again
6. Run Windows Update service

After that your SoftwareDistribution will be located on another drive and you will have more space on your OS drive. Hope that this trick will help someone.

Thursday, October 1, 2020

How to enable Android App links in web-facing Sharepoint on-prem site

In previous article I showed how to enable universal links for iOS on Sharepoint on-prem site – see How to enable iOS universal links in web-facing Sharepoint on-prem site. In this post I will describe how to enable Android App links for web-facing Sharepoint on-prem site.

For Android App links the process is different. First of all we need to prepare special assetlinks.json file with predefined content:

  "relation": ["delegate_permission/common.handle_all_urls"],
  "target": {
    "namespace": "...",
    "package_name": "...",
    "sha256_cert_fingerprints": ["..."]

I won’t describe here details about content of assetlinks.json file – you may find this information on Android documentation sites. Next step is to place assetlinks.json file into .well-known subfolder of the root folder of our site. On Sharepoint we can create this folder in site’s IIS folder C:\inetpub\wwwroot\wss\VirtualDirectories\{SiteName}.

After that we need to go to IIS manager > choose Sharepoint site > right click .well-known sub folder and click “Convert to Application”:

When .well-known subfolder is ready and configured in IIS manager we may copy assetlinks.json file there.

Now we need to make assetlinks.json file available for anonymous access (otherwise it won’t be available for Android infrastructure) and ensure that server adds Content-Type: "application/json" http header in response. In order to do that add the following web.config file to .well-known subfolder:

<?xml version="1.0" encoding="UTF-8"?>
        <allow users="*" />
        <clear />
        <mimeMap fileExtension=".json" mimeType="application/json" />
            <remove name="JSONHandlerFactory" />

After that Android App links will be available on your web-facing Sharepoint on-prem site.

Tuesday, September 29, 2020

How to fix error could not load type 'Microsoft.SharePoint.Administration.DesignPackageType' error when import Microsoft.Online.SharePoint.PowerShell module

When you try to import Microsoft.Online.SharePoint.PowerShell module:

Import-Module -Name Microsoft.Online.SharePoint.PowerShell

you may get the following error:

Import-Module : Could not load type 'Microsoft.SharePoint.Administration.DesignPackageType' from assembly 'Microsoft.SharePoint.Client, Version=, Culture=neutral, PublicKeyToken=71e9bce111e9429c'.

also error can be the following:

Import-Module : Could not load type 'Microsoft.SharePoint.Client.Publishing.PortalLaunch' from assembly 'Microsoft.SharePoint.Client.Publishing, Version=, Culture=neutral, PublicKeyToken=71e9bce111e9429c'.

The main problem is that PowerShell tries to use on-prem CSOM assemblies (which have version instead of online (with version 16.1.x.x). There are some articles and posts which recommend to manually remove Microsoft.SharePoint.Client folders from GAC. It should not be done! Especially if you have on-prem Sharepoint installation on the same server – this installation may be broken because of that.

The reason of this error is conflicts between different versions of Sharepoint Online SDKs (SharePoint Client Components, SharePoint Online Management Shell). So correct way to solve this problem is the following:

1. Go to Control Panel > Programs and features and uninstall all instances of SharePoint Online Management Shell and SharePoint Client Components

2. Open PowerShell console and uninstall existing versions of Microsoft.Online.SharePoint.PowerShell module:

Uninstall-Module -Name Microsoft.Online.SharePoint.PowerShell -AllVersions

3. After that close PowerShell, open new session and install latest version of Microsoft.Online.SharePoint.PowerShell module:

Import-Module -Name Microsoft.Online.SharePoint.PowerShell

This time it should go without error.

Tuesday, September 22, 2020

How to enable iOS universal links in web-facing Sharepoint on-prem site

Universal links allow to open mobile app when user clicks it inside the site. I.e. if user has appropriate mobile app it will be opened instead of regular browser page which will give better user experience on mobile devices. In this post I will show how to configure universal links for iOS in web-facing Sharepoint on-premise site. At the same time I won’t write about all steps like building mobile app itself, getting its app id, etc – please refer to appropriate resources related with mobile apps development for that.

In order to enable universal links for iOS on your web-facing Sharepoint on-prem site the following steps have to be done:

  1. create AASA (Apple app site association) file. File name should be exactly “apple-app-site-association” without extension. It should contain data in json format with predefined structure (see below)
  2. put it to the root folder of web-facing site
  3. ensure that AASA file is available for anonymous users (robots) with Content-Type = “application/json”

This is how AASA file may look like:

  "applinks": {
    "apps": [],
    "details": [ { "appID": "{appId}", "paths": ["*"] } ]

(use you app id instead of {appId} placeholder). When file is ready we need to copy it to the IIS folder of our Sharepoint site. Next step is to provide anonymous access and set correct content type. It can be done by modifying web.config of Sharepoint site – we need to add new location element here under configuration tag (usually in web.config of Sharepoint site there are many others location elements – so just put them together):

 <location path="apple-app-site-association">
        <allow users="*" />
        <mimeMap fileExtension="." mimeType="application/json" />

It will make AASA file available for anonymous users and will set content type to “application/json”.

The last step is to verify that everything is configured properly. It can be done on AASA validator. Result should look like this:

At the end you should have universal links enabled for your web-facing Sharepoint on-prem site.

Update 2020-10-01: guide for enabling Android App links on Sharepoint site is available here: How to enable Android App links in web-facing Sharepoint on-prem site.

Monday, September 21, 2020

Call AAD secured Azure functions from Postman

It is often needed to test our AAD secured Azure functions from client applications like Postman. In one of my previous articles I showed how to call secured Azure functions from C# and PowerShell (see Call Azure AD secured Azure functions from C#). Let’s see now to to call them from Postman app – in this case we won’t need to write any code for testing our functions.

First of all we need to obtain access token. If app permissions are used we may use the same approach as in article mentioned above. For delegated user permissions you may go to SPFx page which calls Azure function, open browser Network tab, find call to some Azure function there and copy access token from Authorization header (value after “Bearer” part).

After that go to Postman and create new request using url of Azure function (it can be copied from Azure portal > Functions app) and correct http verb. On Autorization tab set Type = “Bearer Token” and copy access token to the Token field:

After that on Headers tab set correct content type if needed (e.g. application/json):

Then press Send. After that we should get results from AAD secured Azure function in Postman.

Wednesday, September 9, 2020

How to identify whether SPFx web part is running in web browser or in Teams client

As you probably know it is possible to add Sharepoint Online page as a tab to Team’s channel so it will be shown inside Teams: both when you access it from web browser via or from native client (desktop or mobile). It may be needed to identify from where exactly Teams are accessed in order to provide better user experience for this particular client (e.g. add extra css, use different caching mechanisms, etc). In order to do that we may inspect User-Agent header (navigator.userAgent in Typescript) for different clients. In the following table I summarized values of User-Agent header for mentioned scenarios:

Accessed fromUser agent

Desktop browser (Chrome)

Mozilla/5.0 (Windows NT 6.3; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36

Mobile browser (Chrome on Android) Mozilla/5.0 (Linux; Android 10; …) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.81 Mobile Safari/537.36
Teams web client in desktop browser (Chrome) Mozilla/5.0 (Windows NT 6.3; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.83 Safari/537.36 SPTeamsWeb
Teams web client in mobile browser (Safari on iPad). Chrome mobile browser on Android and Safari on iPhone are not supported browsers for Teams web client

Mozilla/5.0 (Macintosh; Intel Mac OS X …) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/13.1.1 Safari/605.1.15 SPTeamsWeb

Teams native desktop client

Mozilla/5.0 (Windows NT 6.3; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Teams/ Chrome/69.0.3497.128 Electron/4.2.12 Safari/537.36

Teams native mobile client (Android)

Mozilla/5.0 (Linux; Android 10; …) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/85.0.4183.81 Mobile Safari/537.36 TeamsMobile-Android

Teams native mobile client (iPhone)

Mozilla/5.0 (iPhone; CPU iPhone OS … like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Mobile/15E148 TeamsMobile-iOS

Teams native mobile client (iPad)

Mozilla/5.0 (iPad; CPU OS … like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Mobile/15E148 TeamsMobile-iOS

Those parts which allow to identify current client type are highlighted by bold font. I.e. if we have TeamsMobile-Android or TeamsMobile-iOS in User-Agent it means that SPFx is running in Teams native mobile client. If we have SPTeamsWeb – web part is running in web client.

Wednesday, September 2, 2020

Redirect traffic from single site collection in Sharepoint on-prem via URL rewrite IIS module

Suppose that you have Sharepoint on-prem web application with multiple site collections which have own urls and need to perform maintenance work on one of these site collections so others will remain working. In order to do that you may temporary redirect traffic from this site collection to external page (e.g. to maintenance page hosted outside of this site collection). Technically it can be done by adding URL rewrite rule to Sharepoint web app in IIS. Rule will look like this:

This rule will redirect traffic from site collection with url /test to external site This is the same rule in xml which is added to web.config of Sharepoint web app:

	<rule name="MyRule" enabled="true" stopProcessing="true">
	  <match url="test/*" />
	  <action type="Redirect" url="" appendQueryString="false" redirectType="Found" />

Note that if you perform temporary maintenance work on site collection then Redirect type should be set to Found (302) like shown above and not to Permanent (301). In case of permanent redirect browser will cache response and will redirect users from this site collection even after maintenance will be completed and rule will be removed from the site.

Friday, August 14, 2020

Camlex 5.2 and Camlex.Client 4.1: support for Membership and OffsetDays

Good news for Sharepoint developers who use Camlex library: new versions are released both for server and client object models. For client object model separate packages are available for Sharepoint online and on-premise:

Package Version Description
Camlex.NET.dll 5.2.0 Server object model (on-prem)
Camlex.Client.dll 4.1.0 Client object model (SP online)
Camlex.Client.2013 4.1.0 Client object model (SP 2013 on-prem)
Camlex.Client.2016 4.1.0 Client object model (SP 2016 on-prem)
Camlex.Client.2019 4.1.0 Client object model (SP 2019 on-prem)

In this release 2 new features were added: support for Membership element (many developers requested it so finally it is out) and OffsetDays. Main credits for this release go to Ivan Russo who made actual work with very good quality so I only needed validate PR, accept it and add few more unit tests. Then I merged new features to the client branch which keeps source code for client object model version (nowdays it is done manually as there are quite many changes between server and client object model versions).

1. Membership element support

Membership element allows to create CAML queries using different membership conditions. The following examples show how it can be done now with Camlex:

// SPWeb.AllUsers
var caml = Camlex.Query().Where(x => Camlex.Membership(
	x["Field"], new Camlex.SPWebAllUsers())).ToString();
var expected =
	"  " +
	"    " +
	"      " +
	"    " +
	"  ";

// SPGroup
var caml = Camlex.Query().Where(x => Camlex.Membership(
	x["Field"], new Camlex.SPGroup(3))).ToString();
var expected =
	"  " +
	"    " +
	"      " +
	"    " +
	"  ";

// SPWeb.Groups
var caml = Camlex.Query().Where(x => Camlex.Membership(
	x["Field"], new Camlex.SPWebGroups())).ToString();
var expected =
	"  " +
	"    " +
	"      " +
	"    " +
	"  ";

// CurrentUserGroups
var caml = Camlex.Query().Where(x => Camlex.Membership(
			x["Field"], new Camlex.CurrentUserGroups())).ToString();
var expected =
	"  " +
	"    " +
	"      " +
	"    " +
	"  ";

// SPWeb.Users
var caml = Camlex.Query().Where(x => Camlex.Membership(
	x["Field"], new Camlex.SPWebUsers())).ToString();
var expected =
	"  " +
	"    " +
	"      " +
	"    " +
	"  ";

2. OffsetDays support

OffsetDays attribute is used with Today element and allows to add or subtract number of days from today for constructing dynamic CAML queries based on current datetime offset. The following example shows how it can be done with Camlex:

string caml = Camlex.Query().Where(x =>
	x["Created"] > ((DataTypes.DateTime)Camlex.Today).OffsetDays(-1)).ToString();
const string expected =
	"  " +
	"    " +
	"        " +
	"        " +
	"           " +
	"        " +
	"    " +
	"  ";

Thank you for using Camlex and stay tuned for the further improvements.

Thursday, July 16, 2020

One problem with caching old web job binaries after Azure web job update with WAWSDeploy

WAWSDeploy is convenient console utility which allows to deploy a folder or a zip file to an Azure Website using WebDeploy (see Using this tool it is possible to automate tasks in your Azure deployment. E.g. you may upload Azure web job binaries (they should be packaged to zip file for that). However you should be aware of one potential problem.

Triggered web job binaries are uploaded to "app_data/jobs/triggered/{JobName}" path (full path will be /site/wwwroot/app_data/jobs/triggered/{JobName}). You may connect to your Azure App Service via FTP (connection settings may be copied from Azure portal > App service > Deployment center > FTP) and check that your web jobs folders are there

Note that you need to use FTP protocol (not FTPS) if you will connect e.g. via WinSCP tool.

The problem is the following: you may create zip package with web job binaries just by selecting all files for web jobs and adding them to archive – in this case there won’t be root folder in archive and all files will be located inside archive root. Alternatively you may select their parent folder (e.g. bin/Release) and add whole folder to archive – in this case archive will contain Release root folder.

So if 1st time you created archive using 1st method (by archiving files), then created another archive using 2nd method (by archiving folder) and uploaded this 2nd zip archive via WAWSDeploy – it will just create subfolder Release inside app_data/jobs/triggered/{JobName}. I.e. there will be 2 versions of web job: one in the app_data/jobs/triggered/{JobName} and another in app_data/jobs/triggered/{JobName}/Release. Azure runtime will use 1st found exe file which will be exe file from the root (not from Release sub folder) and will use that for running web job.

One way to avoid this problem is to always delete existing web job before to update it (you may check how to delete web job via PowerShell here: How to remove Azure web job via Azure RM PowerShell). In this case old binaries will be deleted so selected archiving method won’t cause problems during update.

Wednesday, July 15, 2020

How to remove Azure web job via Azure RM PowerShell

Azure web jobs allow to run your code by scheduler. Schedule is configured via CRON expression. This is convinient tool to perform repeated actions. Sometimes we need to remove them via PowerShell in automation process. In this article I will show how to remove Azure web job using Azure RM PowerShell cmdlets.

If you will try to google this topic you will most probably find Remove-AzureWebsiteJob cmdlet. The problem however is that this cmdlet is used for classic resources and you will most probably have problems using it (e.g. because of error Select-AzureSubscription -Default). So we need some other way to remove Azure web jobs – preferably by using Azure RM cmdlets. Fortunately it is possible – we may remove web job by using another cmdlet Remove-AzureRmResource. Here is syntax for removing triggered web job:

$resourceName = $appServiceName + "/" + $webJobName
Remove-AzureRmResource -ResourceGroupName $resourceGroupName -ResourceName $resourceName -ResourceType microsoft.web/sites/triggeredwebjobs

The important thing to note here is how ResourceName param is specified for web job (app service name/web job name) and ResourceType which is set to microsoft.web/sites/triggeredwebjobs. If you need to delete continuous web job you need to use microsoft.web/sites/continuouswebjobs resourceType.

Using this command you will be able to delete Azure web jobs via Azure RM PowerShell.

Monday, July 6, 2020

Run Azure function as Web job without code change

Sometime it may be needed to test existing Azure function as Web job (e.g. if you decide to move some long-running process from AF to web job). In order to do that you need to create App service first. After that create new console app project in Visual Studio, reference your Azure functions project as it would be regular class library and from Program.cs call necessary Azure function (it is possible since AFs are created as static methods of the class).

The problem however is that if you read some app settings in Azure function code – you most probably do it like that:

string foo = Environment.GetEnvironmentVariable("Foo", EnvironmentVariableTarget.Process);

But if you will just copy AF app settings to appSettings section of app.config of the new console project then above code will return empty string for any app setting. So in order to call our Azure function from console application and run it as web job we need to perform additional step: iterate through all app settings and fill environment variables for the process:

var appSettings = ConfigurationManager.AppSettings;
foreach (var key in appSettings.AllKeys)
 Environment.SetEnvironmentVariable(key, appSettings[key],

After that app settings will be successfully read and Azure function may run as web job without changing its code.

Thursday, July 2, 2020

Sharepoint MVP 2020

Yesterday I got email from MS about my MVP status renewal. This is 10th award for me in a row and I would like to say thanks to MS and to whole community which gives me motivation and energy to continue sharing my experience and solutions from every-day work:

Sharepoint and MS ecosystem in general evolves: now we switched to Modern Sharepoint, tight integration with Azure which becomes more and more powerful. New technologies require new skills and experience so I think it is crucial that we will continue sharing our knowledge with colleagues all over the world. Global pandemic which we faced with this year introduced new challenges for IT world: tools which we build became as crucial as never before since people switched to remote work. Performance and stability went to the 1st plan. From this perspective it is very important to take part in IT community life by answering forum questions, contributing to open source software, speaking on events, writing technical articles and so on. Hope that my contributing helped someone and would like to say thanks one more time to MS for recognizing it.

Monday, June 29, 2020

Problem with ReSharper unit test runner with NUnit 3.12: unit test is inconclusive

Some time ago we switched to NUnit 3.12 and NUnit3TestAdapter in Camlex library and noticed the following problem with ReSharper unit test runner: all unit tests were marked as “inconclusive” and didn’t run. At the same time built-in Visual Studio test runner worked. I used 2019 version of ReSharper when found this problem.

By searching the web I found similar problems from other people in JetBrains support site. In order to fix it I uninstalled 2019 version and installed latest available version of ReSharper JetBrains.ReSharperUltimate.2020.1.3. After that test become working again:

So if you will face with the same problem try to install last ReSharper version.

Friday, June 12, 2020

Problem with threads count grow when use OfficeDevPnP AuthenticationManager

Recently we faced with the following problem in our Azure function app: after some time of functioning Azure functions got stuck and the only way to make them work again was to restart them manually from Azure portal UI. Research showed that problem was related with threads count: because of some reason threads count permanently grew to 6K thread after which Azure function app became unresponsive:

We reviewed our code, made some optimizations, change some async calls to sync equivalents, but it didn’t fix the problem. Then we continued troubleshooting and found the following: in the code we used OfficeDevPnP AuthenticationManager in order to get access token for communicating with Sharepoint Online like that:

using (var ctx = new OfficeDevPnP.Core.AuthenticationManager().GetAppOnlyAuthenticatedContext(url, clientId, clientSecret))

Note that in this example new instance of AuthenticationManager is created each time when above code was executed. In our case it was inside queue-triggered Azure function which was called quite frequently. Elio Struyf (we worked over this problem together so credits for this finding go to him Smile) tried to make AuthenticationManager static – i.e. only one instance was created for the whole Azure function worker process:

public static class Helper
 private static OfficeDevPnP.Core.AuthenticationManager authMngr = new OfficeDevPnP.Core.AuthenticationManager();

 public static void QueueTriggeredFunc()
  using (var ctx = authMngr.GetAppOnlyAuthenticatedContext(url, clientId, clientSecret))

After that threads count got stabilized with avg 60 threads:

We reported this problem to OfficeDevPnP team and they confirmed that there is problem in AuthenticationManager which internally creates thread for monitoring access token’s lifetime, but this thread is not released when AuthenticationManager got recycled by garbage collector (also need to mention that AuthenticationManager itself was not disposable at the moment when this problem was found. We used OfficeDevPnP 3.20.2004.0 when found this problem). For now this workaround with making AuthenticationManager static was Ok and as far as I know currently OfficeDevPnP team works over correct fix for this problem. So hopefully it will be available soon globally.

Update 2020-06-15: from latest news I've got (thanks to Yannick Plenevaux :) ) OfficeDevPnP team addressed this issue in 202006.2 version and made AuthenticationManager disposable. So you may need to review your code in order to add using() around it.

Monday, June 8, 2020

Fix AccessToKeyVaultDenied error in Azure App service app settings which use KeyVault reference

Azure KeyVault is convenience safe storage for secrets (passwords, keys, etc.) which can be used in your apps instead of storing them in plain text in app settings. It adds number of advantages like access control, expiration policies, versioning, access history and others. However it has some gotchas which you should be aware. In this post I will describe one such gotcha.

First of all you need to configure access to key vault secret so your App service or Azure function will be able to read values from there. You may check how to do that e.g. here: Provide Key Vault authentication with an access control policy. After that go to App service > Configration and click Edit icon for the app setting which uses reference on KeyVault secret and check that there are no errors. Because even if you did everything correctly you may see Status = AccessToKeyVaultDenied and the following error description:

Key Vault reference was not able to be resolved because site was denied access to Key Vault reference's vault

In order to fix it try the following workaround:

  1. Delete app setting from UI
  2. Save changes
  3. Add the same app setting with KeyVault reference (i.e. with @Microsoft.KeyVault(SecretUri=…))
  4. Save changes again

After that if permissions are configured properly Status should be changed to Resolved:

and your app should be able to successfully resolve secret from KeyVault reference.

Monday, June 1, 2020

Upload WebJob to Azure App service with predefined schedule

As you probably know in Azure Portal it is possible to create Azure web jobs with Type = Triggered and specify CRON expression which will define schedule of the job. In this post I will describe how to create Azure web job with predefined CRON expression.

Before to start we need to create Azure app service in the tenant and prepare zip file with executable file for web job itself. Usually it can be created by archiving bin/Release (or bin/Debug) folder of the console project with web job logic. However there is one important detail: in order to set schedule of web job we need to add special file to the project called settings.job and define CRON expression inside this file. E.g. if we want our job to run every 15 minutes we need to define it like this:

{“schedule”: “* */15 * * * *”}

After that also go to File properties in Visual Studio and set "Copy to target directory value to “Copy Always”. After that compile project and create zip file from bin/Release (or bin/Debug) folder.

Next step is to create web job from zip file. It can be done by the following convenient command line util available on github: WAWSDeploy. With this tool web job can be created created e.g. from PowerShell by running the following command:

WAWSDeploy.exe publishSettings.txt /t "app_data\jobs\triggered\MyJob"

In this command publishSettings.txt is the file with publishing settings for your App service. It can be created by Get-AzureRmWebAppPublishingProfile cmdlet. This command will create new web job called MyJob based on provided zip file with predefined schedule.

Tuesday, May 19, 2020

Fix javascript heap out of memory error when running local SPFx web server via gulp

If you develop SPFx web part for Sharepoint and run local web server on dev env:

gulp serve –nobrowser

then you may face with famous “JaavaScript heap out of memory” error:

In order to avoid this error we need to increase nodejs heap size. On Windows it can be done by the following method:

  1. Run the following command in your terminal client:
    setx -m NODE_OPTIONS "--max-old-space-size=8192"

    (after that NOTE_OPTIONS should be added to PC > Advanced properties > Environment variables)
  2. Restart terminal session which is used for running gulp to ensure that new env variables will be applied.
  3. Run web server:
         gulp serve --nobrowser

After that your solution should be built without heap out of memory error.

Monday, May 18, 2020

Query limits in Azure Table storage API

As you probably know Table API used by Azure table storage is limited by the following operators which can be used in queries:

  • =
  • <>
  • >
  • <
  • <=
  • >=

We needed to run analogue of NOT IN Sql operator against RowKey column:

RowKey NOT IN (id1, id2, …, id_n)

where these ids were guids represented by strings. In order to do that with mentioned operators we needed to use not equal operator (<>) and construct query like that:

RowKey <> id1 AND RowKey <> id2 AND … RowKey <> id_n

The problem is that this list of guids didn’t have fixed size and could be potentially big. So it was interesting is there limit for max query length and if yes – what is exact value of this maximum query length.

For testing I created test app and generated conditions over guids like that (in addition to mentioned conditions on RowKey we needed to get groups from particular partition only – so this extra condition was added there as well. Exact partition name is not important – we only need to know that its length was 8 symbols – it will be used for overall query length calculation below):

var tableClient = storageAccount.CreateCloudTableClient(new TableClientConfiguration());

var table = tableClient.GetTableReference("MyTable");
var queryConditions = new List();
queryConditions.Add(TableQuery.GenerateFilterCondition("PartitionKey", QueryComparisons.Equal, "Test1234"));
int numOfConditions = 50;
for (int i = 0; i < numOfConditions; i++)
 queryConditions.Add(TableQuery.GenerateFilterCondition("RowKey", QueryComparisons.NotEqual, Guid.NewGuid().ToString().ToLower()));
string filter = CombineUsingAnd(queryConditions);
var query = new TableQuery().Where(filter);

var watch = System.Diagnostics.Stopwatch.StartNew();
var result = table.ExecuteQuery(query).ToList();

public static string CombineUsingAnd(List queryConditions)
 if (queryConditions == null || queryConditions.Count == 0)
  return string.Empty;
 string filter = string.Empty;
 foreach (string queryCondition in queryConditions)
  if (string.IsNullOrEmpty(queryCondition))

  if (string.IsNullOrEmpty(filter))
   filter = queryCondition;
   filter = TableQuery.CombineFilters(filter, TableOperators.And, queryCondition);

 return filter;

E.g. here how query looks like with condition on 50 guids:

((((((((((((((((((((((((((((((((((((((((((((((((((PartitionKey eq 'Teamwork') and (RowKey ne '60712ffe-061b-4a46-8402-01381177c4da')) and (RowKey ne '85694313-2776-485c-833d-ae78c185adba')) and (RowKey ne 'dac6f7fd-1198-486c-b3ea-e90337ee9711')) and (RowKey ne '5faf85cb-4662-4843-87f6-ac6dc20b04e1')) and (RowKey ne '8f524940-9822-46e1-b961-59a2e8fa51b4')) and (RowKey ne 'bacd765e-64d3-464a-a57e-bb30eeca0c2f')) and (RowKey ne '75ba6126-f529-4b30-ae41-882f2b223bff')) and (RowKey ne '5e9950ac-e611-4d58-8c91-8fcf16f2886a')) and (RowKey ne '8f08bd7b-7d6e-4c21-8b94-d5ca2a15f849')) and (RowKey ne 'a965f6f6-f03c-4b4a-91be-98481a35ba11')) and (RowKey ne '5858de75-addb-4f5a-acc1-6f9f55078b4f')) and (RowKey ne '9ada1d04-dd03-4b1f-ae26-bc51b7f35ec3')) and (RowKey ne 'cbb976e7-87c9-4b83-8007-355f3ce1061d')) and (RowKey ne '59e06da8-f343-4a98-9382-0bbf1ce21d3f')) and (RowKey ne 'ea0763d9-cb64-4750-a7e0-c1869ca82a15')) and (RowKey ne '926106f9-bf64-42db-b27f-be9bcb45907e')) and (RowKey ne '3cdcaf1e-ff36-4243-b865-9424e1f46427')) and (RowKey ne '8fcfc4ca-7510-4fa6-95d7-3ff2999de509')) and (RowKey ne '21b04d3a-32e0-4a59-900a-30babfa8fbd2')) and (RowKey ne 'df181ee8-4dc0-4232-b2bc-7e13dc8f2abf')) and (RowKey ne '6a0038d0-40a4-4a11-b626-9850b8178caa')) and (RowKey ne 'ffb78a96-85c9-4648-aba7-3baa56be6c1f')) and (RowKey ne '0c56afd8-164e-4574-9054-8824e6fc1969')) and (RowKey ne '51d95602-cfee-40bf-b9e3-b0f935083742')) and (RowKey ne 'f286d1fb-b9bb-4850-a911-58a77ffc7991')) and (RowKey ne '50280089-e61b-4874-8f99-8dc437f74181')) and (RowKey ne '05f72847-3936-4efc-9902-540a9d9c5ca8')) and (RowKey ne '066fcbe4-afee-46ff-b6d2-18e072f850a0')) and (RowKey ne '602f7cd4-7697-49ca-af51-e96c70eefed5')) and (RowKey ne '6858f832-a58e-4bcb-b57b-7c40d32cbcb6')) and (RowKey ne 'f9e2cc4f-a579-46f4-820b-e7d8ab9711c8')) and (RowKey ne 'cbb68c38-7e73-463b-8839-bef040803c37')) and (RowKey ne '1bf0d7ff-ece6-4ef1-9d48-ae424feea108')) and (RowKey ne 'de1dbdbf-8a75-40b4-bb9b-81e95ff4a542')) and (RowKey ne '31f5c94e-3eea-4c5a-8e60-8899eddc7829')) and (RowKey ne '94bc24a8-28e4-4b6d-965e-9360104bff21')) and (RowKey ne '93d7f84e-eb4b-4c7d-826b-196e8b46255e')) and (RowKey ne 'bd0d0989-7256-4ffc-8904-6135955b8aae')) and (RowKey ne '9c38865a-913c-468c-bd26-25e606f9b73a')) and (RowKey ne 'c2f9b899-76c1-4b04-aa68-fb3f903d81d4')) and (RowKey ne '9b478051-78b2-4a67-a25f-6cc09251ec33')) and (RowKey ne '908f80ea-e31f-4a85-9ff8-8cb80b601434')) and (RowKey ne '7775f8d4-2b9c-4903-a191-b54d214e380a')) and (RowKey ne '55b24a87-3259-4ca4-8869-f6d68f9c95c3')) and (RowKey ne '0476263d-cf8a-49b3-89f1-0bc8df9fd906')) and (RowKey ne 'f7a45c30-eba7-4944-b08b-a875d4b871e0')) and (RowKey ne '4136dc3c-10d3-4d58-afb9-23d8b422822e')) and (RowKey ne 'b6a40f8e-5284-4cbd-9356-2296df415650')) and (RowKey ne 'd63ecb62-5f48-4314-8a15-08c0b62ee835')) and (RowKey ne 'e3acdc4f-70a5-4a73-b023-1845ccf9ea51')

Testing results are summarized in the following table:

Num of conditions Filter length Query elpsed time
50 2876 12075
100 5726 10550
112 6410 10361
113 6467 10967
114 6524 Bad request
> 114   Bad request

So Table storage query maximum length is between 6467 and 6524 symbols. For us this knowledge was enough to proceed. Hope that this information will help someone.

Tuesday, May 12, 2020

How to run handler on ribbon open event and access ribbon elements via javascript in Sharepoint

Sometimes we need to run code which will be executed when Sharepoint ribbon is opened. In many cases ribbon is closed by default and you can’t access its elements in regular document.ready handler because ribbon elements are not created in the DOM yet. In this case you need to subscribe somehow on even and access elements there. This is how it can be done:

ExecuteOrDelayUntilScriptLoaded(function() {
  console.log("Ribbon is opened");
}, "sp.ribbon.js");

Interesting that even inside this handler you can’t access elements via jQuery selector because it doesn’t find these dynamically added elements. So e.g. the following code which tries to access New document tab in doclib will return 0:


In order to access these dynamically added ribbon elements we need to use pure javascript call document.getElementById. Here is the full code:

ExecuteOrDelayUntilScriptLoaded(function() {
  var el = document.getElementById("Ribbon.Documents.New");
  if (el) {
   console.log("New document tab is found");
}, "sp.ribbon.js");

Using this approach you will be able to access and manipulate ribbon elements via javascript.

Monday, April 20, 2020

Create Azure Notification Hub with configured Apple APNS and Google FCM using PowerShell and ARM templates

If you have Azure notification hub with configured Apple APNS and Google FCM and try to export ARM template of this notification hub you will find that APNS and FCM configuration won’t be included to this template. In order to add APNS and FCM you need to add the following properties to ARM template:

  • GcmCredential for FCM
  • ApnsCredential for APNS

If you use Token authentication type and Production endpoint for Apple APNS template will look like this:

    "$schema": "",
    "contentVersion": "",
    "parameters": {
        "hubName": {
            "type": "string"
        "namespaceName": {
            "type": "string"
        "googleApiKey": {
            "type": "string"
        "appId": {
            "type": "string"
        "appName": {
            "type": "string"
        "keyId": {
            "type": "string"
        "token": {
            "type": "string"
    "variables": {},
    "resources": [
            "type": "Microsoft.NotificationHubs/namespaces/notificationHubs",
            "apiVersion": "2017-04-01",
            "name": "[concat(parameters('namespaceName'), '/', parameters('hubName'))]",
            "location": "North Europe",
            "properties": {
                "authorizationRules": [],
    "GcmCredential": {
     "properties": {
     "googleApiKey": "[parameters('googleApiKey')]",
     "gcmEndpoint": ""
    "ApnsCredential": {
     "properties": {
      "appId": "[parameters('appId')]",
      "appName": "[parameters('appName')]",
      "keyId": "[parameters('keyId')]",
      "token": "[parameters('token')]",
      "endpoint": ""
            "type": "Microsoft.NotificationHubs/namespaces/notificationHubs/authorizationRules",
            "apiVersion": "2017-04-01",
            "name": "[concat(parameters('namespaceName'), '/', parameters('hubName'), '/DefaultFullSharedAccessSignature')]",
            "dependsOn": [
                "[resourceId('Microsoft.NotificationHubs/namespaces/notificationHubs', parameters('namespaceName'), parameters('hubName'))]"
            "properties": {
                "rights": [
            "type": "Microsoft.NotificationHubs/namespaces/notificationHubs/authorizationRules",
            "apiVersion": "2017-04-01",
            "name": "[concat(parameters('namespaceName'), '/', parameters('hubName'), '/DefaultListenSharedAccessSignature')]",
            "dependsOn": [
                "[resourceId('Microsoft.NotificationHubs/namespaces/notificationHubs', parameters('namespaceName'), parameters('hubName'))]"
            "properties": {
                "rights": [

In order to create new notification hub using this ARM template we need to call New-AzResourceGroupDeployment cmdlet and provide params specified in parameters section of the template:

$resourceGroupName = ...
$hubNamespaceName = ...
$hubName = ...
$googleApiKey = ...
$appId = ...
$appName = ...
$keyId = ...
$token = ...

New-AzResourceGroupDeployment -ResourceGroupName $resourceGroupName -TemplateFile ./template.json -hubName $hubName -namespaceName $hubNamespaceName -googleApiKey $googleApiKey -appId $appId -appName $appName -keyId $keyId -token $token

It will create new Azure notification hub with specified name in specified namespace with configured Apple APNS (Token authentication type) and Googke FCM.