Friday, December 27, 2019

Several problems when use Set-AzureADApplication cmdlet with AzureAD app with allowPublicClient = true

In order to be able to use username/password credentials authentication flow with AzureAD app it should have allowPublicClient property set to true. As described in documentation this property won’t affect those OAuth flows which use reply urls:

Specifies the fallback application type. Azure AD infers the application type from the replyUrlsWithType by default. There are certain scenarios where Azure AD cannot determine the client app type (e.g. ROPC flow where HTTP request happens without a URL redirection). In those cases Azure AD will interpret the application type based on the value of this property. If this value is set to true the fallback application type is set as public client, such as an installed app running on a mobile device. The default value is false which means the fallback application type is confidential client such as web app.

However setting it true has several side effects. One of them is that you can’t set this property via Set-AzureADApplication cmdlet as there is no such property (Update 2021-04-22: it is possible to change it using another cmdlets - see Change "Allow public client flows" property of Azure AD apps via PowerShell). There is another property “publicClient” and based on this discussion on github when you set it to true/false it also set to true/false “allowPublicClient”. But if you will try to run Set-AzureADApplication and try to specify API permissions via RequiredResourceAccess

Set-AzureADApplication -ObjectId … –RequiredResourceAccess $permissions -PublicClient $true

you will get error:

Property requiredResourceAccess.resourceAccess is invalid.

It is possible to set allowPublicClient to true from UI in Azure portal > Azure active directory > App registrations > app > Authentication > Default client type > Treat application as a public client = Yes:

But after that it won’t be possible to change API permissions via Set-AzureADApplication cmdlet. The following code:

$tokensApp = Get-AzureADApplication | Where-Object { $_.AppId -eq "..." }
$requiredResourceAccess = $tokensApp.RequiredResourceAccess
$graphPermissions = $requiredResourceAccess | Where-Object { $_.ResourceAppId -eq "00000003-0000-0000-c000-000000000000" }
# add Sites.Read.All delegated permission
$perm = New-Object -TypeName "Microsoft.Open.AzureAD.Model.ResourceAccess"-ArgumentList "205e70e5-aba6-4c52-a976-6d2d46c48043","Scope"
$graphPermissions.ResourceAccess.Add($perm)

Set-AzureADApplication -ObjectId $tokensApp.ObjectId -RequiredResourceAccess $requiredResourceAccess

will thrown the same exception:

Set-AzureADApplication : Error occurred while executing SetApplication
Code: Request_BadRequest
Message: Property requiredResourceAccess.resourceAccess is invalid.
Details: PropertyName - requiredResourceAccess.resourceAccess, PropertyErrorCode - GenericError
HttpStatusCode: BadRequest
HttpStatusDescription: Bad Request
HttpResponseStatus: Completed

Note that the same code works properly if allowPublicClient property is set to false. For now I posted this problem in github here. If you know any workarounds for this problem please share it in comments.

Friday, December 20, 2019

Configurable schedule of timer triggered Azure functions

As you probably know it is possible to create Azure functions which will be triggered automatically by schedule specified by CRON expression. In order to do that create timer-triggered Azure function and specify CRON expression like this:

public static class MyFunc
{
 [FunctionName("MyFunc")]
 public static void Run([TimerTrigger("0 */15 * * * *")] TimerInfo myTimer, TraceWriter log)
 {
  ...
 }
} 

Here we defined function which will run every 15 minutes. But with this approach if you will want to change schedule on production env you will need to recompile the package and re-upload it to Azure. It is more convenient to create schedule configurable. In order to do that use the following syntax:

public static class MyFunc
{
 [FunctionName("MyFunc")]
 public static void Run([TimerTrigger("%MyFuncSchedule%")] TimerInfo myTimer, TraceWriter log)
 {
  ...
 }
}

At the same time in AF app settings you need to have MyFuncSchedule app setting:

{
  "IsEncrypted": false,
  "Values": {
    "MyFuncSchedule": "0 */15 * * * *",
  },
  "Host": {
    "CORS": "*"
  }
}

In this case it will be possible to change timer schedule for your timer-triggered Azure function without re-compiling and re-uploading the package.

Note 1. App setting with schedule should be added before to upload package with compiled Azure functions. Otherwise function won’t run.

Note 2. When you change Azure function app settings – currently running instances will be terminated. After that new instances will use updated app settings.

Monday, December 9, 2019

Configure permissions for Azure Function app to access secrets from Azure Key vault

In order to be able to use Azure Key vault from Azure functions at first you need to grant permissions to Azure Function app to read data (in our example we will use Secrets i.e. passwords, app secrets, etc. But you also may use the same technique to access keys and certificates which also may be stored there) from Azure Key vault. At first you need to create System assigned identify for your Azure function from Platform features > Identify:

On this page under System assigned tab set status to On:

After that go to Azure Key vault (if you don’t have it yet than create it first) and select Access policies > Add Access Policy. In opened page select Secret permissions > Get:

(if you store keys or certificates in Key vault you have to select appropriate Key or Certificate permissions).

In Select principal choose name of your Azure Function app. Principal will be available in this field only after creation of Function app principal which we made above.

After that your Azure functions will be able to read values from Azure Key vault. Note that you have to keep it in the following format in app settings:

@Microsoft.KeyVault(SecretUri=https://{key-vault-name}.vault.azure.net/secrets/{secret-name}/{id})

Then you may just read this param from app setting and it will be automatically expanded to the actual secret value stored in Key vault.

Friday, December 6, 2019

Provision language specific embedded resources with Sharepoint project type in Visual Studio

In my previous posts I showed how to provision embedded resources automatically with wsp provisioning:

Provision and automatic update of embedded resources to Resources folder in Sharepoint hive

Provision and automatic update of embedded resources to App_LocalResources folder under Sharepoint Template/Layouts or Template/ControlTemplates sub folders

In this post I will describe how to include language specific embedded resources to wsp using standard Sharepoint project type in Visual Studio.

Let’s create new empty Sharerpoint project, add Resources mapped folder, add Test.resx file there and set it as Embedded resource:

After that let’s use mentioned solution from this post in order to include Test.resx to wsp package (by default Visual Studio adds only those resx files which are set as Content). If we will publish wsp package now it will look like this:

Now let’s add language specific resource Test.ru-RU.resx:

and also add it to wsp using the same technique as for default resource file Test.resx. Now our wsp will contain both resx files which will be provisioned to {hive}/Resources folder:

But this is not enough. When we add language specific embedded resource to solution Visual Studio produces additional assembly xx-XX\{AssemblyName}.resources.dll in output folder (in our example ru-RU\SharePointProject5.resources.dll). This additional assembly should be installed to the GAC together with basic assembly – otherwise code will use only default resources from Test.resx. In order to do it we need to add this assembly to Additional assembles list in Package > Advanced > Additional assembles > Add existing assembly:

(Don’t forget to add language identifier in Location field before assembly name – i.e. ru-RU\SharePointProject5.resources.dll, but not just SharePointProject5.resources.dll. If you will have several additional languages there will be several *.resources.dll assemblies for each of them. By adding language identifier we instruct Visual Studio to put them to appropriate subfolders inside wsp package).

After that both language specific resx file and additional resources.dll assembly will be added to wsp and will be installed automatically during wsp provisioning:

Thursday, December 5, 2019

One reason for Graph API call failure when it is done under delegated permissions

As you probably know you may call Graph API user app-only permissions and user delegated permissions. Here is example of authentication provider which can be used for calling Graph API under delegated permissions (using username and password):

public class AzureAuthenticationProviderDelegatedPermissions : IAuthenticationProvider
{
 public async Task AuthenticateRequestAsync(HttpRequestMessage request)
 {
  var delegatedAccessToken = await GetGraphAccessTokenForDelegatedPermissionsAsync();
  request.Headers.Add("Authorization", "Bearer " + delegatedAccessToken);
 }

 public async Task<string> GetGraphAccessTokenForDelegatedPermissionsAsync()
 {
  string clientId = ...;
  string userName = ...;
  string password = ...;
  string tenant = ...;
  
  var creds = new UserPasswordCredential(userName, password);
  var authContext = new AuthenticationContext(string.Format("https://login.microsoftonline.com/{0}", tenant));
  var authResult = await authContext.AcquireTokenAsync("https://graph.microsoft.com", clientId, creds);
  return authResult.AccessToken;
 }
}

However when you call Graph API with delegated permissions you may get the following error:

AADSTS7000218: The request body must contain the following parameter: 'client_assertion' or 'client_secret'.

The reason may be that app which app id is used for authentication is Default client type is set to private, i.e. “Treat application as a public client” set to No:

In order to fix it set Default client type to Public (set “Treat application as a public client” to Yes).

Wednesday, November 27, 2019

How to specify BindingRedirects in Azure functions app settings via ARM template in automatic provisioning scenarios

If you face with famous problem with Newtonsoft.Json version mismatch in Azure functions (when Azure functions SDK requires one specific version and e.g. OfficeDevPnP requires other – you get error in runtime that Newtonsoft.Json.dll of version x.x.x.x is not found). There is solution for this issue posted here: Performing a binding redirect in Azure Functions. With this solution we have to add new app setting for Azure function which is string representation of JSON array:

{
    "BindingRedirects": "[ { \"ShortName\": \"Newtonsoft.Json\", \"RedirectToVersion\": \"11.0.0.0\", \"PublicKeyToken\": \"30ad4fe6b2a6aeed\" } ]"
}

It works but if you use automatic provision of Azure functions using template json file (see Azure Resource Manager template functions) and will want to specify it there like this:

{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "BindingRedirects": {
            "type": "string",
            "defaultValue": "[{ \"ShortName\": \"Newtonsoft.Json\", \"RedirectToVersion\": \"12.0.0.0\", \"PublicKeyToken\": \"30ad4fe6b2a6aeed\" }]"
        }
    }
}

you will see the following error:

Error occured when try to create Function app (error code 1):
Error: Code=InvalidTemplate; Message=Deployment template language expression evaluation failed: 'The language expression '{ "ShortName": "Newtonsoft.Json", "RedirectToVersion": "12.0.0.0", "PublicKeyToken": "30ad4fe6b2a6aeed" }
' is not valid: the string character '{' at position '0' is not expected.'. Please see https://aka.ms/arm-template-expressions for usage details.

The problem here is that BindingRedirects app setting use square brackets which at the same time are used by ARM template engine for specifying calculated values. Solution for this problem is to escape square brackets by double them i.e. instead of [ use [[ and instead of ] use ]]. Final version will look like this:

{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "BindingRedirects": {
            "type": "string",
            "defaultValue": "[[{ \"ShortName\": \"Newtonsoft.Json\", \"RedirectToVersion\": \"12.0.0.0\", \"PublicKeyToken\": \"30ad4fe6b2a6aeed\" }]]"
        }
    }
}

After that provision of ARM template for Azure functions will go well.

Monday, November 25, 2019

Authentication when publish nuget package to internal Azure packages feed

If you  want to use Azure packages feed for storing internal packages of your organization (https://{tenant}.pkgs.visualstudio.com) you will have to use nuget client as usual and authenticate yourself against Azure packages feed using your Azure AD credentials. However by default nuget.exe doesn’t support Azure authentication: it will ask you to enter username and password inside console Windows which will fail (it will just ask you to enter username and password again and again).

In order to enable Azure AD authentication with nuget client you will need to install authentication provider first. This page contains instructions how to do that: Azure Artifacts Credential Provider. Basically you need to download installcredprovider.ps1 PowerShell script to local machine and run it with “-AddNetfx” param:

installcredprovider.ps1 -AddNetfx

Note that “-AddNetfx” switch param is mandatory in most cases – otherwise credentials providers will be added only to .Net core and won’t work in regular Windows console client or PowerShell window.

After it will be done when you will run command to publish nuget package to Azure:

nuget push -Source "MySource" -ApiKey az PackageName.nupkg

it should show standard authentication popup window where you will be able to enter your Azure AD credentials. And after that publish to Azure nuget packages should succeed (of course if you have permission to push packages to this feed).

Tuesday, November 12, 2019

Why you should be careful with /groups/{id}/photo and /users/{id}/photo endpoints in MS Graph or unintentional getting photos of big sizes in Graph

As you probably know we may retrieve groups and users photos via Graph API using the following endpoints (see Get photo):

GET /users/{id | userPrincipalName}/photo/$value
GET /groups/{id}/photo/$value

Note however that Graph stores photos of different sizes:

The supported sizes of HD photos on Office 365 are as follows: 48x48, 64x64, 96x96, 120x120, 240x240, 360x360, 432x432, 504x504, and 648x648.

and if we use default /photo endpoint it will return biggest available photo, i.e. 648x648. Depending on your scenario it may not be what you actually need as if you just need to display group logo or group member photo smaller images e.g. of 64x64 pixels size will be enough. And if you work with big photos of 648x648 size and reduce its size by css – page size may be very big and browser memory consumption will be much bigger:

So instead of using default /photo endpoint in MS Graph you may consider using of endpoints which return photo of specified smaller size:

GET /groups/{id}/photos/64x64//$value

or with Graph client library in .Net app:

var stream = Task.Run(async () =>
{
 var photo = await graphClient.Groups[groupId].Photos[Constants.GRAPH_IMAGES_CACHE_PHOTO_ID].Content.Request().GetAsync();
 return photo;
}).GetAwaiter().GetResult();

This code will return small image which may be more appropriate for your scenario:

Compare also sizes for big 648x648 image and small 64x64 image: 10Kb vs 2Kb.

Thursday, November 7, 2019

Quick way to upload user photo for making it available through Graph API /user/{id}/photo endpoint

If you want to test fetching of users photos via Graph API you may need to upload this photos so they will become available through Graph endpoints (see Get photo):

GET /me/photo/$value
GET /users/{id | userPrincipalName}/photo/$value
GET /groups/{id}/photo/$value

Of course you may do that programmatically using PUT requests like shown here:

PUT /me/photo/$value
PUT /users/{id | userPrincipalName}/photo/$value
PUT /groups/{id}/photo/$value

but if you need to do it quickly without developing tool for that you may use standard UI and upload photo from user’s Delve profile page:

1. Click icon with your account in top right corner and select My Office profile:

2. You will be redirected to your accounts Delve profile page:

3. On this page click “Upload a new photo” icon and upload photo of the user.

After that it will be possible to get users photos via Graph endpoints including photos with different sizes.

Friday, October 11, 2019

Problem with creating new content database for web application in Sharepoint and Full recovery mode

If you want to create new content database for web application using New-SPContentDatabase cmdlet you should be aware of one side effect related with this action: new content database will be created with Full recovery mode by default even if your basic content database has Simple recovery mode:

With Full recovery mode transaction log will grow quite fast until you will do full backup. It may be significant consideration for large content databases. So before to start using it you may want to change recovery mode of additional content database to Simple. If you moved existing site collection there using Move-SPSite – transaction log already may be quite big. So after changing recovery mode you may also need to shrink log file.

Friday, October 4, 2019

How to test and debug Timer triggered Azure functions running locally on localhost

With Azure function project type we may create many types of Azure functions. One of them is Timer triggered Azure function which is triggered by scheduler (CRON expression is specified during function creation and may be modified later in Azure function attribute):

When you will run this function locally (on locahost) you will notice thatt it won’t be listed under

http://localhost:7071/api/*

address together with Http triggered Azure function. Runtime will call your function automatically when schedule time will come. But this is not very convenient since during development you would probably like to have possibility to call it synchronously on demand.

In order to call Timer triggered Azure function you may use Postman tool. You have to construct URL little bit different from Http triggered functions: instead of /api you should send POST request to http://localhost:7071/admin/functions/* base url and send the following request body:

{
    "input": ""
}

Using this technique you may debug your Timer triggered Azure functions synchronously.

Wednesday, October 2, 2019

Different nuget packages of Camlex.Client library for different Sharepoint versions

There is excited news for those Sharepoint developers which use Camlex in their work: now Camlex.Client library (Camlex version for CSOM) is available for each Sharepoint version as separate nuget package:

Sharepoint versionNuget package
2013 on-prem Camlex.Client.2013
2016 on-prem Camlex.Client.2016
2019 on-prem Camlex.Client.2019
SPO Camlex.Client.dll

Before it was only Sharepoint Online version Camlex.Client.dll (see here). May be it is more logical to call it package for SPO version Camlex.Client.Online but since this package exists quite a long time I left it to keep backward compatibility.

So now for targeting correct Sharepoint version you may just add appropriate nuget package to your project and work with it. Each package contains Camlex.Client.dll assembly which is compiled using appropriate versions of CSOM (Microsoft.SharePoint.Client.dll and Microsoft.SharePoint.Client.Runtime.dll). This approach is similar to those which is used e.g. in OfficeDevPnP which uses different nuget packages for each Sharepoint version.

This feature is well wished feature and was requested many times so probably many developers will find it useful. The only reason which prevented me to do it earlier is that it will increase maintenance time and as I’m the only developer who maintains the project it is quite significant. But now time has come and now with each new release all 4 nuget packages for Camlex.Client will be updated.

Friday, September 20, 2019

Fix The EntityContainer name must be unique EF error in Sharepoint

If you use Entity Framework in Sharepoint (I described one useful technique which simplifies usage of EF in Sharepoint in this article: Build connection string for Entity Framework programmatically via code or use Entity Framework in Sharepoint Timer jobs) you may face with the following error:

The EntityContainer name must be unique. An EntityContainer with the name '…' is already defined

Although in your project (class library which you install to GAC for using it in Sharepoint) there is single entity container with specified name.

The problem is that entity container name appears to be unique within project boundaries. I.e. if you have another project which is compiled and installed to the GAC and loaded to the same app domain and which uses the same metadata for Entity Framework data model like:

res://*/DataModel.csdl|res://*/DataModel.ssdl|res://*/DataModel.msl

and then install another assembly to the GAC with the same EF data model metadata – you will get the above error when second project will be loaded to Sharepoint site’s app domain. This is not that obvious that you have to use unique name for EF entity containers across all assemblies used in Sharepoint. Solution for this problem is to rename EF data model in second project so it will have unique name:

res://*/Project2DataModel.csdl|res://*/Project2DataModel.ssdl|res://*/Project2DataModel.msl

After that your code will start working in Shrepoint.

Thursday, September 12, 2019

Use Office 365 connected groups in Yammer

Yammer platform allows you to integrate your Yammer groups to Office 365. Technically it means that each time when new Yammer group will be created it will be also created in Azure AD of the tenant of this Yammer network. More information about Yammer and O365 groups can be found in this article: Yammer and Office 365 Groups. This article also contains information how to enable Office 365 connected groups in Yammer and which conditions should be met before you will be able to enable it:

  • You must enforce Office 365 identity for Yammer users. When you first enforce Office 365 identity there is a seven-day trial period, after which the Status of your Office 365 Identity Enforcement changes to Committed.
  • Your Yammer network must be in a 1:1 network configuration. This means you have one Yammer network that is associated with one Office 365 tenant.

Note that currently it is possible to enforce Office 365 identity for Yammer users without 7 days trial i.e. right away.

In order to enable Office 365 connected groups in Yammer it you need to login to your Yammer network with administrator account and go to Network settings (gear icon in top left) > Network Admin > Security setting. On this page at first we need to enforce Office 365 identity for Yammer users:

Once it’s status will be changed to Committed after some time “Office 365 Connected Yammer Groups” setting will be changed to Enabled:

Documentation above says that it takes up to 24h to change status to Enabled. But in my case it was changed quite fast: almost immediately after enforcing of Office 365 identity for Yammer users.

Let’s see what happens when “Office 365 Connected Yammer Groups” setting is disabled for your Yammer network. When you create new Yammer group:

this group won’t be found in Azure AD of the Yammer network’s tenant:

After Office 365 Connected Yammer Groups have been enabled Yammer groups will appear in Yammer:

So you will be able to use all advantages of O365 connected groups. Note that for those Yammer groups which already exist when O365 connected groups had been enabled also appropriate O365 groups will be created after some time. Documentation says that it may take up to 1 week:

After about 1 week, existing eligible groups will be converted to Office 365 groups.

but for me it also happened quite fast within 1 hour or so.

Wednesday, September 11, 2019

Camlex 5.1.1 and Camlex.Client 3.2.1 are released

Today I’ve released new versions of Camlex library: Camlex 5.1.1 and Camlex.Client 3.2.1 (for Sharepoint client object model). In these versions reverse engineering support has been added for Includes/NotIncludes operations (support for Includes/NotIncludes operations was added in Camlex 5.1 and Camlex.Client 3.2). This is common practice currently that at first basic feature is released and then reverse engineering support is added into next minor release. Also Camlex Online service was updated with new version.

Primary goal of reverse engineering is to power Camlex Online web site where developers may convert plain CAML query to C# code with Camlex. I.e. it simplifies usage of Camlex for developer who doesn’t familiar with its syntax yet (also there should not be those nowadays Smile ). I.e. you may enter CAML query like:

<Query>
  <Where>
      <Includes>
        <FieldRef Name="Title" />
        <Value Type="Text">Hello</Value>
      </Includes>
  </Where>
</Query>

and it will convert to to the following C# code:

Camlex.Query().Where(x => ((object)(string)x["Title"]).Includes((object)"Hello")) 

Tuesday, September 3, 2019

Problem with not unique web ids for Sharepoint Online web sites created with Fast Site Collection creation

When you create modern Team or Communication sites which use Fast Site Collection creation (not only from UI but also e.g. using New-PnPUnifiedGroup cmdlet) you may face with unexpected surprise: web id of the root sites (SPWeb.ID) of different site collections created this way may be equal!

Connect-PnPOnline https://{mytenant}.sharepoint.com/sites/site1
Get-PnPWeb
Title             ServerRelativeUrl        Id

-----             -----------------        --

site1             /sites/site1             {guid1}


Connect-PnPOnline https://{mytenant}.sharepoint.com/sites/site2
Get-PnPWeb
Title             ServerRelativeUrl        Id

-----             -----------------        --

site2             /sites/site2             {guid1}

This may be crucial if you have custom features which depend on web id. At the same time site collection ids appears to be different (SPSite.ID). Which means that if you have custom feature which identifies sites by SPWeb.ID only it will be safer to modify it so it will use pair (SPSite.ID, SPWeb.ID) to identify sites.

Friday, August 30, 2019

How to force ItemUpdating/ItemUpdated events in event receivers for all list items in Sharepoint list via PowerShell

If you need to force ItemUpdating/ItemUpdated events in all event receivers attached to Sharepoint list you may go through each item one by one, click Edit item and then click Save. However if there are many list items it is better to use PowerShell script which will force these events for all list items one by one:

param(
    [string]$url,
    [string]$listTitle
)

$web = Get-SPWeb $url
$list = $web.Lists[$listTitle]
foreach ($item in $list.Items)
{
    $item.Update()
    Start-Sleep -Seconds 5
}

In this script we iterate trough all list items in the list and call SPListItem.Update() method. In turn it forces attached event receivers to generate ItemUpdating and ItemUpdated events. After each update we wait 5 seconds to make sure event receivers finish previous event before to handle next one. If this is not needed for your scenario or if event receivers work faster you may comment or decrease number of seconds to wait.

Thursday, August 29, 2019

Enumerate all event receivers attached to Sharepoint list via PowerShell

Sometimes we need to enumerate all event receivers attached to specific Sharepoint list. The simplest way to do that is to use PowerShell. The following PowerShell script shows all event receivers attached to specified list:

param( 
    [string]$url,
    [string]$listName
)

$web = Get-SPWeb $url
$list = $web.Lists[$listName]

foreach($eventReceiverDef in $list.EventReceivers)
{
    $eventInfo = $eventReceiverDef.Class + ", " + $eventReceiverDef.Assembly + " – " + $eventReceiverDef.Type
    Write-Host $eventInfo -ForegroundColor green
}

In order to run it specify url of the web site which contains the list and list title:

check.ps1 http://example.com MyList

Here is example of running this script for standard Discussino board list:

Microsoft.SharePoint.Portal.CommunityEventReceiver, Microsoft.SharePoint.Portal, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c - ItemAdding
Microsoft.SharePoint.Portal.CommunityEventReceiver, Microsoft.SharePoint.Portal, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c - ItemUpdating
Microsoft.SharePoint.Portal.CommunityEventReceiver, Microsoft.SharePoint.Portal, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c - ItemDeleting
Microsoft.SharePoint.DiscussionListEventReceiver, Microsoft.SharePoint,Version=15.0.0.0,Culture=neutral,PublicKeyToken=71e9bce111e9429c - ItemAdded
Microsoft.SharePoint.Portal.CommunityEventReceiver, Microsoft.SharePoint.Portal, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c - ItemAdded
Microsoft.SharePoint.DiscussionListEventReceiver, Microsoft.SharePoint,Version=15.0.0.0,Culture=neutral,PublicKeyToken=71e9bce111e9429c - ItemUpdated

Sunday, August 25, 2019

Internal mechanism of reply emails in Sharepoint discussion board

In Sharepoint you may create discussion board lists where users may create new discussion threads. When somebody writes reply into particular discussion author of this discussion receives email notification. In this post I will write about internal mechanism of these reply emails i.e. how they are implemented internally.

We may expect that reply notification emails in discussion boards are implemented via standard Sharepoint email alerts. However this is not the case. If you will check alerts list of parent web of discussion board list you will see that it will be empty (or it may contain alerts created in different list. Also it may contain alerts for discussion board but this is different story – I will write more about it below):

$web = Get-SPWeb http://example.com
$web.Alerts

I.e. authors of discussion board will still get email notifications on replies even when there are no alerts in the web. It mean that these reply emails are implemented via some different mechanism. Also it means that it is not possible to edit template of these emails by modifying Sharepoint alert templates. Users may still use OTB Sharepoint alerts and subscribe themselves to events in discussion board list: by clicking three dots near discussion thread subject and selecting Alert me link:

In this case real alert will be created and web.Alerts collection will contain it. This alert will be customizable i.e. it will be possible to modify it’s template by editing discussion board alert template. However still it will be different alert from reply notification email: if author of discussion will subscribe him or herself on discussion board event this way then author will get 2 alerts – one as reply notification and another as OTB alert.

So how reply email notifications are implemented then? If it is not OTB alert it may be implemented rather via workflow or via event receiver. If we will check list of workflows for discussion board we will see that it is empty. So only event receivers remain. Let’s try to execute the following PowerShell script which lists all event receivers for specified list:

param( 
    [string]$url,
    [string]$listName
)

$web = Get-SPWeb $url
$list = $web.Lists[$listName]

foreach($eventReceiverDef in $list.EventReceivers)
{
    $eventInfo = $eventReceiverDef.Class + ", " + $eventReceiverDef.Assembly + " – " + $eventReceiverDef.Type
    Write-Host $eventInfo -ForegroundColor green
}

For discussion board it will show the following event receivers:

Microsoft.SharePoint.Portal.CommunityEventReceiver, Microsoft.SharePoint.Portal, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c - ItemAdding
Microsoft.SharePoint.Portal.CommunityEventReceiver, Microsoft.SharePoint.Portal, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c - ItemUpdating
Microsoft.SharePoint.Portal.CommunityEventReceiver, Microsoft.SharePoint.Portal, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c - ItemDeleting
Microsoft.SharePoint.DiscussionListEventReceiver, Microsoft.SharePoint,Version=15.0.0.0,Culture=neutral,PublicKeyToken=71e9bce111e9429c - ItemAdded
Microsoft.SharePoint.Portal.CommunityEventReceiver, Microsoft.SharePoint.Portal, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c - ItemAdded
Microsoft.SharePoint.DiscussionListEventReceiver, Microsoft.SharePoint,Version=15.0.0.0,Culture=neutral,PublicKeyToken=71e9bce111e9429c - ItemUpdated

DiscussionListEventReceiver basically updates LastReplyBy field of the parent discussion board and doesn’t do other actions. So let’s check CommunityEventReceiver. If we will check it’s code via decompiler we will see that in all post event handler methods (ItemAdded, ItemUpdated, ItemDeleted) it calls internal method HandleEvent which in turn calls EventCache.Instance.HandleChange() method:

public sealed class CommunityEventReceiver : SPItemEventReceiver
{
 public CommunityEventReceiver()
 {
 }

 private void HandleEvent(SPItemEventProperties properties)
 {
  bool eventFiringEnabled = base.EventFiringEnabled;
  try
  {
   base.EventFiringEnabled = false;
   using (SPMonitoredScope sPMonitoredScope = new SPMonitoredScope("CommunityEventReceiver::HandleEvent"))
   {
    EventChangeRecord eventChangeRecord = null;
    SPSecurity.RunWithElevatedPrivileges(() => {
     using (SPSite sPSite = new SPSite(properties.Web.Site.ID))
     {
      using (SPWeb sPWeb = sPSite.OpenWeb(properties.Web.ID))
      {
       eventChangeRecord = EventCache.Instance.HandleChange(sPWeb, properties);
      }
     }
    });
    if (eventChangeRecord != null && eventChangeRecord.SocialPostCreationData != null)
    {
     FeedNotificationUtils.AddSocialPostNotification(properties.Web, eventChangeRecord.SocialPostCreationData);
    }
   }
  }
  finally
  {
   base.EventFiringEnabled = eventFiringEnabled;
  }
 }

 public override void ItemAdded(SPItemEventProperties properties)
 {
  this.HandleEvent(properties);
 }

 public override void ItemDeleted(SPItemEventProperties properties)
 {
  this.HandleEvent(properties);
 }

 public override void ItemUpdated(SPItemEventProperties properties)
 {
  this.HandleEvent(properties);
 }

 ...
}

(There are also pre events ItemAdding, ItemUpdating, ItemDeleting but they are not relevant to this post). Let’s now see what happens inside EventCache.Instance.HandleChange. Among with other actions it iterates through internal handlers collection and calls HandleEvent method for each handler in this collection:

public EventChangeRecord HandleChange(SPWeb web, SPItemEventProperties properties)
{
 ...
  BaseCommunityEventHandler[] baseCommunityEventHandlerArray = this.handlers;
  for (int i = 0; i < (int)baseCommunityEventHandlerArray.Length; i++)
  {
   BaseCommunityEventHandler baseCommunityEventHandler = baseCommunityEventHandlerArray[i];
   if (baseCommunityEventHandler.HandledTemplateType == (int)list.BaseTemplate || baseCommunityEventHandler.HandledTemplateType == BaseCommunityEventHandler.HandleAllTemplateTypes)
   {
    baseCommunityEventHandler.HandleEvent(properties, eventChangeRecord);
   }
  }
  ...
 }
 ...
}

Now let’s see what exact handlers are added to this collection:

private void InitializeHandlers()
{
 BaseCommunityEventHandler[] discussionListCommunityEventHandler = new BaseCommunityEventHandler[] { new DiscussionListCommunityEventHandler(), new CategoriesListCommunityEventHandler(), new ReputationCommunityEventHandler(), new MembersListCommunityEventHandler(), new BadgesListCommunityEventHandler(), new CommunityNotificationsEventHandler() };
 this.handlers = discussionListCommunityEventHandler;
}

So there are quite many handlers which implement different features of community sites (reputations, membership, bages, etc). One of them is CommunityNotificationsEventHandler – this is exact handler which sends email notification on reply from discussion board:

internal class CommunityNotificationsEventHandler : BaseCommunityEventHandler
{
 internal override void HandleEvent(SPItemEventProperties properties, EventChangeRecord record)
 {
  ...
  SPList list = record.GetList("properties");
  if (record.EventType == SPEventReceiverType.ItemAdded)
  {
    FeedNotificationUtils.SendEmailNotificationOnReply(record.Web, sPListItem, listItem);
  }
  ...
 }
}

It calls internal FeedNotificationUtils.SendEmailNotificationOnReply() method which sends actual email:

internal static class FeedNotificationUtils
{
 ...
 public static bool SendEmailNotificationOnReply(SPWeb communityWeb, SPListItem topic, SPListItem reply)
 {
  bool flag = false;
  try
  {
   SPUser author = FeedNotificationUtils.GetAuthor(communityWeb, topic);
   if (FeedNotificationUtils.ShouldSendReplyNotification(communityWeb, reply, author))
   {
    UserProfile userProfile = CommonFeedNotificationUtils.GetUserProfile(communityWeb, author);
    if (userProfile != null && (userProfile.get_EmailOptin() & 64) == 0)
    {
     UserProfileApplicationProxy proxy = UserProfileApplicationProxy.GetProxy(CommonFeedNotificationUtils.GetServiceContext(communityWeb));
     string mySitePortalUrl = proxy.GetMySitePortalUrl(ServerApplication.get_CurrentUrlZone(), userProfile.get_PartitionID());
     using (SPSite sPSite = new SPSite(mySitePortalUrl))
     {
      MailMessage mailMessage = null;
      try
      {
       string mySiteEmailSenderName = proxy.GetMySiteEmailSenderName(userProfile.get_PartitionID());
       mailMessage = FeedNotificationUtils.CreateReplyNotificationMailMessage(sPSite.RootWeb, mySiteEmailSenderName, communityWeb, author, topic, reply, mySitePortalUrl);
       using (SPSmtpClient sPSmtpClient = new SPSmtpClient(communityWeb.Site))
       {
        flag = SPMailMessageHelper.TrySendMailMessage(sPSmtpClient, mailMessage);
        if (!flag)
        {
         ...
        }
       }
      }
      finally
      {
       if (mailMessage != null)
       {
        SPMailMessageHelper.DisposeAttachmentStreams(mailMessage);
        mailMessage.Dispose();
       }
      }
     }
    }
   }
  }
  catch (Exception exception1)
  {
   ...
  }
  return flag;
 }
}

So as you can see reply emails in discussion boards are implemented via event handlers. At the end let’s also mention that it is possible to disable reply emails in discussion board – but it will also disable other community features like ratings (i.e. users won’t be able to like replies). In order to do that go to discussion board list settings > Rating settings and set “Allow items in this list to be rated” to No:

It will call internal method ReputationHelper.DisableReputation() which will remove CommunityEventReceiver from discussion board:

internal static class ReputationHelper
{
 ...
 internal static void DisableReputation(SPList list)
 {
  ReputationHelper.HideAllReputationFields(list);
  ReputationHelper.SetExperience(list, string.Empty, false);
  if (list.BaseTemplate == SPListTemplateType.DiscussionBoard)
  {
   List sPViews = new List();
   foreach (SPView view in list.Views)
   {
    sPViews.Add(view);
   }
   Guid[] contentReputationPopularityFieldId = new Guid[] { CommunitiesConstants.ContentReputation_Popularity_FieldId, CommunitiesConstants.ContentReputation_DescendantLikesCount_FieldId, CommunitiesConstants.ContentReputation_DescendantRatingsCount_FieldId, CommunitiesConstants.ContentReputation_LastRatedOrLikedBy_FieldId };
   FunctionalityEnablers.RemoveFieldsFromViews(contentReputationPopularityFieldId, list, sPViews);
   CommunityUtils.RemoveEventReceiver(list, typeof(CommunityEventReceiver).FullName);
   foreach (SPView sPView in sPViews)
   {
    if (sPView.JSLink != null)
    {
     sPView.JSLink = sPView.JSLink.Replace("|sp.ui.communities.js", "");
    }
    sPView.Update();
   }
  }
 }
}

This is how email notifications work in Sharepoint discussion boards. Hope that this information will help you in your work.

Thursday, July 18, 2019

Problem with Copy-PnPFile cmdlet and File Not Found error

Sharepoint PnP PowerShell library has many useful commands which simplify scripting against Sharepoint Online. One of them is Copy-PnPFile which allows to copy file from one document library to another. Source and target doclibs may be even located in different site collections in the same tenant.

One of example of Copy-PnPFile (basically 1st example) says that it is possible to copy single file like that:

Copy-PnPFile -SourceUrl Documents/company.docx -TargetUrl /sites/otherproject/Documents/company.docx

Unfortunately currently it gives File not found error. Looks like there is a bug in Copy-PnPFile which prevents it from working correctly – it is also described in this StackOverflow thread which was created couple of days ago: Copy-PnPFile returns File Not Found.

The only working way which I’ve found so far is to copy whole root folder:

Copy-PnPFile -SourceUrl Documents -TargetUrl /sites/otherproject/Documents -SkipSourceFolderName

It will copy all files from Documents doclib to Documents doclib on another site collection /sites/otherproject. However it will also try to copy OTB list view AllItems.aspx as last file and it will give error that AllItems.aspx already exists in target doclib. In order to ignore this error I used the following solution:

$error = $null
Copy-PnPFile -SourceUrl Documents -TargetUrl /sites/otherproject/Documents -SkipSourceFolderName -ErrorAction SilentlyContinue -ErrorVariable error
if ($error -and !$error.Exception.Message.ToLower().Contains("allitems.aspx")) {
    throw $error
}

I.e. it will throw error only if message doesn’t contain allitems.aspx occurrence. Hope that it will help someone.

Monday, July 15, 2019

Problem with not editable host.json file for Azure function app

Some time we need to manually modify content of host.json file of Azure function app (e.g. to change logging settings). In order to do that you need to go to Azure function app > Platform features > Function app settings > host.json. However you may face with situation that textarea which stores content of host.json will be readonly:

In this case change Function app edit mode setting on the same page from “Read only” to “Read/Write”. After that you will be able to edit content of host.json file for you Azure function app:

Friday, July 12, 2019

Get all Azure AD groups where current user is a member transitively via Graph API


As you probably know we may get all groups where user is member using memberOf endpoint:

GET /users/{id | userPrincipalName}/memberOf

This endpoint returns only those groups where user was added as direct member. I.e. if user was added to GroupA and this GroupA was then added to GroupB – it will return only GroupA but not GruopB. However we often need to get all groups where user is member transitively. Fortunately it is also possible with another endpoint getMemberGroups:

POST /users/{id | userPrincipalName}/getMemberGroups

Until recently it was available only in Graph API itself – not in .Net Graph client library. Fortunately starting with 1.16 version Microsoft.Graph.User class got new property TransitiveMemberOf propery:

Using this property we may get all groups where user is member transitively. It supports paging so in order to get all groups we also need to iterate through pages. Here is the code example which does that:

private static List<Guid> GetUserGroupsTrasitively(string userPrincipalName)
{
 try
 {
  var graph = new GraphServiceClient(new AzureAuthenticationProvider());
  var groups = graph.Users[userPrincipalName].TransitiveMemberOf.Request().GetAsync().Result;
  if (groups == null)
  {
   return new List<Guid>();
  }

  var result = new List<Guid>();
  while (groups.Count > 0)
  {
   foreach (var group in groups)
   {
    result.Add(new Guid(group.Id));
   }

   if (groups.NextPageRequest != null)
   {
    groups = groups.NextPageRequest.GetAsync().Result;
   }
   else
   {
    break;
   }
  }

  return result;
 }
 catch (Exception x)
 {
  // error handling
 }
}

Monday, July 1, 2019

Sharepoint MVP 2019

I got very exciting email from MS today that I’ve got MVP award in Office Apps & Services category. Although it is 9th award for me it definitely has own place in my professional life and I’m very glad that MS recognizes my contribution to community life with this award. Last year a lot of work was done related with MS Graph API and OfficeDevPnP. Many issues, workarounds and solutions were discussed during last year on forums, in blog posts comments, github issues, etc. Also I continue maintenance of Camlex library which simplifies creation of dynamic CAML queries for developers. Nowadays new technologies appear very often and they allow us to do such things which were not possible before. This is great but developers’ life don’t become easier because of that – we get new and new challenges in our work every day. From this perspective community role is crucial – I can’t say how many times I by myself found solutions for technical challenges in blog posts, forums, code samples, etc. Knowing on practice how important this work is I also try to share my findings, ideas and solutions with community. Thank you MS and thank you dear readers of my blog for being with me this year. Looking forward for the new year with it’s own interesting challenges and inventive solutions.

Tuesday, June 25, 2019

How to get localized field titles via CSOM in Sharepoint

Some time ago I wrote article which shows how to localize web part titles via CSOM. If you are not familiar with it I recommend to read it before continue as it has useful information about Sharepoint MUI feature in general: Localize web part titles via client object model in Sharepoint. In current post I will show how to get localized field titles via CSOM in Sharepoint. This technique can be used both in on-prem and online versions.

Currently CSOM has Field.TitleResource property and it looks suitable when you want to get localized title of some field. However this is not the case. In order to get localized fields’ titles you still have to use Field.Title property but with little tuning of ClientContext – similar to those which is mentioned in the article above. More specifically you need to specify target language via “Accept-Language” HTTP header associated with ClientContext and then request field titles (of course assuming that fields have these localized titles provisioned. See e.g. Provision multilingual sites with PnP templates to see how to provision multilingual fields’ titles using PnP templates). Here is the code which shows this concept:

string siteUrl = "...";
string clientId = "...";
string clientSecret = "...";
int lcid = ...;
using (var ctx = new OfficeDevPnP.Core.AuthenticationManager().GetAppOnlyAuthenticatedContext(siteUrl, clientId, clientSecret))
{
 ctx.PendingRequest.RequestExecutor.WebRequest.Headers["Accept-Language"] = new CultureInfo(lcid).Name;
 ctx.Load(ctx.Web);
 ctx.Load(ctx.Web.Fields, f => f.Include(c => c.Id, c => c.Title));
 ctx.ExecuteQueryRetry();
 ...
}

As result when you will iterate through site columns retrieved this way they will contain titles localized for language specified with lcid parameter.

Thursday, June 13, 2019

Use pagination with Sharepoint search API

Often search request return a lot of data. It may be insufficient to show all this data at once on the page – performance may suffer, page may be overloaded with data, etc. In order to address these issues we may use pagination i.e. get data by chunks. But how it often happens with Sharepoint there are own considerations related with pagination in search API.

If we will check documentation of SharePoint Search REST API we will find several properties which affect pagination logic:

  • StartRow: The first row that is included in the search results that are returned. You use this parameter when you want to implement paging for search results.
  • RowLimit: The maximum number of rows overall that are returned in the search results. Compared to RowsPerPage, RowLimit is the maximum number of rows returned overall.
  • RowsPerPage: The maximum number of rows to return per page. Compared to RowLimit, RowsPerPage refers to the maximum number of rows to return per page, and is used primarily when you want to implement paging for search results.

So based on this description we may assume that most obvious way to get paginated data is to use StartRow and RowsPerPage:

var searchUrl = "http://{tenant}.sharepoint.com/_api/search/query?querytext='" + query + "'&selectproperties='Title'&startRow=" + startRow + "&rowsPerPage=" + pageSize;

It will work but with one condition: if page size is less than default page size which is 10 items per page. If page size is greater than default page size it won’t work: in this case you have to use RowLimit which will work as page size even though documentation says different:

var searchUrl = "http://{tenant}.sharepoint.com/_api/search/query?querytext='" + query + "'&selectproperties='Title'&startRow=" + startRow + "&rowLimit=" + pageSize;

This approach will allow to implement pagination with page size bigger than default search page size (10 items per page).

Thursday, June 6, 2019

Grant permissions and trust SharePoint app automatically via PowerShell

In Sharepoint app model we may need to grant permissions to Sharepoint app on AppInv.aspx page by providing appropriate permissions request xml. If permissions are granted on Tenant level you need to open AppInv.aspx in context of Central admin i.e. https://{tenant}-admin.sharepoint.com:

It was historically quite painful to automate this process as automatic permissions grant is not currently possible. There were attempts to automate O365 login and automate trust process using COM automation in PowerShell (using New-Object -com internetexplorer.application): https://github.com/wulfland/ScriptRepository/blob/master/Apps/Apps/Deploy-SPApp.ps1. With this approach script opens AppInv.aspx page and simulates user’s input.

However O365 login experience was changed since this script was implemented and there is no guarantee that it won’t be changed further. Also there may be several login scenarios:

  • user may be already logged in if chose Remember credentials during previous login
  • user may use MFA with SMS, authenticator app or something else which will make login automation even more complicated

Keeping that in mind I implemented the following semi-automatic way of granting app permissions and trust the app:

1. app is registered in Azure AD via PowerShell (in Sharepoint Online it is not necessary to register app which will be used for communicating with Sharepoint via AppRegNew.aspx. You may also register it in Azure Portal > App Registrations). See e.g. Create an Azure Active Directory Application and Key using PowerShell for example

2. Then script opens AppInv.aspx page in IE (using Start-Process cmdlet) and asks user to authenticate him/herself manually. After that user returns to the script and clicks Enter – all other steps (grant permissions and trust the app) are performed by the following PowerShell script:

function Trust-SPAddIn {
    [CmdletBinding(SupportsShouldProcess=$true)]
    [OutputType([int])]
    Param
    (
        [Parameter(Mandatory=$true, Position=0)]
        [string]$AppInstanceId,

        [Parameter(Mandatory=$true, Position=1)]
        [string]$WebUrl,

        [parameter(Mandatory=$true, Position=2)] 
        [string]$UserName, 

        [parameter(Mandatory=$true, Position=3)] 
        [string]$Password
    )

    $ie = New-Object -com internetexplorer.application
    try {
  Log-Warn ("Script will now open $WebUrl. Please authenticate yourself and wait until Admin Center home page will be loaded.")
  Log-Warn ("After that leave Admin Center window opened (don't close it), return to the script and follow provided instructions.")
  Log-Warn ("In case you are already signed in Admin Center window will be opened without asking to login. In this case wait until Admin Center window will be loaded, leave it opened and return to the script.")
  if (-not $silently) {
   Log-Warn ("Press Enter to open $WebUrl...")
   Read-Host
  }
 
        $ie.Visible = $true
        $ie.Navigate2($WebUrl)
  
  if (-not $silently) {
   Log-Warn ("Wait until Admin Center window will be fully loaded and press Enter to continue installation")
   Log-Warn ("Don't close Admin Center window - script will close it automatically")
   Read-Host
  }
  
  $authorizeURL = "$($WebUrl.TrimEnd('/'))/_layouts/15/appinv.aspx"
  Log-Info ("Open $authorizeURL...")
  $ie.Visible = $false
  $ie.Navigate2($authorizeURL)
  WaitFor-IEReady $ie -initialWaitInSeconds 3

  Log-Info ("Grant permissions to the app...")
  $appIdInput = $ie.Document.getElementById("ctl00_ctl00_PlaceHolderContentArea_PlaceHolderMain_IdTitleEditableInputFormSection_ctl01_TxtAppId")
  $appIdInput.value = $AppInstanceId
  $lookupBtn = $ie.Document.getElementById("ctl00_ctl00_PlaceHolderContentArea_PlaceHolderMain_IdTitleEditableInputFormSection_ctl01_BtnLookup")
  $lookupBtn.Click()
  WaitFor-IEReady $ie -initialWaitInSeconds 3
  Log-Info ("Step 1 of 2 done")
  $appIdInput = $ie.Document.getElementById("ctl00_ctl00_PlaceHolderContentArea_PlaceHolderMain_TitleDescSection_ctl01_TxtPerm")
  $appIdInput.value = '<AppPermissionRequests AllowAppOnlyPolicy="true"><AppPermissionRequest Scope="http://sharepoint/content/tenant" Right="FullControl" /></AppPermissionRequests>'
  $createBtn = $ie.Document.getElementById("ctl00_ctl00_PlaceHolderContentArea_PlaceHolderMain_ctl01_RptControls_BtnCreate")
  $createBtn.Click()
  WaitFor-IEReady $ie -initialWaitInSeconds 3
  Log-Info ("Step 2 of 2 done")

  Log-Info ("Trust the app...")
  $trustBtn = $ie.Document.getElementById("ctl00_ctl00_PlaceHolderContentArea_PlaceHolderMain_BtnAllow")
  $trustBtn.Click()
  WaitFor-IEReady $ie -initialWaitInSeconds 3

  Log-Info ("All steps are done")
    }
    finally {
        $ie.Quit()
    } 
}

WaitFor-IEReady helper method is given from original script mentioned above so credits go to it’s author:

function WaitFor-IEReady {
    [CmdletBinding()]
    Param
    (
        [Parameter(Mandatory=$true, ValueFromPipelineByPropertyName=$true, Position=0)]
        $ie,

        [Parameter(Mandatory=$false, Position=1)]
        $initialWaitInSeconds = 1
    )

    sleep -Seconds $initialWaitInSeconds

    while ($ie.Busy) {

        sleep -milliseconds 50
    }
}

Log-Info and Log-Warn are basic logger methods and you may implement them as needed for your scenario. Since we delegated login to the end user we don’t need to handle different O365 login scenarios and script is greatly simplified, e.g. there is no need to perform javascript activities which work not very stable via COM automation.

Monday, May 20, 2019

One reason why SPFarm.CurrentUserIsAdministrator may return false when current user is farm administrator

Sharepoint farm administrators are powerful users who may perform administrative actions in Sharepoint farm (see SharePoint Farm Administrator account for more details). Sometime you may need to check whether or not current user is farm admin in order to allow or disallow specific actions. You may do that using SPFarm.CurrentUserIsAdministrator method. The problem however that it will work as expected only in context of Central administration web application (i.e. if page which calls this method is running in Central administration). If you will try to call this method from regular Sharepoint web application it will always return false even if current user is added to Farm administrations group on your farm - you may add users to Farm administrators using Central administration > Manage the farm administrators group:

In this case (when you need to call SPFarm.CurrentUserIsAdministrator from regular Sharepoint web application) you have to use overridden version of this method with boolean parameter:

allowContentApplicationAccess
true to make the check work in the content port of a Web application; otherwise, false.

i.e. like this:

bool isFarmAdmin = SPFarm.Local.CurrentUserIsAdministrator(true);

In this case method will work as expected i.e. return true if user is member of Farm administrators group and false otherwise.