Wednesday, October 29, 2014

Speaking on Finland Sharepoint user group

Today I was speaking on Finland Sharepoint UG ( meeting about using WatiN for avoiding limitations of sandbox solutions and client object model in Sharepoint. It was first virtual event which I participated in Finland SP UG and it went very well. Thanks to all attendees and other speakers (famous guys, Gunnar Peipman and Spencer Harbar) for interesting talks and this meeting. And special thanks for Jussi Roine for organizing the event. Update: presentation from my speak is available on Slideshare: Using WatiN in Sharepoint.

Tuesday, October 21, 2014

One way to localize fields and web part titles in Sharepoint Online

If you tried to localize fields, content types or web part titles for Sharepoint Online you already know that it is not trivial process. The main problem is that we can’t deploy farm solutions to Sharepoint Online which may install files on file system. As result we can’t put custom resx files to 15/Resources folder and use them in declaration of our artifacts with special syntax “$Resources:filename,key”.

However it is still possible to localize some frequently used strings like Documents, Meetings, Events, Calendar and others with minimal efforts. Just search for the needed string across OTB Sharepoint resx files which are installed to 15/Resources folder with basic Sharepoint or language pack installation. If you will find such string you still will be able to use it in your artifacts in regular way. For example, localized Documents string may look like this “$Resources:cmscore,ListNameDocuments” because cmscore.resx and its language-specific versions contains translations of Documents string.

Of course with this approach you are limited with strings which present in OTB Sharepoint resource files, but it is better than nothing. In order to be able to use custom resource files for localizing fields and content types in Sharepoint Online much more complicated technique should be used. I will write about one of these technique in one of the future posts.

Saturday, October 18, 2014

Fix managed metadata fields after updating sandbox solution using client object model in Sharepoint

In one of my previous posts (see Reactivate Web-scoped features from PowerShell using client object model in Sharepoint Online) I wrote about problem which you will face with when will update sandbox solution: all web-scoped features on existing sites will be deactivated, so you will need to activate them back after wsp update. In this post I will write about one more issue related with update of sandbox solutions: after update binding of all managed metadata fields with appropriate term sets become broken. I.e. fields became greyed out and you can’t select any value for them using managed metadata picker.

In another post Provision managed metadata term sets and fields for Sharepoint Online using client object model I showed how to provision managed metadata fields and term sets and bind them with each other via client object model. And I wrote that it is important to assign IDs explicitly to the term sets because they will be used for binding. These ids will also help for fixing fields’ bindings after wsp update.

The main idea is that after each wsp update you need to call function Bind-Managed-Metadata-Field which was showed in the previous article again:

   1: function Bind-Managed-Metadata-Field($ctx, $termStoreId, $fieldId, $termSetId)
   2: {
   3:     $rootWeb = $ctx.Web
   4:     $fields = $rootWeb.Fields
   5:     $ctx.Load($fields)
   6:     $ctx.ExecuteQuery()
   8:     try
   9:     {
  10:         $field = $fields.GetById($fieldId)
  11:     }
  12:     catch
  13:     {
  14:         Write-Host "Field" $fieldId "not found in site columns collection"
  15:             -foregroundcolor red
  16:         return
  17:     }
  19:     $taxField = [Microsoft.SharePoint.Client.ClientContext].GetMethod("CastTo").
  20:         MakeGenericMethod([Microsoft.SharePoint.Client.Taxonomy.TaxonomyField]).
  21:         Invoke($ctx, $field)
  22:     $taxField.SspId = $termStoreId
  23:     $taxField.TermSetId = $termSetId
  24:     $taxField.TargetTemplate = ""
  25:     $taxField.AnchorId = [System.Guid]::Empty
  26:     $taxField.UpdateAndPushChanges($true)
  27:     $ctx.ExecuteQuery()
  28:     Write-Host "Field" $fieldId "was successfully bound to termset" $termSetId
  29:         -foregroundcolor green
  30: }
  32: function Bind-Managed-Metadata-Fields($ctx, $xmlFilePath)
  33: {
  34:     Write-Host "Binding managed metadata fields to term sets" -foregroundcolor green
  35:     [xml]$xmlContent = (Get-Content $xmlFilePath)
  36:     if (-not $xmlContent)
  37:     {
  38:         Write-Host "Xml was not loaded successfully. Fields won't be bound to term sets"
  39:             -foregroundcolor Red
  40:         return
  41:     }
  42:     $termStore = Get-TermStore $ctx
  43:     $groups = $termStore.Groups
  44:     $ctx.Load($groups)
  45:     $ctx.ExecuteQuery()
  46:     $group = $groups | Where-Object {$_.Name -eq $xmlContent.Group.Name}
  47:     if (-not $group)
  48:     {
  49:         Write-Host "Group" $xmlContent.Group.Name "not found"
  50:             -foregroundcolor Red
  51:         return
  52:     }
  54:     Bind-Managed-Metadata-Field $ctx $termStore.Id "{field id 1}" "{term set id 1}"
  55:     Bind-Managed-Metadata-Field $ctx $termStore.Id "{field id 2}" "{term set id 2}"
  56:     ...
  57:     Bind-Managed-Metadata-Field $ctx $termStore.Id "{field id n}" "{term set id n}"
  58: }
  60: $context = New-Object Microsoft.SharePoint.Client.ClientContext($siteURL)
  61: $context.AuthenticationMode =
  62:     [Microsoft.SharePoint.Client.ClientAuthenticationMode]::Default
  63: $securePassword = ConvertTo-SecureString $password -AsPlainText -Force
  64: $credentials =
  65:     New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($username,
  66: $securePassword)
  67: $context.Credentials = $credentials
  69: function Bind-Managed-Metadata-Fields $ctx "termsets.xml"

This example uses the same xml file structure as in previous article. After that your managed metadata fields will work again after update of sandbox solution.

Monday, October 13, 2014

Camlex 4.0 and Camlex.Client 2.0 are released: list joins and field projections

I have a good news for Sharepoint developers which use Camlex library in their projects: today Camlex 4.0 and Camlex.Client 2.0 with support of list joins and field projections were released. Let’s check msdn example List Joins and Projections and see how it will look like with Camlex syntax. Suppose that we have 4 lists:

  • Orders
  • Members
  • Cities
  • States

with the following left joins:

  • Orders to Members
  • Members to Cities
  • Cities to States

In order to create CAML query with list joins we need to specify value for SPQuery.Joins and SPQuery.ProjectedFields properties. Suppose that we need to create the query for all orders made by customers from London, UK. Here is how it is done using Camlex syntax:

   1: var query = new SPQuery();
   3: query.Query = Camlex.Query().Where(x => (string)x["CustomerCity"] == "London" &&
   4:     (string)x["CustomerCityState"] == "UK").ToString();
   6: query.Joins = Camlex.Query().Joins()
   7:     .Left(x => x["CustomerName"].ForeignList("Customers"))
   8:     .Left(x => x["CityName"].PrimaryList("Customers").ForeignList("CustomerCities"))
   9:     .Left(x => x["StateName"].PrimaryList("CustomerCities").ForeignList("CustomerCityStates"))
  10:     .ToString();
  12: query.ProjectedFields = Camlex.Query().ProjectedFields()
  13:     .Field(x => x["CustomerCity"].List("CustomerCities").ShowField("Title"))
  14:     .Field(x => x["CustomerCityState"].List("CustomerCityStates").ShowField("Title"))
  15:     .ToString();
  17: query.ViewFields = Camlex.Query().ViewFields(x => new[] {x["CustomerCity"],
  18:     x["CustomerCityState"]});

Here we have 4 Camlex calls for Query, Joins, ProjectedField and ViewFields properties. Let’s see what CAML is generated for each property:


   1: <Where>
   2:   <And>
   3:     <Eq>
   4:       <FieldRef Name="CustomerCity" />
   5:       <Value Type="Text">London</Value>
   6:     </Eq>
   7:     <Eq>
   8:       <FieldRef Name="CustomerCityState" />
   9:       <Value Type="Text">UK</Value>
  10:     </Eq>
  11:   </And>
  12: </Where>


   1: <Join Type="LEFT" ListAlias="Customers">
   2:   <Eq>
   3:     <FieldRef Name="CustomerName" RefType="Id" />
   4:     <FieldRef List="Customers" Name="Id" />
   5:   </Eq>
   6: </Join>
   7: <Join Type="LEFT" ListAlias="CustomerCities">
   8:   <Eq>
   9:     <FieldRef List="Customers" Name="CityName" RefType="Id" />
  10:     <FieldRef List="CustomerCities" Name="Id" />
  11:   </Eq>
  12: </Join>
  13: <Join Type="LEFT" ListAlias="CustomerCityStates">
  14:   <Eq>
  15:     <FieldRef List="CustomerCities" Name="StateName" RefType="Id" />
  16:     <FieldRef List="CustomerCityStates" Name="Id" />
  17:   </Eq>
  18: </Join>


   1: <Field Name="CustomerCity" Type="Lookup" List="CustomerCities" ShowField="Title" />
   2: <Field Name="CustomerCityState" Type="Lookup" List="CustomerCityStates" ShowField="Title" />


   1: <FieldRef Name="CustomerCity" />
   2: <FieldRef Name="CustomerCityState" />

As you can see joins syntax is quite straightforward:

   1: Joins().Left(x => x[...].ForeignList(...))
   2: // or
   3: Joins().Left(x => x[...].PrimaryList(...).ForeignList(...))

First syntax (without specifying primary list) is used when primary list if the same list against which query is made, second – when it is different list. For projected fields syntax is also self descriptive:

   1: .Field(x => x[...].List(...).ShowField(...))

i.e. you need to specify values for attributes of the resulting Field element.

For joins and projected fields only fluent interfaces were used for supporting multiple elements (.Join().Join()… or .Field().Field()…), i.e. there are no appropriate methods which receive collection of parameters which can be passed in order to get multiple Join or Field elements in resulting CAML. Such methods simplify dynamic creation of the parameters in runtime when their amount is not known in compile time. This approach is used in Camlex for Where and ViewFields, but I hardly may imagine scenarios when dynamic creation of join chains will be needed. If someone will need it, I will think about adding it to the library, but meanwhile only fluent interfaces for joins and projected fields will be used.

Reverse version of new features (which allows to make conversion in opposite direction from CAML to C#) is not implemented yet. It will take some time to implement it and I decided to release it later. Practically it means that e.g. if you will try to convert Joins or ProjectedFields from CAML to C# Camlex syntax on online service (this service was launched in order to simplify Sharepoint developers which are not familiar with Camlex starting using it), it won’t work currently. But I’m going to implement it in next release so it will work after that.

If you will have any comments or feature requests please post them to discussions section of the Camlex project site.

Wednesday, October 8, 2014

T-SQL script for deleting big amount of records from table in SQL server database

Sometimes we need to delete big amount of records from the table in SQL Server database. If number of records is quite big (several millions) the simple delete command will hang:

   1: delete from Foo

In this case we can use the following approach: delete records by chunks (e.g. 100000 items per operation, but it depends on how many columns your table has. It if has a lot of columns and data, you may adjust this number for your scenario) until all records will be deleted. For example imagine that we have table [Log] used for logging some operations:

   1: create table [Log](
   2:     [Id] [int] identity(1,1) not null,
   3:     [ThreadId] [int] not null,
   4:     [EventType] [nvarchar](32) null,
   5:     [Message] [nvarchar](max) null,
   6:     [Created] [datetime] not null
   7: )

And suppose that this table grows quite fast and we need periodically delete old log records from it (e.g. older than 2 weeks). It can be done by creating job in SQL Server agent. For this job create the step which will run the following code:

   1: declare @s datetime
   2: set @s = GETDATE() - 14
   3: declare @cnt int
   5: set @cnt = (select count(*) from [Log] where Created < @s)
   6: while @cnt > 0
   7: begin
   8:     delete top (100000)
   9:     from [Log]
  10:     where Created < @s
  12:     set @cnt = (select count(*) from [Log] where Created < @s)
  13: end

This script will delete records from the Log table by chunks in 100000 rows until all records which are older than 2 weeks won’t be deleted and won’t hang because of big amount of data.

Saturday, October 4, 2014

Provision custom web parts from sandbox solution and add them on page declaratively in Sharepoint

We all know how to add custom web parts on the publishing page declaratively in farm solution. In order to do it first of all we need to create module and provision page itself. In the <File> element for the page we may add web parts (OTB or custom) using <AllUsersWebPart> element:

   1: <Module Name="Home" Url="$Resources:osrvcore,List_Pages_UrlName;" Path="">
   2:         <File Path="default.aspx" Url="update.aspx" Type="GhostableInLibrary" >
   3:           <Property Name="Title" Value="Update" />
   4:           <Property Name="PublishingPageLayout"
   5: Value="~SiteCollection/_catalogs/masterpage/Foo.aspx, Foo;" />
   6:           <Property Name="ContentType"
   7: Value="$Resources:cmscore,contenttype_welcomepage_name;" />
   8:           <Property Name="PublishingPageContent" Value="" />
   9:           <AllUsersWebPart WebPartZoneID="BottomLeftZone" WebPartOrder="1">
  10:             <![CDATA[
  11: <webParts>
  12:   <webPart xmlns="">
  13:     <metaData>
  14:       <type name="Foo.MyWebPart, ..." />
  15:       <importErrorMessage>Cannot import this Web Part.</importErrorMessage>
  16:     </metaData>
  17:     <data>
  18:       <properties>
  19:         <property name="Title" type="string">Update helper web part</property>
  20:       </properties>
  21:     </data>
  22:   </webPart>
  23: </webParts>
  24:             ]]>
  25:           </AllUsersWebPart>
  26:         </File>
  27: </Module>

However the same approach won’t work in sandbox solutions because in this case assembly is not installed to the GAC and instead of web part you will see error message that type of your web part is not found. In order to provision custom web parts declaratively from sandbox solution we need to add <Solution SolutionId=”…” xmlns=xmlns="" > under <metaData> element. SolutionId attribute should have id of your sandbox solution which you may find in Visual Studio with opened solution, then Package > Package.package > Manifest > copy SolutionId from here. Here is how module should look like in order to work with sandbox solution:

   2: <Module Name="Home" Url="$Resources:osrvcore,List_Pages_UrlName;" Path="">
   3:         <File Path="default.aspx" Url="update.aspx" Type="GhostableInLibrary" >
   4:           <Property Name="Title" Value="Update" />
   5:           <Property Name="PublishingPageLayout"
   6: Value="~SiteCollection/_catalogs/masterpage/Foo.aspx, Foo;" />
   7:           <Property Name="ContentType"
   8: Value="$Resources:cmscore,contenttype_welcomepage_name;" />
   9:           <Property Name="PublishingPageContent" Value="" />
  10:           <AllUsersWebPart WebPartZoneID="BottomLeftZone" WebPartOrder="1">
  11:             <![CDATA[
  12: <webParts>
  13:   <webPart xmlns="">
  14:     <metaData>
  15:       <type name="Foo.MyWebPart, ..." />
  16:       <importErrorMessage>Cannot import this Web Part.</importErrorMessage>
  17:       <Solution SolutionId="..."
  18: xmlns="" />
  19:     </metaData>
  20:     <data>
  21:       <properties>
  22:         <property name="Title" type="string">Update helper web part</property>
  23:       </properties>
  24:     </data>
  25:   </webPart>
  26: </webParts>
  27:             ]]>
  28:           </AllUsersWebPart>
  29:         </File>
  30: </Module>

After that your web part will be shown on the page.