Saturday, March 29, 2014

Problem with SEO properties for pages with friendly URLs in Sharepoint 2013

Sharepoint 2013 has OTB SEO features which allow you to optimize your sites for search engines. In order to use them you need to activate SearchEngineOptimization feature. After that in ribbon on your publishing pages there will be “Edit SEO properties” command:

image

The problem is that if your friendly URL equal to the real site’s URLs (which is not bad because it is possible to switch to structural URLs and your links will still work, because http://example.com/test will be automatically resolved to http://example.com/test/pages/default.aspx), then “Edit SEO properties” command will be disabled:

image

However it is still possible to edit SEO properties using direct URL. E.g. if we are on http://example.com/test site. In order to edit SEO properties open the following URL in the browser:

http://example.com/test/_layouts/15/SEOProperties.aspx?Source=/test/pages/default.aspx&page=1

There are several important moments:

  1. URL should be relative to the current site (http://example.com/test), not e.g. for the root site of site collection;
  2. Source query string param should contain server relative real URL of the page (/test/pages/default.aspx).

If you made it correctly, you should see SEO properties page opened and after saving changes will be applied to the page, regardless of real or friendly URL are used in browser:

image

Hope that this workaround will help you.

Startup code in Orchard module

If you work with some CMS it will be better if you will play by its rules. E.g. in Sharepoint basic way to add customizations is feature. It allows you to add custom functionality to the site. In Orchard the basic way to apply customization is module. There may be a lot of custom modules in your project and the more each module will be independent from others the easier maintenance will be.

In the module you may also implement custom functionality like controllers, views and models. Some of this functionality may require adding of initialization code (like adding custom model binder), which in regular ASP.Net MVC application goes to global.asax.cs Application_Start() method. Orchard has own global.asax, but it is located in the common Orchard.Web project which is not related with your module. Is there a way to add initialization code which will be executed on startup for the module? Yes, and it is quite simple. In order to do it we need to add a class which inherits Autofac.Module class to the module’s project (Autofac is IoC library used in Orchard) and override its Load method:

   1: public class ContactFormModule : Module
   2: {
   3:     private bool isInitialized = false;
   4:     protected override void Load(ContainerBuilder builder)
   5:     {
   6:         if (!this.isInitialized)
   7:         {
   8:             ModelBinders.Binders.Add(typeof (FooModel), new FooModelBinder());
   9:             this.isInitialized = true;
  10:         }
  11:     }
  12: }

And that’s basically all. Orchard will automatically wire it up and call during application initialization.

Saturday, March 22, 2014

Problem with using not latest publishing version in cross-site publishing in Sharepoint 2013 site

In one of the previous project with cross-site publishing we faced with strange behavior when Sharepoint showed not latest publishing version in Content by search web part from search index. Here is the scenario: we used classic architecture with single authoring web application and multiple publishing sites:

image

On publishing sites managed metadata was used in order to show different authoring content on different publishing sites (e.g. Country term set and different publishing sites for different countries). In order to simplify content creation on authoring site custom ribbon action was implemented for Pages doclib which copied publishing pages in the same doclib on the authoring site. I.e. page created for all languages was copied, then translators translated this copy to another languages and changed Country managed metadata to appropriate language. Also they removed this language from original page’s metadata:

image

It worked quite well, i.e. English content was shown on English publishing site, Russian – on Russian. But some time after site worked in production we found one problem: on publishing site if user changed browser language, then Content by search web part showed original English version of the page, which was targeted to both languages. But as I wrote above after translators translated the page they removed their language from the original page (Page 1 on the picture above) and published it with major version.

We used continuous crawl which worked quite well in other scenarios and the problem still reproduced after several hours after it was found, i.e. continuous crawl didn’t fix it. We tried to make incremental and full crawls after that also without result. This behavior means the following:

1. Browse language is used for filtering content in Content by search web parts. This is not new finding, I wrote about how to avoid it here: Configure content by search web parts in Sharepoint 2013 to not use client language.

2. Sharepoint may show content from the search index which is not last published version. I.e. even if greater major versions exist for the page, Content by search web part may still show previous version. This looks like a bug, because only latest content should be used in search index.

In order to fix the problem we changed our content by search web parts to not use client language like described in the article above. After that Content by search web parts started to use specified language, not Accept-Language http header of client’s browser, but the original problem was still not resolved.

Hope that this information will be useful if you will face with the same problem when will work with cross-site publishing in Sharepoint.

Thursday, March 20, 2014

Provision Sharepoint 2013 site collection with cross-site publishing

If you worked with Sharepoint 2007 or 2010 you know that in most cases it is possible to fully automate provisioning and make it by scripts. However when you provision site collection in Sharepoint 2013 with cross-site publishing there are few additional steps which require full crawling and which are harder to automate. In this post I will describe the installation order for site collections with cross-site publishing.

Here are the basic steps:

1. Create new web applications in Central administration: authoring, publishing and probably assets.

2. Deploy custom wsps if needed. They may contain custom web templates used for creating sites.

3. Create site collections in the following order:

3.1. Assets site at first as it is the most simplest in most cases. It is used as storage for images, attachments, documents and other assets. Often standard site templates may be used for it;

3.2. Authoring site – it should go second because we need to crawl it in order to be able to connect to catalog in publishing site (see below). After site collection is created we need to configure search service application:

3.2.1. Create new content source in Central administration > Manage service applications > Search service application > Content sources. Also ensure that authoring site is not in Local SharePoint sites content source. It will allow us to make configuration more flexible and independent for authoring and publishing sites. E.g. can use incremental crawl for publishing site and continuous crawl for authoring.

3.2.2. If you need to exclude some parts from authoring site add appropriate crawl rules in Search service application page.

3.2.3. Important step: for each document library which is published as catalog we need to create some content and specify some values in all metadata fields for which we need to have managed properties in search index (and which then can be used in search queries). It is important to have not-empty values in all such fields, otherwise managed properties won’t be created.

3.2.4. Run full crawl for the authoring site content source.

3.2.5. After full crawl go to Search service application > Search index > Crawled properties and ensure that for each field in your custom content types (from document libraries which are published as catalogs) there are 2 crawled properties (see the following article which tells why 2 crawled properties are created for each metadata property: Problem with not crawled managed metadata fields in Sharepoint 2013).

3.2.6. For each crawled property configure managed property. It is possible to do it by PowerShell script: PowerShell script for creating and mapping managed properties in Search service application in Sharepoint.

3.2.7. Run full crawl again.

3.2.8. Enable continuous crawl if needed.

3.3. Publishing site. At this phase we made full crawl already and catalog connections should be available (in Site settings > Manage catalog connections. See also Restore Sharepoint site with configured cross-site publishing on different environment for details) so we may connect to them during provisioning of the publishing site. It can be done programmatically e.g. by using CatalogSubscriber from this msdn sample: http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.publishing.catalogconnectionmanager.aspx.

Also if you need to configure managed properties also for publishing site make similar steps to those which were described for authoring site.

This installation order will help you if you will need to provision site collection in Sharepoint 2013 with cross site publishing. Hope that it will be useful.

Sunday, March 9, 2014

Problem with DotNetOpenAuth logging when log4net is located in GAC

DotNetOpenAuth is popular open source library which simplifies implementation of own OAuth server (it also have other functionalities, like OpenID). The first thing which you will need when will use it is logger. DotNetOpenAuth may work with log4net – popular logging library for .Net. If it doesn’t finds log4net assembly it uses simple trace logger which writes records to the trace output (which you can read e.g. with DebugView utility). By default trace logger is switched off. In order to enable it you need to add the following section into your web.config file:

   1: <system.diagnostics>
   2:    <switches>
   3:       <!-- "1" gives error messages, "2" gives errors 
   4:          and warnings, "3" gives more detailed error information,
   5:          and "4" gives verbose trace information -->
   6:       <add name="DotNetOpenAuth.Messaging" value="4" />
   7:    </switches>
   8: </system.diagnostics>

Trace logger may be good for local development environments, but it is not very useful on production where you don’t always monitor logs with external utilities. For production environment log4net is more suitable choice.

The problem however is that when log4net assembly is installed to the GAC DotNetOpenAuth can’t find it and thus uses trace logger (issue was found in all-in-one DotNetOpenAuth assembly of 4.3.0.0 version). First of all in this case we need to specify fully qualified assembly names in web.config’s sections for configuration of log4net:

   1: <configSections>
   2:   <section name="log4net"
   3:         type="log4net.Config.Log4NetConfigurationSectionHandler, log4net,
   4:         Version=1.2.11.0, Culture=neutral, PublicKeyToken=669e0ddf0bb1aa2a"
   5:         requirePermission="false" />
   6:  
   7:  
   8: <log4net>
   9:   <appender name="FileAppender" type="log4net.Appender.FileAppender">
  10:     <file value="log-file.txt" />
  11:     <appendToFile value="true" />
  12:     <layout type="log4net.Layout.PatternLayout, log4net,
  13:         Version=1.2.11.0, Culture=neutral, PublicKeyToken=669e0ddf0bb1aa2a">
  14:         <conversionPattern value="%d [%t] %-5level %l [%p{NDC}] - %m%n" />
  15:     </layout>
  16:   </appender>
  17:   <root>
  18:     <level value="ALL" />
  19:     <appender-ref ref="FileAppender" />
  20:   </root>
  21:   <logger name="DotNetOpenAuth">
  22:     <level value="ALL" />
  23:   </logger>
  24: </log4net>

But it is not enough. Another problem comes from DotNetOpenAuth.Loggers.Log4NetLogger.IsLog4NetPresent property:

   1: private static bool IsLog4NetPresent
   2: {
   3:     get
   4:     {
   5:         try
   6:         {
   7:             Assembly.Load("log4net");
   8:             return true;
   9:         }
  10:         catch (FileNotFoundException)
  11:         {
  12:             return false;
  13:         }
  14:     }
  15: }

Assembly.Load(“log4net”) call throws FileNotFoundException. In order to fix the problem attach custom handler to AppDomain.AssemblyResolve event (e.g. in Application_Start in Global.asax.cs or in http module. In last case ensure that it is done once) and specify fully qualified assembly name:

   1: private void Application_Start(object sender, EventArgs e)
   2: {
   3:     AppDomain.CurrentDomain.AssemblyResolve +=
   4: CurrentDomain_AssemblyResolve;
   5: }
   6:  
   7: private static Assembly CurrentDomain_AssemblyResolve(object sender,
   8: ResolveEventArgs args)
   9: {
  10:     if (args.Name == "log4net")
  11:     {
  12:         return Assembly.Load("log4net, Version=1.2.11.0, " +
  13: "Culture=neutral, PublicKeyToken=669e0ddf0bb1aa2a");
  14:     }
  15:     return null;
  16: }

After that DotNetOpenAuth logging should work.

Saturday, March 8, 2014

Multilingual site on Orchard CMS with single codebase and content database depending on domain name

When we need to create multilingual site on Orchard we have several options:

  1. Create site on first basic language, then copy content database and translate all content. In this case we will have 2 separate databases and will need to maintain them separately, although codebase can be the same (however you will still need to create separate sites in IIS in order to specify different connection strings in \App_Data\Sites\Default\Settings.txt). Also you will need new content database for each new language;
  2. After creating the site on basic language go to Admin panel > Settings and add necessary languages to the list of available languages (via “Add or remove supported cultures for the site” link). After that in Modules install and enable Localization module which allows to create languages variations for each content item, like pages. There will be “New translation” under each page:

image

Using other module called “Culture layer” you may create new layers in Widgets for separate languages. E.g. this is how you may create layer for front page for Russian and English languages:

Name Rule
TheHomepage ru-RU url("~/") and lang("ru-RU")
TheHomepage en-US url("~/") and lang("en-US")

(lang rule is added with “Culture layer” module). After that you may add different widgets and define localized content in different layers.

Now when you will change language in Settings > Default site culture, you will see localized content. But in this case it still will be needed to have different site in IIS and different copies of content databases for each site which will have only one difference in Default site culture, which isn’t good. The better way would be to make it so that site language will be selected automatically based on the domain name of the site? I.e. when we visit site by http://example.ru it will use Russian, but when by http://example.com – English, while both host headers will point to the single IIS site. The answer is yes, it is possible with Orchard.

In order to do it we will need to create new Module for our site. Into Content folder our module we will put text file with list of domains of 1st level and appropriate locale names:

ru:ru-RU;com:en-US

Then we create new class which inherits ICultureSelector interface from Orchard framework. It will parse text file with locales and depending on current host name will choose the correct one:

   1: public class CultureSelector : ICultureSelector
   2: {
   3:     private const string DEFAULT_CULTURE = "ru-RU";
   4:     private const int CULTURE_PRIORITY = 1000;
   5:  
   6:     public CultureSelectorResult GetCulture(HttpContextBase context)
   7:     {
   8:         var defaultCulture = new CultureSelectorResult {Priority = CULTURE_PRIORITY,
   9: CultureName = DEFAULT_CULTURE};
  10:         try
  11:         {
  12:             var locales = this.getLocales();
  13:             if (!locales.Any())
  14:             {
  15:                 return defaultCulture;
  16:             }
  17:  
  18:             string host = HttpContext.Current.Request.Url.Host;
  19:             int idx = host.LastIndexOf(".");
  20:             if (idx < 0)
  21:             {
  22:                 return defaultCulture;
  23:             }
  24:             var currentDomain = host.Substring(idx + 1).ToLower();
  25:             var currentLocale = locales.FirstOrDefault(l =>
  26: l.Domain == currentDomain);
  27:             if (currentLocale == null)
  28:             {
  29:                 return defaultCulture;
  30:             }
  31:             return new CultureSelectorResult {Priority = CULTURE_PRIORITY,
  32: CultureName = currentLocale.Locale};
  33:         }
  34:         catch
  35:         {
  36:             return defaultCulture;
  37:         }
  38:     }
  39:  
  40:     private List<dynamic> getLocales()
  41:     {
  42:         string path = HttpContext.Current.Server.
  43: MapPath("~/Modules/Example.Localization/Content/locales.txt");
  44:         string locales = File.ReadAllText(path);
  45:         if (string.IsNullOrEmpty(locales))
  46:         {
  47:             return new List<dynamic>();
  48:         }
  49:  
  50:         string[] pairs = locales.Split(new[] { ';' },
  51: StringSplitOptions.RemoveEmptyEntries);
  52:         if (!pairs.Any())
  53:         {
  54:             return new List<dynamic>();
  55:         }
  56:  
  57:         var result = new List<dynamic>();
  58:         foreach (string pair in pairs)
  59:         {
  60:             int idx = pair.IndexOf(":");
  61:             if (idx < 0)
  62:             {
  63:                 continue;
  64:             }
  65:  
  66:             result.Add(new {Domain = pair.Substring(0, idx),
  67: Locale = pair.Substring(idx + 1)});
  68:         }
  69:         return result;
  70:     }
  71: }

Now if we will open our site in ru domain, we will see Russian content, while in com – English.