Wednesday, September 27, 2023

Problem with datetime format used in %date% environment variable in Windows scheduled tasks

Recently I've faced with interesting problem. Let's say you have Russian datetime format set in Windows (Control panel > Region > Formats) and create new Windows scheduled task which runs the following cmd script:

echo %date%

It will use Russian datetime as it should. Now we change OS datetime format to English. If we will run the same script manually it will immediately use new format. But if we will run our existing scheduled task it will still use old Russian datetime format. Even if we will restart OS and will try to run scheduled task again result will be the same: it will still use old Russian datetime format.

Solution which I found so far is to recreate scheduled task (export it, delete and then import). After that it will use current (English) datetime format.

Wednesday, September 20, 2023

Use Github secrets to restore nuget packages from private packages source with authentication in Dockerfile via Github actions

If you use private nuget packages source with authentication and Docker in your project you may need to restore packages from this custom packages source within Docker file. In this post I will show how to use Github secrets for that when you build Docker image via docker/build-push-action Github action.

First of all in the yaml file of our Github action we need to pass necessary secrets references to the build action using the following syntax:

name: Build
id: docker_build
uses: docker/build-push-action@v5
with:
  ...
  secrets: |
    "NUGET_USERNAME=${{ secrets.NUGET_USERNAME }}"
    "NUGET_PWD=${{ secrets.NUGET_PWD }}"

After that in Docker file we fetch passed secrets (they are stored to special files under /run/secrets/... path which is available during Docker image build) and will store them to environment variables using export command. After that we will add our private packages source with username and password (using dotnet nuget add source). When it will be done we will be able to run "dotnet restore" command which will restore project dependencies including those which come from private nuget source:

COPY Foo.csproj src/

RUN --mount=type=secret,id=NUGET_USERNAME \
	--mount=type=secret,id=NUGET_PWD \
	export NUGET_USERNAME=$(cat /run/secrets/NUGET_USERNAME) && \
	export NUGET_PWD=$(cat /run/secrets/NUGET_PWD) && \
	dotnet nuget add source https://my-private-packages-source/index.json --name FooPackages --username "${NUGET_USERNAME}" --password "${NUGET_PWD}" --store-password-in-clear-text

RUN dotnet restore "src/Foo.csproj" /p:IsDockerBuild=true

Note that it is important to pipe commands which export environment variables and then use them to the same single RUN command. If you will try to use these variables in separate RUN command "nuget add source" will tell that "Package source with Name: ... added successfully" but then you will get confusing error when will try to run "dotnet restore":

Error NU1301: Unable to load the service index for source

But if everything is done in the way how it is described above then your project dependencies should be restored successfully for your Docker image.

Tuesday, August 15, 2023

Problem with MissingManifestResourceException for embedded resx files in subfolders

If you worked with .Net applications you most probably know what is embedded resources:

Such resources are embedded directly to output assembly. We may check them e.g. with decompiler:

 

Sometime we may need to change namespace of generated resources: e.g. in above example Foo.resx file is located in Resources sub folder so by default it will generate strongly typed class in namespace TestApp.Resources:

Note also line 42 when it created ResourceManager object - it also uses TestApp.Resources namespace. Now in properties of resx file we will explicitly specify Custom tool namespace = TestApp:

After that namespace of automatically generated class will be changed from TestApp.Resources to TestApp. However ResourceManager will be still created with the same old namespace TestApp.Resources:

We may modify .Designer.cs file manually and change namespace of ResourceManager to TestApp (note that if custom tool will run again it will override manual changes in cs file and they should be done again):

But if we will try to get string from generated class Foo.Bar we will get MissingManifestResourceException:

Unhandled exception. System.Resources.MissingManifestResourceException: Could not find the resource "TestApp.Foo.resources" among the resources "TestApp.Resources.Foo.resources" embedded in the assembly "TestApp", nor among the resources in any satellite assemblies for the specified culture. Perhaps the resources were embedded with an incorrect name.

The problem is that after our changes resx file is still embedded to the assembly as TestApp.Resources.Foo.resources:

In order to fix this error we need to edit csproj file and add LogicalName element under EmbeddedResource element for our resx file with correct name:

<EmbeddedResource Update="Resources\Foo.resx">
  <Generator>ResXFileCodeGenerator</Generator>
  <LastGenOutput>Foo.Designer.cs</LastGenOutput>
  <CustomToolNamespace>TestApp</CustomToolNamespace>
  <LogicalName>TestApp.Foo.resources</LogicalName> 
</EmbeddedResource>

After that resource will be embedded to the assembly with correct name TestApp.Foo.resources:

and exception will gone

Tuesday, August 8, 2023

Camlex and Camlex.Client 5.4.2 released

New version 5.4.2 of Camlex library has been released. Starting with this version it became possible to generate CAML queries with string operators BeginsWith and Contains for ContentTypeId field type e.g. the following C# code:

Camlex.Query().Where(x => ((DataTypes.ContentTypeId)x["ContentTypeId"]).StartsWith("0x123")).ToString(true);

will generate the following CAML query:

<Query>
  <Where>
    <BeginsWith>
      <FieldRef Name="ContentTypeId" />
      <Value Type="ContentTypeId">0x123</Value>
    </BeginsWith>
  </Where>
</Query>

This is useful since when you add some site content type to SharePoint list under the hood SharePoint creates inherited content type (which has ContentTypeId which starts with ContentTypeId of parent site content type with appended 00 symbols and guid without dashes) and exactly this inherited content type is then used for list items created in this list. In order to fetch all items created with original site content type we may use CAML query with BeginsWith operator and ContentTypeIdof parent site content type.

As usual new version is available via Nuget.

Tuesday, July 4, 2023

Generate database schema for MySQL using NHibernate hbm2ddl tool

NHibernate has hbm2ddl tool which allows automatically export database tables schema based on provided mappings. I.e. we may define mapping using C# for some POCO class:

public class User
{
    public virtual int Id { get; set; }
    public virtual string FirstName { get; set; }
    public virtual string LastName { get; set; }
}

like that:

public class UserMap : ClassMap<User>
{
    public UserMap()
    {
        Table("[User]");
        Id(x => x.Id, "UserId");
        Map(x => x.FirstName);
        Map(x => x.LastName);
     }
}

and then based on this mapping hbm2ddl will generate the following SQL code for creating User table:

create table `User` (
    UserId INTEGER NOT NULL AUTO_INCREMENT,
    FirstName TEXT,
    LastName TEXT,
    primary key (UserId)
);

That is convenient because we don't need to maintain database schema separately - any change in C# models and mappings will be automatically reflected in db schema. However in order to use hbm2ddl for MySQL we need to add few tweaks to the export code:

  • MySQL uses back ticks instead of square brackets
  • MySQL syntax requires semicolon after each code line in generated SQL

First of all we need to create Fluent NHibernate configuration for MySQL syntax:

var config = Fluently.Configure()
    .Database(
        MySQLConfiguration.Standard
            .ConnectionString(connectionString)
            .AdoNetBatchSize(100)
            .DoNot.ShowSql()
    )
    .Mappings(cfg =>
    {
        cfg.FluentMappings.AddFromAssemblyOf<UserMap>()
            .Conventions.Setup(mappings =>
            {
                mappings.AddAssembly(typeof(UserMap).Assembly);
            });
    });

With this Fluent NHibernate config export code will look like this:

var nhConfig = ... // see above
var export = new SchemaExport(nhConfig);
var sb = new StringBuilder();
export.Create(schema =>
{
    schema = schema.Replace("[", "`").Replace("]", "`");
    if (schema.EndsWith("\r\n"))
    {
        schema = schema.Substring(0, schema.Length - 2) + ";\r\n";
    }
    else
    {
        schema += ";";
    }
    sb.Append(schema);
}, false);

Console.WriteLine(sb.ToString());

Here we replace square brackets by back ticks and add semicolon to the end of each added line. After that we will have valid SQL code for MySQL syntax with database schema.

Thursday, June 22, 2023

Implement TAP/multithread friendly logging scopes for Microsoft.Extensions.Logging.ILogger

Some time ago I wrote a post how to implement custom logger which writes logs to Azure storage blob container: Custom logger for .Net Core for writing logs to Azure BLOB storage. This logger implements ILogger interface from Microsoft.Extensions.Logging namespace. It works quite well but doesn't support logging scopes:

public IDisposable BeginScope<TState>(TState state) => default!;

Logging scopes are quite useful - they allow to specify additional information e.g. from each method log record was added, etc:

public void Foo()
{
    using (logger.BeginScope("Outer scope"))
    {
        ...
        using (logger.BeginScope("Inner scope"))
        {
        }
    }
}

Important requirement for logging scopes is that it should work properly in Task-based asynchronous pattern (TAP) and multithread code (which is widely used nowadays). For that we will use AsyncLocal<T> class from .NET. For implementing scopes themselves we will use linked list (child-parent relation).

For implementing it we will create LogScopeProvider class which implements Microsoft.Extensions.Logging.IExternalScopeProvider interface (for creating this custom LogScopeProvider I used code from Microsoft.Extensions.Logging.Console namespace as base example):

public class LogScopeProvider : IExternalScopeProvider
{
    private readonly AsyncLocal<LogScope> currentScope = new AsyncLocal<LogScope>();

    public object Current => this.currentScope.Value?.State;

    public LogScopeProvider() {}

    public void ForEachScope<TState>(Action<object, TState> callback, TState state)
    {
        void Report(LogScope current)
        {
            if (current == null)
            {
                return;
            }
            Report(current.Parent);
            callback(current.State, state);
        }

        Report(this.currentScope.Value);
    }

    public IDisposable Push(object state)
    {
        LogScope parent = this.currentScope.Value;
        var newScope = new LogScope(this, state, parent);
        this.currentScope.Value = newScope;

        return newScope;
    }

    private class LogScope : IDisposable
    {
        private readonly LogScopeProvider provider;
        private bool isDisposed;

        internal LogScope(LogScopeProvider provider, object state, LogScope parent)
        {
            this.provider = provider;
            State = state;
            Parent = parent;
        }

        public LogScope Parent { get; }

        public object State { get; }

        public override string ToString()
        {
            return State?.ToString();
        }

        public void Dispose()
        {
            if (!this.isDisposed)
            {
                this.provider.currentScope.Value = Parent;
                this.isDisposed = true;
            }
        }
    }
}

Note that LogScopeProvider stores scopes as AsyncLocal<LogScope> which allows to use it in TAP code. So e.g. if we have await inside using(scope) it will be handled correctly:

public async Task Foo()
{
    using (logger.BeginScope("Outer scope"))
    {
        var result = await Bar();
        ...
    }
}

Now returning to our BlobStorage: all we have to is to pass LogScopeProviderto its constructor, add current scope to the log record and return new scope when it is requested (check bold code):

public class BlobLogger : ILogger
{
    private const string CONTAINER_NAME = "custom-logs";
    private string connStr;
    private string categoryName;
    private LogScopeProvider scopeProvider;
 
    public BlobLogger(string categoryName, string connStr, LogScopeProvider scopeProvider)
    {
        this.connStr = connStr;
        this.categoryName = categoryName;
        this.scopeProvider = scopeProvider;
    }
 
    public void Log<TState>(LogLevel logLevel, EventId eventId, TState state, Exception? exception,
        Func<TState, Exception?, string> formatter)
    {
        if (!IsEnabled(logLevel))
        {
            return;
        }
 
        string scope = this.scopeProvider.Current as string;
        using (var ms = new MemoryStream(Encoding.UTF8.GetBytes($"[{this.categoryName}: {logLevel,-12}] {scope} {formatter(state, exception)}{Environment.NewLine}")))
        {
            var container = this.ensureContainer();
            var now = DateTime.UtcNow;
            var blob = container.GetAppendBlobClient($"{now:yyyyMMdd}/log.txt");
            blob.CreateIfNotExists();
            blob.AppendBlock(ms);
        }
    }
 
    private BlobContainerClient ensureContainer()
    {
        var container = new BlobContainerClient(this.connStr, CONTAINER_NAME);
        container.CreateIfNotExists();
        return container;
    }
 
    public bool IsEnabled(LogLevel logLevel) => true;
 
    public IDisposable BeginScope<TState>(TState state) => this.scopeProvider.Push(state);
}

That's it: now our logger also supports logging scopes.

Saturday, June 17, 2023

Profile MySQL db with Neor Profile SQL

If you need to profile MySQL db you may use builtin MySQL shell profiler (CLI). Also there is free GUI alternative - Neor Profile SQL. There are few things which should be done before to use it.

When you launch Profile SQL it asks to establish connection using default parameters for localhost:

If you click Test button you may get "Test is failed" error even with correct credentials of the root user. In order to avoid it you need to enable native MySQL password for the root user:

ALTER USER 'root'@'localhost' IDENTIFIED WITH mysql_native_password BY '...'

after that connection should be successful.

But that is not all. If after that you will run application which connects to MySQL you won't see sessions and queries in profiler - because profiler works as a proxy with own port (4040 by default):

and in order to collect data your application should connect to profiler (not to MySQL directly). I.e. we need to change port in connection string from MySQL port (3306 by default) to profiler port (4040):

{
  "ConnectionStrings": {
    "Default": "Server=localhost;Port=4040;Database=Test;User ID=test;Password=..."
  }
}

If after that connection will fail with System.IO.IOException (from MySql.Data.Common.Ssl namespace methods) add ";SSL Mode=None" to connection string:

{
  "ConnectionStrings": {
    "Default": "Server=localhost;Port=4040;Database=Test;User ID=test;Password=...;SSL Mode=None"
  }
}

After that application should connect to profiler successfully and you should see profiled data.