LibLog library, which I described before, uses reflection magic to allow libraries to do logging without introducing dependencies to any particular logging framework. Problem with the magic tricks is that they fail from time to time.

For me this happened when I tried to use ILMerge for Dropcraft CLI tool and merge in one executable all the Dropcraft libraries, which use LibLog, and Serilog. As the result, merged executable did not produce any logs. No exceptions, no warnings – just empty screen.

After a short LibLog code review, I found the root cause - Type.GetType() calls. LibLog uses GetType calls to probe availability of the different logging frameworks and it uses assembly qualified type names, like Type.GetType("NLog.LogManager, NLog").

Here the issue, in the ILMerged executable there is no NLog assembly. LibLog is not able to detect any logging framework and silently ignores all logging calls. Solution is easy - if GetType call for an assembly qualified type returns null, call GetType for the type only.

Type.GetType("NLog.LogManager, NLog") ?? Type.GetType("NLog.LogManager");

After the change, assembly perfectly works with and without merging. An example of the fully modified LibLog file is available in the Dropcraft repository.

Introducing Dropcraft, a NuGet-based app deployment and composition framework

NuGet ecosystem already overgrew the original idea of the NuGet as a design-time package management tool. NuGet powers installers, helps to create auto-updateable applications and pluggable solutions. What the ecosystem missed so far, is the general-purpose library which abstracts complex, not yet documented NuGet API and simplifies the NuGet-based solutions development.

Welcome Dropcraft. Based on the version 4 of NuGet, it provides a high-level package management API and enables various scenarios of using NuGet packages in the applications. Going beyond just package installations, updates and uninstallations, Dropcraft includes a runtime API for the installed packages discovering and a simple extensibility framework, which is based on NuGet packages.

The main features of Dropcraft

Scenarios, where Dropcraft may be useful

Get started

The easiest way to try Dropcraft is to use dropcraft.exe command line tool. It is built using public Dropcraft APIs and can be used as a framework usage example by itself.

dropcraft.exe install "bootstrap/3.0.0" --path "c:\DemoApp" -s "https://api.nuget.org/v3/index.json" --framework net461

This command instructs Dropcraft to install bootstrap v3.0.0 package from NuGet.org to c:\DemoApp\ folder. It automatically resolves all the dependencies, downloads the packages and installs them

Installing packages...
0 product package(s) found and 1 new package(s) requested
Versions are confirmed
2 package(s) are resolved
        2 package(s) to install
        0 package(s) to update
bootstrap/3.0.0 downloaded
jQuery/1.9.0 downloaded
bootstrap/3.0.0 installed
jQuery/1.9.0 installed
Installation complete.

As the result, C:\DemoApp contains Content, Scripts and fonts folder from the bootstrap and jQuery packages. Dropcraft followed the instructions and installed Bootstrap 3.0.0, which is pretty old. So, the following command will update it

dropcraft.exe install "bootstrap" --path "c:\DemoApp" -s "https://api.nuget.org/v3/index.json" --framework net461
Installing packages...
2 product package(s) found and 1 new package(s) requested
Versions are confirmed
2 package(s) are resolved
        0 package(s) to install
        2 package(s) to update
bootstrap/3.3.7 downloaded
jQuery/1.9.1 downloaded
bootstrap/3.0.0 uninstalled
jQuery/1.9.0 uninstalled
bootstrap/3.3.7 installed
jQuery/1.9.1 installed
Installation complete.

Dropcraft automatically resolved the latest version for the packages and upgraded them. Similarly, Dropcraft can install additional packages, downgrade or uninstall exiting ones.

Advanced scenarios

Previous example demonstrated used of the unmodified NuGet packages with Dropcraft. To enable more advanced scenarios, Dropcraft introduces an additional package manifest file, which can be included in the package by its author.

Package manifest allows to notify Dropcaft about package’s initialization method, allows packages to intercept various Dropcraft events and participate in the application composition.

Dropcraft defines a lightweight application composition model based on an extensibility point concept. Any package is able to define one or many extensibility points which will be linked with the corresponding extensions exported by other packages. It allows to create a chain of extensibility points/extensions and build application from packages like from the Lego blocks.

Dropcraft WPF demo app demonstrates this concept. It consists from the several NuGet package which can be installed separately or all together. First command installs a package with the common interfaces, an executable and the application’s main window. It uses two package sources – NuGet.org and MyGet.org

dropcraft.exe install "Dropcraft.WpfExample.App" "Dropcraft.WpfExample.Shell" "Dropcraft.WpfExample.Common" --path "c:\DemoWPF" -s "https://www.myget.org/F/dropcraft/api/v3/index.json" -s "https://api.nuget.org/v3/index.json" --framework net461

Dropcraft Demo App

The resulting application is just an empty window. Next command adds some functionality by installing an extension – text editor.

dropcraft.exe install "Dropcraft.WpfExample.Editor" --path "c:\DemoWPF" -s "https://www.myget.org/F/dropcraft/api/v3/index.json" -s "https://api.nuget.org/v3/index.json" --framework net461

Dropcraft Demo App

Text editor defines a new extensibility point – editor command – and there is Dropcraft.WpfExample.Commands package which exports two command. So the next step is to install it

dropcraft.exe install "Dropcraft.WpfExample.Commands" --path "c:\DemoWPF" -s "https://www.myget.org/F/dropcraft/api/v3/index.json" -s "https://api.nuget.org/v3/index.json" --framework net461

Dropcraft Demo App

The final result is the application composed from the packages where all the packages are loosely coupled through the interfaces and the composition is facilitated by Dropcraft. The framework takes care about the order of the packages initialization, runtime extensions registration and other scenarios common in the pluggable applications.

Conclusion

Dropcraft provides APIs which can be used by applications to incorporate NuGet functionality. It enables a wide range of scenarios, from direct manipulation with NuGet packages, to package-based plugins and runtime application composition.

While the release 0.2.1 is compiled for .NET 4.6.1, Dropcraft libraries target .NET Standard and are going to support .NET Core in the future releases. Similarly, future release will support ASP.NET Core in addition to the desktop applications.

Links for my Tech Summit talk about Desktop Bridge

Desktop Bridge, previously known as a Project Centennial, is a technology which helps to bring existing desktop applications to the Universal Windows Platform. It allows to manually create an .appx package or use an existing installer and convert it in the .appx. It allows to integrate parts of the UWP in the desktop application and facilitates a gradual move from the classic desktop application to a UWP app.

This post contains links to the additional resources for my Tech Summit Toronto 2016 presentation:

Using the NuGet v3 libraries in your projects, part 2

The previous post demonstrated the use of RemoteDependencyWalker to discover all the dependencies of the target libraries. The provided sample will work as expected if versions for all requested packages are perfectly aligned. However, this is not a case in the real life. User’s request may require installation of the conflicting packages and the application should be able to recognize and handle this situation.

With the provided code, application will fail when any conflict will appear. For the problematic packages, meta information will not be resolved, and the resolve result for the associated GraphNode will be null.

The simplest approach to solve the conflicts is to use Analyze() method of GraphNode class. This method returns analysis result which contains information about the issues. There are three types of issues: cycle dependencies, version downgrades and versioning conflicts, and all of them are detected by GraphNode.Analyze().

While the cycle dependencies and versioning conflicts are most likely will lead to the application failure, version downgrades can be handled. Downgrade means presence of the required package with a version lower than a version of one of the resolved packages. In this situation, the dependency resolver adds both packages and marks package with the lower version as a downgraded package. It can be used by the application to decide next action: allow downgrade or fail.

Slides for my MVP MIX Toronto 2016 talks

Talk: Using NuGet libraries in your application

NuGet is not only a Visual Studio extension or a command line application. It is also a set of libraries which can be used to manipulate NuGet packages programmatically. Do you have a unique CI process, beyond the expected NuGet workflow? Do you need your own way to propagate dependencies between the subsystem? Or maybe you want to create a NuGet-based deployment process for the end users? During this session you will learn about the main NuGet library concepts, will see example of the embedded NuGet usage and will hear some guidance to help you with integrating NuGet with your own application.

The code demonstrated during the presentation is described in Using the NuGet v3 libraries in your projects post and used in Dropcraft project.

Slides: http://www.slideshare.net/LunarFrog/using-nuget-libraries-in-your-application

Talk: C# code sharing across the platforms

Portable, shared, net-standard libraries – so many options to choose when you need to share some code between the platforms. During this talk we will explore all the options and differences between the library types. After the session you will have a solid understanding of the modern .NET library types and code sharing strategies which you can apply for your next .NET Core, desktop or Xamarin project.

Slides: http://www.slideshare.net/LunarFrog/c-code-sharing-across-the-platforms

Using the NuGet v3 libraries in your projects

NuGet 3 tool, as it is expected from a package manager, by itself built using packages. These packages are published on the NuGet.org gallery and can be used by any applications required NuGet-like features. Usage scenarios include plugins, packaged as .nupkg, application content, package-based installers and others. Several projects, like Chocolatey or Wyam, already integrate NuGet for the different proposes, however for the really wide adoption of the NuGet libraries, a better API documentation is required.

This post demonstrates one of the ways of incorporating the NuGet libraries in an application. Dave Glick, an author of Wyam’s, has a great introduction to NuGet v3 APIs and I recommend to read his posts before continuing, however it is not required. The NuGet usage approach described in this post is different from the approach reviewed in the mentioned articles. When applied, it allows to create .NET Standard compatible libraries and incorporate the NuGet tooling not only in .NET Framework applications, but also in .NET Core solutions.

NuGet 3 uses a zillion of libraries. Unlike NuGet 2, composed from just a few libraries, NuGet 3 design is based on multiple small libraries. For example, the post’s sample code uses nine libraries. Another note about the API – it is still in development. Post is based on version 3.5.0-rc1-final of NuGet and before the release some APIs may change.

Top level NuGet libraries used by the solutions are NuGet.DependencyResolver and NuGet.Protocol.Core.v3.

Workflow

The logical workflow is similar to NuGet restore command and from the developer perspective it includes the following phases:

Main concepts

Prepare package sources

The following code adds the official NuGet feed as the package source and registers the sources in the RemoteDependencyWalker’s context.

var resourceProviders = new List>();
resourceProviders.AddRange(Repository.Provider.GetCoreV3());
 
var repositories = new List
{
    new SourceRepository(new PackageSource("https://api.nuget.org/v3/index.json"), resourceProviders)
};
 
var cache = new SourceCacheContext();
var walkerContext = new RemoteWalkContext();
 
foreach (var sourceRepository in repositories)
{
    var provider = new SourceRepositoryDependencyProvider(sourceRepository, _logger, cache, true);
    walkerContext.RemoteLibraryProviders.Add(provider);
}

Identify a list of packages to install

RemoteDependencyWalker accepts only one root library to calculate the dependencies. In case of the multiple root target libraries, they should be wrapped inside of a fake library and IProjectDependencyProvider allows to include the fake library in the dependency resolution process.

IProjectDependencyProvider defines SupportsType method, which allows to control library types handled by the class and GetLibrary method which is expected to return the library object.

The trick is to define the fake library as a LibraryDependencyTarget.Project and only accept this type of libraries to be resolved by ProjectDependencyProvider. So, when RemoteDependencyWalker will ask for the instance of the fake library, it can be constructed with the list of targeted libraries as dependencies. For example, the following code assumes that two NuGet libs are the targeted libraries to install.

public Library GetLibrary(LibraryRange libraryRange, NuGetFramework targetFramework, string rootPath)
{
    var dependencies = new List();
 
    dependencies.AddRange( new []
    {
        new LibraryDependency
        {
            LibraryRange = new LibraryRange("NuGet.Protocol.Core.v3", VersionRange.Parse("3.0.0"), LibraryDependencyTarget.Package)
        },
        new LibraryDependency
        {
            LibraryRange = new LibraryRange("NuGet.DependencyResolver", VersionRange.Parse("3.0.0"), LibraryDependencyTarget.Package)
        },
    });
 
    return new Library
    {
        LibraryRange = libraryRange,
        Identity = new LibraryIdentity
        {
            Name = libraryRange.Name,
            Version = NuGetVersion.Parse("1.0.0"),
            Type = LibraryType.Project,
        },
        Dependencies = dependencies,
        Resolved = true
    };
}

Dependency discovery

When all preparations are done, RemoteDependencyWalker can start to discover the dependencies

walkerContext.ProjectLibraryProviders.Add(new ProjectLibraryProvider());

var fakeLib = new LibraryRange("FakeLib", VersionRange.Parse("1.0.0"), LibraryDependencyTarget.Project);
var frameworkVersion = FrameworkConstants.CommonFrameworks.Net461;
var walker = new RemoteDependencyWalker(walkerContext);
 
GraphNode result = await walker.WalkAsync(
    fakeLib,
    frameworkVersion,
    frameworkVersion.GetShortFolderName(), RuntimeGraph.Empty, true);
 
foreach (var node in result.InnerNodes)
{
    await InstallPackageDependencies(node);
}

The provided code does more than the dependencies discovery. It defines the supported .NET framework version and it iterates through the result to install the packages.

Installing the packages

And now application is ready to install the discovered packages

HashSet _installedPackages = new HashSet();
 
private async Task InstallPackageDependencies(GraphNode node)
{
    foreach (var innerNode in node.InnerNodes)
    {
        if (!_installedPackages.Contains(innerNode.Key))
        {
            _installedPackages.Add(innerNode.Key);
            await InstallPackage(innerNode.Item.Data.Match);
        }
 
        await InstallPackageDependencies(innerNode);
    }
}
 
private async Task InstallPackage(RemoteMatch match)
{
    var packageIdentity = new PackageIdentity(match.Library.Name, match.Library.Version);
 
    var versionFolderPathContext = new VersionFolderPathContext(
        packageIdentity,
        @"D:\Temp\MyApp\",
        _logger,
        PackageSaveMode.Defaultv3,
        XmlDocFileSaveMode.None);
 
    await PackageExtractor.InstallFromSourceAsync(
        stream => match.Provider.CopyToAsync(
            match.Library,
            stream,
            CancellationToken.None),
        versionFolderPathContext,
        CancellationToken.None);
}

As the result of execution, all resolved packages will be de-duplicated and installed in D:\Temp\MyApp[package-name] subfolder. Each package subfolder includes .nupkg, .nuspec and libraries for all supported frameworks.

And that’s it. The provided code demonstrates the whole workflow. There are tons of small details hidden behind this simple demo, but it should be enough for staring your own experiments. Fill free to comment if you have any questions.

Exploring .NET Open Source ecosystem: handling database schema versioning with FluentMigrator

One of the most common mistake which a junior database architect can make is a missed versioning schema. It is so easy to design a schema, release the corresponding application and to realize later how difficult to maintain this schema, support compatibility between the versions and migrate users to the new versions.

However, even when the new schema includes a concept of version, work is required to keep the schema in a health state, have a migration procedure and some tooling to automate the maintenance tasks.

FluentMigrator C# library provides all the needed tools to solve those problems. It provides a syntax to define the versioned schema, it provides a way to migrate databases from version to version and it includes tools to automate the tasks, during the development, deployment and in the field.

Schema

The core concept of FluentMigrator is a migration. Migration is a class which has a version and two methods Up() and Down(). Up() is responsible to migrate the target database from the previous version, to the version defined by the migration. Down() is responsible for the opposite operation – downgrading the database to the previous version.

[Migration(10)]
public class AddNotesTable : Migration
{
      public override Up()
      {
            Create.Table("Notes")
                  .WithIdColumn()
                  .WithColumn("Body").AsString(4000).NotNullable()
                  .WithTimeStamps()
                  .WithColumn("UserId").AsInt32();
      }

      public override Down()
      {
            Delete.Table("Notes");
      }
}

Instead of using SQL, migrations are defined using fluent C# syntax. This approach makes the migrations almost independent from the concrete databases, hiding the differences in SQL between them.

Migration version is defined using MigrationAttribute. Attribute accepts a number, which will be used by a migration executor to sort all the defined migrations and execute them one by one.

In addition to the schema definition, migrations can also include data seeding.

[Profile("Development")]
public class CreateDevData: Migration
{
      public override Up()
      {
            Insert.IntoTable("User").Row( new
                  {
                        Username = "devuser1",
                        DisplayName = "Dev User1"
                  });
      }

      public override Down()
      {
            // empty, not using
      }
}

This example also demonstrates an idea of profiles – an ability to selectively execute some migrations to have, for example, a seeded database for development or testing.

Execution

All migrations are usually grouped in on assembly and can be executed using the various provided tool. FluentMigrator provides CLI, NAnt, MSBuild and Rake migration runners.

Migrate.exe /connection "Data Source=db\db.sqlite;Version=3;" /db sqlite /target migrations.dll

This code demonstrates usage of CLI tool to execute the migrations from migrations.dll for the database defined via the connection string using sqlite driver. Runner automatically detect the current database version and applies only the required migrations.

FluentMigrator is published under Apache 2.0 license and available at GitHub and NuGet.

Exploring .NET Open Source ecosystem: logging from netstandard libraries using LibLog

Continuing discussing the logging, let’s talk about LibLog. Unlike other logging libraries, LibLog targets just one specific scenario – logging within reusable libraries.

Why this scenario needs a specific treatment? There are two general logging approaches for the library developers: one is to choose a logging library and force all library consumers to use the same library, and another is to create a logging façade and ask the consuming application to create an adapter for the preferred logging library like NLog or Serilog. Obviously, both approaches are not very elegant, and LibLog provides a solution based on the optimized second approach.

LibLog consists from just one file which can be added to the project directly from the GitHub repository or using a corresponding NuGet package. In case of manual approach, LibLog.cs will require namespace editing (detailed instructions provided in the file header comments).

After that, LibLog’s entry point, LogProvider class, becomes available for use. By design it is only available for the assembly which contains LibLog.cs. If the library includes more than one assembly, InternalsVisibleToAttribute can be used to share the logging infrastructure across all the components of the library without exposing it to the consumers.

A simple usage scenario looks like this:

public class MyClass
    {
        private static readonly ILog Logger = LogProvider.For();

        public void DoSomething()
        {
            Logger.Trace("Method 'DoSomething' in progress");
        }
    }

That’s it, no the library is ready to automatically pick-up logger used by the consuming application. For example, if Serilog is the selected library, assigning Serilog’s Logger.Log will automatically connect all the moving parts together:

Log.Logger = new LoggerConfiguration()
                .MinimumLevel.Verbose()
                .WriteTo.LiterateConsole()
                .CreateLogger();

            Log.Logger.Verbose("Starting...");

            var myClass = new MyClass();
            myClass.DoSomething();

            Log.Logger.Verbose("Finishing...");
            Console.ReadKey();

The result is

[21:09:18 APP] Starting...
[21:09:18 APP] Method 'DoSomething' in progress
[21:09:18 APP] Finishing... 

Everything described before just works when library targeting .NET Framework. In case of .NET Standard library (at least netstandard1.3) things are a bit more complicated. When LibLog is added to such library, compilation will fail and two modifications are required to fix it.

First, LIBLOG_PORTABLE conditional compilation symbol shall be defined in the project settings (don’t forget to define it in all used build configurations). Second, two missing NuGet packages shall be added - Microsoft.CSharp and System.Dynamic.Runtime. These modifications will fix the build and enable LibLog usage.

Exploring .NET Open Source ecosystem: logging with Serilog

Logging is an important part of every software project. Ideally, developers would prefer to attach a debugger to the failing system and investigate it in-situ. In real life, however, systems are rarely available to the developers in the moment the issue is happening. And this is the reason we are heavily rely on logs, which can be analyzed later.

One of the most popular logging library for the .NET applications is log4net. It is a powerful, flexible library and the usual workflow looks like this: a developer needs to log an object state and some action, so he creates a human-readable string which represents the state and the action. The string is passed to log4net and stored in a text file. With the amount of logging data generated by the modern software, it very desirable to be able to process the logs automatically, by some software tool. I this case stored text file will be parsed to extract the object state and analyze it.

When we taking into account the second part of the workflow (parsing and analyzing), it sounds crazy. Nicely structured data available before logging is transformed to unstructured text only to parse and to transform to the structured state a bit late. What a waste of CPU cycles and time!

Serilog, a structured logging library, aims to stop this craziness. With Serilog, data can be logged in an original form and passed to the structured storage (database, Windows Event Log, third-party service) without an additional overhead.

var user = new User {Name="Guest", Ip="127.0.0.1"};
logger.Information("User {@User} logged", user);

If Serilog is configured to output data to console, the logged information will be present as

08:43:12 [Information] User {Name: Guest, Ip: 127.0.0.1} logged

And for other sinks (like Event Log) it will be captured as JSON

{ "User": {"Name": "Guest", "Ip": "127.0.0.1"}}

Serilog supports more than 50 sink, including console, text files, Email, Elasticsearch, RethinkDB and others. Logging sinks are configured for each logger and more than one sink can be used:

var logger = new LoggerConfiguration()
    .MinimumLevel.Debug()
    .WriteTo.RollingFile("log-{Date}.txt")
    .WriteTo.LiterateConsole()
    .CreateLogger(););

Serilog allow to enrich logging data with some static information (like thread id), define custom serializers and filters, for selective logging.

Serilog is published under Apache 2.0 license at https://github.com/serilog

Exploring .NET Open Source ecosystem: communicate with NetMQ

A proper introduction to NetMQ requires more than one post and is well beyond the format of this series of the posts. Instead, in a spirit of the series, this is an awareness raising post, to bring additional visibility to the great library.

NetMQ is a managed .NET port of ZeroMQ library. The idea of the both libraries is not to directly compete with the messaging solutions like RabbitMQ, ActiveMQ or NServiceBus, which provide rich high level feature set out of the box; but to provide a relatively low level API which allows to build complex solutions tailored to the specific business and functional needs.

Unlike the mentioned solutions, NetMQ does not include a central server or broker. NetMQ is based on a socket idea. Each side of communication require to open a socket and use it for communication. NetMQ, following ZeroMQ, supports many different type of sockets and each socket type has a unique behavior.

The simplest pair of sockets is RequestSocket/ResponseSocket. These two sockets, when connected to each other, allows to build synchronous client server communications.

Server code example

using (var responseSocket = new ResponseSocket("@tcp://*:5555"))
            {
                while (true)
                {
                    // receive a request message
                    var msg = responseSocket.ReceiveFrameString();

                    Console.WriteLine("Request received: " + msg);

                    // send a canned response
                    responseSocket.SendFrame("Response for " + msg);
                }
            }

Client code example

using (var requestSocket = new RequestSocket(">tcp://localhost:5555"))
            {
                requestSocket.SendFrame("Hello");
                var message = requestSocket.ReceiveFrameString();
                Console.WriteLine("requestSocket : Received '{0}'", message);

                Console.ReadKey();
            }

Samples demonstrate creation of the sockets for server and client and sending the request/response between the applications.

The real power of NetMQ comes from the ability to craft your own protocol using multi-frame messages and from the variety of the available sockets. They include async sockets, pub/sub sockets, in-proc sockets. These features differentiate ZeroMQ and NetMQ and allows to build very complex solutions, however they bring lots of complexity and require some learning. ZeroMQ guide may be a good start, even if you will end up using NetMQ – it helps to understand main principles and pattern.

Previous posts