I am really existed about an opportunity to talk about .NET Standard 2.0 and .NET Core 2.0 with the Toronto and Toronto area .NET community. While .NET Standard 2.0 is a big step ahead for streamlining the compatibility story between the different implementations of the .NET Framework, .NET Core 2.0 brings better performance and simplifications for the ASP.NET Core developers.

If you are interested in the topic, here are the details about the upcoming meetups:

Additional resources for the presentation:

Configuring TeamCity to run in Docker on Linux and build .NET Core projects

Recently, I had a need to setup a build pipeline for a medium size .NET Core project and, having a good previous experience with JetBrains TeamCity, I decided to use it in this case as well. Professional Edition is free and its limitations are acceptable for the project – up to three build agents and up to twenty build configurations.

This post provides a step-by-step guide on installing and configuring TeamCity. The starting point is a clean Ubuntu 16.04 LTS server, and the goal is to run TeamCity server, build agents and PostgreSQL on this system using Docker containers. Additionally, the server and the agents are configured to support .NET Core project builds. This solution can be equally easy deployed on a local system or in cloud, like Azure or AWS.

For use of TeamCity in production environment, it is recommended to use an external database for storing the configuration data. For the described case, I use PostgreSQL running in a Docker container as well. So, the full stack includes five Docker containers, one for PostgreSQL, one for TeamCity server and three for the build agents. PostgreSQL database and all the data generated by TeamCity are persisted on a local drive using Docker mounted volumes.

Installing Docker

If you are starting from a clean system, you will need to install Docker and Docker Compose first. The detailed instructions on installing Docker for Ubuntu are available at https://docs.docker.com/engine/installation/linux/ubuntu/ and to install Docker Compose use apt-get:

sudo apt-get install docker-compose

Folder structure

I use /srv folder as a root folder for all the data related to TeamCity builds and here is the full hierarchy of folders you will need to create inside of /srv:


When the folders are created, we are ready to define the stack.

Docker containers

Create a docker-compose.yaml file in the /srv/docker folder and paste the following content

It configures the stack of the required containers. The configuration is self-explanatory and I’d like to highlight just a couple of things:

The only required change for the file is a correct PostgreSQL password and if it is updated, save the file, close it and start the configured stack by running

docker-compose up -d

It will download all the required images and start the containers. We are ready to open and configure TeamCity.

TeamCity first start

In a browser, open TeamCity site. There is nothing special about configuring TeamCity running in Docker comparing with a conventional deployment, so these instructions are provided just for the completeness of the guide and based on a version 2017 of TeamCity.

TeamCity First Start

On the first page, just click Proceed, the data directory is properly configured already.

JDBC drivers needed

Now you need to connect TeamCity with the running instance of PostgreSQL. But before, you need JDBC drivers – they are not shipped with TeamCity. In terminal, open /srv/teamcity/data/lib/jdbc and put downloaded drivers here, for example by executing

sudo wget https://jdbc.postgresql.org/download/postgresql-42.1.1.jar 

Back to the browser and click Refresh JDBC drivers – TeamCity should detect the newly installed drivers and allow you to connect to the database.

Enter database connection information

Provide the required information (use database name, user name and the password defined in the docker-compose file) and click Proceed. If you are receiving a connection error, verify that the database host name is entered without ‘http’ and the host allows access to port 5432 for PostgreSQL (most likely will be blocked if the instance is hosted by Azure or AWS).

On the next page accept the agreement, create an administrative account and you are ready to use TeamCity.

Using TeamCity for building .NET Core project

After the start, three build agents shall be detected by TeamCity automatically, but marked as Unauthorized. They need to be authorized manually.


So far, we managed to configure and launch TeamCity and connect the build agent. The last step, before creating a new build project, is to install .NET Core plugin. This step is optional, as you can run .NET Core tasks from the command line runner, but the plugin simplifies steps definition by adding a dedicated .NET Core runner.

The plugin can be downloaded at plugins.jetbrains.com and can be installed via TeamCity UI – just open Administration\Plugins List page and upload the plugin. To enable the plugin, TeamCity requires restart and, unfortunately, there is no way to do it from the UI, so you need to use console again, go to /srv/docker and do

docker-compose stop
docker-compose up -d 

After that, the plugin is installed and the agents are capable to use it (see the agent’s properties)

Agent properties

That’s it – now you are ready to create a TeamCity project and configure the first build.

.NET Core build step Configured .NET Core build steps


This guide demonstrated an approach for deploying a Docker-based TeamCity setup for running .NET Core build. It is based on a free version of TeamCity and allows an easy cloud deployment.

First impressions from using .NET Core for AWS Lambdas and deployment tips

Earlier this year, Amazon announced availability of C# as a first-class citizen for AWS Lambda services. This solution is based on .NET Core and allows to use .Net ecosystem for building serverless services.


Tooling and integration with Visual Studio are surprisingly good. VS2017 support is in preview, but even the preview extension is pretty stable. C#-based Lambdas can be deployed and tested directly from Visual Studio, with a couple of clicks. Lambda tools can also be used from .NET Core CLI, which enables command line-based Continuous Integration and Continuous Deployment scenarios.


C# Lambdas support two main workflows – simple Lambda Function and serverless application hosted by Lambda. While the first case is simple, the serverless model is more interesting as it allows to host the whole ASP.NET Core Web API site in Lambda. In this case, AWS tools, with help of CloudFormation, deploys and configures API Gateway endpoint and Lambda which routes all requests to the corresponding Web API controllers/actions. In the same time, Web API site can still be run locally for testing purposes. This model provides a huge productivity boost comparing with a traditional model when Lambda shall be deployed to AWS first and only then can be tested.

AWS Developer blog provides tutorials on creating a new C# Lambda or converting existing ASP.NET Web API site in Serverless application. While the tutorials are well written, they miss a couple of important points:


Performance of the solution is yet to be profiled, but the quick measurements show performance enough to proceed with the spikes. Latency for the cold Lambda (which is almost not relevant as you may use some techniques to keep Lambdas warm) is about 5s for the “Hello World” type of API and the average latency for the warm Lambdas is about 80ms.

Xamarin experience event

If you are interested in Xamarin technologies, Xamarin team is going to be in Toronto to speak about the new features delivered as part of Visual Studio 2017 release, cloud connectivity for the mobile apps and the mobile DevOps story. You will also have a chance to participate in a panel discussion and ask your questions.

The event will take place on Thursday, April 13th at Microsoft Toronto Office (suite 1201, 222 Bay Street) from 10am to 1pm, lunch provided.

For registration, email Catherine Kerr with your name and contact details. Spaces are limited.


How to use new features of MSBuild 15 with .NET Framework projects

One of the components updated for the Visual Studio 2017 release is the MSBuild build system. With the move of the .NET Core’s project system from project.json to the csproj format, MSBuild was updated to support the new lightweight csproj files. It also provides a better NuGet support, supporting referencing of the NuGet packages directly from csproj, and introducing restore and pack build tasks. This tasks can be used to restore project’s NuGet dependencies and to pack libraries without using NuGet, just with MSBuild.

While these features are primarily advertised (and work out of the box) for .NET Core projects, they can be used with .NET Framework 4.x projects too, assuming you have Visual Studio 2017 installed.


Restore task invocation is simple and is equivalent to the NuGet restore command

msbuild "path" /t:restore

If you will try to execute the command for an existing .NET Framework project with some NuGet packages referenced, it will do nothing. The exact message is “Nothing to do. None of the projects specified contain packages to restore”. MSBuild restore does not recognize packages.config. Instead, it expects to see all the NuGet references inside of the .csproj files.

This feature can be enabled via the NuGet Package Manager options. You may need to remove and add again referenced packages to switch csproj to the new way of referencing packages.

NuGet options

As the result, the MSBuild should be able to perform the restore command successfully.


MSBuild pack command is the direct replacement for the NuGet pack. While MSBuild still supports .nuspec files, it can also build the package based on the project properties, or the command line parameters, without spec.

Similar to the restore command, pack does not work out of the box for .NET Framework projects. To support this command, some MSBuild tasks need to be imported in the project. It is done automatically for the .NET Core projects, but requires some modifications in csproj for the .NET Framework projects.

First change is to import NuGet related build tasks.

<Import Project="$(MSBuildSDKsPath)\NuGet.Build.Tasks.Pack\build\NuGet.Build.Tasks.Pack.targets" />

MSBuildSDKsPath variable points to C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\MSBuild\Sdks folder and is a part of the new MSBuild concept - SDK - which allows dynamically delivered build process extensions.

With this change, MSBuild is ready to do the pack. However, if you will try, it will complain about missing fields – ID and Authors. This fields can be defined using MSBuild properties or from the command line. For example, the command line may look like

msbuild /t:pack /p:PackageId=MSBuildTestLib /p:Authors="Andrei Marukovich"

And when defined in csproj

    <Authors>Andrei Marukovich</Authors>

BuildOutputTargetFolder is a property which should be used in this case to instruct MSBuild which framework folder to use for the build output. Without this line, the project assemblies will go in the root of .nupkg \lib folder. For the additional information about using the MSBuild properties for controlling the packing process and defining the package metadata, see Additions to the csproj format for .NET Core.

After these final changes, the modified .NET Framework project will allow to restore the packages and to be packed as a NuGet package using MSBuild, without involving NuGet.exe, and will continue to work in Visual Studio 2017.

Fixing LibLog for using with ILMerge

LibLog library, which I described before, uses reflection magic to allow libraries to do logging without introducing dependencies to any particular logging framework. Problem with the magic tricks is that they fail from time to time.

For me this happened when I tried to use ILMerge for Dropcraft CLI tool and merge in one executable all the Dropcraft libraries, which use LibLog, and Serilog. As the result, merged executable did not produce any logs. No exceptions, no warnings – just empty screen.

After a short LibLog code review, I found the root cause - Type.GetType() calls. LibLog uses GetType calls to probe availability of the different logging frameworks and it uses assembly qualified type names, like Type.GetType("NLog.LogManager, NLog").

Here the issue, in the ILMerged executable there is no NLog assembly. LibLog is not able to detect any logging framework and silently ignores all logging calls. Solution is easy - if GetType call for an assembly qualified type returns null, call GetType for the type only.

Type.GetType("NLog.LogManager, NLog") ?? Type.GetType("NLog.LogManager");

After the change, assembly perfectly works with and without merging. An example of the fully modified LibLog file is available in the Dropcraft repository.

Introducing Dropcraft, a NuGet-based app deployment and composition framework

NuGet ecosystem already overgrew the original idea of the NuGet as a design-time package management tool. NuGet powers installers, helps to create auto-updateable applications and pluggable solutions. What the ecosystem missed so far, is the general-purpose library which abstracts complex, not yet documented NuGet API and simplifies the NuGet-based solutions development.

Welcome Dropcraft. Based on the version 4 of NuGet, it provides a high-level package management API and enables various scenarios of using NuGet packages in the applications. Going beyond just package installations, updates and uninstallations, Dropcraft includes a runtime API for the installed packages discovering and a simple extensibility framework, which is based on NuGet packages.

The main features of Dropcraft

Scenarios, where Dropcraft may be useful

Get started

The easiest way to try Dropcraft is to use dropcraft.exe command line tool. It is built using public Dropcraft APIs and can be used as a framework usage example by itself.

dropcraft.exe install "bootstrap/3.0.0" --path "c:\DemoApp" -s "https://api.nuget.org/v3/index.json" --framework net461

This command instructs Dropcraft to install bootstrap v3.0.0 package from NuGet.org to c:\DemoApp\ folder. It automatically resolves all the dependencies, downloads the packages and installs them

Installing packages...
0 product package(s) found and 1 new package(s) requested
Versions are confirmed
2 package(s) are resolved
        2 package(s) to install
        0 package(s) to update
bootstrap/3.0.0 downloaded
jQuery/1.9.0 downloaded
bootstrap/3.0.0 installed
jQuery/1.9.0 installed
Installation complete.

As the result, C:\DemoApp contains Content, Scripts and fonts folder from the bootstrap and jQuery packages. Dropcraft followed the instructions and installed Bootstrap 3.0.0, which is pretty old. So, the following command will update it

dropcraft.exe install "bootstrap" --path "c:\DemoApp" -s "https://api.nuget.org/v3/index.json" --framework net461
Installing packages...
2 product package(s) found and 1 new package(s) requested
Versions are confirmed
2 package(s) are resolved
        0 package(s) to install
        2 package(s) to update
bootstrap/3.3.7 downloaded
jQuery/1.9.1 downloaded
bootstrap/3.0.0 uninstalled
jQuery/1.9.0 uninstalled
bootstrap/3.3.7 installed
jQuery/1.9.1 installed
Installation complete.

Dropcraft automatically resolved the latest version for the packages and upgraded them. Similarly, Dropcraft can install additional packages, downgrade or uninstall exiting ones.

Advanced scenarios

Previous example demonstrated used of the unmodified NuGet packages with Dropcraft. To enable more advanced scenarios, Dropcraft introduces an additional package manifest file, which can be included in the package by its author.

Package manifest allows to notify Dropcaft about package’s initialization method, allows packages to intercept various Dropcraft events and participate in the application composition.

Dropcraft defines a lightweight application composition model based on an extensibility point concept. Any package is able to define one or many extensibility points which will be linked with the corresponding extensions exported by other packages. It allows to create a chain of extensibility points/extensions and build application from packages like from the Lego blocks.

Dropcraft WPF demo app demonstrates this concept. It consists from the several NuGet package which can be installed separately or all together. First command installs a package with the common interfaces, an executable and the application’s main window. It uses two package sources – NuGet.org and MyGet.org

dropcraft.exe install "Dropcraft.WpfExample.App" "Dropcraft.WpfExample.Shell" "Dropcraft.WpfExample.Common" --path "c:\DemoWPF" -s "https://www.myget.org/F/dropcraft/api/v3/index.json" -s "https://api.nuget.org/v3/index.json" --framework net461

Dropcraft Demo App

The resulting application is just an empty window. Next command adds some functionality by installing an extension – text editor.

dropcraft.exe install "Dropcraft.WpfExample.Editor" --path "c:\DemoWPF" -s "https://www.myget.org/F/dropcraft/api/v3/index.json" -s "https://api.nuget.org/v3/index.json" --framework net461

Dropcraft Demo App

Text editor defines a new extensibility point – editor command – and there is Dropcraft.WpfExample.Commands package which exports two command. So the next step is to install it

dropcraft.exe install "Dropcraft.WpfExample.Commands" --path "c:\DemoWPF" -s "https://www.myget.org/F/dropcraft/api/v3/index.json" -s "https://api.nuget.org/v3/index.json" --framework net461

Dropcraft Demo App

The final result is the application composed from the packages where all the packages are loosely coupled through the interfaces and the composition is facilitated by Dropcraft. The framework takes care about the order of the packages initialization, runtime extensions registration and other scenarios common in the pluggable applications.


Dropcraft provides APIs which can be used by applications to incorporate NuGet functionality. It enables a wide range of scenarios, from direct manipulation with NuGet packages, to package-based plugins and runtime application composition.

While the release 0.2.1 is compiled for .NET 4.6.1, Dropcraft libraries target .NET Standard and are going to support .NET Core in the future releases. Similarly, future release will support ASP.NET Core in addition to the desktop applications.

Links for my Tech Summit talk about Desktop Bridge

Desktop Bridge, previously known as a Project Centennial, is a technology which helps to bring existing desktop applications to the Universal Windows Platform. It allows to manually create an .appx package or use an existing installer and convert it in the .appx. It allows to integrate parts of the UWP in the desktop application and facilitates a gradual move from the classic desktop application to a UWP app.

This post contains links to the additional resources for my Tech Summit Toronto 2016 presentation:

Using the NuGet v3 libraries in your projects, part 2

The previous post demonstrated the use of RemoteDependencyWalker to discover all the dependencies of the target libraries. The provided sample will work as expected if versions for all requested packages are perfectly aligned. However, this is not a case in the real life. User’s request may require installation of the conflicting packages and the application should be able to recognize and handle this situation.

With the provided code, application will fail when any conflict will appear. For the problematic packages, meta information will not be resolved, and the resolve result for the associated GraphNode will be null.

The simplest approach to solve the conflicts is to use Analyze() method of GraphNode class. This method returns analysis result which contains information about the issues. There are three types of issues: cycle dependencies, version downgrades and versioning conflicts, and all of them are detected by GraphNode.Analyze().

While the cycle dependencies and versioning conflicts are most likely will lead to the application failure, version downgrades can be handled. Downgrade means presence of the required package with a version lower than a version of one of the resolved packages. In this situation, the dependency resolver adds both packages and marks package with the lower version as a downgraded package. It can be used by the application to decide next action: allow downgrade or fail.

Slides for my MVP MIX Toronto 2016 talks

Talk: Using NuGet libraries in your application

NuGet is not only a Visual Studio extension or a command line application. It is also a set of libraries which can be used to manipulate NuGet packages programmatically. Do you have a unique CI process, beyond the expected NuGet workflow? Do you need your own way to propagate dependencies between the subsystem? Or maybe you want to create a NuGet-based deployment process for the end users? During this session you will learn about the main NuGet library concepts, will see example of the embedded NuGet usage and will hear some guidance to help you with integrating NuGet with your own application.

The code demonstrated during the presentation is described in Using the NuGet v3 libraries in your projects post and used in Dropcraft project.

Slides: http://www.slideshare.net/LunarFrog/using-nuget-libraries-in-your-application

Talk: C# code sharing across the platforms

Portable, shared, net-standard libraries – so many options to choose when you need to share some code between the platforms. During this talk we will explore all the options and differences between the library types. After the session you will have a solid understanding of the modern .NET library types and code sharing strategies which you can apply for your next .NET Core, desktop or Xamarin project.

Slides: http://www.slideshare.net/LunarFrog/c-code-sharing-across-the-platforms

Previous posts