.
Anmeldung | Registrieren | Hilfe

Blogger

.NET-Blog Archiv

.NET Developer Blogs

Material von der BASTA! Spring

26.02.2017 08:44:00 | Jörg Neumann

Hier das Material meiner Sessions auf der BASTA! Spring 2017:



Die Unterlagen meines Workshops "Cross-Plattform-App-Development mit Xamarin" erhalten Sie gerne auf Anfrage.

Tools: NDepend v2017.1 Smart Technical Debt Estimation

25.02.2017 16:36:02 | Steffen Steinbrecher

Jeder Entwickler dürfte das folgende Szenario aus dem Alltag kennen: Es soll ein neues Feature/Anforderung in die bestehende Software integriert werden und dafür gibt es (wie fast immer) mehrere Möglichkeiten: die Quick & Dirty Lösung – lässt sich schnell erledigen, obwohl man sich darüber bewusst ist über kurz oder lang wieder über diese Codestelle „stolpern“. […]

Creating a Word document OutputFormatter in ASP.NET Core

22.02.2017 18:00:00 | Jürgen Gutsch

In one of the ASP.NET Core projects we did in the last year, we created an OutputFormatter to provide a Word documents as printable reports via ASP.NET Core Web API. Well, this formatter wasn't done by me, but done by a fellow software developer Jakob Wolf at the yooapps.com. I told him to write about it, but he hadn't enough time to do it yet, so I'm going to do it for him. Maybe you know about him on Twitter. Maybe not, but he is one of the best ASP.NET and Angular developers I ever met.

About OutputFormatters

In ASP.NET you are able to have many different formatters. The best known built-in formatter is the JsonOutputFormatter which is used as the default OutputFormatter in ASP.NET Web API.

By using the AddMvcOptions() you are able to add new Formatters or to manage the existing formatters:

services.AddMvc()
    .AddMvcOptions(options =>
    {
        options.OutputFormatters.Add(new WordOutputFormatter());
        options.FormatterMappings.SetMediaTypeMappingForFormat(
          "docx", MediaTypeHeaderValue.Parse("application/ms-word"));
    })

As you can see in the snippet above, we add the Word document formatter (called WordOutputFormatter to provide the Word documents if the requested type is "application/ms-word".

You are able to add whatever formatter you need, provided on whatever media type you want to support.

Let's have a look how a output formatter looks like:

public class MyFormatter : IOutputFormatter
{
  public bool CanWriteResult(OutputFormatterCanWriteContext context)
  {
    // check whether to write or not
    throw new NotImplementedException();
  }

  public async Task WriteAsync(OutputFormatterWriteContext context)
  {
    // write the formatted contents to the response stream.
    throw new NotImplementedException();
  }
}

You have one method to check whether the data can be written to the expected format or not. The other async method does the job to format and output the data to the response stream, which comes with the context.

This way needs to do some things manually. A more comfortable way to implement an OutputFormatter is to inherit from the OutputFormatter base class directly:

public class WordOutputFormatter : OutputFormatter
{
  public string ContentType { get; }

  public WordOutputFormatter()
  {
    ContentType = "application/ms-word";
    SupportedMediaTypes.Add(MediaTypeHeaderValue.Parse(ContentType));
  }

  // optional, but makes sense to restrict to a specific condition
  protected override bool CanWriteType(Type type)
  {
    if (type == null)
    {
      throw new ArgumentNullException(nameof(type));
    }

    // only one ViewModel type is allowed
    return type == typeof(DocumentContentsViewModel);
  }

  // this needs to be overwritten
  public override Task WriteResponseBodyAsync(OutputFormatterWriteContext context)
  {
    // Format and write the document outputs here
    throw new NotImplementedException();
  }
}

The base class does some things for you. For example to write the correct HTTP headers.

Creating Word documents

To create Word documents you need to add a reference to the Open XML SDK. We used the OpenXMLSDK-MOT with the version 2.6.0, which cannot used with .NET Core. This is why we run that specific ASP.NET Core project on .NET 4.6.

Version 2.7.0 is available as a .NET Standard 1.3 library and can be used in .NET Core. Unfortunately this version isn't yet available in the default NuGet Feed. To install the latest Version, follow the instructions on GitHub: https://github.com/officedev/open-xml-sd Currently there is a mess with the NuGet package IDs and versions on NuGet and MyGet. Use the MyGet feed, mentioned on the GitHub page to install the latest version. The package ID here is DocumentFormat.OpenXml and the latest stable Version is 2.7.1

In this post, I don't want to go threw all the word processing stuff , because it is too specific to our implementation. I just show you how it works in general. The Open XML SDK is pretty well documented, so you can use this as an entry point to create your own specific WordOutputFormatter:

public override async Task WriteResponseBodyAsync(OutputFormatterWriteContext context)
{
  var response = context.HttpContext.Response;
  var filePath = Path.GetTempFileName();

  var viewModel = context.Object as DocumentContentsViewModel;
  if (viewModel == null)
  {
    throw new ArgumentNullException(nameof(viewModel));
  }

  using (var wordprocessingDocument = WordprocessingDocument
         .Create(filePath, WordprocessingDocumentType.Document))
  {
    // start creating the documents and the main parts of it
    wordprocessingDocument.AddMainDocumentPart();

    var styleDefinitionPart = wordprocessingDocument.MainDocumentPart
      .AddNewPart<StyleDefinitionsPart>();
    var styles = new Styles();
    styles.Save(styleDefinitionPart);

    wordprocessingDocument.MainDocumentPart.Document = new Document
    {
      Body = new Body()
    };
    var body = wordprocessingDocument.MainDocumentPart.Document.Body;

    // call a helper method to set default styles
    AddStyles(styleDefinitionPart); 
    // call a helper method set the document to landscape mode
    SetLandscape(body); 

    foreach (var institution in viewModel.Items)
    {
      // iterate threw some data of the viewmodel 
      // and create the elements you need
      
      // ... more word processing stuff here

    }

    await response.SendFileAsync(filePath);
  }
}

The VewModel with the data to format, is in the Object property of the OutputFormatterWriteContext. We do a save cast and check for null before we continue. The Open XML SDK works based on files. This is why we need to create a temp file name and let the SDK use this file path. Because of that fact - at the end - we send the file out to the response stream using the response.SendFileAsync() method. I personally prefer to work on the OutputStream directly, to have less file operations and to be a little bit faster. The other thing is, we need to cleanup the temp files.

After the file is created, we work on this file and create the document, custom styles and layouts and the document body, which will contain the formatted data. Inside the loop we are only working on that Body object. We created helper methods to add formatted values, tables and so on...

Conclusion

OutputFormatters are pretty useful to create almost any kind of content out of any kind of data. Instead of hacking around in the specific Web API actions, you should always use the OutputFormatters to have reusable components.

The OutputFormatter we build, is not really reusable or even generic, because it was created for a specific kind of a report. But with this starting point, we are able to make it generic. We could pass a template document to the formatter, which knows the properties of the ViewModel, this way it is possible to create almost all kind of Word documents.

Create NuGet packages with Cake

14.02.2017 01:45:00 |

This blogpost is a follow up to these Cake (C# Make) related blogpost:

Scenario

x

Let’s say we have this project structure. The “Config”, “Result” and “Engine” projects contains a corresponding .nuspec, like this:

<?xml version="1.0"?>
<package >
  <metadata>
    <id>Sloader.Config</id>
    <version>0.1.0</version>
    <title>Sloader.Config</title>
    <authors>Code Inside Team</authors>
    <owners>Code Inside Team</owners>
    <licenseUrl>https://github.com/Code-Inside/Sloader/blob/master/LICENSE</licenseUrl>
    <projectUrl>https://github.com/Code-Inside/Sloader</projectUrl>
    <requireLicenseAcceptance>false</requireLicenseAcceptance>
    <description>Sloader Config</description>
    <releaseNotes>
      ## Version 0.1 ##
      Init
    </releaseNotes>
    <copyright>Copyright 2017</copyright>
    <tags>Sloader</tags>
    <dependencies />
  </metadata>
</package>

Nothing fancy - pretty normal NuGet stuff.

Goal

The goal is to create a NuGet package for each target project with Cake.

build.cake

The usage in Cake is pretty much the same as with the normal nuget.exe pack command The sample only shows the actual cake target - see the older blogposts for a more complete example:

Task("BuildPackages")
    .IsDependentOn("Restore-NuGet-Packages")
	.IsDependentOn("RunTests")
    .Does(() =>
{
    var nuGetPackSettings = new NuGetPackSettings
	{
		OutputDirectory = rootAbsoluteDir + @"\artifacts\",
		IncludeReferencedProjects = true,
		Properties = new Dictionary<string, string>
		{
			{ "Configuration", "Release" }
		}
	};

    MSBuild("./src/Sloader.Config/Sloader.Config.csproj", new MSBuildSettings().SetConfiguration("Release"));
    NuGetPack("./src/Sloader.Config/Sloader.Config.csproj", nuGetPackSettings);
    MSBuild("./src/Sloader.Result/Sloader.Result.csproj", new MSBuildSettings().SetConfiguration("Release"));
    NuGetPack("./src/Sloader.Result/Sloader.Result.csproj", nuGetPackSettings);
    MSBuild("./src/Sloader.Engine/Sloader.Engine.csproj", new MSBuildSettings().SetConfiguration("Release"));
    NuGetPack("./src/Sloader.Engine/Sloader.Engine.csproj", nuGetPackSettings);
});

Easy, right? The most interesting part here is the NuGetPack command. Before we invoke this command we need to make sure that we build the last recent version - to enforce that we just rebuild each project in release mode.

Result

x

The console output should make the flow pretty clear:

PS C:\Users\Robert\Documents\GitHub\Sloader> .\build.ps1 -t BuildPackages
Preparing to run build script...
Running build script...
Analyzing build script...
Processing build script...
Installing tools...
Compiling build script...

========================================
Clean
========================================
Executing task: Clean
Cleaning directory C:/Users/Robert/Documents/GitHub/Sloader/artifacts
Finished executing task: Clean

========================================
Restore-NuGet-Packages
========================================
Executing task: Restore-NuGet-Packages
MSBuild auto-detection: using msbuild version '14.0' from 'C:\Program Files (x86)\MSBuild\14.0\bin'.
All packages listed in packages.config are already installed.
Finished executing task: Restore-NuGet-Packages

========================================
BuildTests
========================================
Executing task: BuildTests
Start Building Test: Sloader.Config.Tests
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

  Sloader.Config -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Config.Tests\Sloader.Config.dll
  Sloader.Config.Tests -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Config.Tests\Sloader.Config
  .Tests.dll
Start Building Test: Sloader.Result.Tests
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

  Sloader.Result -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Result.Tests\Sloader.Result.dll
  Sloader.Result.Tests -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Result.Tests\Sloader.Result
  .Tests.dll
Start Building Test: Sloader.Engine.Tests
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

  Sloader.Config -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Engine.Tests\Sloader.Config.dll
  Sloader.Result -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Engine.Tests\Sloader.Result.dll
  Sloader.Engine -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Engine.Tests\Sloader.Engine.dll
  Sloader.Engine.Tests -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Engine.Tests\Sloader.Engine
  .Tests.dll
Finished executing task: BuildTests

========================================
RunTests
========================================
Executing task: RunTests
Start Running Tests
xUnit.net Console Runner (64-bit .NET 4.0.30319.42000)
  Discovering: Sloader.Config.Tests
  Discovered:  Sloader.Config.Tests
  Starting:    Sloader.Config.Tests
  Finished:    Sloader.Config.Tests
  Discovering: Sloader.Engine.Tests
  Discovered:  Sloader.Engine.Tests
  Starting:    Sloader.Engine.Tests
  Finished:    Sloader.Engine.Tests
  Discovering: Sloader.Result.Tests
  Discovered:  Sloader.Result.Tests
  Starting:    Sloader.Result.Tests
  Finished:    Sloader.Result.Tests
=== TEST EXECUTION SUMMARY ===
   Sloader.Config.Tests  Total: 23, Errors: 0, Failed: 0, Skipped: 0, Time: 0.554s
   Sloader.Engine.Tests  Total: 17, Errors: 0, Failed: 0, Skipped: 0, Time: 1.070s
   Sloader.Result.Tests  Total:  4, Errors: 0, Failed: 0, Skipped: 0, Time: 1.061s
                                --          -          -           -        ------
                   GRAND TOTAL: 44          0          0           0        2.684s (5.697s)
Finished executing task: RunTests

========================================
BuildPackages
========================================
Executing task: BuildPackages
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

Build started 2017-02-19 22:00:09.
The target "_ConvertPdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,37)" does not exist in the project, and will be ignored.
The target "_CollectPdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,70)" does not exist in the project, and will be ignored.
The target "_CollectMdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (41,38)" does not exist in the project, and will be ignored.
The target "_CopyMdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Common
.targets\ImportAfter\Xamarin.Common.targets (41,71)" does not exist in the project, and will be ignored.
Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Config\Sloader.Config.csproj" on node 1 (Build target(s))
.
GenerateTargetFrameworkMonikerAttribute:
Skipping target "GenerateTargetFrameworkMonikerAttribute" because all output files are up-to-date with respect to the i
nput files.
CoreCompile:
Skipping target "CoreCompile" because all output files are up-to-date with respect to the input files.
CopyFilesToOutputDirectory:
  Sloader.Config -> C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Config\bin\Release\Sloader.Config.dll
Done Building Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Config\Sloader.Config.csproj" (Build target
(s)).


Build succeeded.
    0 Warning(s)
    0 Error(s)

Time Elapsed 00:00:00.22
Attempting to build package from 'Sloader.Config.csproj'.
MSBuild auto-detection: using msbuild version '14.0' from 'C:\Program Files (x86)\MSBuild\14.0\bin'.
Packing files from 'C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Config\bin\Release'.
Using 'Sloader.Config.nuspec' for metadata.
Found packages.config. Using packages listed as dependencies
Successfully created package 'C:\Users\Robert\Documents\GitHub\Sloader\artifacts\Sloader.Config.0.2.1.nupkg'.
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

Build started 2017-02-19 22:00:10.
The target "_ConvertPdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,37)" does not exist in the project, and will be ignored.
The target "_CollectPdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,70)" does not exist in the project, and will be ignored.
The target "_CollectMdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (41,38)" does not exist in the project, and will be ignored.
The target "_CopyMdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Common
.targets\ImportAfter\Xamarin.Common.targets (41,71)" does not exist in the project, and will be ignored.
Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Result\Sloader.Result.csproj" on node 1 (Build target(s))
.
GenerateTargetFrameworkMonikerAttribute:
Skipping target "GenerateTargetFrameworkMonikerAttribute" because all output files are up-to-date with respect to the i
nput files.
CoreCompile:
Skipping target "CoreCompile" because all output files are up-to-date with respect to the input files.
CopyFilesToOutputDirectory:
  Sloader.Result -> C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Result\bin\Release\Sloader.Result.dll
Done Building Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Result\Sloader.Result.csproj" (Build target
(s)).


Build succeeded.
    0 Warning(s)
    0 Error(s)

Time Elapsed 00:00:00.24
Attempting to build package from 'Sloader.Result.csproj'.
MSBuild auto-detection: using msbuild version '14.0' from 'C:\Program Files (x86)\MSBuild\14.0\bin'.
Packing files from 'C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Result\bin\Release'.
Using 'Sloader.Result.nuspec' for metadata.
Found packages.config. Using packages listed as dependencies
Successfully created package 'C:\Users\Robert\Documents\GitHub\Sloader\artifacts\Sloader.Result.0.2.1.nupkg'.
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

Build started 2017-02-19 22:00:12.
The target "_ConvertPdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,37)" does not exist in the project, and will be ignored.
The target "_CollectPdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,70)" does not exist in the project, and will be ignored.
The target "_CollectMdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (41,38)" does not exist in the project, and will be ignored.
The target "_CopyMdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Common
.targets\ImportAfter\Xamarin.Common.targets (41,71)" does not exist in the project, and will be ignored.
The target "_ConvertPdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,37)" does not exist in the project, and will be ignored.
The target "_CollectPdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,70)" does not exist in the project, and will be ignored.
The target "_CollectMdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (41,38)" does not exist in the project, and will be ignored.
The target "_CopyMdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Common
.targets\ImportAfter\Xamarin.Common.targets (41,71)" does not exist in the project, and will be ignored.
Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Engine\Sloader.Engine.csproj" on node 1 (Build target(s))
.
Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Engine\Sloader.Engine.csproj" (1) is building "C:\Users\R
obert\Documents\GitHub\Sloader\src\Sloader.Config\Sloader.Config.csproj" (2) on node 1 (default targets).
GenerateTargetFrameworkMonikerAttribute:
Skipping target "GenerateTargetFrameworkMonikerAttribute" because all output files are up-to-date with respect to the i
nput files.
CoreCompile:
Skipping target "CoreCompile" because all output files are up-to-date with respect to the input files.
CopyFilesToOutputDirectory:
  Sloader.Config -> C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Config\bin\Release\Sloader.Config.dll
Done Building Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Config\Sloader.Config.csproj" (default targ
ets).

The target "_ConvertPdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,37)" does not exist in the project, and will be ignored.
The target "_CollectPdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,70)" does not exist in the project, and will be ignored.
The target "_CollectMdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (41,38)" does not exist in the project, and will be ignored.
The target "_CopyMdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Common
.targets\ImportAfter\Xamarin.Common.targets (41,71)" does not exist in the project, and will be ignored.
Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Engine\Sloader.Engine.csproj" (1) is building "C:\Users\R
obert\Documents\GitHub\Sloader\src\Sloader.Result\Sloader.Result.csproj" (3) on node 1 (default targets).
GenerateTargetFrameworkMonikerAttribute:
Skipping target "GenerateTargetFrameworkMonikerAttribute" because all output files are up-to-date with respect to the i
nput files.
CoreCompile:
Skipping target "CoreCompile" because all output files are up-to-date with respect to the input files.
CopyFilesToOutputDirectory:
  Sloader.Result -> C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Result\bin\Release\Sloader.Result.dll
Done Building Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Result\Sloader.Result.csproj" (default targ
ets).

BclBuildEnsureBindingRedirects:
Skipping target "BclBuildEnsureBindingRedirects" because all output files are up-to-date with respect to the input file
s.
GenerateTargetFrameworkMonikerAttribute:
Skipping target "GenerateTargetFrameworkMonikerAttribute" because all output files are up-to-date with respect to the i
nput files.
CoreCompile:
Skipping target "CoreCompile" because all output files are up-to-date with respect to the input files.
_CopyAppConfigFile:
Skipping target "_CopyAppConfigFile" because all output files are up-to-date with respect to the input files.
CopyFilesToOutputDirectory:
  Sloader.Engine -> C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Engine\bin\Release\Sloader.Engine.dll
Done Building Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Engine\Sloader.Engine.csproj" (Build target
(s)).


Build succeeded.
    0 Warning(s)
    0 Error(s)

Time Elapsed 00:00:00.54
Attempting to build package from 'Sloader.Engine.csproj'.
MSBuild auto-detection: using msbuild version '14.0' from 'C:\Program Files (x86)\MSBuild\14.0\bin'.
Packing files from 'C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Engine\bin\Release'.
Using 'Sloader.Engine.nuspec' for metadata.
Found packages.config. Using packages listed as dependencies
Successfully created package 'C:\Users\Robert\Documents\GitHub\Sloader\artifacts\Sloader.Engine.0.2.1.nupkg'.
Finished executing task: BuildPackages

Task                          Duration
--------------------------------------------------
Clean                         00:00:00.1083837
Restore-NuGet-Packages        00:00:00.7808530
BuildTests                    00:00:02.6296445
RunTests                      00:00:05.9397822
BuildPackages                 00:00:05.2679058
--------------------------------------------------
Total:                        00:00:14.7265692

BrowserLink causes an error in a new ASP.NET Core web

09.02.2017 18:00:00 | Jürgen Gutsch

Using the latest bits of Visual Studio 2017 and the latest SDK of .NET Core, you will possibly get an error while starting your ASP.NET Core Web in Visual Studio 2017 or using dotnet run.

In Visual Studio 2017 the browser opens on pressing F5, but with wired HTML code in the address bar. Debugging starts and stops a few seconds later. Using the console you'l get a more meaningful error message:

This error means, something is missing the system.runtime.dll ("System.Runtime, Version=4.2.0.0") which is not referenced or used somewhere directly. I had a deeper look into the NuGet references and couldn't found it.

Because I often had problems with BrowserLink in the past, I removed the NuGet reference from the project and all worked fine. I added it again, to be sure that the removal didn't clean up anything. The error happened again.

It seams that the current Version of BrowserLink is referencing a library which is not supported by the app. Remove it, if you get the same or a similar error.

BrowserLink in general is a pretty cool feature, it refreshes the browser magically if anything changes on the server. With this tool, you are able to edit your CSS files and preview it directly in the browser without doing a manual refresh. It is a VisualStudio Add-in and uses NuGet-Packages to extend your app to support it.

Build & run xUnit tests with Cake

08.02.2017 01:45:00 |

Last year I already covered the basic usage of Cake, which stands for “C# Make”. This time we want to build and run xUnit tests with Cake.

Scenario

x

Let’s say we have this project structure. Be aware that all our tests have the suffix “Tests” in the project name.

The files are organized like this, so we have all “Tests” in a “tests” folder and the actual code under “src”:

src/Sloader.Config
src/Sloader.Engine
src/Sloader.Hosts.Console
src/Sloader.Result
tests/Sloader.Config.Tests
tests/Sloader.Engine.Tests
tests/Sloader.Result.Tests
.gitignore
build.cake
build.ps1
LICENSE
Sloader.sln

Goal

Now we want to build all tests projects and run them with the xUnit console runner. Be aware that there are multiple ways of doing it, but I found this quite good.

build.cake

#tool "nuget:?package=xunit.runner.console"
//////////////////////////////////////////////////////////////////////
// ARGUMENTS
//////////////////////////////////////////////////////////////////////

var target = Argument("target", "Default");
var configuration = Argument("configuration", "Release");

//////////////////////////////////////////////////////////////////////
// PREPARATION
//////////////////////////////////////////////////////////////////////

// Define directories.
var artifactsDir  = Directory("./artifacts/");
var rootAbsoluteDir = MakeAbsolute(Directory("./")).FullPath;

//////////////////////////////////////////////////////////////////////
// TASKS
//////////////////////////////////////////////////////////////////////

Task("Clean")
    .Does(() =>
{
    CleanDirectory(artifactsDir);
});

Task("Restore-NuGet-Packages")
    .IsDependentOn("Clean")
    .Does(() =>
{
    NuGetRestore("./Sloader.sln");
});

Task("Build")
    .IsDependentOn("Restore-NuGet-Packages")
    .Does(() =>
{

     
});

Task("BuildTests")
    .IsDependentOn("Restore-NuGet-Packages")
    .Does(() =>
{
	var parsedSolution = ParseSolution("./Sloader.sln");

	foreach(var project in parsedSolution.Projects)
	{
	
	if(project.Name.EndsWith(".Tests"))
		{
        Information("Start Building Test: " + project.Name);

        MSBuild(project.Path, new MSBuildSettings()
                .SetConfiguration("Debug")
                .SetMSBuildPlatform(MSBuildPlatform.Automatic)
                .SetVerbosity(Verbosity.Minimal)
                .WithProperty("SolutionDir", @".\")
                .WithProperty("OutDir", rootAbsoluteDir + @"\artifacts\_tests\" + project.Name + @"\"));
		}
	
	}    

});

Task("RunTests")
    .IsDependentOn("BuildTests")
    .Does(() =>
{
    Information("Start Running Tests");
    XUnit2("./artifacts/_tests/**/*.Tests.dll");
});

//////////////////////////////////////////////////////////////////////
// TASK TARGETS
//////////////////////////////////////////////////////////////////////

Task("Default")
    .IsDependentOn("RunTests");

//////////////////////////////////////////////////////////////////////
// EXECUTION
//////////////////////////////////////////////////////////////////////

RunTarget(target);

Explanation: BuildTests?

The default target “Default” will trigger “RunTests”, which depend on “BuildTests”.

Inside the “BuildTests”-target we use a handy helper from Cake and we parse the .sln file and search all “Test”-projects. With that information we can build each test individually and don’t have to worry over “overlapping” files. The output of this build will be saved at “artifacts/_tests”.

Running xUnit

To run xUnit we have to include the runner at the top of the cake file:

#tool "nuget:?package=xunit.runner.console"

Now we can just invoke XUnit2 and scan for all Tests.dlls and we are done:

XUnit2("./artifacts/_tests/**/*.Tests.dll");

Result

The console output should make the flow pretty clear:

PS C:\Users\Robert\Documents\GitHub\Sloader> .\build.ps1
Preparing to run build script...
Running build script...
Analyzing build script...
Processing build script...
Installing tools...
Compiling build script...

========================================
Clean
========================================
Executing task: Clean
Creating directory C:/Users/Robert/Documents/GitHub/Sloader/artifacts
Finished executing task: Clean

========================================
Restore-NuGet-Packages
========================================
Executing task: Restore-NuGet-Packages
MSBuild auto-detection: using msbuild version '14.0' from 'C:\Program Files (x86)\MSBuild\14.0\bin'.
All packages listed in packages.config are already installed.
Finished executing task: Restore-NuGet-Packages

========================================
BuildTests
========================================
Executing task: BuildTests
Start Building Test: Sloader.Config.Tests
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

  Sloader.Config -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Config.Tests\Sloader.Config.dll
  Sloader.Config.Tests -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Config.Tests\Sloader.Config
  .Tests.dll
Start Building Test: Sloader.Result.Tests
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

  Sloader.Result -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Result.Tests\Sloader.Result.dll
  Sloader.Result.Tests -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Result.Tests\Sloader.Result
  .Tests.dll
Start Building Test: Sloader.Engine.Tests
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

  Sloader.Config -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Engine.Tests\Sloader.Config.dll
  Sloader.Result -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Engine.Tests\Sloader.Result.dll
  Sloader.Engine -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Engine.Tests\Sloader.Engine.dll
  Sloader.Engine.Tests -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Engine.Tests\Sloader.Engine
  .Tests.dll
Finished executing task: BuildTests

========================================
RunTests
========================================
Executing task: RunTests
Start Running Tests
xUnit.net Console Runner (64-bit .NET 4.0.30319.42000)
  Discovering: Sloader.Config.Tests
  Discovered:  Sloader.Config.Tests
  Starting:    Sloader.Config.Tests
  Finished:    Sloader.Config.Tests
  Discovering: Sloader.Engine.Tests
  Discovered:  Sloader.Engine.Tests
  Starting:    Sloader.Engine.Tests
  Finished:    Sloader.Engine.Tests
  Discovering: Sloader.Result.Tests
  Discovered:  Sloader.Result.Tests
  Starting:    Sloader.Result.Tests
  Finished:    Sloader.Result.Tests
=== TEST EXECUTION SUMMARY ===
   Sloader.Config.Tests  Total: 22, Errors: 0, Failed: 0, Skipped: 0, Time: 0.342s
   Sloader.Engine.Tests  Total:  9, Errors: 0, Failed: 0, Skipped: 0, Time: 0.752s
   Sloader.Result.Tests  Total:  4, Errors: 0, Failed: 0, Skipped: 0, Time: 0.475s
                                --          -          -           -        ------
                   GRAND TOTAL: 35          0          0           0        1.569s (3.115s)
Finished executing task: RunTests

========================================
Default
========================================
Executing task: Default
Finished executing task: Default

Task                          Duration
--------------------------------------------------
Clean                         00:00:00.0155255
Restore-NuGet-Packages        00:00:00.5065704
BuildTests                    00:00:02.1590662
RunTests                      00:00:03.2443534
Default                       00:00:00.0061325
--------------------------------------------------
Total:                        00:00:05.9316480

Using Dependency Injection in .NET Core Console Apps

07.02.2017 18:00:00 | Jürgen Gutsch

The Dependency Injection (DI) Container used in ASP.NET Core is not limited to ASP.NET Core. You are able to use it in any kind of .NET Project. This post shows how to use it in an .NET Core Console application.

Create a Console Application using the dotnet CLI or Visual Studio 2017. The DI Container is not available by default, bit the IServiceProvider is. If you want to use an Custom or third party DI Container, you should provide an implementation if an IServiceProvider, as an encapsulation of a DI Container.

In this post I want to use the DI Container used in the ASP.NET Core projects. This needs an additional NuGet package "Microsoft.Extensions.DependencyInjection" (currently it is version 1.1.0)

Since this library is a .NET Standard Library, it should also work in a .NET 4.6 application. You just need to add a reference to "Microsoft.Extensions.DependencyInjection"

After adding that package we can start to use it. I created two simple classes which are dependent to each other, to show the how it works in a simple way:

public class Service1 : IDisposable
{
  private readonly Service2 _child;
  public Service1(Service2 child)
  {
    Console.WriteLine("Constructor Service1");
    _child = child;
  }

  public void Dispose()
  {
    Console.WriteLine("Dispose Service1");
    _child.Dispose();
  }
}

public class Service2 : IDisposable
{
  public Service2()
  {
    Console.WriteLine("Constructor Service2");
  }

  public void Dispose()
  {
    Console.WriteLine("Dispose Service2");
  }
}

Usually you would also use interfaces and create the relationship between this two classes, instead of the concrete implementation. Anyway, we just want to test if it works.

In the static void Main of the console app, we create a new ServiceCollection and register the classes in a transient scope:

var services = new ServiceCollection();
services.AddTransient<Service2>();
services.AddTransient<Service1>();

This ServiceCollection comes from the added NuGet package. Your favorite DI container possibly uses another way to register the services. You could now share the ServiceCollection to additional components, who wants to share some more services, in the same way ASP.NET Core does it with the AddSomething (e. g. AddMvc()) extension methods.

Now we need to create the ServiceContainer out of that collection:

var provider = services.BuildServiceProvider();

We can also share the ServiceProvider in our application to retrieve the services, but the proper way is to use it only on a single entry point:

using (var service1 = provider.GetService<Service1>())
{
  // so something with the class
}

Now, let's start the console app and look at the console output:

As you can see, this DI container is working in any .NET Core app.

Neuer Server neues Glück – Blog Umzug ist vollzogen

06.02.2017 18:08:54 | Hans-Peter Schelian

Nachdem es lange Zeit "sehr" still rund um meinen Blog war, möchte ich den gerade vollzogenen Umzug meines Blog auf einen neuen Server auch dazu nutzen wieder regelmäßig zu bloggen. Mit dem Umzug auf den neuen Server habe ich auch gleich das Zertifikat Thema aktualisiert. Ab sofort setze ich hier im Blog ein Zertifikat von Letsencrypt ein.

Visual Studio: NDepend v2017 erschienen

05.02.2017 13:23:43 | Steffen Steinbrecher

NDepend ist in der Version v2017 erschienen und bietet tolle neue Features: Smart Technical Debt Estimation – mit dieser Funktion sieht man auf einen Blick wie viel Zeit es kosten würde kritische Codestellen/Funktionen zu fixen bzw. ein Refactoring durchzuführen. Angenommen ein Entwickler implementiert eine neue Funktion und die NDepend Analyse würde jetzt ergeben, dass diese […]

C#: OData-Operationen mit Parametern aufrufen

05.02.2017 11:56:19 | Steffen Steinbrecher

OData unterstützt benutzerdefinierte Operationen, sogenannte Aktionen und Funktionen. Eine Funktion muss Daten zurückliefern und hat im Normalfall keine Nebenwirkungen, d.h. eine Funktion sollte keine Daten verändern und lediglich lesend auf Daten zugreifen (GET). Aktionen hingegen können CRUD-Operationen auf Entitäten durchführen, d.h. man kann benutzerdefinierte Aktionen definieren, welche CREATE-, UPDATE- oder DELETE-Operationen auf Entitägen ausführen, wenn […]

C#: OData, SAP NW Gateway und CSRF-Token

01.02.2017 15:53:09 | Steffen Steinbrecher

Bei allen Änderungsanfragen (PUT, POST und DELETE) eines Clients gegen einen SAP Netweaver OData-Service muss der Client ein entsprechendes CSRF (Cross Site Request Forgery) Token mitgeben. In diesem Beitrag wird jetzt gezeigt wie man an solches Token anfordern und dann bei den OData-Requests mitgeben kann. CSRF – Cross Site Request Forgery Zunächst einmal ein paar […]

Mein Technologieradar für 2017

25.01.2017 07:30:47 | Johnny Graber

Mit dem neuen Jahr wird es auch wieder Zeit für eine Aktualisierung meines Technologieradars. Nach den Ausgaben für 2014 und 2015 verzichtete ich auf eine Ausgabe für 2016. Die grossen Ankündigungen von Microsoft (wie ASP.Net vNext) verzögerten sich und nur die Jahreszahl anzupassen war mir zu wenig.   Mein neuer Technologieradar Auch weiterhin kommt das … Mein Technologieradar für 2017 weiterlesen

Building a home robot: Part 5 - arms and hands

13.01.2017 19:00:00 | Daniel Springwald

(see all parts of "building a home robot")

I wanted Roobert to get two identical hands with separate moveable fingers.

Because of this (and the small size) I decided to use a commercial construction kit instead of designing and constructing the hands on my own.

Although it was a construction kit it was fun for hours to assemble the hands:

Each arm is constructed from the hand construction kit, 3 servos, 3d printed servo brackets and an I2C servo controller. Because the servo controller seemed to be unable to shut down the servo power, I attached a relays for each arm to turn the servo power on/on.

The servo holder for the upper arm parts are printed in 3D:

The complete arms:

The right arm:

A roobert-hand-assembling-workplace :-)

GitHub API: Create or update files

03.01.2017 01:45:00 |

This blogpost covers a pretty basic GitHub topic: Creating and updating content on GitHub. Of course, there are many ways to do it - e.g. you could do the full Git-ceremony and it would work with all Git hosts, but in my case I just wanted to target the offical GitHub API.

Prerequisite: A GitHub User, Repo and Token

To use this code you will need write access to a GitHub repository and you should have a valid GitHub token.

Code

The most simple way to communicate with the GitHub API is by using the Octokit SDK (from GitHub).

Description: Inside the try-block we try to get the target file, if it is already committed in the repo the API will return the last commit SHA.

With this SHA it is possible to create a new commit to do the actual update.

If the file was not found, we create the file. I’m not a huge fan of this try/catch block, but didn’t found any other way to check if the file is comitted or not (please give me a hint if this is wrong ;))

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Octokit;

namespace CreateOrUpdateGitHubFile
{
    class Program
    {
        static void Main(string[] args)
        {
            Task.Run(async () =>
            {
                var ghClient = new GitHubClient(new ProductHeaderValue("Octokit-Test"));
                ghClient.Credentials = new Credentials("ACCESS-TOKEN");

                // github variables
                var owner = "OWNER";
                var repo = "REPO";
                var branch = "BRANCH";

                var targetFile = "_data/test.txt";

                try
                {
                    // try to get the file (and with the file the last commit sha)
                    var existingFile = await ghClient.Repository.Content.GetAllContentsByRef(owner, repo, targetFile, branch);

                    // update the file
                    var updateChangeSet = await ghClient.Repository.Content.UpdateFile(owner, repo, targetFile,
                       new UpdateFileRequest("API File update", "Hello Universe! " + DateTime.UtcNow, existingFile.First().Sha, branch));
                }
                catch (Octokit.NotFoundException)
                {
                    // if file is not found, create it
                    var createChangeSet = await ghClient.Repository.Content.CreateFile(owner,repo, targetFile, new CreateFileRequest("API File creation", "Hello Universe! " + DateTime.UtcNow, branch));
                }

                
                
            }).Wait();
        }
    }
}

The demo code is also available on GitHub.

Hope this helps.

DbProviderFactories: Write database agnostic ADO.NET code

31.12.2016 16:00:00 |

Recently I needed to write a module that needs to connect to a wide range of SQL-DBs, e.g. MySQL, MS SQL, Oracle etc.

Problem: Most providers will use their concret classes

If you look at the C# example on the MySQL dev page you will see the MsSql-Namespace and classes:

MySql.Data.MySqlClient.MySqlConnection conn;
string myConnectionString;

myConnectionString = "server=127.0.0.1;uid=root;" +
    "pwd=12345;database=test;";

try
{
    conn = new MySql.Data.MySqlClient.MySqlConnection();
    conn.ConnectionString = myConnectionString;
    conn.Open();
}
catch (MySql.Data.MySqlClient.MySqlException ex)
{
    MessageBox.Show(ex.Message);
}

The same classes will probably not work for a MS SQL database.

“Solution”: Use the DbProviderFactories

For example if you install the MySql-NuGet package you will also get this little enhancement to you app.config:

<system.data>
  <DbProviderFactories>
    <remove invariant="MySql.Data.MySqlClient" />
    <add name="MySQL Data Provider" invariant="MySql.Data.MySqlClient" description=".Net Framework Data Provider for MySQL" type="MySql.Data.MySqlClient.MySqlClientFactory, MySql.Data, Version=6.9.9.0, Culture=neutral, PublicKeyToken=c5687fc88969c44d" />
  </DbProviderFactories>
</system.data>

Now we can get a reference to the MySql client via the DbProviderFactories:

using System;
using System.Data;
using System.Data.Common;

namespace DbProviderFactoryStuff
{
    class Program
    {
        static void Main(string[] args)
        {
            try
            {
                Console.WriteLine("All registered DbProviderFactories:");
                var allFactoryClasses = DbProviderFactories.GetFactoryClasses();

                foreach (DataRow row in allFactoryClasses.Rows)
                {
                    Console.WriteLine(row[0] + ": " + row[2]);
                }

                Console.WriteLine();
                Console.WriteLine("Try to access a MySql DB:");

                DbProviderFactory dbf = DbProviderFactories.GetFactory("MySql.Data.MySqlClient");
                using (DbConnection dbcn = dbf.CreateConnection())
                {
                    dbcn.ConnectionString = "Server=localhost;Database=testdb;Uid=root;Pwd=Pass1word;";
                    dbcn.Open();
                    using (DbCommand dbcmd = dbcn.CreateCommand())
                    {
                        dbcmd.CommandType = CommandType.Text;
                        dbcmd.CommandText = "SHOW TABLES;";

                        // parameter...
                        //var foo = dbcmd.CreateParameter();
                        //foo.ParameterName = "...";
                        //foo.Value = "...";

                        using (DbDataReader dbrdr = dbcmd.ExecuteReader())
                        {
                            while (dbrdr.Read())
                            {
                                Console.WriteLine(dbrdr[0]);
                            }
                        }
                    }
                }
            }
            catch (Exception exc)
            {
                Console.WriteLine(exc.Message);
            }

            Console.ReadLine();

        }
    }
}

The most important line is this one:

DbProviderFactory dbf = DbProviderFactories.GetFactory("MySql.Data.MySqlClient");

Now with the DbProviderFactory from the MySql client we can access the MySql database without using any MySql-specific classes.

There are a couple of “in-built” db providers registered, like the MS SQL provider or ODBC stuff.

The above code will output something like this:

All registered DbProviderFactories:
Odbc Data Provider: System.Data.Odbc
OleDb Data Provider: System.Data.OleDb
OracleClient Data Provider: System.Data.OracleClient
SqlClient Data Provider: System.Data.SqlClient
Microsoft SQL Server Compact Data Provider 4.0: System.Data.SqlServerCe.4.0
MySQL Data Provider: MySql.Data.MySqlClient

Other solutions

Of course there are other solutions - some OR-Mapper like the EntityFramework have a provider model which might also work, but this one here is a pretty basic approach.

SQL Commands

The tricky bit here is that you need to make sure that your SQL commands work on your database - this is not a silver bullet, it just lets you connect and execute SQL commands to any ‘registered’ database.

The full demo code is also available on GitHub.

Hope this helps.

Building a home robot: Part 4 - moving the head up and down

26.12.2016 11:00:00 | Daniel Springwald

(see all parts of "building a home robot")

Because the monitor and raspberry pi are placed in the front of the head most of weight is there. So the motors for up/down  movement have to be powerful enough to handle this.

I decided to use two stepper motors with built in gearboxes. They also are the axis for the head movement. I mounted them back to back into a 3d printed box ,so they are moving in contrariwise directions.

Putting together neck and front head:

Assembling including cables, Raspberry Pi and electronics:

The two stepper motors have much power - but still not enough to lift the head. To fix this problem I placed a counterbalance made from lead in the back of the head.

The first test with provisional counterbalance:

After some hours of testing another problem occurred. The mechanic connection between the two motors and the 3d printed part was very small.

Only 4mm PLA to connect  to the whole head wight and movement:

And the PLA material was not strong enough to handle this for a long time – so the freedom of movement increased.

To solve this I used two small parts of aluminum with holes matching to the axis of both stepper motors.

Goodbye Techblog

20.12.2016 17:12:00 | Martin Hey

Ich habe immer gerne gebloggt, war das doch für mich ein Weg, mein Wissen zu teilen. Und oft genug habe ich beim Suchen nach Lösungen für Probleme meine eigenen Posts gefunden und so gemerkt, dass ich das gleiche Problem schon mal hatte - ein bisschen Eigennutz war also dabei und ich habe ihn auch als private Wissenssammlung mit Indizierung durch Google verwendet. 

Aber in den letzten beiden Jahren ist immer weniger Zeit fürs Bloggen geblieben (6 Posts in 2016 ist nicht viel) und auch die Themen mit denen ich mich beschäftige verändern sich - so bin ich zwar noch an Entwickler-Themen dran, aber oft genug sind da auch eher betriebswirtschaftliche Themen die mich umtreiben.

Ich werde auch weiterhin aktiv in der Entwickler-Community bleiben und an anderer Stelle sicher weiter bloggen. Nur dieser eher privat betriebene Blog wird zum Ende des Jahres 2016 gelöscht.

Angelehnt an den allerersten Post aus 2007: goodbye world

A small library to support the CQS pattern.

11.12.2016 18:00:00 | Jürgen Gutsch

The last years, I loved to use the Command and Query Segregation pattern. Using this pattern in every new project, requires to have the same infrastructure classes in this projects. This is why I started to create a small and reusable library, which now supports ASP.NET Core and is written to match .NET Standard 1.6.

About that CQS

The idea behind CQS is to separate the query part (the read part / fetching-the-data-part) from the command part (the write part / doing-things-with-the-data-part). This enables you to optimize both parts in different ways. You are able to split the data flow into different optimized pipes.

From my perspective, the other most important benefit of it is, that this approach enforce you, to split your business logic into pretty small peaces of code. This is because each command and each query only does one single thing:

  • fetching a specific set of data
  • executing a specific command

E. g. if you press a button, you probably want to save some data. You will create a SaveDataCommand with the data to save in it. You'll pass that command to the CommandDispatcher and this guy will delegate the command to the right CommandHandler, which is just responsible to save that specific data to the database, or what ever you want to do with that data. You think you'll also add a log entry with the same command? No problem: Just create another CommandHandler using the same command. With this approach you'll have two small components, one to save the data and another one to add a log entry, which are completely independent and can be tested separately.

What about fetching the data? Just create a "Query" with the data used as a filter criteria. Pass the Query to the QueryProcessor, which delegates the Query to the right QueryHandler. in this QueryHandler, you are able to select the data from the data source, map it to the result you expect and return it back.

Sounds easy? It really is as easy.

Each Handler, both the QuerHandlers and the CommandHandlers, are isolated peaces of code, if you use Dependency Injection in it. This means unit tests are as easy as the implementation itself.

What is inside the library?

This library contains a CommandDispatcher and a QueryProcessor to delegate commands and queries to the right handlers. The library helps you to write your own commands and queries, as well as your own command handlers and query handlers. There are two main NameSpaces inside the library: Command and Query

The Command part contains the CommandDispatcher, an ICommand interface and two more interfaces to define command handlers (ICommandHandler<in TCommand>) and async command handlers (IAsyncCommandHandler<in TCommand>):

The CommandDispatcher interface looks like this:

public interface ICommandDispatcher
{
    void DispatchCommand<TCommand>(TCommand command) where TCommand : ICommand;
    Task DispatchCommandAsync<TCommand>(TCommand command) where TCommand : ICommand;
}

The Query part contains the QueryProcessor, a generic IQuery, which defines the result in the generic argument and two different QueryHandlers. It also contains two more interfaces to define query handlers (IHandleQuery<in TQuery, TResult>) and async query handlers (IHandleQueryAsync<in TQuery, TResult>)

public interface IQueryProcessor
    {
        TResult Process<TResult>(IQuery<TResult> query);
        Task<TResult> ProcessAsync<TResult>(IQuery<TResult> query);
    }

Using the library

For the following examples, I'll reuse the speaker database I already used in previous blog posts.

After you installed the library using Nuget, you need to register the the QueryProcessor and the CommandDispatcher to the dependency injection. You can do it manually in the ConfigureSerices method or just by using AddCqsEngine()

public void ConfigureServices(IServiceCollection services)
{
	services.AddMvc(),

	services.AddCqsEngine();

	services.AddQueryHandlers();
	services.AddCommandHandlers();
}

The methods AddQueryHandlers and AddCommandHandlers are just methods to encapsulate the registrtion of your Handlers, and are propably written by you as a user of this library. The method could look like this:

public static IServiceCollection AddQueryHandlers(this IServiceCollection services)
{
	services.AddTransient<IHandleQueryAsync<AllSpeakersQuery, IEnumerable<Speaker>>, SpeakerQueryHandler>();
	services.AddTransient<IHandleQueryAsync<SpeakerByIdQuery, Speaker>, SpeakerQueryHandler>();

	services.AddTransient<IHandleQueryAsync<AllEventsQuery, IEnumerable<Event>>, EventQueryHandler>();
	services.AddTransient<IHandleQueryAsync<SingleEventByIdQuery, Event>, EventQueryHandler>();

	services.AddTransient<IHandleQueryAsync<AllUsergroupsQuery, IEnumerable<Usergroup>>, UsergroupQueryHandler>();
	services.AddTransient<IHandleQueryAsync<SingleUsergroupByIdQuery, Usergroup>, UsergroupQueryHandler>();

	services.AddTransient<IHandleQueryAsync<AllNewslettersQuery, IEnumerable<Newsletter>>, NewsletterQueryHandler>();
	services.AddTransient<IHandleQueryAsync<SingleNewsletterByIdQuery, Newsletter>, NewsletterQueryHandler>();
	
	return services;
}

Usually you will place this method near you handlers.

The method AddCqsEngine is overloaded to add your QueryHandlers and yor CommandHandlers to the dependnecy injection. There is no real magic behind that method. It is just to group the additional dependencies:

services.AddCqsEngine(s =>
	{
		s.AddQueryHandlers();
		s.AddCommandHandlers();
	});

The parameter s is the same Seervicecollection as the one in the ConfigureServices method.

This library makes heavily use of dependency injection and uses the IServiceProvider, which is used and provided in ASP.NET Core. If you replace the built in DI container with different one, you should ensure that the IServiceProvider is implemented and registered with that container.

Query the data

Getting all the speakers out of the storage is a pretty small example. I just need to create a small class called AllSpeakersQuery and to implement the generic interface IQuery:

public class AllSpeakersQuery : IQuery<IEnumerable<Speaker>>
{
}

The generic argument of the IQuery interface defines the value we want to retrieve from the storage. In this case it is a IEnumerable of speakers.

Querying a single speaker looks like this:

public class SpeakerByIdQuery : IQuery<Speaker>
{
    public SingleSpeakerByIdQuery(Guid id)
    {
        Id = id;
    }

    public Guid Id { get; private set; }
}

The query contains the speakers Id and defines the return value of a single Speaker.

Once you got the QueryProcessor from the dependency injection, you just need to pass the queries to it and retrieve the data:

// sync
var speaker = _queryProcessor.Process(new SpeakerByIdQuery(speakerId));
// async
var speakers = await _queryProcessor.ProcessAsync(new AllSpeakersQuery());

Now let's have a look into the QueryHandlers, which are called by the QueryProcessor. This handlers will contain your business logic. This are small classes, implementing the IHandleQuery<in TQuery, TResult> interface or the IHandleQueryAsync<in TQuery, TResult> interface, where TQuery is a IQuery<TResult>. This class usually retrieves a data source via dependency injection and an Execute or ExecuteAsync method, with the specific Query as argument:

public class AllSpeakersQueryHandler :
    IHandleQuery<AllSpeakersQuery, IEnumerable<Speaker>>
{
    private readonly ITableClient _tableClient;

    public SpeakerQueryHandler(ITableClient tableClient)
    {
        _tableClient = tableClient;
    }

    public Task<IEnumerable<Speaker>> Execute(AllSpeakersQuery query)
    {
        var result = _tableClient.GetItemsOf<Speaker>();
        return result;
    }
}

public class SpeakerByIdQueryQueryHandler :
    IHandleQueryAsync<SpeakerByIdQuery, Speaker>
{
    private readonly ITableClient _tableClient;

    public SpeakerQueryHandler(ITableClient tableClient)
    {
        _tableClient = tableClient;
    }
    
    public async Task<Speaker> ExecuteAsync(SpeakerByIdQuery query)
    {
        var result = await _tableClient.GetItemOf<Speaker>(query.Id);
        return result;
    }
}

Sometimes I handle multiple queries in a single class, this is possible by just implementing multiple IHandleQuery interfaces. I would propose to do this only, if you have really small Execute methods.

Executing Commands

Let's have a quick look into the commands too.

Let's assume we need to create a new speaker and we need to update a speakers email address. To do this we need to define two specific commands

public class AddSpeakerCommand : ICommand
{
    AddSpeakerCommand(Speaker speaker)
    {
        Speaker = speaker;
    }

    public Speaker Speaker { get; private set; }
]

public class UpdateSpeakersEmailCommand : ICommand
{
    UpdateSpeakersEmailCommand(int speakerId, string email)
    {
        SpeakerId = speakerId;
        Email = email;
    }

    public int SpeakerId { get; private set; }

    public string Email { get; private set; }
}

As equal to the queries, the commands need to be passed to the CommandDispatcher, which is registered in the DI container.

// sync
_commandDispatcher.DispatchCommand(new AddSpeakerCommand(myNewSpeaker));
// async
await _commandDispatcher.DispatchCommandasync(new UpdateSpeakersEmailCommand(speakerId, newEmail));

The CommandHandlers are small classes which are implementing the ICommandHandler or the IAsyncCommandHandler where TCommand is a ICommand. Thise handlers contain a Handle or a HandleAync method with the specific Command as argument. As equal to the query part, you usually will also get a data source from the dependency injection:

public class AddSpeakerCommandHandler : ICommandHandler<AddSpeakerCommand>
{
	private readonly ITableClient _tableClient;

	public AddSpeakerCommandHandler(ITableClient tableClient)
	{
		_tableClient = tableClient;
	}

	public void Handle(AddSpeakerCommand command)
	{
		_tableClient.SaveItemOf<Speaker>(command.Speaker);
	}
}

Command validation

What about validatig the commands? Sometimes it is needed to check authorization or to validate the command values before executing the commands. You can do the checks inside the handlers, but this is not always a good idea. This increases the size and the complexity of the handlers and the validation logic is not reusable like this.

This is why the CommandDispatcher supports precondition checks. As equal to the command handlers, you just need to write command preconditions (ICommandPrecondition<in TCommand>) od async command preconditions (ICommandPrecondition<in TCommand>). This interfaces contain a Chack or ChackAsync method which will be executed before the command handlers are executed. You can hava as many preconditions as you want for a single command. If you register the preconditions to the DI container, the command dispatcher will find and execute them:

public class ValidateChangeUsersNameCommandPrecondition : ICommandPrecondition<ChangeUsersNameCommand>
{
    public void Check(ChangeUsersNameCommand command)
    {
        if (command.UserId == Guid.Empty)
        {
            throw new ArgumentException("UserId cannot be empty");
        }
        if (String.IsNullOrWhiteSpace(command.Name))
        {
            throw new ArgumentNullException("Name cannot be null");
        }
    }
}

In case of errors, the command dispatcher will throw an AggregateException with all the possible exceptions in it.

Conclusion

The whole speaker database application is built like this: Using handlers to create small components, which are handling queries to fetch data or which are executing commands to do something with the data.

What do you think? Does it make sense to you? Would it be useful for your projects? Please drop some lines and tell me about your opinion :)

This library is hosted on GitHub in the "develop" branch. I would be happy about any type contribution on GitHub. Feel free to try id out and let me know about issues, tips and improvements :)

Contributing to OSS projects on GitHub using fork and upstream

06.12.2016 18:00:00 | Jürgen Gutsch

Intro

Some days ago, Damien Bowden wrote a pretty cool post about, how to contribute to an open source software project hosted on GitHub, like the AspLabs. He uses Git Extensions in his great and pretty detailed post. Also a nice fact about this post is, that he uses a AspLabs as the demo project. Because we both worked on that on the hackathon at the MVP Summit 2016, together with Glen Condron and Andrew Stanton-Nurse from the ASP.NET Team.

At that Hackathon we worked on the HealthChecks for ASP.NET Core. The HealthChecks can be used to check the heath state of dependent sub systems in an e. g. micro service environment, or in any other environment where you need to know the health of depending systems. A depending systems could be a SQL Server, an Azure Storage service, the Hard drive, a Web-/REST-Service, or anything else you need to run your application. Using the HealthChecks you are able to do something, if a service is not available or unhealthy.

BTW: The HealthChecks are mentioned by Damian Edwards in this ASP.NET Community Standup: https://youtu.be/hjwT0av9gzU?list=PL0M0zPgJ3HSftTAAHttA3JQU4vOjXFquF

Because Damien Bowden also worked on that project, my idea was to do the same post. So asked him to "fork" the original post, but use the Git CLI in the console instead of Git Extensions. Because this is a fork, some original words are used in this post ;)

Why using the console? Because I'm a console junkie since a few years and from my perspective, no Git UI is as good as the simple and clean Git CLI :) Anyway, feel free to use the tool that fits your needs. Maybe someone will write the same post using SourceTree or using the Visual Studio Git integration. ;)

As a result this post is a also simple guideline on how you could contribute to OSS projects hosted on GitHub using fork and upstream. This is even not the only way to do it. In this demo I'm going to use the console and the basic git commands. As same as Damien did, I'll also use the aspnet/AspLabs project from Microsoft as the target Repository.

True words by Damien: So you have something to contribute, cool, that’s the hard part.

Setup your fork

Before you can make your contribution, you need to create a fork of the repository where you want to make your contribution. Open the project on GitHub, and click the "Fork" button in the top right corner.

Now clone your forked repository. Click the "Clone and download" button and copy the clone URL to the clipboard.

Open a console and cd to the location where you want to place your projects. It is c:\git\ in my case. Write git clone followed by the URL to the repository and press enter.

Now you have a local master branch and also a server master branch (remote) of your forked repository. The next step is to configure the remote upstream branch to the original repository. This is required to synchronize with the parent repository, as you might not be the only person contributing to the repository. This is done by adding another remote to that git repository. On GitHub copy the clone URL the the original repository aspnet/AspLabs. Go back to the console and type git remote add upstream followed by the URL of the original repository:

To check if anything is done right, type git remote -v, to see all existing remotes. It should look like this:

Now you can pull from the upstream repository. You pull the latest changes from the upstream/master branch to your local master branch. Due to this you should NEVER work on your master branch. Then you can also configure your git to rebase the local master with the upstream master if preferred.

Start working on the code

Once you have pulled from the upstream, you can push to your remote master, i. e. the forked master. Just to mention it again, NEVER WORK ON YOUR LOCAL FORKED MASTER, and you will save yourself hassle.

Now you’re ready to work. Create a new branch. A good recommendation is to use the following pattern for naming:

<gitHub username>/<reason-for-the-branch>

Here’s an example:

JuergenGutsch/add-healthcheck-groups

By using your GitHub username, it makes it easier for the person reviewing the pull request.

To create that branch in the console, use the git checkout -b command followed by the branch name. This creates the branch and checks it out immediately:

Creating pull requests

When your work is finished on the branch, you need to push your branch to your remote repository by calling git push Now you are ready to create a pull request. Go to your repository on GitHub, select your branch and and click on the "Compare & pull request" button:

Check if the working branch and the target branch are fine. The target branch is usually the master of the upstream repo.

NOTE: If your branch was created from an older master commit than the actual master on the parent, you need to pull from the upstream and rebase your branch to the latest commit. This is easy as you do not work on the local master. Or update your local master with the latest changes from the upstream, push it to your remote and merge your local master into your feature branch.

If you are contributing to any Microsoft repository, you will need to sign an electronic contribution license agreement before you can contribute. This is pretty easy and done in a few minutes.

If you are working together with a maintainer of the repository, or your pull request is the result of an issue, you could add a comment with the GitHub name of the person that will review and merge, so that he or she will be notified that you are ready. They will receive a notification on GitHub as soon as you save the pull request.

Add a meaningful description. Tell the reviewer what they need to know about your changes. and save the pull request.

Now just wait and fix the issues as required. Once the pull request is merged, you need to pull from the upstream on your local forked repository and rebase if necessary to continue with you next pull request.

And who knows, you might even get a coin from Microsoft. ;)

The console I use

I often get the question what type console I use. I have four consoles installed on my machine, in addition to the cmd.exe and PowerShell. I also installed the bash for Windows. But my favorite console is the Cmder, which is a pretty nice ConEmu implementation. I like this console because it is easy to use, easy to customize and it has a nice color theme too.

Thanks

Thanks to Andrew Stanton-Nurse for his tips. Thanks to Glen Condron for the reviews. thanks Damien Bowden for the original blog post ;)

I'd also be happy for tips from anyone on how to improve this guideline.

VSTO: Bilder in Ribbons nutzen

01.12.2016 15:40:01 | Hendrik Loesch

Aktuell habe ich die Ehre ein Excel Plugin zu schreiben. Dieses verfügt über ein eigenes Ribbon, welches möglichst ansprechend aussehen soll. Für mich als Farblegastheniker ist es unmöglich ansprechende Icons zu designen. Dazu kommt, dass die Google Bildersuche schon allein aufgrund der rechtlichen Situation eine eher fragliche Quelle ist. Um nun also möglichst nah am […]

XMAS Treffen der SharePoint UG München bei CGI

17.11.2016 11:46:07 | Sebastian Gerling

Wir veranstalten auch diese Jahr wieder ein XMAS Treffen der SharePoint UG München bei der CGI in der Spixstr. 59. Termin ist der 21.12. um 18:30. Dabei lassen wir  uns bei Glühwein von Max und Corinna auf Weihnachten mit SharePoint einstimmen, dafür haben wir folgende Vorträge: Titel: Hybrid Search – Everything YOU need to know Abstract: In […]

Enable SSL with custom domains on GitHub Pages via Cloudflare

15.11.2016 01:15:00 |

Two weeks ago I decided (finally!) that I should enable SSL on this blog.

Problem: GitHub Pages with a custom domain

This blog is hosted on GitHub Pages with a custom domain, which currently doesn’t support SSL out of the box. If you stick with a github.io domain SSL is not a problem.

Cloudflare to the rescure

I decided to take a deeper look at Cloudflare, which provides DNS, CDN and other “network”-related services. For the “main” service Cloudflare serves as the DNS for your domain and is like a proxy.

With this setup you have some nice benefits:

  • A free SSL certificate (AFAIK you can also use your own cert if you need)
  • A CDN cache
  • DDOS protection
  • “Analytics”

Be aware: This is just the free plan.

And everything is pretty easy to manage via the web interface.

Setup

The first step is to register at Cloudflare & setup your domain. After the first step you need to change the name server for your domain to Cloudflares server.

All your domain belonging can now be managed inside Cloudflare:

x

Setting up some rules

When your DNS changes are done (which can take a couple of hours) you might want to introduce some basic rules. I use these settings, which enforces HTTPS and Cloudflare cache:

x

Done… or nearly done.

Now we have done the “Cloudflare-part”. The next step is to make sure that everything on your page uses HTTPS instead of HTTP to avoid “mixed content”-errors.

Some notes from my own “migration”:

  • If you have Google Analytics - make sure you change the property-settings to the HTTPS URL
  • If you use Disqus you need to migrate your comments from the HTTP url to the HTTPS URL. There is a migrate tool available, which uses a CSV file.

Other solutions…

As far as I know there are other, similar, providers out there and of course you can host the page yourself.

Cloudflare is an easy solution if you are willing to hand of the DNS settings of your domain.

Hope this helps!

CGI has interviewed 1000 executives and thats the outcome in a nutshell…

11.11.2016 14:59:57 | Sebastian Gerling

CGIs Global 1000 is one of CGI’s long lasting traditions in which we interview top executives from 10 different industries and 20 different countries about business and IT priorities. Following graphic does show the top 5 priorities for the upcomming years: More Details can be found here.

Empfehlung: SharePoint Such Tipps von Wolfgang Miedl

09.11.2016 10:03:17 | Sebastian Gerling

Wolfgang hat auf seinem Blog (SharePoint 360) einen super Artikel über Suchtipps in SharePoint veröffentlicht, der wirklich lesenswert ist und gerade für Endanwender in Unternehmen von großem Wert sein kann, um Zeit beim Suchen nach den richtigen Informationen zu sparen. Hier auch der Link zum direkten Download des PDF:

Slides von der EKON

09.11.2016 09:10:00 | Jörg Neumann

Hier die Slides & Samples meiner Talks von der EKON:



Regeln | Impressum