.
Anmeldung | Registrieren | Hilfe

Blogger

.NET-Blog Archiv

.NET Developer Blogs

Mobil bis ins hohe Alter: Microsoft HoloLens unterstützt thyssenkrupp bei der Produktion individuell gefertigter Treppenlifts

24.04.2017 15:11:53 | Mathias Gronau

Praxisbeispiele für den Einsatz der HoloLens kommen meist aus dem Produktionsumfeld. Die Montage und Reparatur komplexer Maschinen wird unterstützt. Auf der CeBIT zeigt Microsoft gemeinsam mit thyssenkrupp ein Beispiel für den Einsatz der HoloLens, bei dem die Endverbraucher direkt von dieser Technologie profitieren können. Um maßgeschneiderte Home-Mobility-Lösungen liefern zu können, setzt thyssenkrupp auf Microsoft HoloLens. Die Mixed-Reality-Brille ermöglicht Kunden in Echtzeit die Visualisierung eines Treppenlift-Produkts im eigenen Zuhause. Durch die Beschleunigung verschiedener Prozessschritte sind zudem bis zu vier Mal schnellere Lieferzeiten möglich. Microsoft HoloLens wird damit bei thyssenkrupp zu einem wesentlichen Werkzeug bei der Bereitstellung individueller Mobilitätslösungen in den eigenen vier Wänden. Der Bedarf nach solchen Lösungen wird rasant steigen: Neben der schnell wachsenden städtischen Bevölkerung tragen auch ein besseres Gesundheitswesen und ein höherer Lebensstandard zu einer längeren Lebenserwartung bei. Es wird erwartet, dass die globale Bevölkerung von derzeit 7,4 Milliarden bis 2100 auf über 11 Milliarden anwächst – und altert. Im Jahr 2016 war weltweit bereits jeder achte Mensch über 60 Jahre alt. Für Deutschland berechnet das Statistische Bundesamt, dass im Jahr 2060 69 Prozent der Bevölkerung im Rentenalter sein werden. Diese Zahlen verdeutlichen, dass die Mobilität älterer Menschen in Städten eine wachsende Herausforderung darstellt.1 Mobilitätslösungen wie Treppenlifts tragen bereits heute zu einer höheren Lebensqualität älterer Menschen bei, die damit länger und unabhängiger in ihrem eigenen Zuhause leben können. Allerdings führen einige Aspekte bei der Gestaltung und Bereitstellung von Home-Mobility-Lösungen zu hoher Komplexität: Jede häusliche Situation hat ihre Besonderheiten, daher ist jeder Treppenlift im Prinzip ein Einzelstück, das speziell auf individuelle Anforderungen zugeschnitten ist. Zudem erwerben die meisten Kunden ein solches Produkt erst, wenn ihre Mobilität bereits stark eingeschränkt ist. Dadurch wird der Kauf zu einer emotionalen Erfahrung, bei der sie ihre neuen Einschränkungen anerkennen müssen. Oft wollen die Kunden aus dieser Situation heraus ihre Käufe sofort geliefert bekommen; das heißt, es ist Eile geboten. Microsoft HoloLens wird den Kunden maßgeblich helfen, Mobilitätslösungen individuell zu gestalten. Treppen können nun sofort unter der Berücksichtigung von Ergonomie und Hindernissen wie Heizungen, Licht – und Elektrohalterungen oder der Nähe zur Wand usw. vermessen werden. Kunden können bequem auf einem Tablet sehen, wie der Treppenlift auf ihrer Treppe aussehen wird, und Entscheidungen zu Polsterung, Farbe von Stuhl und Schienen sowie maßgeschneiderten Besonderheiten treffen. So erhalten sie ein Produkt, das in das Erscheinungsbild ihres Zuhauses passt. „Neue Realitäten erfordern neue Lösungen“, kommentiert Andreas Schierenbeck, CEO von thyssenkrupp Elevator. „Wir sehen in der Microsoft HoloLens einen Wegbereiter, der uns dabei unterstützt, das Kundenerlebnis für Home-Lösungen neu zu erfinden. Damit tragen wir dazu bei, kontinuierliche Lebensqualität für die alternde Bevölkerung bereitzustellen – unabhängig von ihren Mobilitätseinschränkungen.“ thyssenkrupp hat Microsoft HoloLens für den häuslichen Bereich bereits bei mehr als 100 Kunden in Holland, Spanien und Deutschland mit exzellenten Ergebnissen und positivem Kundenfeedback eingesetzt. Als nächster Schritt steht die deutschlandweite Einführung an.

.NET CultureInfo in Windows 10

24.04.2017 03:45:00 |

Did you know that the CultureInfo behavior with “unkown” cultures has changed with Windows 10?

I stumbled two times about this “problem” - so this is enough to write a short blogpost about it.

Demo Code

Lets use this democode:

    try
    {


        // ok on Win10, but not on pre Win10 if culture is not registred
        CultureInfo culture1 = new CultureInfo("foo");
        CultureInfo culture2 = new CultureInfo("xyz");
        CultureInfo culture3 = new CultureInfo("en-xy");

        // not ok even on Win 10 - exception
        CultureInfo culture4 = new CultureInfo("foox");

    }
    catch (Exception exc)
    {

    }

Windows 10 Case

If you run this code under Windows 10 it should fail for the “foox” culture, because it doesn’t seem to be a valid culture anyway.

“culture1”, “culture2”, “culture3” are all valid cultures in the Windows 10 world, but are resolved with Unkown Locale and LCID 4096.

I guess Windows will look for a 2 or 3 letter ISO style language, and “foox” doesn’t match this pattern.

Pre Windows 10 - e.g. running on Win Server 2012R2

If you would run the code unter Windows Server 2012 R2 it would fail on the first culture, because there is no “foo” culture registred.

“Problem”

The main “problem” is that this behavior could lead to some production issues if you develop with Windows 10 and the software is running on a Win 2012 server.

If you are managing “language” content in your application, be aware of this “limitation” on older Windows versions.

I discovered this problem while debugging our backend admin application. With this ASP.NET frontend it is possible to add or manage “localized” content and the dropdown for the possible language listed a whole bunch of very special, but “Unkown locale” cultures. So we needed to filter out all LCID 4096 cultures to ensure it would run under all Windows versions.

MSDN

This behavior is also documented on MSDN

The “Unkown culture” LCID 4096 was introduced with Windows Vista, but only with Windows 10 it will be “easy” usable within the .NET Framework.

12 Open Source UI-Frameworks für WPF-Anwendungen

21.04.2017 23:48:11 | Kazim Bahar

Auf der mittlerweile populärsten Sourcecode-Verwaltungsplattform GitHub befinden sich etliche frei verfügbare UI-Frameworks für .NET Anwendungen. Die hier aufgelisteten Frameworks für WPF sind...

Thorsten Herrmann übernimmt bei Microsoft das Großkunden-Geschäft

20.04.2017 12:09:06 | Mathias Gronau

Thorsten Herrmann (49) ist am 1. April in die Geschäftsleitung von Microsoft Deutschland eingetreten und wird ab 1. Juli für das Großkunden- und Partnergeschäft in Deutschland verantwortlich zeichnen. Thomas Herrmann berichtet unmittelbar an die Vorsitzende der Geschäftsführung von Microsoft Deutschland, Sabine Bendiek. Thorsten Herrmann übernimmt damit die Aufgaben von Alexander Stüger, der die Position seit dem Ausscheiden von Alastair Bruce im Juli 2016 interimistisch bekleidete. Alexander Stügers neue Aufgabe in der Microsoft Organisation wird zu einem späteren Zeitpunkt bekannt gegeben. Thorsten Herrmann kommt von Hewlett Packard Enterprise (HPE) zu Microsoft. Bei HPE war er in den letzten drei Jahren als Vice-President für die weltweiten Geschäftsbeziehungen zwischen HP (später HPE) und SAP zuständig. Der studierte Wirtschaftsinformatiker startete seine Karriere 1989 bei IBM. 1997 wechselte er zu Compaq Computer GmbH in den Vertrieb für SAP R/3-Infrastrukturlösungen und wurde später Regionalvertriebsleiter. Seit dem Merger mit HP war Thorsten Herrmann in verschiedenen leitenden Vertriebspositionen tätig ehe er im April 2009 in die Geschäftsführung der Hewlett Packard GmbH berufen wurde. In dieser Rolle übernahm er als Vice President die Verantwortung für den Großkundenvertrieb bei HP Deutschland. „Auf der Hannover Messe in der kommenden Woche sehen wir, in welcher Geschwindigkeit die Digitalisierung klassischer Industrien vorangeht. Microsoft leistet nach meiner Beobachtung einen substanziellen Beitrag, die Industrieunternehmen bei ihrer digitalen Transformation wirksam zu unterstützen. Dabei bringe ich meine Erfahrungen und Expertise jetzt sehr gerne ein“, sagte Thorsten Herrmann vor Mitarbeitern. „Mit Thorsten Herrmann konnten wir einen Top-Manager mit einer umfassenden Vertriebs-Expertise und Marktkenntnis für uns gewinnen und ich freue mich sehr auf die persönliche Zusammenarbeit“, kommentierte Sabine Bendiek die Berufung des neuen Geschäftsleitungs-Kollegen.

A first glimpse into CAKE

19.04.2017 21:00:00 | Jürgen Gutsch

Since a couple of years I use FAKE (C# Make) to configure my builds. Also at the YooApps we use FAKE in some projects. One of the projects uses it since more than two years. FAKE is really great and I love to use it, but there is one problem with it: The most C# developers don't really like to use new things. The worst case for the most C# developers - it seems - is a new tool, that uses an exotic language like F#.

This is why I have to maintain the FAKE build scripts, since I introduced FAKE to the team.

It is that new tool and the F# language that must be scary for them, even if they don't really need to learn F# for the most common scenarios. That's why I asked the fellow developers to use CAKE (C# Make).

  • It is C# make instead of F# make
  • It looks pretty similar
  • It works the same way
  • It is a scripting language
  • It works almost everywhere

They really liked the idea to use CAKE. Why? just because of C#? It seems so...

It doesn't really makes sense to me, but anyway, it makes absolutely sense that the developers need to use and to maintain there own build configurations.

How does CAKE work?

CAKE is built using a C# scripting language. It uses the Roslyn compiler to compile the scripts. Instead of using batch files, as FAKE does, it uses a PowerShell script (build.ps1) to bootstrap itself and to run the build script. The bootstrapping step loads CAKE and some dependencies using NuGet. The last step the PowerShell script does, is to call the cake.exe and to execute the build script.

The bootstrapping needs network access, to load all the stuff. It also loads the nuget.exe, if it's not available. If you don't like this, you can also commit the loaded dependencies to the source code repository.

The documentation is great. Just follow the getting-started guide to get a working example. There's also a nice documentation about setting up a new project available.

Configuring the build

If you know FAKE or even MSBuild it will look pretty familiar to you. Let's have a quick look into the first simple example of the getting started guide:

var target = Argument("target", "Default");

Task("Default")
  .Does(() =>
        {
          Information("Hello World!");
        });

RunTarget(target);

The first line retrieves the build target to execute from the command line arguments. Starting from line 3 we see a definition of a build target. This target just prints a "Hello World!" as a information message.

The last line starts the initial target by its name.

A more concrete code sample is the build script from the CAKE example (I removed some lines in this listing to get a shorter example):

#tool nuget:?package=NUnit.ConsoleRunner&version=3.4.0

var target = Argument("target", "Default");
var configuration = Argument("configuration", "Release");

// Define the build directory.
var buildDir = Directory("./src/Example/bin") + Directory(configuration);

Task("Clean")
  .Does(() =>
        {
          CleanDirectory(buildDir);
        });

Task("Restore-NuGet-Packages")
  .IsDependentOn("Clean")
  .Does(() =>
        {
          NuGetRestore("./src/Example.sln");
        });

Task("Build")
  .IsDependentOn("Restore-NuGet-Packages")
  .Does(() =>
        {
          MSBuild("./src/Example.sln", settings =>
                  settings.SetConfiguration(configuration));
        });

Task("Run-Unit-Tests")
  .IsDependentOn("Build")
  .Does(() =>
        {
          NUnit3("./src/**/bin/" + configuration + "/*.Tests.dll", new NUnit3Settings {
            NoResults = true
          });
        });

Task("Default")
  .IsDependentOn("Run-Unit-Tests");

RunTarget(target);

This script uses another NuGet package to run the NUnit3 tests and references it. A nice feature is to configure the NuGet dependency at the beginning of the script.

This build script contains five targets. The method IsDependentOn("") wires the targets together in the right execution order. This way is a bit different to FAKE and maybe a little bit confusing. It needs to write the targets in the right execution order. If you don't write the script like this, you need to find the initial target and to follow the way back to the very first target. You will read the execution order from the last to the first target.

FAKE does this a little easier and wires the targets up in a single statement at the end of the file:

// Dependencies
"Clean"
  ==> "Restore-NuGet-Packages"
  ==> "Build"
  ==> "Run-Unit-Tests"
  ==> "Default"
 
RunTargetOrDefault "Default"

This could possibly look like this dummy code in CAKE:

// Dependencies
WireUp("Clean")
  .Calls("Restore-NuGet-Packages")
  .Calls("Build")
  .Calls("Run-Unit-Tests")
  .Calls("Default");

RunTarget(target);

Running the build

To run the build, just call .\build.ps1 in a PowerShell console:

If you know FAKE, the results look pretty familiar:

Conclusion

Anyway. I think CAKE gets pretty much faster accepted by the fellow developers at the YooApps than FAKE did. Some things will work a little easier in CAKE than in FAKE and some a little different, but the most stuff will work the same way. So it seems it makes sense to switch to use CAKE at the YooApps. So let's use it. :)

I'm sure, I will write down a comparison of FAKE and CAKE later, if I have used it for a few months.

Please use a password manager

17.04.2017 21:00:00 | Jürgen Gutsch

Since years, I'm using a password manager to store and manage all my credentials for all the accounts I use. The usage of such a tool is pretty common for me and it is a normal flow for me to create a new entry in the password manager first before I create the actual account.

I'm always amazed, if I meet people, who don't use any tool to store and manage their credentials. They use the same password or the same bunch of passwords everywhere. This is dangerous and most of them already know about that. Maybe they are to lazy to spend one or two ours to think about the benefits and to search for the right tool or they simply don't know about such tools

For me the key benefits are pretty clear:

  • I don't need to remember all the passwords.
  • I don't need to think about secure passwords while creating new one
  • I just need to remember one single passphrase

Longer passwords are more secure than short ones, even if the short ones include special characters, numbers and upper case characters. But longer passwords are pretty hard to remember. This is why Edward Snowden proposed to use passphrases instead of passwords. Using passphrases like this, are easy to remember. But would it really makes sense to create a different pass-phrase for every credentials you use? I don't think so.

I just created a single pass-phrase which is used with the password manager and I use the password manager to generate the long passwords for me. Most of the generated password are like this:

I4:rs/pQO8Ir/tSk*`1_KSG"~to-xMI/Gf;bNP7Qi3s8RuqIzl0r=-JL727!cgwq

The tool I use is KeePass 2, which is available on almost all devices and OS I use. As small desktop app on Windows, Mac and Linux as an app on Windows phones and Android. (Probably on iOS too, but I don't use such a device). KeePass stores the passwords on a encrypted file. This file could be stored on a USB drive or on a shared folder in the cloud. I use the cloud way to share the file to every device. The cloud store itself is secured with a password I don't know and which is stored in that KeePass file.

What the F***, you really put the key to lock your house into a safe which is inside your house? Yes!

The KeyPass file is synced offline and offline accessible on all devices. If the file changes it will be synched to all devices. I'm still able to access all credentials.

With the most password mangers, you are able to copy the username and the password, using shortcuts to the login forms. Some of them are able to automatically fill in the credentials. KeePass uses the clipboard, but deletes the values from the clipboard after a couple of seconds. Like this, no one else can access or reuse the credentials within the clipboard.

There are many more password manager out there. But KeePass - with the encrypted file - works best for me. I'm responsible to store the file on a save location and I can do that wherever I want.

Summary

Use a password manager! Spend the time to try the tools and choose the tools that fits you best. It is important! Once you use a password manager, you never want to work without one. It will make you live easier and hopefully more secure.

Blogging with Git Flow

02.04.2017 21:00:00 | Jürgen Gutsch

Since a while we use Git Flow in our projects at the YooApps. Git Flow is a add-in for Git, that helps us to follow the feature branch process in a standard way. We usually use our Jira ticket number as names for features, bugs or hotfix and we use the Jira Version as a Release name. This makes absolutely sense to us and makes the flow pretty transparent. We also use BitBucket to host our Git repositories, which is directly linked to Jira. That means on every Pull Request we have the direct link to the Jira ticket.

Using Git Flow to release new posts

My idea is to also use Git Flow to release new blog posts.

My blog is generated by Pretzel, which is a Jekyll clone written in C#. I host it on an Azure Web Site. I write my posts in GitHub style Markdown files and push them to GitHub. Every time I push a new post, the Azure Web Site starts a new build and runs Pretzel to generate the static blog.

I already wrote some blog posts about using Pretzel

Dear hackers and bots: Because it is a static web site, it doesn't make sense to search for WordPress login pages or WordPress APIs on my blog ;)

Since almost one and a halve year now, I use Git on GitHub to publish my posts, but I only use the master branch yet and I don't commit the drafts. I sometimes write two or three post in parallel and I anyway have around ten drafts in the Pretzels _drafts folder. This feels a bit messy and I will probably loose my drafted posts, if the machine crashes.

Using Git Flow, I don't need to use the _drafts folder anymore. Every unfinished feature branch is a draft. If I finish a post, I would just need to finish the feature branch. If I want to publish the posts, I can start a new release.

Maybe the release step is a bit too much. But currently I need to push explicitly to publish a new post and it shouldn't be a big deal to also create a new release to merge all finished posts from develop to master. If it doesn't fit, it would be easy to switch to the more simple feature branches.

Let's see how it works :-)

This blog post feature branch is created by using the following commands in the console:

git flow feature start blogging-with-git-flow

Now I'm in the draft mode and will create the blog post file, writing, adding images, linking between existing posts and so on. If this is done I do a first commit and publish the feature branch to GitHub:

git add _posts/*
git add img/*
git commit -m "new post about blogging with pretzel"
git flow feature publish

At this state I can change and commit as much as I want to. If I finish the blog post, I'm going to finish the post and push the current develop branch to GitHub:

git flow feature finish
git push

The last step is publishing the posts. I currently not sure, but I could probably use the number of posts as the minor version number, which will also be used as tag for the release:

git flow release start 1.48.0
git flow release finish
git push --all
git push --tags

(I possibly should create a batch command to execute the four lines to release new posts)

After that push, Pretzel will start "baking" the blog including the new blog post.

If I want to see the current drafts, I just need to display the existing branches:

I'm sure this will be a clean way to publish and to handle the drafts and finished posts.

Versioning

While publishing the latest post like this, I realized that GitHub actually will display a release in the GitHub repository, this is quite nice. This is not really needed but a funny fact and a reason why I wanna think a little more about the version numbers, if I release a new article.

My idea is to change the Major version, if I do a huge change on the layout of the blog or if I add a new feature. Because the layout is still the first version and I only did some really small changes, I'll keep it as version 1. For the minor version I will use the number of published articles.

This doesn't really makes sense from the semver perspective, but blogging should also be fun and this really means is fun to me.

This means, with the latest post I published release 1.47.0 and this post will be in release 1.48.0 ;-)

Authoring

In the last post about writing blog posts using Pretzel, I wrote that I use MarkdownPad 2 to write my pots. Unfortunately it seems that this editor is not longer maintained, no new version was published for a while. I paid for it, to have some more extended feature to use. Anyway, a few months ago it crashes every time I opened it and there was no way to get an updated or fixed version. (I'm pretty sure it was related to the installation of CefSharp.) This is why I now use Typora, which is pretty minimalistic and lightweight. It works completely different, but in a pretty cool way. It doesn't use a splitted screen to edit and preview the content. With typora you write markdown code directly and it immediately translated it to the preview in the same editor view. It also supports custom CSS and different Markdown styles. It is real WYSIWYG.

Any thoughts about it? Feel free to drop a comment and let me know :-)

Using xUnit, MSTest or NUnit to test .NET Core libraries

30.03.2017 21:00:00 | Jürgen Gutsch

MSTest was just announced to be open sourced, but was already moved to .NET Core some months ago. It seems it makes sense to write another blog post about unit testing .NET Core applications and .NET Standard libraries using .NET Core tools.

In this post I'm going to use the dotnet CLI and Visual Studio Code completely. Feel free to use Visual Studio 2017 instead, if you want to and if you don't like to use the console. Visual Studio 2017 is using the same dotnet CLI and almost the same commands in the background.

Setup the system under test

Our SUT is a pretty complex class, that helps us a lot to do some basic math operation. This class will be a part of a super awesome library:

namespace SuperAwesomeLibrary
{
  public class MathTools
  {
    public decimal Add(decimal a, decimal b) =>  a + b;
    public decimal Substr(decimal a, decimal b) => a - b;
    public decimal Multiply(decimal a, decimal b) => a * b;
    public decimal Divide(decimal a, decimal b) => a / b;
  }
}

I'm going to add this class to the "SuperAwesomeLibrary" and a solution I recently added like this:

mkdir unit-tests & cd unit-tests
dotnet new sln -n SuperAwesomeLibrary
mkdir SuperAwesomeLibrary & cd SuperAwesomeLibrary
dotnet new classlib
cd ..
dotnet sln add SuperAwesomeLibrary\SuperAwesomeLibrary.csproj

The cool thing about the dotnet CLI is, that you are really able to create Visual Studio solutions (line 2). This wasn't possible with the previous versions. The result is a Visual Studio and MSBuild compatible solution and you can use and build it like any other solution in your continuous integration environment. Line 5 creates a new library, which will be added to the solution in line 7.

After this is done, the following commands will complete the setup, by restoring the NuGet packages and building the solution:

dotnet restore
dotnet build

Adding xUnit tests

The dotnet CLI directly supports to add XUnit tests:

mkdir SuperAwesomeLibrary.Xunit & cd SuperAwesomeLibrary.Xunit
dotnet new xunit
dotnet add reference ..\SuperAwesomeLibrary\SuperAwesomeLibrary.csproj
cd ..
dotnet sln add SuperAwesomeLibrary.Xunit\SuperAwesomeLibrary.Xunit.csproj

This commands are creating the new xUnit test project, adding a reference to the SuperAwesomeLibrary and adding the test project to the solution.

If this was done, I created the xUnit tests for our MathHelper using VSCode:

public class MathToolsTests
{
  [Fact]
  public void AddTest() 
  {
    var sut = new MathTools();
    var result = sut.Add(1M, 2M);
    Assert.True(3M == result);
  }
  [Fact]
  public void SubstrTest() 
  {
    var sut = new MathTools();
    var result = sut.Substr(2M, 1M);
    Assert.True(1M == result);
  }
  [Fact]
  public void MultiplyTest() 
  {
    var sut = new MathTools();
    var result = sut.Multiply(2M, 1M);
    Assert.True(2M == result);
  }
  [Fact]
  public void DivideTest() 
  {
    var sut = new MathTools();
    var result = sut.Divide(2M, 2M);
    Assert.True(1M == result);
  }
}

This should work and you need to call your very best dotnet CLI friends again:

dotnet restore
dotnet build

The cool thing about this commands is, it works in your solution directory and restores the packages of all your solution and it builds all your projects in your solution. You don't need to go threw all of your projects separately.

But if you want to run your tests, you need to call the library or the project directly, if you are not in the project folder:

dotnet test SuperAwesomeLibrary.Xunit\SuperAwesomeLibrary.Xunit.csproj

If you are in the test project folder just call dotnet test without the project file.

This command will run all your unit tests in your project.

Adding MSTest tests

Adding a test library for MSTest works the same way:

mkdir SuperAwesomeLibrary.MsTest & cd SuperAwesomeLibrary.MsTest
dotnet new mstest
dotnet add reference ..\SuperAwesomeLibrary\SuperAwesomeLibrary.csproj
cd ..
dotnet sln add SuperAwesomeLibrary.MsTest\SuperAwesomeLibrary.MsTest.csproj

Event the test class looks almost the same:

[TestClass]
public class MathToolsTests
{
  [TestMethod]
  public void AddTest()
  {
    var sut = new MathTools();
    var result = sut.Add(1M, 2M);
    Assert.IsTrue(3M == result);
  }
  [TestMethod]
  public void SubstrTest()
  {
    var sut = new MathTools();
    var result = sut.Substr(2M, 1M);
    Assert.IsTrue(1M == result);
  }
  [TestMethod]
  public void MultiplyTest()
  {
    var sut = new MathTools();
    var result = sut.Multiply(2M, 1M);
    Assert.IsTrue(2M == result);
  }
  [TestMethod]
  public void DivideTest()
  {
    var sut = new MathTools();
    var result = sut.Divide(2M, 2M);
    Assert.IsTrue(1M == result);
  }
}

And again our favorite commands:

dotnet restore
dotnet build

The command dotnet restore will fail in offline mode, because MSTest is not delivered with the runtime and the default NuGet packages, but xUnit is. This means, it needs to fetch the latest packages from NuGet.org. Kinda weird, isn't it?

The last task to do, is to run the unit tests:

dotnet test SuperAwesomeLibrary.MsTest\SuperAwesomeLibrary.MsTest.csproj

This doesn't really look hard.

What about Nunit?

Unfortunately there is no default template for a Nunit test projects. I really like Nunit and I used it for years. It is anyway possible to use NUnit with .NET Core, but you need to do some things manually. The first steps seem to be pretty similar to the other examples, except we create a console application and add the NUnit dependencies manually:

mkdir SuperAwesomeLibrary.Nunit & cd SuperAwesomeLibrary.Nunit
dotnet new console
dotnet add package Nunit
dotnet add package NUnitLite
dotnet add reference ..\SuperAwesomeLibrary\SuperAwesomeLibrary.csproj
cd ..

The reason why we need to create a console application is, that there is not yet a runner for visual studio available for NUnit. This also means dotnet test will not work. The NUnit Devs are working on it, but this seems to need some more time. Anyway, there is an option to use NUnitLite to create a self executing test library.

We need to use NUnitLight and to change the static void Main to a static int Main:

static int Main(string[] args)
{
  var typeInfo = typeof(Program).GetTypeInfo();
  return new AutoRun(typeInfo.Assembly).Execute(args);
}

This lines automatically executes all TestClasses in the current assembly. It also passes the NUnit arguments to NUnitLite, to e. g. setup the output log file, etc.

Let's add a NUnit test class:

[TestClass]
public class MathToolsTests
{
  [Test]
  public void AddTest() 
  {
    var sut = new MathTools();
    var result = sut.Add(1M, 2M);
    Assert.That(result, Is.EqualTo(3M));
  }
  [Test]
  public void SubstrTest() 
  {
    var sut = new MathTools();
    var result = sut.Substr(2M, 1M);
    Assert.That(result, Is.EqualTo(1M));
  }
  [Test]
  public void MultiplyTest() 
  {
    var sut = new MathTools();
    var result = sut.Multiply(2M, 1M);
    Assert.That(result, Is.EqualTo(2M));
  }
  [Test]
  public void DivideTest() 
  {
    var sut = new MathTools();
    var result = sut.Divide(2M, 2M);
    Assert.That(result, Is.EqualTo(1M));
  }
}

Finally we need to run the tests. But this time we cannot use dotnet test.

dotnet restore
dotnet build
dotnet run -p SuperAwesomeLibrary.Nunit\SuperAwesomeLibrary.Nunit.csproj

Because it is a console application, we need to use dotnet run to execute the app and the NUnitLite test runner.

What about mocking?

Currently creating mocking frameworks is a little bit difficult using .NET Standard, because there is a lot of reflection needed, which is not completely implemented in .NET Core or even .NET Standard.

My Favorite tool Moq is anyway available for .NET Standard 1.3, which means it should work here. Let's see how it works.

Lets assume we have a PersonService in the SuperAwesomeLibrary that uses a IPersonRepository to fetch Persons from a data storage:

using System;
using System.Collections.Generic;

public class PersonService
{
  private readonly IPersonRepository _personRepository;

  public PersonService(IPersonRepository personRepository)
  {
    _personRepository = personRepository;
  }

  public IEnumerable<Person> GetAllPersons()
  {
    return _personRepository.FetchAllPersons();
  }
}

public interface IPersonRepository
{
  IEnumerable<Person> FetchAllPersons();
}

public class Person
{
  public string Firstname { get; set; }
  public string Lastname { get; set; }
  public DateTime DateOfBirth { get; set; }
}

After building the library, I move to the NUnit test project to add Moq and GenFu.

cd SuperAwesomeLibrary.Nunit
dotnet add package moq
dotnet add package genfu
dotnet restore

GenFu is a really great library to create the test data for unit tests or demos. I really like this library and use it a lot.

Now we need to write the actual test using this tools. This test doesn't really makes sense, but it shows how Moq works:

using System;
using System.Linq;
using NUnit.Framework;
using SuperAwesomeLibrary;
using GenFu;
using Moq;

namespace SuperAwesomeLibrary.Xunit
{
  [TestFixture]
  public class PersonServiceTest
  {
    [Test]
    public void GetAllPersons() 
    {
      var persons = A.ListOf<Person>(10); // generating test data using GenFu
      
      var repo = new Mock<IPersonRepository>();
      repo.Setup(x => x.GetAllPersons()).Returns(persons);

      var sut = new PersonService(repo.Object);
      var actual = sut.GetAllPersons();

      Assert.That(actual.Count(), Is.EqualTo(10));
    }
  }
}

As you can see the, Moq works the same was in .NET Core as in the full .NET Framework.

Now let's start the NUnit tests again:

dotnet build
dotnet run

Et voilà:

Conclusion

Running unit tests within .NET Core isn't really a big deal and it is really a good thing that it is working with different unit testing frameworks. You have the choice to use your favorite tools. It would be nice to have the same choice even with the mocking frameworks.

In one of the next post I'll write about unit testing a ASP.NET Core application. Which includes testing MiddleWares, Controllers, Filters and View Components.

You can play around with the code samples on GitHub: https://github.com/juergengutsch/dotnetcore-unit-test-samples/

How we moved to Visual Studio 2017

28.03.2017 03:45:00 |

x

Visual Studio 2017 has arrived and because of .NET Core and other goodies we wanted to switch fast to the newest release with our product OneOffixx.

Company & Product Environment

In our solution we use some VC++ projects (just for Office Development & building a .NET shim), Windows Installer XML & many C# projects for desktop or ASP.NET stuff.

Our builds are scripted with CAKE (see here for some more blogposts about [CAKE}(https://blog.codeinside.eu/2017/02/13/create-nuget-packages-with-cake/) and us the TFS vNext Build to orchestrate everything.

Step 1: Update the Development Workstations

The first step was to update my local dev enviroment and install Visual Studio 2017.

After the installation I started VS and opened our solution and because we have some WIX projects we needed the most recent Wix 3.11 toolset & the VS 2017 extension.

Step 2: VC++ update

We wanted a clean VS 2017 enviroment, so we decided to use the most recent VC++ 2017 runtime for our VC++ projects.

Step 3: project update

In the past we had some issues that MSBuild used the wrong MSBuild version. Maybe this step is not needed, but we pointed all .csproj files to the newest MSBuild ToolVersion 15.0.

Step 4: CAKE update

The last step was to update the CAKE.exe (which is controlled by us and not automatically downloaded via a build script) to 0.18.

Step 5: Minor build script changes

We needed to adjust some paths (e.g. to the Windows SDK for signtool.exe) and ensure that we are using the most recent MSBuild.exe.

Step 6: Create a new Build-Agent

We decided to create a new TFS Build-Agent and do the usual build agent installation, imported the code-cert and do some magic because of some C++/COM-magic (don’t ask… COM sucks.)

Recap

Besides the C++/COM/magic issue (see above) the migration was pretty easy and now our team works with Visual Studio 2017.

Visual Studio 2017 Launch bei der .NET User Group Koblenz

22.03.2017 14:39:39 | Andre Kraemer

Update 23.03.2017: Das Treffen ist am 23.03 2017 und nicht wie ursprünglich geschrieben am 24.03.

Am 7. März 2017 veröffentliche Microsoft pünktlich zum 20. Geburtstag von Visual Studio die neue Version 2017.

Bei der .NET User Group Koblenz werden wir aus diesem Anlass am 23. März 2017 um 19:00 Uhr ein Treffen abhalten, bei dem Eric Berres und ich die wichtigsten neuen Features vorstellen werden, mit denen Entwickler noch produktiver werden.

Unter anderem werden wir folgendes besprechen:

  • Visual Studio 2017 Installer / Workloads
  • Schnellerer Start von Visual Studio 2017 dank Lightweight Solution Load
  • Neue Refactorings in Visual Studio
  • ASP.NET Core und Docker
  • Live Unit Testing in Visual Studio 2017
  • Xamarin in Visual Studio 2017

Visual Studio 2017 Launch Event T-Shirts

Für alle Microsoft MSDN Abonnenten gibt es übrigens auch noch ein kleines Geschenk seitens Microsoft: Ein Visual Studio 2017 Geburtstags-T-Shirt. Sollten wir nicht genug Shirts für alle Teilnehmer haben, dann werden diese nach dem Prinzip: First-Come, First-Serve ausgeteilt.

Visual Studio 2017 Launch Event T-Shirts

Weitere Details gibt es auf der Webseite der .NET User Group Koblenz

Vortrag auf der Embedded meets Agile in München

17.03.2017 14:23:39 | Sebastian Gerling

ich bin eingeladen worden auf der Embedded meets Agile am 26.04 in München über das Thema „Design Thinking vs Business Canvas Modell“ zu sprechen. Ziel ist es hier aufzuzeigen, wo die Unterschiede liegen, worauf gerade in Umfeld Embedded geachtet werden muss und wo in einer Kombination aus Fuzzy und strukturiertem Ansatz Mehrwerte liegen können.   Weitere […]

Interview zu App Design und Storyboarding

16.03.2017 09:53:00 | Jörg Neumann

Im Rahmen der MobileTechCon hat mich S&S zum Thema App Design und Storyboarding interviewt. Das vollständige Interview gibt's hier.

Material von der MTC 2017

16.03.2017 09:47:00 | Jörg Neumann

Integrate Meetup events on your website

12.03.2017 19:00:00 | Jürgen Gutsch

This is a guest post, written by Olivier Giss about integrating Meetup events on your website. Olivier is working as a web developer at algacom AG in Basel and also one of the leads of the .NET User Group Nordwest-Schweiz


For two years, I am leading the .NET User Group Nordwest-Schweiz with Jürgen Gutsch that owns this nice blog. After a year, we decided also to use Meetup to get more participants.

Understanding the problem

But with each added platform where we post our events, we increased the workload to keep it all up to date. Jürgen had the great idea to read the Meetup events and list them on our own website to lower the work.

This is exactly what I want to show you.

The Meetup API

Before we start coding we should understand how the API of Meetup is working and what it does offer. The API of Meetup is well documented and supports a lot of data. What we want is to get a list of upcoming events for our meetup group and display it on the website without to be authenticated on meetup.com.

For our goal, we need the following meetup API method:

GET https://api.meetup.com/:urlname/events

The parameter “:urlname” is the meetup group name. In the request body we could sort, filter and control paging what we don’t need. If we would execute that query, you get an authorization error.

However, we don’t want that the user must be authenticated to get the events. To get it to work we need to use a JSONP request.

Let’s getting it done

The simplest way doing a JSONP request is using jQuery:

$.ajax({
  url: "https://api.meetup.com/Basel-NET-User-Group/events",
  jsonp: "callback",
  dataType: "jsonp",
  data: {
    format: "json"
  },
  success: function(response) {
    var events = response.data;
  }
});

Be aware: JSONP has some security implications. As JSONP is really JavaScript, it can do everything that is possible in the context. You need to trust the provider of the JSONP data!

After that call, we are getting the data from the Meetup API which can be used with simple data binding to display it on our website. You can choose any kind of MV* JS framework to do that. I used AngularJS.

<div class="row" ng-repeat="model in vm.Events track by model.Id" ng-cloak>
  <ahttp://feedproxy.google.com href="" target="_blank" title="Öffnen auf meetup.com"><h3></h3></a>
  <label>Datum und Uhrzeit</label>
  <p></p>
  <label>Description</label>
  <div ng-bind-html="model.Description"></div>
  <label>Ort</label>
  <p></p>
</div>

As you can see everything is One-Way bound because the data is never changed. The “ng-bind-html” binds HTML content from the meetup event description.

The Angular controller is simple, it uses the "$sce” service to ensure that the provided HTML content from the meetup API is marked as secure. When we change a model outside of angular, we must notify our changes with “vm.scope.$apply()”.

(function () {
  var module = angular.module('app', []);

  module.controller('MeetupEventsController', ['$scope', '$sce', MeetupEventsController]);

  MeetupEventsController.$inject = ['$scope', '$sce'];

  function MeetupEventsController($scope, $sce) {

    var vm = this;
    vm.Events = [];
    vm.scope = $scope;
    vm.loaded = false;

    vm.Refresh = function() {
      $.ajax({
        url: "https://api.meetup.com/Basel-NET-User-Group/events",
        jsonp: "callback",
        dataType: "jsonp",
        data: {
          format: "json"
        },
        success: function(response) {
          var events = response.data;

          for (var i = 0; i < events.length; i++) {
            var item = events[i];

            var eventItem = {
              Id: i,
              DisplayName: item.name,
              Description: $sce.trustAsHtml(item.description),
              Location: item.venue.name + " " + item.venue.address_1 + " " + item.venue.city,
              Time: new Date(item.time).toLocaleString(),
              Link :item.link,
            };
            vm.Events.push(eventItem)
          }
          vm.loaded = true;
          vm.scope.$apply();
        }
      });
    };
    function activate() {
      vm.Refresh();
    };
    activate();
  };
})();

Finally, we are finish. Not that complicated, right? Feel free to ask question or share your experience.


Just visit the website of the .NET User Group Nordwest-Schweiz to see the Meetup integration in action.

Mit Künstlicher Intelligenz die User Experience verbessern

07.03.2017 22:00:19 | Kazim Bahar

… so lautet mein erster Blogeintrag auf AISOMA Analytics: AISOMA Analytics Blog

Using dependency injection in multiple .NET Core projects

05.03.2017 18:00:00 | Jürgen Gutsch

One of my last post was about Dependency Injection (DI) in .NET Core Console Applications. Some days after that post was published, I got a question about how to use the IServiceCollection in multiple projects. In this post I'm going to try to explain, how to use the IServiceCollection in a Solution with more projects.

Setup

To demonstrate that, I created a Solutions with two .NET Core Console apps and two .NET Standard libraries. One of the console apps uses two of the libraries and the other one is just using on. Each library provides some services which need to be registered to the DI container. Also the console apps provide some services to add.

We now have four projects like this:

  • DiDemo.SmallClient
    • .NET Core Console app
    • includes a WriteSimpleDataService
    • references DiDemo.CsvFileConnector
  • DiDemo.BigClient
    • .NET Core Console app
    • includes a WriteExtendedDataService
    • includes a NormalizedDataService
    • references DiDemo.SqlDatabaseConnector
    • references DiDemo.CsvFileConnector
  • DiDemo.SqlDatabaseConnector
    • .NET Standard library
    • includes a SqlDataService
    • includes a SqlDataProvider used by the service
  • DiDemo.CsvFileConnector
    • .NET Standard library
    • includes a CsvDataService

BTW: Since one of the latest updates the "Class Libraries (.NET Standard)" project disappeared from the ".NET Core" node in the "Add New Project" dialogue and the "Class Library (.NET Core)" is back again. The "Class Libraries (.NET Standard)" is now in the ".NET Standard" node under the "Visual C#" node.

In the most cases it doesn't really makes sense to create a .NET Core class library. The difference here is, that the Class Library (.NET Core) has some .NET Core related references. They targeting the netcoreapp1.x instead of the netstandard1.x. This means they have a lot of references, which are not needed in a class library in the most cases, e. g. the Libuv and the .NET Core runtime.

The WriteExtendedDataService uses a INormalizedDataService to get the data and writes it to the console. The NormalizedDataService fetches the data from the CsvDataService and from the SqlDataService and normalize it, to make it usable in the WriteExtendedDataService.

The WriteSimpleDataService uses only the ICsvDataService and writes the data out to the console.

Setup the DI container

Let's setup the DI container for the SmallClient app. Currently it looks like this:

var services = new ServiceCollection();
services.AddTransient<IWriteSimpleDataService, WriteSimpleDataService>();
services.AddTransient<ICsvDataService, CsvDataService>();

var provider = services.BuildServiceProvider();

var writer = provider.GetService<IWriteSimpleDataService>();
writer.write();

That doesn't really look wrong, but what happens if the app grows and gets a lot more services to add to the DI container? The CsvDataService is not in the app directly, but it is in the separate library. Usually I don't want to map all the services of the external library. I just want to use the library and I don't want to know anything about the internal stuff. This is why we should set-up the mapping for the DI container also in the external library.

Let's plug things together

The .NET Standard libraries should reference the Microsoft.Extensions.DependencyInjection.Abstractions to get the IServiceCollection interface. Now we can create a public static class called IServiceCollectionExtensions to create an extension method to work in the IServiceCollection:

public static class IServiceCollectionExtension
{
  public static IServiceCollection AddCsvFileConnector(this IServiceCollection services)
  {
    services.AddTransient<ICsvDataService, CsvDataService>();
    return services;
  }
}

Inside this method we do all the mappings from the interfaces to the concreate classes or all the other registrations to the DI container. Let's do the same to encapsulate all the services inside the SmallClient app and to keep the program.cs as small as possible:

public static class IServiceCollectionExtension
{
  public static IServiceCollection AddInternalServices(this IServiceCollection services)
  {
    services.AddTransient<IWriteSimpleDataService, WriteSimpleDataService>();
    return services;
  }
}

We can now use this methods in the program.cs of the SmallClient app to plug all that stuff together:

var services = new ServiceCollection();
services.AddInternalServices();
services.AddCsvFileConnector();

var provider = services.BuildServiceProvider();

var writer = provider.GetService<IWriteSimpleDataService>();
writer.write();

It looks much cleaner now. Maybe you remember the AddSomething methods? Exacctly, this is the same way, it is done in ASP.NET Core with e. g. the services.AddMvc() method.

We now need to do the same thing for the BigClient app and the SqlDatabaseConnector library. At first let's create the mapping for the SqlDatbaseConnector:

public static class IServiceCollectionExtension
{
  public static IServiceCollection AddSqlDatabaseConnector(this IServiceCollection services)
  {
    services.AddTransient<ISqlDataService, SqlDataService>();
    services.AddTransient<ISqlDataProvider, SqlDataProvider>();
    return services;
  }
}

We also need to create a extension method for the internal services:

public static class IServiceCollectionExtension
{
  public static IServiceCollection AddInternalServices(this IServiceCollection services)
  {
    services.AddTransient<IWriteExtendedDataService, WriteExtendedDataService>();
    services.AddTransient<INormalizedDataService, NormalizedDataService>();
    return services;
  }
}

Now let's plug that stuff together in the BigClient App:

var services = new ServiceCollection();
services.AddInternalServices();
services.AddCsvFileConnector();
services.AddSqlDatabaseConnector();

var provider = services.BuildServiceProvider();

var writer = provider.GetService<IWriteExtendedDataService>();
writer.write();

As you can see, the BigClient app uses the already existing services.AddCsvFileConnector() method.

Does it really work?

It does. Start the BigClient app in Visual Studio to see that it will work as expected:

To see the full sources and to try it out by yourself, please visit the GitHub repository: https://github.com/JuergenGutsch/di-core-multi-demo

What do you think? Do you have questions or some ideas to add? Feel free to drop a comment :)

HowTo: Get User Information & Group Memberships from Active Directory via C#

03.03.2017 01:45:00 |

I had to find a way to access all group memberships from a given Active Directory user. The problem here is, that groups may contain other groups and I needed a list of “all” applied group memberships - directly or indirectly.

The “fastest” solution (without querying each group) is to use the Token-Groups attribute, which already does this magic for us. This list should contain all applied groups.

The code would also allow to read any other AD property, e.g. the UPN or names etc.

Code

class Program
{
    static void Main(string[] args)
    {
        Console.WriteLine("ListAllGroupsViaTokenGroups:");

        List<string> result = new List<string>();

        try
        {
            result = ListAllGroupsViaTokenGroups("USERNAME", "DOMAIN");

            foreach (var group in result)
            {
                Console.WriteLine(group);
            }
        }
        catch (Exception exc)
        {
            Console.WriteLine(exc.Message);
        }

        Console.Read();
    }

  
    private static List<string> ListAllGroupsViaTokenGroups(string username, string domainName)
    {
        List<string> result = new List<string>();

        using (PrincipalContext domainContext = new PrincipalContext(ContextType.Domain, domainName))
        using (var searcher = new DirectorySearcher(new DirectoryEntry("LDAP://" + domainContext.Name)))
        {
            searcher.Filter = String.Format("(&(objectClass=user)(sAMAccountName={0}))", username);
            SearchResult sr = searcher.FindOne();

            DirectoryEntry user = sr.GetDirectoryEntry();

            // access to other user properties, via user.Properties["..."]

            user.RefreshCache(new string[] { "tokenGroups" });

            for (int i = 0; i < user.Properties["tokenGroups"].Count; i++)
            {
                SecurityIdentifier sid = new SecurityIdentifier((byte[])user.Properties["tokenGroups"][i], 0);
                NTAccount nt = (NTAccount)sid.Translate(typeof(NTAccount));

                result.Add(nt.Translate(typeof(NTAccount)).ToString() + " (" + sid + ")");
            }
        }

        return result;
    }

}

Hope this will help someone in the future.

Code @ GitHub

HowTo: Get User Information & Group Memberships from Active Directory via C#

03.03.2017 01:45:00 |

I had to find a way to access all group memberships from a given Active Directory user. The problem here is, that groups may contain other groups and I needed a list of “all” applied group memberships - directly or indirectly.

The “fastest” solution (without querying each group) is to use the Token-Groups attribute, which already does this magic for us. This list should contain all applied groups.

The code would also allow to read any other AD property, e.g. the UPN or names etc.

Code

class Program
{
    static void Main(string[] args)
    {
        Console.WriteLine("ListAllGroupsViaTokenGroups:");

        List<string> result = new List<string>();

        try
        {
            result = ListAllGroupsViaTokenGroups("USERNAME", "DOMAIN");

            foreach (var group in result)
            {
                Console.WriteLine(group);
            }
        }
        catch (Exception exc)
        {
            Console.WriteLine(exc.Message);
        }

        Console.Read();
    }

  
    private static List<string> ListAllGroupsViaTokenGroups(string username, string domainName)
    {
        List<string> result = new List<string>();

        using (PrincipalContext domainContext = new PrincipalContext(ContextType.Domain, domainName))
        using (var searcher = new DirectorySearcher(new DirectoryEntry("LDAP://" + domainContext.Name)))
        {
            searcher.Filter = String.Format("(&(objectClass=user)(sAMAccountName={0}))", username);
            SearchResult sr = searcher.FindOne();

            DirectoryEntry user = sr.GetDirectoryEntry();

            // access to other user properties, via user.Properties["..."]

            user.RefreshCache(new string[] { "tokenGroups" });

            for (int i = 0; i < user.Properties["tokenGroups"].Count; i++)
            {
                SecurityIdentifier sid = new SecurityIdentifier((byte[])user.Properties["tokenGroups"][i], 0);
                NTAccount nt = (NTAccount)sid.Translate(typeof(NTAccount));

                result.Add(nt.Translate(typeof(NTAccount)).ToString() + " (" + sid + ")");
            }
        }

        return result;
    }

}

Hope this will help someone in the future.

Code @ GitHub

ActiveRoute TagHelper

02.03.2017 18:00:00 | Jürgen Gutsch

I recently read the pretty cool blog post by Ben Cull about the IsActiveRoute TagHelper: http://benjii.me/2017/01/is-active-route-tag-helper-asp-net-mvc-core/. This TagHelper adds a css class to an element, if the specified route or route parts are in the current active route. This is pretty useful, if you want to highlight an active item in a menu.

Inspired by this idea, I created a different TagHelper, which shows or hide contents, if the specified route or route parts are in the current route. This could be useful, e.g. if you don't want to have a link in an active menu item.

From the perspective of an semantic web, it doesn't make sense to link to the current page. That means, the menu item that points to the current page should not be a link.

The usage of this TagHelper will look like this:

<ul class="nav navbar-nav">
  <li>
    <a asp-active-route asp-action="Index" asp-controller="Home" asp-hide-if-active="true">
      <span>Home</span>
    </a>
    <span asp-active-route asp-action="Index" asp-controller="Home">Home</span>
  </li>
  <li>
    <a asp-active-route asp-action="About" asp-controller="Home" asp-hide-if-active="true">
      <span>About</span>
    </a>
    <span asp-active-route asp-action="About" asp-controller="Home">About</span>
  </li>
  <li>
    <a asp-active-route asp-action="Contact" asp-controller="Home" asp-hide-if-active="true">
      <span>Contact</span>
    </a>
    <span asp-active-route asp-action="Contact" asp-controller="Home">Contact</span>
  </li>
</ul>

As you may see on the a-Tag, multiple TagHelper can work on a single Tag. In this case the built in AnchorTagHelper and the ActiveRouteTagHelper are manipulating the Tag. The a-Tag will be hidden if the specified route is active and the span-Tag is shown in that case.

If you now navigate to the About page, the a-Tag is removed from the specific menu item and the span-Tag is shown. The HTML result of the menu now looks pretty clean:

<ul class="nav navbar-nav">
  <li>
    <ahttp://feedproxy.google.com href="/">
      <span>Home</span>
    </a>
  </li>
  <li>
    <span>About</span>
  </li>
  <li>
    <ahttp://feedproxy.google.com href="/Home/About">
      <span>Contact</span>
    </a>
  </li>
</ul>

Using this approach for the menu, we don't need Ben Culls TagHelper here to add a special CSS class. The style for the active item can be set via the selection of that list item with just the span in it:

.nav.navbar-nav li > a { ... }
.nav.navbar-nav li > a > span { ... }
.nav.navbar-nav li > span { ... } /* this is the active item*/

This CSS is based on the default Bootstrap based template in a new ASP.NET Core project. If you use another template, just replace the CSS class which identifies the menu with your specific identifier.

That means, to get that active menu item looking nice, you may just add a CSS like this:

.navbar-nav li > span {
    padding: 15px;
    display: block;
    color: white;
}

This results in the following view:

To get this working, we need to implement the TagHelper. I just created a new class in the project and called it ActiveRouteTagHelper and added the needed properties:

[HtmlTargetElement(Attributes = "asp-active-route")]
public class ActiveRouteTagHelper : TagHelper
{
  [HtmlAttributeName("asp-controller")]
  public string Controller { get; set; }

  [HtmlAttributeName("asp-action")]
  public string Action { get; set; }

  [HtmlAttributeName("asp-hide-if-active")]
  public bool HideIfActive { get; set; }
  
  
}

That class inherits the TagHelper base class. To use it on any HTML tag, I defined a attribute name which is needed to on the HTML we want to manipulate. I used the name "asp-active-route". Also the attributes getting a specific name. I could use the default name, without the leading "asp" prefix, but I thouhgt it would make sense to share the Controller and Action properties with the built-in AnchorTagHelper. And to be consistent, I use the prefix in all cases.

Now we need to override the Process method to actually manipulate the specific HTML tag:

public override void Process(TagHelperContext context, TagHelperOutput output)
{
  if (!CanShow())
  {
    output.SuppressOutput();
  }

  var attribute = output.Attributes.First(x => x.Name == "asp-active-route");
  output.Attributes.Remove(attribute);
}

If I cannot show the Tag because of the conditions in the CahShow() method, I completely suppress the output. Nothing is generated in that case. Not the contents and not the HTML tag itself.

At the end of the method, I remove the identifying attribute, which is used to activate this TagHelper, because this attribute will be kept usually.

To get the RouteData of the current route, we cant use the TagHelperContext or the TagHelperOutput. We need to add the inject the ViewContext:

[HtmlAttributeNotBound]
[ViewContext]
public ViewContext ViewContext { get; set; }

Now we are able to access the route data and get the needed information about the current route:

private bool CanShow()
{
  var currentController = ViewContext.RouteData.Values["Controller"].ToString();
  var currentAction = ViewContext.RouteData.Values["Action"].ToString();

  var show = false;
  if (!String.IsNullOrWhiteSpace(Controller) &&
      Controller.Equals(currentController, StringComparison.CurrentCultureIgnoreCase))
  {
    show = true;
  }
  if (show &&
      !String.IsNullOrWhiteSpace(Action) &&
      Action.Equals(currentAction, StringComparison.CurrentCultureIgnoreCase))
  {
    show = true;
  }
  else
  {
    show = false;
  }

  if (HideIfActive)
  {
    show = !show;
  }

  return show;
}

One last step you need to do, is to register your own TagHelpers. In Visual Studio open the _ViewImports.cshtml and add the following line of code:

@addTagHelper *, CoreWebApplication

Where CoreWebApplication is the assembly name of your project. * means use all TagHelpers in that library

Conclusion

I hope this makes sense to you and helps you a little more to get into the TagHelpers.

I always have fun, creating a new TagHelper. With less code, I'm able to extend the View engine the way I need.

I always focus on semantic HTML, if possible. Because it makes the Web a little more accessible to other devices and engines than we usually use. This could be screen readers for blind people, as well as search engines. Maybe I can do some more posts about accessibility in ASP.NET Core applications.

ActiveRoute TagHelper

02.03.2017 18:00:00 | Jürgen Gutsch

I recently read the pretty cool blog post by Ben Cull about the IsActiveRoute TagHelper: http://benjii.me/2017/01/is-active-route-tag-helper-asp-net-mvc-core/. This TagHelper adds a css class to an element, if the specified route or route parts are in the current active route. This is pretty useful, if you want to highlight an active item in a menu.

Inspired by this idea, I created a different TagHelper, which shows or hide contents, if the specified route or route parts are in the current route. This could be useful, e.g. if you don't want to have a link in an active menu item.

From the perspective of an semantic web, it doesn't make sense to link to the current page. That means, the menu item that points to the current page should not be a link.

The usage of this TagHelper will look like this:

<ul class="nav navbar-nav">
  <li>
    <a asp-active-route asp-action="Index" asp-controller="Home" asp-hide-if-active="true">
      <span>Home</span>
    </a>
    <span asp-active-route asp-action="Index" asp-controller="Home">Home</span>
  </li>
  <li>
    <a asp-active-route asp-action="About" asp-controller="Home" asp-hide-if-active="true">
      <span>About</span>
    </a>
    <span asp-active-route asp-action="About" asp-controller="Home">About</span>
  </li>
  <li>
    <a asp-active-route asp-action="Contact" asp-controller="Home" asp-hide-if-active="true">
      <span>Contact</span>
    </a>
    <span asp-active-route asp-action="Contact" asp-controller="Home">Contact</span>
  </li>
</ul>

As you may see on the a-Tag, multiple TagHelper can work on a single Tag. In this case the built in AnchorTagHelper and the ActiveRouteTagHelper are manipulating the Tag. The a-Tag will be hidden if the specified route is active and the span-Tag is shown in that case.

If you now navigate to the About page, the a-Tag is removed from the specific menu item and the span-Tag is shown. The HTML result of the menu now looks pretty clean:

<ul class="nav navbar-nav">
  <li>
    <ahttp://feedproxy.google.com href="/">
      <span>Home</span>
    </a>
  </li>
  <li>
    <span>About</span>
  </li>
  <li>
    <ahttp://feedproxy.google.com href="/Home/About">
      <span>Contact</span>
    </a>
  </li>
</ul>

Using this approach for the menu, we don't need Ben Culls TagHelper here to add a special CSS class. The style for the active item can be set via the selection of that list item with just the span in it:

.nav.navbar-nav li > a { ... }
.nav.navbar-nav li > a > span { ... }
.nav.navbar-nav li > span { ... } /* this is the active item*/

This CSS is based on the default Bootstrap based template in a new ASP.NET Core project. If you use another template, just replace the CSS class which identifies the menu with your specific identifier.

That means, to get that active menu item looking nice, you may just add a CSS like this:

.navbar-nav li > span {
    padding: 15px;
    display: block;
    color: white;
}

This results in the following view:

To get this working, we need to implement the TagHelper. I just created a new class in the project and called it ActiveRouteTagHelper and added the needed properties:

[HtmlTargetElement(Attributes = "asp-active-route")]
public class ActiveRouteTagHelper : TagHelper
{
  [HtmlAttributeName("asp-controller")]
  public string Controller { get; set; }

  [HtmlAttributeName("asp-action")]
  public string Action { get; set; }

  [HtmlAttributeName("asp-hide-if-active")]
  public bool HideIfActive { get; set; }
  
  
}

That class inherits the TagHelper base class. To use it on any HTML tag, I defined a attribute name which is needed to on the HTML we want to manipulate. I used the name "asp-active-route". Also the attributes getting a specific name. I could use the default name, without the leading "asp" prefix, but I thouhgt it would make sense to share the Controller and Action properties with the built-in AnchorTagHelper. And to be consistent, I use the prefix in all cases.

Now we need to override the Process method to actually manipulate the specific HTML tag:

public override void Process(TagHelperContext context, TagHelperOutput output)
{
  if (!CanShow())
  {
    output.SuppressOutput();
  }

  var attribute = output.Attributes.First(x => x.Name == "asp-active-route");
  output.Attributes.Remove(attribute);
}

If I cannot show the Tag because of the conditions in the CahShow() method, I completely suppress the output. Nothing is generated in that case. Not the contents and not the HTML tag itself.

At the end of the method, I remove the identifying attribute, which is used to activate this TagHelper, because this attribute will be kept usually.

To get the RouteData of the current route, we cant use the TagHelperContext or the TagHelperOutput. We need to add the inject the ViewContext:

[HtmlAttributeNotBound]
[ViewContext]
public ViewContext ViewContext { get; set; }

Now we are able to access the route data and get the needed information about the current route:

private bool CanShow()
{
  var currentController = ViewContext.RouteData.Values["Controller"].ToString();
  var currentAction = ViewContext.RouteData.Values["Action"].ToString();

  var show = false;
  if (!String.IsNullOrWhiteSpace(Controller) &&
      Controller.Equals(currentController, StringComparison.CurrentCultureIgnoreCase))
  {
    show = true;
  }
  if (show &&
      !String.IsNullOrWhiteSpace(Action) &&
      Action.Equals(currentAction, StringComparison.CurrentCultureIgnoreCase))
  {
    show = true;
  }
  else
  {
    show = false;
  }

  if (HideIfActive)
  {
    show = !show;
  }

  return show;
}

One last step you need to do, is to register your own TagHelpers. In Visual Studio open the _ViewImports.cshtml and add the following line of code:

@addTagHelper *, CoreWebApplication

Where CoreWebApplication is the assembly name of your project. * means use all TagHelpers in that library

Conclusion

I hope this makes sense to you and helps you a little more to get into the TagHelpers.

I always have fun, creating a new TagHelper. With less code, I'm able to extend the View engine the way I need.

I always focus on semantic HTML, if possible. Because it makes the Web a little more accessible to other devices and engines than we usually use. This could be screen readers for blind people, as well as search engines. Maybe I can do some more posts about accessibility in ASP.NET Core applications.

Xamarin Video-Training veröffentlicht

27.02.2017 10:55:00 | Jörg Neumann

Ab dem 1.3.2017 steht mein Video-Training "Cross-Plattform-App-Development mit Xamarin" bei entwickler.tutorials bereit. Bis zum 17.4. gibt es 25% Rabatt. Also anschauen!


Material von der BASTA! Spring

26.02.2017 08:44:00 | Jörg Neumann

Hier das Material meiner Sessions auf der BASTA! Spring 2017:



Die Unterlagen meines Workshops "Cross-Plattform-App-Development mit Xamarin" erhalten Sie gerne auf Anfrage.

Tools: NDepend v2017.1 Smart Technical Debt Estimation

25.02.2017 16:36:02 | Steffen Steinbrecher

Jeder Entwickler dürfte das folgende Szenario aus dem Alltag kennen: Es soll ein neues Feature/Anforderung in die bestehende Software integriert werden und dafür gibt es (wie fast immer) mehrere Möglichkeiten: die Quick & Dirty Lösung – lässt sich schnell erledigen, obwohl man sich darüber bewusst ist über kurz oder lang wieder über diese Codestelle „stolpern“. […]

Creating a Word document OutputFormatter in ASP.NET Core

22.02.2017 18:00:00 | Jürgen Gutsch

In one of the ASP.NET Core projects we did in the last year, we created an OutputFormatter to provide a Word documents as printable reports via ASP.NET Core Web API. Well, this formatter wasn't done by me, but done by a fellow software developer Jakob Wolf at the yooapps.com. I told him to write about it, but he hadn't enough time to do it yet, so I'm going to do it for him. Maybe you know about him on Twitter. Maybe not, but he is one of the best ASP.NET and Angular developers I ever met.

About OutputFormatters

In ASP.NET you are able to have many different formatters. The best known built-in formatter is the JsonOutputFormatter which is used as the default OutputFormatter in ASP.NET Web API.

By using the AddMvcOptions() you are able to add new Formatters or to manage the existing formatters:

services.AddMvc()
    .AddMvcOptions(options =>
    {
        options.OutputFormatters.Add(new WordOutputFormatter());
        options.FormatterMappings.SetMediaTypeMappingForFormat(
          "docx", MediaTypeHeaderValue.Parse("application/ms-word"));
    })

As you can see in the snippet above, we add the Word document formatter (called WordOutputFormatter to provide the Word documents if the requested type is "application/ms-word".

You are able to add whatever formatter you need, provided on whatever media type you want to support.

Let's have a look how a output formatter looks like:

public class MyFormatter : IOutputFormatter
{
  public bool CanWriteResult(OutputFormatterCanWriteContext context)
  {
    // check whether to write or not
    throw new NotImplementedException();
  }

  public async Task WriteAsync(OutputFormatterWriteContext context)
  {
    // write the formatted contents to the response stream.
    throw new NotImplementedException();
  }
}

You have one method to check whether the data can be written to the expected format or not. The other async method does the job to format and output the data to the response stream, which comes with the context.

This way needs to do some things manually. A more comfortable way to implement an OutputFormatter is to inherit from the OutputFormatter base class directly:

public class WordOutputFormatter : OutputFormatter
{
  public string ContentType { get; }

  public WordOutputFormatter()
  {
    ContentType = "application/ms-word";
    SupportedMediaTypes.Add(MediaTypeHeaderValue.Parse(ContentType));
  }

  // optional, but makes sense to restrict to a specific condition
  protected override bool CanWriteType(Type type)
  {
    if (type == null)
    {
      throw new ArgumentNullException(nameof(type));
    }

    // only one ViewModel type is allowed
    return type == typeof(DocumentContentsViewModel);
  }

  // this needs to be overwritten
  public override Task WriteResponseBodyAsync(OutputFormatterWriteContext context)
  {
    // Format and write the document outputs here
    throw new NotImplementedException();
  }
}

The base class does some things for you. For example to write the correct HTTP headers.

Creating Word documents

To create Word documents you need to add a reference to the Open XML SDK. We used the OpenXMLSDK-MOT with the version 2.6.0, which cannot used with .NET Core. This is why we run that specific ASP.NET Core project on .NET 4.6.

Version 2.7.0 is available as a .NET Standard 1.3 library and can be used in .NET Core. Unfortunately this version isn't yet available in the default NuGet Feed. To install the latest Version, follow the instructions on GitHub: https://github.com/officedev/open-xml-sd Currently there is a mess with the NuGet package IDs and versions on NuGet and MyGet. Use the MyGet feed, mentioned on the GitHub page to install the latest version. The package ID here is DocumentFormat.OpenXml and the latest stable Version is 2.7.1

In this post, I don't want to go threw all the word processing stuff , because it is too specific to our implementation. I just show you how it works in general. The Open XML SDK is pretty well documented, so you can use this as an entry point to create your own specific WordOutputFormatter:

public override async Task WriteResponseBodyAsync(OutputFormatterWriteContext context)
{
  var response = context.HttpContext.Response;
  var filePath = Path.GetTempFileName();

  var viewModel = context.Object as DocumentContentsViewModel;
  if (viewModel == null)
  {
    throw new ArgumentNullException(nameof(viewModel));
  }

  using (var wordprocessingDocument = WordprocessingDocument
         .Create(filePath, WordprocessingDocumentType.Document))
  {
    // start creating the documents and the main parts of it
    wordprocessingDocument.AddMainDocumentPart();

    var styleDefinitionPart = wordprocessingDocument.MainDocumentPart
      .AddNewPart<StyleDefinitionsPart>();
    var styles = new Styles();
    styles.Save(styleDefinitionPart);

    wordprocessingDocument.MainDocumentPart.Document = new Document
    {
      Body = new Body()
    };
    var body = wordprocessingDocument.MainDocumentPart.Document.Body;

    // call a helper method to set default styles
    AddStyles(styleDefinitionPart); 
    // call a helper method set the document to landscape mode
    SetLandscape(body); 

    foreach (var institution in viewModel.Items)
    {
      // iterate threw some data of the viewmodel 
      // and create the elements you need
      
      // ... more word processing stuff here

    }

    await response.SendFileAsync(filePath);
  }
}

The VewModel with the data to format, is in the Object property of the OutputFormatterWriteContext. We do a save cast and check for null before we continue. The Open XML SDK works based on files. This is why we need to create a temp file name and let the SDK use this file path. Because of that fact - at the end - we send the file out to the response stream using the response.SendFileAsync() method. I personally prefer to work on the OutputStream directly, to have less file operations and to be a little bit faster. The other thing is, we need to cleanup the temp files.

After the file is created, we work on this file and create the document, custom styles and layouts and the document body, which will contain the formatted data. Inside the loop we are only working on that Body object. We created helper methods to add formatted values, tables and so on...

Conclusion

OutputFormatters are pretty useful to create almost any kind of content out of any kind of data. Instead of hacking around in the specific Web API actions, you should always use the OutputFormatters to have reusable components.

The OutputFormatter we build, is not really reusable or even generic, because it was created for a specific kind of a report. But with this starting point, we are able to make it generic. We could pass a template document to the formatter, which knows the properties of the ViewModel, this way it is possible to create almost all kind of Word documents.

Create NuGet packages with Cake

14.02.2017 01:45:00 |

This blogpost is a follow up to these Cake (C# Make) related blogpost:

Scenario

x

Let’s say we have this project structure. The “Config”, “Result” and “Engine” projects contains a corresponding .nuspec, like this:

<?xml version="1.0"?>
<package >
  <metadata>
    <id>Sloader.Config</id>
    <version>$version$</version>
    <title>Sloader.Config</title>
    <authors>Code Inside Team</authors>
    <owners>Code Inside Team</owners>
    <licenseUrl>https://github.com/Code-Inside/Sloader/blob/master/LICENSE</licenseUrl>
    <projectUrl>https://github.com/Code-Inside/Sloader</projectUrl>
    <requireLicenseAcceptance>false</requireLicenseAcceptance>
    <description>Sloader Config</description>
    <releaseNotes>
      ## Version 0.1 ##
      Init
    </releaseNotes>
    <copyright>Copyright 2017</copyright>
    <tags>Sloader</tags>
    <dependencies />
  </metadata>
</package>

Nothing fancy - pretty normal NuGet stuff, but be aware of the “$version$” variable. This variable is called a “replacement-token”. When the NuGet package is created and it detects such a replacement-token, it will search for the AssemblyVersion (or other replacement-token sources).

Versioning in NuGet:

I’m not a NuGet expert, but you should also versioning your assembly info, otherwise some systems may have trouble to update your dll. The version inside the package can be different from the actual assembly version, but you should manage booth or use this replacement-token-mechanic.

Goal

The goal is to create a NuGet package for each target project with Cake.

build.cake

The usage in Cake is pretty much the same as with the normal nuget.exe pack command The sample only shows the actual cake target - see the older blogposts for a more complete example:

Task("BuildPackages")
    .IsDependentOn("Restore-NuGet-Packages")
	.IsDependentOn("RunTests")
    .Does(() =>
{
    var nuGetPackSettings = new NuGetPackSettings
	{
		OutputDirectory = rootAbsoluteDir + @"\artifacts\",
		IncludeReferencedProjects = true,
		Properties = new Dictionary<string, string>
		{
			{ "Configuration", "Release" }
		}
	};

    MSBuild("./src/Sloader.Config/Sloader.Config.csproj", new MSBuildSettings().SetConfiguration("Release"));
    NuGetPack("./src/Sloader.Config/Sloader.Config.csproj", nuGetPackSettings);
    MSBuild("./src/Sloader.Result/Sloader.Result.csproj", new MSBuildSettings().SetConfiguration("Release"));
    NuGetPack("./src/Sloader.Result/Sloader.Result.csproj", nuGetPackSettings);
    MSBuild("./src/Sloader.Engine/Sloader.Engine.csproj", new MSBuildSettings().SetConfiguration("Release"));
    NuGetPack("./src/Sloader.Engine/Sloader.Engine.csproj", nuGetPackSettings);
});

Easy, right? The most interesting part here is the NuGetPack command. Before we invoke this command we need to make sure that we build the last recent version - to enforce that we just rebuild each project in release mode.

Result

x

The console output should make the flow pretty clear:

PS C:\Users\Robert\Documents\GitHub\Sloader> .\build.ps1 -t BuildPackages
Preparing to run build script...
Running build script...
Analyzing build script...
Processing build script...
Installing tools...
Compiling build script...

========================================
Clean
========================================
Executing task: Clean
Cleaning directory C:/Users/Robert/Documents/GitHub/Sloader/artifacts
Finished executing task: Clean

========================================
Restore-NuGet-Packages
========================================
Executing task: Restore-NuGet-Packages
MSBuild auto-detection: using msbuild version '14.0' from 'C:\Program Files (x86)\MSBuild\14.0\bin'.
All packages listed in packages.config are already installed.
Finished executing task: Restore-NuGet-Packages

========================================
BuildTests
========================================
Executing task: BuildTests
Start Building Test: Sloader.Config.Tests
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

  Sloader.Config -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Config.Tests\Sloader.Config.dll
  Sloader.Config.Tests -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Config.Tests\Sloader.Config
  .Tests.dll
Start Building Test: Sloader.Result.Tests
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

  Sloader.Result -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Result.Tests\Sloader.Result.dll
  Sloader.Result.Tests -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Result.Tests\Sloader.Result
  .Tests.dll
Start Building Test: Sloader.Engine.Tests
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

  Sloader.Config -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Engine.Tests\Sloader.Config.dll
  Sloader.Result -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Engine.Tests\Sloader.Result.dll
  Sloader.Engine -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Engine.Tests\Sloader.Engine.dll
  Sloader.Engine.Tests -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Engine.Tests\Sloader.Engine
  .Tests.dll
Finished executing task: BuildTests

========================================
RunTests
========================================
Executing task: RunTests
Start Running Tests
xUnit.net Console Runner (64-bit .NET 4.0.30319.42000)
  Discovering: Sloader.Config.Tests
  Discovered:  Sloader.Config.Tests
  Starting:    Sloader.Config.Tests
  Finished:    Sloader.Config.Tests
  Discovering: Sloader.Engine.Tests
  Discovered:  Sloader.Engine.Tests
  Starting:    Sloader.Engine.Tests
  Finished:    Sloader.Engine.Tests
  Discovering: Sloader.Result.Tests
  Discovered:  Sloader.Result.Tests
  Starting:    Sloader.Result.Tests
  Finished:    Sloader.Result.Tests
=== TEST EXECUTION SUMMARY ===
   Sloader.Config.Tests  Total: 23, Errors: 0, Failed: 0, Skipped: 0, Time: 0.554s
   Sloader.Engine.Tests  Total: 17, Errors: 0, Failed: 0, Skipped: 0, Time: 1.070s
   Sloader.Result.Tests  Total:  4, Errors: 0, Failed: 0, Skipped: 0, Time: 1.061s
                                --          -          -           -        ------
                   GRAND TOTAL: 44          0          0           0        2.684s (5.697s)
Finished executing task: RunTests

========================================
BuildPackages
========================================
Executing task: BuildPackages
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

Build started 2017-02-19 22:00:09.
The target "_ConvertPdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,37)" does not exist in the project, and will be ignored.
The target "_CollectPdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,70)" does not exist in the project, and will be ignored.
The target "_CollectMdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (41,38)" does not exist in the project, and will be ignored.
The target "_CopyMdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Common
.targets\ImportAfter\Xamarin.Common.targets (41,71)" does not exist in the project, and will be ignored.
Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Config\Sloader.Config.csproj" on node 1 (Build target(s))
.
GenerateTargetFrameworkMonikerAttribute:
Skipping target "GenerateTargetFrameworkMonikerAttribute" because all output files are up-to-date with respect to the i
nput files.
CoreCompile:
Skipping target "CoreCompile" because all output files are up-to-date with respect to the input files.
CopyFilesToOutputDirectory:
  Sloader.Config -> C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Config\bin\Release\Sloader.Config.dll
Done Building Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Config\Sloader.Config.csproj" (Build target
(s)).


Build succeeded.
    0 Warning(s)
    0 Error(s)

Time Elapsed 00:00:00.22
Attempting to build package from 'Sloader.Config.csproj'.
MSBuild auto-detection: using msbuild version '14.0' from 'C:\Program Files (x86)\MSBuild\14.0\bin'.
Packing files from 'C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Config\bin\Release'.
Using 'Sloader.Config.nuspec' for metadata.
Found packages.config. Using packages listed as dependencies
Successfully created package 'C:\Users\Robert\Documents\GitHub\Sloader\artifacts\Sloader.Config.0.2.1.nupkg'.
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

Build started 2017-02-19 22:00:10.
The target "_ConvertPdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,37)" does not exist in the project, and will be ignored.
The target "_CollectPdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,70)" does not exist in the project, and will be ignored.
The target "_CollectMdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (41,38)" does not exist in the project, and will be ignored.
The target "_CopyMdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Common
.targets\ImportAfter\Xamarin.Common.targets (41,71)" does not exist in the project, and will be ignored.
Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Result\Sloader.Result.csproj" on node 1 (Build target(s))
.
GenerateTargetFrameworkMonikerAttribute:
Skipping target "GenerateTargetFrameworkMonikerAttribute" because all output files are up-to-date with respect to the i
nput files.
CoreCompile:
Skipping target "CoreCompile" because all output files are up-to-date with respect to the input files.
CopyFilesToOutputDirectory:
  Sloader.Result -> C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Result\bin\Release\Sloader.Result.dll
Done Building Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Result\Sloader.Result.csproj" (Build target
(s)).


Build succeeded.
    0 Warning(s)
    0 Error(s)

Time Elapsed 00:00:00.24
Attempting to build package from 'Sloader.Result.csproj'.
MSBuild auto-detection: using msbuild version '14.0' from 'C:\Program Files (x86)\MSBuild\14.0\bin'.
Packing files from 'C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Result\bin\Release'.
Using 'Sloader.Result.nuspec' for metadata.
Found packages.config. Using packages listed as dependencies
Successfully created package 'C:\Users\Robert\Documents\GitHub\Sloader\artifacts\Sloader.Result.0.2.1.nupkg'.
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

Build started 2017-02-19 22:00:12.
The target "_ConvertPdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,37)" does not exist in the project, and will be ignored.
The target "_CollectPdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,70)" does not exist in the project, and will be ignored.
The target "_CollectMdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (41,38)" does not exist in the project, and will be ignored.
The target "_CopyMdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Common
.targets\ImportAfter\Xamarin.Common.targets (41,71)" does not exist in the project, and will be ignored.
The target "_ConvertPdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,37)" does not exist in the project, and will be ignored.
The target "_CollectPdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,70)" does not exist in the project, and will be ignored.
The target "_CollectMdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (41,38)" does not exist in the project, and will be ignored.
The target "_CopyMdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Common
.targets\ImportAfter\Xamarin.Common.targets (41,71)" does not exist in the project, and will be ignored.
Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Engine\Sloader.Engine.csproj" on node 1 (Build target(s))
.
Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Engine\Sloader.Engine.csproj" (1) is building "C:\Users\R
obert\Documents\GitHub\Sloader\src\Sloader.Config\Sloader.Config.csproj" (2) on node 1 (default targets).
GenerateTargetFrameworkMonikerAttribute:
Skipping target "GenerateTargetFrameworkMonikerAttribute" because all output files are up-to-date with respect to the i
nput files.
CoreCompile:
Skipping target "CoreCompile" because all output files are up-to-date with respect to the input files.
CopyFilesToOutputDirectory:
  Sloader.Config -> C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Config\bin\Release\Sloader.Config.dll
Done Building Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Config\Sloader.Config.csproj" (default targ
ets).

The target "_ConvertPdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,37)" does not exist in the project, and will be ignored.
The target "_CollectPdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,70)" does not exist in the project, and will be ignored.
The target "_CollectMdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (41,38)" does not exist in the project, and will be ignored.
The target "_CopyMdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Common
.targets\ImportAfter\Xamarin.Common.targets (41,71)" does not exist in the project, and will be ignored.
Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Engine\Sloader.Engine.csproj" (1) is building "C:\Users\R
obert\Documents\GitHub\Sloader\src\Sloader.Result\Sloader.Result.csproj" (3) on node 1 (default targets).
GenerateTargetFrameworkMonikerAttribute:
Skipping target "GenerateTargetFrameworkMonikerAttribute" because all output files are up-to-date with respect to the i
nput files.
CoreCompile:
Skipping target "CoreCompile" because all output files are up-to-date with respect to the input files.
CopyFilesToOutputDirectory:
  Sloader.Result -> C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Result\bin\Release\Sloader.Result.dll
Done Building Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Result\Sloader.Result.csproj" (default targ
ets).

BclBuildEnsureBindingRedirects:
Skipping target "BclBuildEnsureBindingRedirects" because all output files are up-to-date with respect to the input file
s.
GenerateTargetFrameworkMonikerAttribute:
Skipping target "GenerateTargetFrameworkMonikerAttribute" because all output files are up-to-date with respect to the i
nput files.
CoreCompile:
Skipping target "CoreCompile" because all output files are up-to-date with respect to the input files.
_CopyAppConfigFile:
Skipping target "_CopyAppConfigFile" because all output files are up-to-date with respect to the input files.
CopyFilesToOutputDirectory:
  Sloader.Engine -> C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Engine\bin\Release\Sloader.Engine.dll
Done Building Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Engine\Sloader.Engine.csproj" (Build target
(s)).


Build succeeded.
    0 Warning(s)
    0 Error(s)

Time Elapsed 00:00:00.54
Attempting to build package from 'Sloader.Engine.csproj'.
MSBuild auto-detection: using msbuild version '14.0' from 'C:\Program Files (x86)\MSBuild\14.0\bin'.
Packing files from 'C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Engine\bin\Release'.
Using 'Sloader.Engine.nuspec' for metadata.
Found packages.config. Using packages listed as dependencies
Successfully created package 'C:\Users\Robert\Documents\GitHub\Sloader\artifacts\Sloader.Engine.0.2.1.nupkg'.
Finished executing task: BuildPackages

Task                          Duration
--------------------------------------------------
Clean                         00:00:00.1083837
Restore-NuGet-Packages        00:00:00.7808530
BuildTests                    00:00:02.6296445
RunTests                      00:00:05.9397822
BuildPackages                 00:00:05.2679058
--------------------------------------------------
Total:                        00:00:14.7265692

Regeln | Impressum