.
Anmeldung | Registrieren | Hilfe

Blogger

.NET-Blog Archiv

.NET Developer Blogs

5 kostenlose eBooks über Machine Learning

23.05.2017 13:15:03 | Kazim Bahar

Tagtäglich erscheinen Artikel zu Thema Künstliche Intelligenz bzw. Machine Learning. Was hat das aber mit UI-Design und...

Wissenschaftliche Untersuchungen zur Codequalität

23.05.2017 12:10:43 | Hendrik Loesch

Das die Art und Weise in der wir Code schreiben direkt dessen Verständnis beeinflusst, kann sich jeder vorstellen der schon einmal Code lesen musste in dem keine sprechenden Variablennamen verwendet wurden. Diese Zusammenhänge sind mittlerweile auch Teil von wissenschaftlichen Untersuchungen. Zwei Paper die sich damit beschäftigen möchte ich an dieser Stelle verlinken. Shorter Identifier Names […]

Dateien für automatisierte Tests bereit stellen

22.05.2017 18:48:20 | Hendrik Loesch

Die berühmten Unit Testing Frameworks werden nicht nur für Unit Tests genutzt, sondern auch für automatisierte Tests allgemein und damit auch für Integrationstests. In diesem Zusammenhang wird es dann schnell notwendig, dass auch eine Reihe von Dateien bereit gestellt werden müssen. Dies endet nicht selten darin, dass jene Dateien während des Tests nicht gefunden werden […]

Material von der Seacon

15.05.2017 13:36:00 | Jörg Neumann

Material von der JAX

15.05.2017 13:35:00 | Jörg Neumann

ASP.NET Core in trouble

09.05.2017 21:00:00 | Jürgen Gutsch

ASP.NET Core today

Currently ASP.NET Core - Microsoft's new web framework - can be used on top of .NET Core and on top of the .NET Framework. This fact is pretty nice, because you are able to use all the new features of ASP.NET Core with the power of the huge but well known .NET Framework. On the other hand, the new cross-platform .NET Core is even nice, but with a smaller set of features. Today you have the choice between of being x-plat or to use the full .NET Framework. This isn't really bad.

Actually it could be better. Let's see why:

What is the issue?

Microsoft removed the support of the full .NET Framework for ASP.NET Core 2.0 and some developers are not really happy about that. See this Github Issue thread. ASP.NET Core 2.0 will just run on .NET Core 2.0. This fact results in a hot discussion within that GitHub issue.

It also results in some misleading and confusing headlines and contents on some German IT news publishers:

While the discussion was running, David Fowler said on Twitter that it's the best to think of ASP.NET Core 2.0 and .NET Core 2.0 as the same product.

Does this makes sense?

I followed the discussion and thought a lot about it. And yes, it starts to make sense to me.

NET Standard

What many people don't recognize or just forget about, is the .NET Standard. The .NET Standard is a API definition that tries to unify the APIs of .NET Core, .NET Framework and Xamarin. But it actually does a little more, it provides the API as a set of Assemblies, which forwards the types to the right Framework.

Does it make sense to you? (Read more about the .NET Standard in this documentation)

Currently ASP.NET Core runs on top of .NET Core and .NET Framework, but actually uses a framework that is based on .NET Standard 1.4 and higher. All the referenced libraries, which are used in ASP.NET Core are based on .NET Standard 1.4 or higher. Let's call them ".NET Standard libraries" ;) This libraries contain all the needed features, but doesn't reference a specific platform, but the .NET Standard API.

You are also able to create those kind of libraries with Visual Studio 2017.

By creating such libraries you provide your functionality to multiple platforms like Xamarin, .NET Framework and .NET Core (depending on the .NET Standard Version you choose). Isn't that good?

And in .NET Framework apps you are able to reference .NET Standard based libraries.

About runtimes

.NET Core is just a runtime to run Apps on Linux, Mac and Windows. Let's see the full .NET Framework as a runtime to run WPF apps, Winforms apps and classic ASP.NET apps on Windows. Let's also see Xamarin as a runtime to run apps on iOS and Android.

Let's also assume, that the .NET Standard 2.0 will provide the almost full API of the .NET Framework to your Application, if it is finished.

Do we really need the full .NET Framework for ASP.NET Core, in this case? No, we don't really need it.

What if ...

  • ... .NET Framework, .NET Core and Xamarin are just runtimes?
  • ... .NET Standard 2.0 is as complete as the .NET Framework?
  • .. .NET Standard 2.0 libraries will have the same features as the .NET Framework?
  • .. ASP.NET 2.0 Core uses the .NET Standard 2.0 libraries?

Do we really need the full .NET Framework as a runtime for ASP.NET Core?

I think, no!

Does it also makes sense to use the full .NET Framework as a runtime for Xamarin Apps?

I also think, no.

Conclusion

ASP.NET Core and .NET Core shouldn't be really shipped as one product, as David said. Because it is on top of .NET Core and maybe another technology could also be on top of .NET Core in the future. But maybe it makes sense to ship it as one product, to tell the people that ASP.NET Core 2.0 is based on top of .NET Core 2.0 and needs the .NET Core runtime. (The classic ASP.NET is also shipped with the full .NET Framework.)

  • .NET Core, Xamarin and the full .NET Framework are just a runtimes.
  • The .NET Standard 2.0 will almost have the same API as the .NET Framework.
  • The .NET Standard 2.0 libraries will have the same set of features as the .NET Framework
  • ASP.NET Core 2.0 uses NET Standard 2.0 libraries as a framework.

With this facts, Microsoft's decision to run ASP.NET Core 2.0 on .NET Core 2.0 only, doesn't sound that evil anymore.

From my perspective, ASP.NET is not in trouble and it's all fine and it makes absolutely sense. The troubles are only in the discussion about that on GitHub and on Twitter :)

What do you think? Do you agree?

Let's see what Microsoft will tell us about it at the Microsoft Build conference in the next days. Follow the live stream: https://channel9.msdn.com/Events/Build/2017

[Update 2017-05-11]

Yesterday in an official blog post of announcing ASP.NET Core 2.0 preview1, (Announcing ASP.NET 2.0.0-Preview1 and Updates for .NET Web Developers) Jeff Fritz wrote, that the preview 1 is limited to .NET Core 2.0 only. The overall goal is to put ASP.NET Core 2.0 on top of .NET Standard 2.0. This means it will be possible to run ASP.NET Core 2.0 apps on .NET Core, Mono and the full .NET Framework.

Material vom .NET Summit 2017

08.05.2017 11:08:00 | Jörg Neumann

Der .NET Summit war auch in diesem Jahr wieder ein echtes Highlight!
Hier das Material meiner Sessions:


Das Material meines Workshops "Architekturen für XAML-basierte Apps" bekommen Sie gerne auf Anfrage.

Adding a custom dependency injection container in ASP.NET Core

07.05.2017 21:00:00 | Jürgen Gutsch

ASP.NET Core is pretty flexible, customizable and extendable. You are able to change almost everything. Even the built-in dependency injection container can be replaced. This blog post will show you how to replace the existing DI container with another one. I'm going to use Autofac as a replacement.

Why should I do this?

There are not many reasons to replace the built-in dependency injection container, because it works pretty well for the most cases.

If you prefer a different dependency injection container, because of some reasons, you are able to do it. Maybe you know a faster container, if you like the nice features of Ninject to load dependencies dynamically from an assembly in a specific folder, by file patterns, and so on. I really miss this features in the built in container. It is possible to use another solution to to load dependencies from other libraries, but this is not as dynamic as the Ninject way.

Setup the Startup.cs

In ASP.NET Core the IServiceProvider is the component that resolves and creates the dependencies out of a IServiceCollection. The IServiceCollection needs to be manipulated in the method ConfigureServices within the Startup.cs if you want to add dependencies to the IServiceProvider.

The solution is to read the contents of the IServiceCollections to the own container and to provide an own implementation of a IServiceProvider to the application. Reading the IServiceCollection to the different container isn't that trivial, because you need to translate the different mappings types, which are probably not all available in all containers. E. g. the scoped registration (per request singleton) is a special one, that is only needed in web applications and not implemented in all containers.

Providing a custom IServiceprovider is possible by changing the method ConfigureServices a little bit:

public IServiceProvider ConfigureServices(IServiceCollection services)
{
  // Add framework services.
  services.AddMvc();

  return services.BuildServiceProvider();
}

The method now returns a IServiceprovider, which is created in the last line out of the IServiceCollection. It is needed to add the contents of the service collection to the container you want to use, because ASP.NET actually adds around 40 dependencies before this method is called:

1: Singleton - Microsoft.AspNetCore.Hosting.IHostingEnvironment => Microsoft.AspNetCore.Hosting.Internal.HostingEnvironment
2: Singleton - Microsoft.Extensions.Logging.ILogger`1 => Microsoft.Extensions.Logging.Logger`1
3: Transient - Microsoft.AspNetCore.Hosting.Builder.IApplicationBuilderFactory => Microsoft.AspNetCore.Hosting.Builder.ApplicationBuilderFactory
4: Transient - Microsoft.AspNetCore.Http.IHttpContextFactory => Microsoft.AspNetCore.Http.HttpContextFactory
5: Singleton - Microsoft.Extensions.Options.IOptions`1 => Microsoft.Extensions.Options.OptionsManager`1
6: Singleton - Microsoft.Extensions.Options.IOptionsMonitor`1 => Microsoft.Extensions.Options.OptionsMonitor`1
7: Scoped - Microsoft.Extensions.Options.IOptionsSnapshot`1 => Microsoft.Extensions.Options.OptionsSnapshot`1
8: Transient - Microsoft.AspNetCore.Hosting.IStartupFilter => Microsoft.AspNetCore.Hosting.Internal.AutoRequestServicesStartupFilter
9: Transient - Microsoft.Extensions.DependencyInjection.IServiceProviderFactory`1[[Microsoft.Extensions.DependencyInjection.IServiceCollection, Microsoft.Extensions.DependencyInjection.Abstractions, Version=1.1.0.0, Culture=neutral, PublicKeyToken=adb9793829ddae60]] => Microsoft.Extensions.DependencyInjection.DefaultServiceProviderFactory
10: Singleton - Microsoft.Extensions.ObjectPool.ObjectPoolProvider => Microsoft.Extensions.ObjectPool.DefaultObjectPoolProvider
11: Transient - Microsoft.Extensions.Options.IConfigureOptions`1[[Microsoft.AspNetCore.Server.Kestrel.KestrelServerOptions, Microsoft.AspNetCore.Server.Kestrel, Version=1.1.1.0, Culture=neutral, PublicKeyToken=adb9793829ddae60]] => Microsoft.AspNetCore.Server.Kestrel.Internal.KestrelServerOptionsSetup
12: Singleton - Microsoft.AspNetCore.Hosting.Server.IServer => Microsoft.AspNetCore.Server.Kestrel.KestrelServer
13: Singleton - Microsoft.AspNetCore.Hosting.IStartup => Microsoft.AspNetCore.Hosting.ConventionBasedStartup
14: Singleton - Microsoft.AspNetCore.Http.IHttpContextAccessor => Microsoft.AspNetCore.Http.HttpContextAccessor
15: Singleton - Microsoft.ApplicationInsights.Extensibility.ITelemetryInitializer => Microsoft.ApplicationInsights.AspNetCore.TelemetryInitializers.AzureWebAppRoleEnvironmentTelemetryInitializer
16: Singleton - Microsoft.ApplicationInsights.Extensibility.ITelemetryInitializer => Microsoft.ApplicationInsights.AspNetCore.TelemetryInitializers.DomainNameRoleInstanceTelemetryInitializer
17: Singleton - Microsoft.ApplicationInsights.Extensibility.ITelemetryInitializer => Microsoft.ApplicationInsights.AspNetCore.TelemetryInitializers.ComponentVersionTelemetryInitializer
18: Singleton - Microsoft.ApplicationInsights.Extensibility.ITelemetryInitializer => Microsoft.ApplicationInsights.AspNetCore.TelemetryInitializers.ClientIpHeaderTelemetryInitializer
19: Singleton - Microsoft.ApplicationInsights.Extensibility.ITelemetryInitializer => Microsoft.ApplicationInsights.AspNetCore.TelemetryInitializers.OperationIdTelemetryInitializer
20: Singleton - Microsoft.ApplicationInsights.Extensibility.ITelemetryInitializer => Microsoft.ApplicationInsights.AspNetCore.TelemetryInitializers.OperationNameTelemetryInitializer
21: Singleton - Microsoft.ApplicationInsights.Extensibility.ITelemetryInitializer => Microsoft.ApplicationInsights.AspNetCore.TelemetryInitializers.SyntheticTelemetryInitializer
22: Singleton - Microsoft.ApplicationInsights.Extensibility.ITelemetryInitializer => Microsoft.ApplicationInsights.AspNetCore.TelemetryInitializers.WebSessionTelemetryInitializer
23: Singleton - Microsoft.ApplicationInsights.Extensibility.ITelemetryInitializer => Microsoft.ApplicationInsights.AspNetCore.TelemetryInitializers.WebUserTelemetryInitializer
24: Singleton - Microsoft.ApplicationInsights.Extensibility.ITelemetryInitializer => Microsoft.ApplicationInsights.AspNetCore.TelemetryInitializers.AspNetCoreEnvironmentTelemetryInitializer
25: Singleton - Microsoft.ApplicationInsights.Extensibility.TelemetryConfiguration => Microsoft.ApplicationInsights.Extensibility.TelemetryConfiguration
26: Singleton - Microsoft.ApplicationInsights.TelemetryClient => Microsoft.ApplicationInsights.TelemetryClient
27: Singleton - Microsoft.ApplicationInsights.AspNetCore.ApplicationInsightsInitializer => Microsoft.ApplicationInsights.AspNetCore.ApplicationInsightsInitializer
28: Singleton - Microsoft.ApplicationInsights.AspNetCore.DiagnosticListeners.IApplicationInsightDiagnosticListener => Microsoft.ApplicationInsights.AspNetCore.DiagnosticListeners.HostingDiagnosticListener
29: Singleton - Microsoft.ApplicationInsights.AspNetCore.DiagnosticListeners.IApplicationInsightDiagnosticListener => Microsoft.ApplicationInsights.AspNetCore.DiagnosticListeners.MvcDiagnosticsListener
30: Singleton - Microsoft.AspNetCore.Hosting.IStartupFilter => Microsoft.ApplicationInsights.AspNetCore.ApplicationInsightsStartupFilter
31: Singleton - Microsoft.ApplicationInsights.AspNetCore.JavaScriptSnippet => Microsoft.ApplicationInsights.AspNetCore.JavaScriptSnippet
32: Singleton - Microsoft.ApplicationInsights.AspNetCore.Logging.DebugLoggerControl => Microsoft.ApplicationInsights.AspNetCore.Logging.DebugLoggerControl
33: Singleton - Microsoft.Extensions.Options.IOptions`1[[Microsoft.ApplicationInsights.Extensibility.TelemetryConfiguration, Microsoft.ApplicationInsights, Version=2.2.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35]] => Microsoft.Extensions.DependencyInjection.TelemetryConfigurationOptions
34: Singleton - Microsoft.Extensions.Options.IConfigureOptions`1[[Microsoft.ApplicationInsights.Extensibility.TelemetryConfiguration, Microsoft.ApplicationInsights, Version=2.2.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35]] => Microsoft.Extensions.DependencyInjection.TelemetryConfigurationOptionsSetup
35: Singleton - Microsoft.Extensions.Options.IConfigureOptions`1[[Microsoft.ApplicationInsights.AspNetCore.Extensions.ApplicationInsightsServiceOptions, Microsoft.ApplicationInsights.AspNetCore, Version=2.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35]] => Microsoft.AspNetCore.Hosting.DefaultApplicationInsightsServiceConfigureOptions
36: Singleton - Microsoft.Extensions.Logging.ILoggerFactory => Microsoft.Extensions.Logging.LoggerFactory
37: Singleton - System.Diagnostics.DiagnosticListener => System.Diagnostics.DiagnosticListener
38: Singleton - System.Diagnostics.DiagnosticSource => System.Diagnostics.DiagnosticListener
39: Singleton - Microsoft.AspNetCore.Hosting.IApplicationLifetime => Microsoft.AspNetCore.Hosting.Internal.ApplicationLifetime

140 more services gets added by the AddMvc() method. And even more, if you want to use more components and frameworks, like Identity and Entity Framework Core.

Because of that, you should use the common way to add framework services to the IServiceCollection and read the added services to the other container afterwards.

The next lines with dummy code, shows you how the implementation could be look like:

public IServiceProvider ConfigureServices(IServiceCollection services)
{
  // Add framework services.  
  services.AddDbContext<ApplicationDbContext>(options =>
    options.UseSqlite(Configuration.GetConnectionString("DefaultConnection")));

  services.AddIdentity<ApplicationUser, IdentityRole>()
    .AddEntityFrameworkStores<ApplicationDbContext>()
    .AddDefaultTokenProviders();

  services.AddMvc();
  services.AddOtherStuff();

  // create custom container
  var container = new CustomContainer();
  
  // read service collection to the custom container
  container.RegisterFromServiceCollection(services);

  // use and configure the custom container
  container.RegisterSingelton<IProvider, MyProvider>();

  // creating the IServiceProvider out of the custom container
  return container.BuildServiceProvider();
}

The details of the implementation depends on how the container works. E. g. If I'm right, Laurent Bugnion's SimpleIOC already is a IServiceProvider and could be returned directly. Let's see how this works with Autofac:

Replacing with Autofac

Autofac provides an extension library to support this container in ASP.NET Core projects. I added both the container and the extension library packages from NuGet:

Autofac, 4.5.0
Autofac.Extensions.DependencyInjection, 4.1.0

I also added the related usings to the Startup.cs:

using Autofac;
using Autofac.Extensions.DependencyInjection;

Now I'm able to create the Autofac container in the ConfigureServices method:

public IServiceProvider ConfigureServices(IServiceCollection services)
{
  // Add framework services.  
  services.AddDbContext<ApplicationDbContext>(options =>
    options.UseSqlite(Configuration.GetConnectionString("DefaultConnection")));

  services.AddIdentity<ApplicationUser, IdentityRole>()
    .AddEntityFrameworkStores<ApplicationDbContext>()
    .AddDefaultTokenProviders();

  services.AddMvc();
  services.AddOtherStuff();
  
  // create a Autofac container builder
  var builder = new ContainerBuilder();

  // read service collection to Autofac
  builder.Populate(services);

  // use and configure Autofac
  builder.RegisterType<MyProvider>().As<IProvider>();

  // build the Autofac container
  ApplicationContainer = builder.Build();
  
  // creating the IServiceProvider out of the Autofac container
  return new AutofacServiceProvider(ApplicationContainer);
}

// IContainer instance in the Startup class 
public IContainer ApplicationContainer { get; private set; }

With this implementation, Autofac is used as the dependency injection container in this ASP.NET application.

If you also want to resolve the controllers from the container, you should add this to the container too Otherwise the framework will resolve the Controllers and some special DI cases are not possible. A small call adds the Controllers to the IServiceColection:

services.AddMvc().AddControllersAsServices();

That's it.

More about Autofac: http://docs.autofac.org/en/latest/integration/aspnetcore.html

Conclusion

Fortunately Autofac supports the .NET Standard 1.6 and there is this nice extension library to get it working in ASP.NET too. Some other containers don't and it needs some more effort to get it running.

How to add custom logging in ASP.NET Core

04.05.2017 21:00:00 | Jürgen Gutsch

ASP.NET Core is pretty flexible, customizable and extendable. You are able to change almost everything. Even the logging. If you don't like the built-in logging, you are able to plug in your own logger or an existing logger like log4net, NLog, Elmah. In this post I'm going to show you how to add a custom logger.

The logger I show you, just writes out to the console, but just for one single log level. The feature is to configure different font colors per LogLevel. So this logger is called ColoredConsoleLogger.

General

To add a custom logger, you need to add an ILoggerProvider to the ILoggerFactory, that is provided in the method Configure in the Startup.cs:

loggerFactory.AddProvider(new CustomLoggerProvider(new CustomLoggerConfiguration()));

The ILoggerProvider creates one or more ILogger which are used by the framework to log the information.

The Configuration

The idea is, to create different colored console entries per log level and event ID. To configure this we need a configuration type like this:

public class ColoredConsoleLoggerConfiguration
{
  public LogLevel LogLevel { get; set; } = LogLevel.Warning;
  public int EventId { get; set; } = 0;
  public ConsoleColor Color { get; set; } = ConsoleColor.Yellow;
}

This sets the default level to Warning and the color to Yellow. If the EventId is set to 0, we will log all events.

The Logger

The logger gets a name and the configuration passed in via the constructor. The name is the category name, which usually is the logging source, eg. the type where the logger is created in:

public class ColoredConsoleLogger : ILogger
{
  private readonly string _name;
  private readonly ColoredConsoleLoggerConfiguration _config;

  public ColoredConsoleLogger(string name, ColoredConsoleLoggerConfiguration config)
  {
    _name = name;
    _config = config;
  }

  public IDisposable BeginScope<TState>(TState state)
  {
    return null;
  }

  public bool IsEnabled(LogLevel logLevel)
  {
    return logLevel == _config.LogLevel;
  }

  public void Log<TState>(LogLevel logLevel, EventId eventId, TState state, Exception exception, Func<TState, Exception, string> formatter)
  {
    if (!IsEnabled(logLevel))
    {
      return;
    }

    if (_config.EventId == 0 || _config.EventId == eventId.Id)
    {
      var color = Console.ForegroundColor;
      Console.ForegroundColor = _config.Color;
      Console.WriteLine($"{logLevel.ToString()} - {eventId.Id} - {_name} - {formatter(state, exception)}");
      Console.ForegroundColor = color;
    }
  }
}

We are going to create a logger instance per category name with the provider.

The LoggerProvider

The LoggerProvider is the guy who creates the logger instances. Maybe it is not needed to create a logger instance per category, but this makes sense for some Loggers, like NLog or log4net. Doing this you are also able to choose different logging output targets per category if needed:

  public class ColoredConsoleLoggerProvider : ILoggerProvider
  {
    private readonly ColoredConsoleLoggerConfiguration _config;
    private readonly ConcurrentDictionary<string, ColoredConsoleLogger> _loggers = new ConcurrentDictionary<string, ColoredConsoleLogger>();

    public ColoredConsoleLoggerProvider(ColoredConsoleLoggerConfiguration config)
    {
      _config = config;
    }

    public ILogger CreateLogger(string categoryName)
    {
      return _loggers.GetOrAdd(categoryName, name => new ColoredConsoleLogger(name, _config));
    }

    public void Dispose()
    {
      _loggers.Clear();
    }
  }

There's no magic here. The method CreateLogger creates a single instance of the ColoredConsoleLogger per category name and stores it in the dictionary.

Usage

Now we are able to use the logger in the Startup.cs

public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)
{
  loggerFactory.AddConsole(Configuration.GetSection("Logging"));
  loggerFactory.AddDebug();
  // here is our CustomLogger
  loggerFactory.AddProvider(new ColoredConsoleLoggerProvider(new ColoredConsoleLoggerConfiguration
  {
    LogLevel = LogLevel.Information,
    Color = ConsoleColor.Blue
  }));
  loggerFactory.AddProvider(new ColoredConsoleLoggerProvider(new ColoredConsoleLoggerConfiguration
  {
    LogLevel = LogLevel.Debug,
    Color = ConsoleColor.Gray
  }));

But this doesn't really look nice from my point of view. I want to use something like this:

loggerFactory.AddColoredConsoleLogger(c =>
{
  c.LogLevel = LogLevel.Information;
  c.Color = ConsoleColor.Blue;
});
loggerFactory.AddColoredConsoleLogger(c =>
{
  c.LogLevel = LogLevel.Debug;
 c.Color = ConsoleColor.Gray;
});

This means we need to write at least one extension method for the ILoggerFactory:

public static class ColoredConsoleLoggerExtensions
{
  public static ILoggerFactory AddColoredConsoleLogger(this ILoggerFactory loggerFactory, ColoredConsoleLoggerConfiguration config)
  {
    loggerFactory.AddProvider(new ColoredConsoleLoggerProvider(config));
    return loggerFactory;
  }
  public static ILoggerFactory AddColoredConsoleLogger(this ILoggerFactory loggerFactory)
  {
    var config = new ColoredConsoleLoggerConfiguration();
    return loggerFactory.AddColoredConsoleLogger(config);
  }
  public static ILoggerFactory AddColoredConsoleLogger(this ILoggerFactory loggerFactory, Action<ColoredConsoleLoggerConfiguration> configure)
  {
    var config = new ColoredConsoleLoggerConfiguration();
    configure(config);
    return loggerFactory.AddColoredConsoleLogger(config);
  }
}

With this extension methods we are able to pass in an already defined configuration object, we can use the default configuration or use the configure Action as shown in the previous example:

loggerFactory.AddColoredConsoleLogger();
loggerFactory.AddColoredConsoleLogger(new ColoredConsoleLoggerConfiguration
{
  LogLevel = LogLevel.Debug,
  Color = ConsoleColor.Gray
});
loggerFactory.AddColoredConsoleLogger(c =>
{
  c.LogLevel = LogLevel.Information;
  c.Color = ConsoleColor.Blue;
});

Conclusion

This is how the output of that nonsense logger looks:

Now it's up to you to create a logger that writes the entries to a database, log file or whatever or just add an existing logger to your ASP.NET Core application.

Error while starting Docker for Windows

27.04.2017 21:00:00 | Jürgen Gutsch

The last couple of months I wanted to play around with Docker for Windows. It worked just twice. Once at the first try for just one or two weeks. Then I got an error, when Docker tries to initialize right after Windows starts. After I reinstalled Docker for Windows it runs the second time for a two or three weeks. I tried to reinstall all that stuff but I didn't get it running again on my machine.

The error shown on this dialog is not really meaningful:

Object reference not set to an instance of an object...

Even the log didn't really help:

Version: 17.03.1-ce-win5 (10743)
Channel: stable
Sha1: b18e2a50ccf296bcd637b330c0ca9faaab9d790c
Started on: 2017/04/28 21:49:37.965
Resources: C:\Program Files\Docker\Docker\Resources
OS: Windows 10 Pro
Edition: Professional
Id: 1607
Build: 14393
BuildLabName: 14393.1066.amd64fre.rs1_release_sec.170327-1835
File: C:\Users\Juergen\AppData\Local\Docker\log.txt
CommandLine: "Docker for Windows.exe"
You can send feedback, including this log file, at https://github.com/docker/for-win/issues
[21:49:38.182][GUI            ][Info   ] Starting...
[21:49:38.669][GUI            ][Error  ] Object reference not set to an instance of an object.
[21:49:45.081][ErrorReportWindow][Info   ] Open logs

Today I found some time to search for a solution and fortunately I'm not the only one who faced this error. I found an issue on GitHub which described exactly this behavior: https://github.com/docker/for-win/issues/464#issuecomment-277700901

The solution is to delete all the files inside of that folder:

C:\Users\<UserName>\AppData\Roaming\Docker\

Now I just needed to restart docket for Windows, by calling the Docker for Windows.exe in C:\Program Files\Docker\Docker\

Finally I can continue playing around with Docker :)

.NET: Visual Studio Code und .NET Core

25.04.2017 20:34:08 | Steffen Steinbrecher

Bei Visual Studio Code handelt es sich um einen kostenlosen und quelloffenen Texteditor, der plattformübergreifend für die Betriebssystem Windows, macOS und Linux verfügbar ist. Bis auf den Namen und einigen Funktionen wie Debugging, IntelliSense und Versionsverwaltung hat Visual Studio Code nichts mit Visual Studio gemeinsam. Im Unterschied zu Visual Studio arbeitet Visual Studio Code nicht […]

Mobil bis ins hohe Alter: Microsoft HoloLens unterstützt thyssenkrupp bei der Produktion individuell gefertigter Treppenlifts

24.04.2017 15:11:53 | Mathias Gronau

Praxisbeispiele für den Einsatz der HoloLens kommen meist aus dem Produktionsumfeld. Die Montage und Reparatur komplexer Maschinen wird unterstützt. Auf der CeBIT zeigt Microsoft gemeinsam mit thyssenkrupp ein Beispiel für den Einsatz der HoloLens, bei dem die Endverbraucher direkt von dieser Technologie profitieren können. Um maßgeschneiderte Home-Mobility-Lösungen liefern zu können, setzt thyssenkrupp auf Microsoft HoloLens. Die Mixed-Reality-Brille ermöglicht Kunden in Echtzeit die Visualisierung eines Treppenlift-Produkts im eigenen Zuhause. Durch die Beschleunigung verschiedener Prozessschritte sind zudem bis zu vier Mal schnellere Lieferzeiten möglich. Microsoft HoloLens wird damit bei thyssenkrupp zu einem wesentlichen Werkzeug bei der Bereitstellung individueller Mobilitätslösungen in den eigenen vier Wänden. Der Bedarf nach solchen Lösungen wird rasant steigen: Neben der schnell wachsenden städtischen Bevölkerung tragen auch ein besseres Gesundheitswesen und ein höherer Lebensstandard zu einer längeren Lebenserwartung bei. Es wird erwartet, dass die globale Bevölkerung von derzeit 7,4 Milliarden bis 2100 auf über 11 Milliarden anwächst – und altert. Im Jahr 2016 war weltweit bereits jeder achte Mensch über 60 Jahre alt. Für Deutschland berechnet das Statistische Bundesamt, dass im Jahr 2060 69 Prozent der Bevölkerung im Rentenalter sein werden. Diese Zahlen verdeutlichen, dass die Mobilität älterer Menschen in Städten eine wachsende Herausforderung darstellt.1 Mobilitätslösungen wie Treppenlifts tragen bereits heute zu einer höheren Lebensqualität älterer Menschen bei, die damit länger und unabhängiger in ihrem eigenen Zuhause leben können. Allerdings führen einige Aspekte bei der Gestaltung und Bereitstellung von Home-Mobility-Lösungen zu hoher Komplexität: Jede häusliche Situation hat ihre Besonderheiten, daher ist jeder Treppenlift im Prinzip ein Einzelstück, das speziell auf individuelle Anforderungen zugeschnitten ist. Zudem erwerben die meisten Kunden ein solches Produkt erst, wenn ihre Mobilität bereits stark eingeschränkt ist. Dadurch wird der Kauf zu einer emotionalen Erfahrung, bei der sie ihre neuen Einschränkungen anerkennen müssen. Oft wollen die Kunden aus dieser Situation heraus ihre Käufe sofort geliefert bekommen; das heißt, es ist Eile geboten. Microsoft HoloLens wird den Kunden maßgeblich helfen, Mobilitätslösungen individuell zu gestalten. Treppen können nun sofort unter der Berücksichtigung von Ergonomie und Hindernissen wie Heizungen, Licht – und Elektrohalterungen oder der Nähe zur Wand usw. vermessen werden. Kunden können bequem auf einem Tablet sehen, wie der Treppenlift auf ihrer Treppe aussehen wird, und Entscheidungen zu Polsterung, Farbe von Stuhl und Schienen sowie maßgeschneiderten Besonderheiten treffen. So erhalten sie ein Produkt, das in das Erscheinungsbild ihres Zuhauses passt. „Neue Realitäten erfordern neue Lösungen“, kommentiert Andreas Schierenbeck, CEO von thyssenkrupp Elevator. „Wir sehen in der Microsoft HoloLens einen Wegbereiter, der uns dabei unterstützt, das Kundenerlebnis für Home-Lösungen neu zu erfinden. Damit tragen wir dazu bei, kontinuierliche Lebensqualität für die alternde Bevölkerung bereitzustellen – unabhängig von ihren Mobilitätseinschränkungen.“ thyssenkrupp hat Microsoft HoloLens für den häuslichen Bereich bereits bei mehr als 100 Kunden in Holland, Spanien und Deutschland mit exzellenten Ergebnissen und positivem Kundenfeedback eingesetzt. Als nächster Schritt steht die deutschlandweite Einführung an.

.NET CultureInfo in Windows 10

24.04.2017 03:45:00 |

Did you know that the CultureInfo behavior with “unkown” cultures has changed with Windows 10?

I stumbled two times about this “problem” - so this is enough to write a short blogpost about it.

Demo Code

Lets use this democode:

    try
    {


        // ok on Win10, but not on pre Win10 if culture is not registred
        CultureInfo culture1 = new CultureInfo("foo");
        CultureInfo culture2 = new CultureInfo("xyz");
        CultureInfo culture3 = new CultureInfo("en-xy");

        // not ok even on Win 10 - exception
        CultureInfo culture4 = new CultureInfo("foox");

    }
    catch (Exception exc)
    {

    }

Windows 10 Case

If you run this code under Windows 10 it should fail for the “foox” culture, because it doesn’t seem to be a valid culture anyway.

“culture1”, “culture2”, “culture3” are all valid cultures in the Windows 10 world, but are resolved with Unkown Locale and LCID 4096.

I guess Windows will look for a 2 or 3 letter ISO style language, and “foox” doesn’t match this pattern.

Pre Windows 10 - e.g. running on Win Server 2012R2

If you would run the code unter Windows Server 2012 R2 it would fail on the first culture, because there is no “foo” culture registred.

“Problem”

The main “problem” is that this behavior could lead to some production issues if you develop with Windows 10 and the software is running on a Win 2012 server.

If you are managing “language” content in your application, be aware of this “limitation” on older Windows versions.

I discovered this problem while debugging our backend admin application. With this ASP.NET frontend it is possible to add or manage “localized” content and the dropdown for the possible language listed a whole bunch of very special, but “Unkown locale” cultures. So we needed to filter out all LCID 4096 cultures to ensure it would run under all Windows versions.

MSDN

This behavior is also documented on MSDN

The “Unkown culture” LCID 4096 was introduced with Windows Vista, but only with Windows 10 it will be “easy” usable within the .NET Framework.

12 Open Source UI-Frameworks für WPF-Anwendungen

21.04.2017 23:48:11 | Kazim Bahar

Auf der mittlerweile populärsten Sourcecode-Verwaltungsplattform GitHub befinden sich etliche frei verfügbare UI-Frameworks für .NET Anwendungen. Die hier aufgelisteten Frameworks für WPF sind...

Thorsten Herrmann übernimmt bei Microsoft das Großkunden-Geschäft

20.04.2017 12:09:06 | Mathias Gronau

Thorsten Herrmann (49) ist am 1. April in die Geschäftsleitung von Microsoft Deutschland eingetreten und wird ab 1. Juli für das Großkunden- und Partnergeschäft in Deutschland verantwortlich zeichnen. Thomas Herrmann berichtet unmittelbar an die Vorsitzende der Geschäftsführung von Microsoft Deutschland, Sabine Bendiek. Thorsten Herrmann übernimmt damit die Aufgaben von Alexander Stüger, der die Position seit dem Ausscheiden von Alastair Bruce im Juli 2016 interimistisch bekleidete. Alexander Stügers neue Aufgabe in der Microsoft Organisation wird zu einem späteren Zeitpunkt bekannt gegeben. Thorsten Herrmann kommt von Hewlett Packard Enterprise (HPE) zu Microsoft. Bei HPE war er in den letzten drei Jahren als Vice-President für die weltweiten Geschäftsbeziehungen zwischen HP (später HPE) und SAP zuständig. Der studierte Wirtschaftsinformatiker startete seine Karriere 1989 bei IBM. 1997 wechselte er zu Compaq Computer GmbH in den Vertrieb für SAP R/3-Infrastrukturlösungen und wurde später Regionalvertriebsleiter. Seit dem Merger mit HP war Thorsten Herrmann in verschiedenen leitenden Vertriebspositionen tätig ehe er im April 2009 in die Geschäftsführung der Hewlett Packard GmbH berufen wurde. In dieser Rolle übernahm er als Vice President die Verantwortung für den Großkundenvertrieb bei HP Deutschland. „Auf der Hannover Messe in der kommenden Woche sehen wir, in welcher Geschwindigkeit die Digitalisierung klassischer Industrien vorangeht. Microsoft leistet nach meiner Beobachtung einen substanziellen Beitrag, die Industrieunternehmen bei ihrer digitalen Transformation wirksam zu unterstützen. Dabei bringe ich meine Erfahrungen und Expertise jetzt sehr gerne ein“, sagte Thorsten Herrmann vor Mitarbeitern. „Mit Thorsten Herrmann konnten wir einen Top-Manager mit einer umfassenden Vertriebs-Expertise und Marktkenntnis für uns gewinnen und ich freue mich sehr auf die persönliche Zusammenarbeit“, kommentierte Sabine Bendiek die Berufung des neuen Geschäftsleitungs-Kollegen.

A first glimpse into CAKE

19.04.2017 21:00:00 | Jürgen Gutsch

Since a couple of years I use FAKE (C# Make) to configure my builds. Also at the YooApps we use FAKE in some projects. One of the projects uses it since more than two years. FAKE is really great and I love to use it, but there is one problem with it: The most C# developers don't really like to use new things. The worst case for the most C# developers - it seems - is a new tool, that uses an exotic language like F#.

This is why I have to maintain the FAKE build scripts, since I introduced FAKE to the team.

It is that new tool and the F# language that must be scary for them, even if they don't really need to learn F# for the most common scenarios. That's why I asked the fellow developers to use CAKE (C# Make).

  • It is C# make instead of F# make
  • It looks pretty similar
  • It works the same way
  • It is a scripting language
  • It works almost everywhere

They really liked the idea to use CAKE. Why? just because of C#? It seems so...

It doesn't really makes sense to me, but anyway, it makes absolutely sense that the developers need to use and to maintain there own build configurations.

How does CAKE work?

CAKE is built using a C# scripting language. It uses the Roslyn compiler to compile the scripts. Instead of using batch files, as FAKE does, it uses a PowerShell script (build.ps1) to bootstrap itself and to run the build script. The bootstrapping step loads CAKE and some dependencies using NuGet. The last step the PowerShell script does, is to call the cake.exe and to execute the build script.

The bootstrapping needs network access, to load all the stuff. It also loads the nuget.exe, if it's not available. If you don't like this, you can also commit the loaded dependencies to the source code repository.

The documentation is great. Just follow the getting-started guide to get a working example. There's also a nice documentation about setting up a new project available.

Configuring the build

If you know FAKE or even MSBuild it will look pretty familiar to you. Let's have a quick look into the first simple example of the getting started guide:

var target = Argument("target", "Default");

Task("Default")
  .Does(() =>
        {
          Information("Hello World!");
        });

RunTarget(target);

The first line retrieves the build target to execute from the command line arguments. Starting from line 3 we see a definition of a build target. This target just prints a "Hello World!" as a information message.

The last line starts the initial target by its name.

A more concrete code sample is the build script from the CAKE example (I removed some lines in this listing to get a shorter example):

#tool nuget:?package=NUnit.ConsoleRunner&version=3.4.0

var target = Argument("target", "Default");
var configuration = Argument("configuration", "Release");

// Define the build directory.
var buildDir = Directory("./src/Example/bin") + Directory(configuration);

Task("Clean")
  .Does(() =>
        {
          CleanDirectory(buildDir);
        });

Task("Restore-NuGet-Packages")
  .IsDependentOn("Clean")
  .Does(() =>
        {
          NuGetRestore("./src/Example.sln");
        });

Task("Build")
  .IsDependentOn("Restore-NuGet-Packages")
  .Does(() =>
        {
          MSBuild("./src/Example.sln", settings =>
                  settings.SetConfiguration(configuration));
        });

Task("Run-Unit-Tests")
  .IsDependentOn("Build")
  .Does(() =>
        {
          NUnit3("./src/**/bin/" + configuration + "/*.Tests.dll", new NUnit3Settings {
            NoResults = true
          });
        });

Task("Default")
  .IsDependentOn("Run-Unit-Tests");

RunTarget(target);

This script uses another NuGet package to run the NUnit3 tests and references it. A nice feature is to configure the NuGet dependency at the beginning of the script.

This build script contains five targets. The method IsDependentOn("") wires the targets together in the right execution order. This way is a bit different to FAKE and maybe a little bit confusing. It needs to write the targets in the right execution order. If you don't write the script like this, you need to find the initial target and to follow the way back to the very first target. You will read the execution order from the last to the first target.

FAKE does this a little easier and wires the targets up in a single statement at the end of the file:

// Dependencies
"Clean"
  ==> "Restore-NuGet-Packages"
  ==> "Build"
  ==> "Run-Unit-Tests"
  ==> "Default"
 
RunTargetOrDefault "Default"

This could possibly look like this dummy code in CAKE:

// Dependencies
WireUp("Clean")
  .Calls("Restore-NuGet-Packages")
  .Calls("Build")
  .Calls("Run-Unit-Tests")
  .Calls("Default");

RunTarget(target);

Running the build

To run the build, just call .\build.ps1 in a PowerShell console:

If you know FAKE, the results look pretty familiar:

Conclusion

Anyway. I think CAKE gets pretty much faster accepted by the fellow developers at the YooApps than FAKE did. Some things will work a little easier in CAKE than in FAKE and some a little different, but the most stuff will work the same way. So it seems it makes sense to switch to use CAKE at the YooApps. So let's use it. :)

I'm sure, I will write down a comparison of FAKE and CAKE later, if I have used it for a few months.

Please use a password manager

17.04.2017 21:00:00 | Jürgen Gutsch

Since years, I'm using a password manager to store and manage all my credentials for all the accounts I use. The usage of such a tool is pretty common for me and it is a normal flow for me to create a new entry in the password manager first before I create the actual account.

I'm always amazed, if I meet people, who don't use any tool to store and manage their credentials. They use the same password or the same bunch of passwords everywhere. This is dangerous and most of them already know about that. Maybe they are to lazy to spend one or two ours to think about the benefits and to search for the right tool or they simply don't know about such tools

For me the key benefits are pretty clear:

  • I don't need to remember all the passwords.
  • I don't need to think about secure passwords while creating new one
  • I just need to remember one single passphrase

Longer passwords are more secure than short ones, even if the short ones include special characters, numbers and upper case characters. But longer passwords are pretty hard to remember. This is why Edward Snowden proposed to use passphrases instead of passwords. Using passphrases like this, are easy to remember. But would it really makes sense to create a different pass-phrase for every credentials you use? I don't think so.

I just created a single pass-phrase which is used with the password manager and I use the password manager to generate the long passwords for me. Most of the generated password are like this:

I4:rs/pQO8Ir/tSk*`1_KSG"~to-xMI/Gf;bNP7Qi3s8RuqIzl0r=-JL727!cgwq

The tool I use is KeePass 2, which is available on almost all devices and OS I use. As small desktop app on Windows, Mac and Linux as an app on Windows phones and Android. (Probably on iOS too, but I don't use such a device). KeePass stores the passwords on a encrypted file. This file could be stored on a USB drive or on a shared folder in the cloud. I use the cloud way to share the file to every device. The cloud store itself is secured with a password I don't know and which is stored in that KeePass file.

What the F***, you really put the key to lock your house into a safe which is inside your house? Yes!

The KeyPass file is synced offline and offline accessible on all devices. If the file changes it will be synched to all devices. I'm still able to access all credentials.

With the most password mangers, you are able to copy the username and the password, using shortcuts to the login forms. Some of them are able to automatically fill in the credentials. KeePass uses the clipboard, but deletes the values from the clipboard after a couple of seconds. Like this, no one else can access or reuse the credentials within the clipboard.

There are many more password manager out there. But KeePass - with the encrypted file - works best for me. I'm responsible to store the file on a save location and I can do that wherever I want.

Summary

Use a password manager! Spend the time to try the tools and choose the tools that fits you best. It is important! Once you use a password manager, you never want to work without one. It will make you live easier and hopefully more secure.

Blogging with Git Flow

02.04.2017 21:00:00 | Jürgen Gutsch

Since a while we use Git Flow in our projects at the YooApps. Git Flow is a add-in for Git, that helps us to follow the feature branch process in a standard way. We usually use our Jira ticket number as names for features, bugs or hotfix and we use the Jira Version as a Release name. This makes absolutely sense to us and makes the flow pretty transparent. We also use BitBucket to host our Git repositories, which is directly linked to Jira. That means on every Pull Request we have the direct link to the Jira ticket.

Using Git Flow to release new posts

My idea is to also use Git Flow to release new blog posts.

My blog is generated by Pretzel, which is a Jekyll clone written in C#. I host it on an Azure Web Site. I write my posts in GitHub style Markdown files and push them to GitHub. Every time I push a new post, the Azure Web Site starts a new build and runs Pretzel to generate the static blog.

I already wrote some blog posts about using Pretzel

Dear hackers and bots: Because it is a static web site, it doesn't make sense to search for WordPress login pages or WordPress APIs on my blog ;)

Since almost one and a halve year now, I use Git on GitHub to publish my posts, but I only use the master branch yet and I don't commit the drafts. I sometimes write two or three post in parallel and I anyway have around ten drafts in the Pretzels _drafts folder. This feels a bit messy and I will probably loose my drafted posts, if the machine crashes.

Using Git Flow, I don't need to use the _drafts folder anymore. Every unfinished feature branch is a draft. If I finish a post, I would just need to finish the feature branch. If I want to publish the posts, I can start a new release.

Maybe the release step is a bit too much. But currently I need to push explicitly to publish a new post and it shouldn't be a big deal to also create a new release to merge all finished posts from develop to master. If it doesn't fit, it would be easy to switch to the more simple feature branches.

Let's see how it works :-)

This blog post feature branch is created by using the following commands in the console:

git flow feature start blogging-with-git-flow

Now I'm in the draft mode and will create the blog post file, writing, adding images, linking between existing posts and so on. If this is done I do a first commit and publish the feature branch to GitHub:

git add _posts/*
git add img/*
git commit -m "new post about blogging with pretzel"
git flow feature publish

At this state I can change and commit as much as I want to. If I finish the blog post, I'm going to finish the post and push the current develop branch to GitHub:

git flow feature finish
git push

The last step is publishing the posts. I currently not sure, but I could probably use the number of posts as the minor version number, which will also be used as tag for the release:

git flow release start 1.48.0
git flow release finish
git push --all
git push --tags

(I possibly should create a batch command to execute the four lines to release new posts)

After that push, Pretzel will start "baking" the blog including the new blog post.

If I want to see the current drafts, I just need to display the existing branches:

I'm sure this will be a clean way to publish and to handle the drafts and finished posts.

Versioning

While publishing the latest post like this, I realized that GitHub actually will display a release in the GitHub repository, this is quite nice. This is not really needed but a funny fact and a reason why I wanna think a little more about the version numbers, if I release a new article.

My idea is to change the Major version, if I do a huge change on the layout of the blog or if I add a new feature. Because the layout is still the first version and I only did some really small changes, I'll keep it as version 1. For the minor version I will use the number of published articles.

This doesn't really makes sense from the semver perspective, but blogging should also be fun and this really means is fun to me.

This means, with the latest post I published release 1.47.0 and this post will be in release 1.48.0 ;-)

Authoring

In the last post about writing blog posts using Pretzel, I wrote that I use MarkdownPad 2 to write my pots. Unfortunately it seems that this editor is not longer maintained, no new version was published for a while. I paid for it, to have some more extended feature to use. Anyway, a few months ago it crashes every time I opened it and there was no way to get an updated or fixed version. (I'm pretty sure it was related to the installation of CefSharp.) This is why I now use Typora, which is pretty minimalistic and lightweight. It works completely different, but in a pretty cool way. It doesn't use a splitted screen to edit and preview the content. With typora you write markdown code directly and it immediately translated it to the preview in the same editor view. It also supports custom CSS and different Markdown styles. It is real WYSIWYG.

Any thoughts about it? Feel free to drop a comment and let me know :-)

Using xUnit, MSTest or NUnit to test .NET Core libraries

30.03.2017 21:00:00 | Jürgen Gutsch

MSTest was just announced to be open sourced, but was already moved to .NET Core some months ago. It seems it makes sense to write another blog post about unit testing .NET Core applications and .NET Standard libraries using .NET Core tools.

In this post I'm going to use the dotnet CLI and Visual Studio Code completely. Feel free to use Visual Studio 2017 instead, if you want to and if you don't like to use the console. Visual Studio 2017 is using the same dotnet CLI and almost the same commands in the background.

Setup the system under test

Our SUT is a pretty complex class, that helps us a lot to do some basic math operation. This class will be a part of a super awesome library:

namespace SuperAwesomeLibrary
{
  public class MathTools
  {
    public decimal Add(decimal a, decimal b) =>  a + b;
    public decimal Substr(decimal a, decimal b) => a - b;
    public decimal Multiply(decimal a, decimal b) => a * b;
    public decimal Divide(decimal a, decimal b) => a / b;
  }
}

I'm going to add this class to the "SuperAwesomeLibrary" and a solution I recently added like this:

mkdir unit-tests & cd unit-tests
dotnet new sln -n SuperAwesomeLibrary
mkdir SuperAwesomeLibrary & cd SuperAwesomeLibrary
dotnet new classlib
cd ..
dotnet sln add SuperAwesomeLibrary\SuperAwesomeLibrary.csproj

The cool thing about the dotnet CLI is, that you are really able to create Visual Studio solutions (line 2). This wasn't possible with the previous versions. The result is a Visual Studio and MSBuild compatible solution and you can use and build it like any other solution in your continuous integration environment. Line 5 creates a new library, which will be added to the solution in line 7.

After this is done, the following commands will complete the setup, by restoring the NuGet packages and building the solution:

dotnet restore
dotnet build

Adding xUnit tests

The dotnet CLI directly supports to add XUnit tests:

mkdir SuperAwesomeLibrary.Xunit & cd SuperAwesomeLibrary.Xunit
dotnet new xunit
dotnet add reference ..\SuperAwesomeLibrary\SuperAwesomeLibrary.csproj
cd ..
dotnet sln add SuperAwesomeLibrary.Xunit\SuperAwesomeLibrary.Xunit.csproj

This commands are creating the new xUnit test project, adding a reference to the SuperAwesomeLibrary and adding the test project to the solution.

If this was done, I created the xUnit tests for our MathHelper using VSCode:

public class MathToolsTests
{
  [Fact]
  public void AddTest() 
  {
    var sut = new MathTools();
    var result = sut.Add(1M, 2M);
    Assert.True(3M == result);
  }
  [Fact]
  public void SubstrTest() 
  {
    var sut = new MathTools();
    var result = sut.Substr(2M, 1M);
    Assert.True(1M == result);
  }
  [Fact]
  public void MultiplyTest() 
  {
    var sut = new MathTools();
    var result = sut.Multiply(2M, 1M);
    Assert.True(2M == result);
  }
  [Fact]
  public void DivideTest() 
  {
    var sut = new MathTools();
    var result = sut.Divide(2M, 2M);
    Assert.True(1M == result);
  }
}

This should work and you need to call your very best dotnet CLI friends again:

dotnet restore
dotnet build

The cool thing about this commands is, it works in your solution directory and restores the packages of all your solution and it builds all your projects in your solution. You don't need to go threw all of your projects separately.

But if you want to run your tests, you need to call the library or the project directly, if you are not in the project folder:

dotnet test SuperAwesomeLibrary.Xunit\SuperAwesomeLibrary.Xunit.csproj

If you are in the test project folder just call dotnet test without the project file.

This command will run all your unit tests in your project.

Adding MSTest tests

Adding a test library for MSTest works the same way:

mkdir SuperAwesomeLibrary.MsTest & cd SuperAwesomeLibrary.MsTest
dotnet new mstest
dotnet add reference ..\SuperAwesomeLibrary\SuperAwesomeLibrary.csproj
cd ..
dotnet sln add SuperAwesomeLibrary.MsTest\SuperAwesomeLibrary.MsTest.csproj

Event the test class looks almost the same:

[TestClass]
public class MathToolsTests
{
  [TestMethod]
  public void AddTest()
  {
    var sut = new MathTools();
    var result = sut.Add(1M, 2M);
    Assert.IsTrue(3M == result);
  }
  [TestMethod]
  public void SubstrTest()
  {
    var sut = new MathTools();
    var result = sut.Substr(2M, 1M);
    Assert.IsTrue(1M == result);
  }
  [TestMethod]
  public void MultiplyTest()
  {
    var sut = new MathTools();
    var result = sut.Multiply(2M, 1M);
    Assert.IsTrue(2M == result);
  }
  [TestMethod]
  public void DivideTest()
  {
    var sut = new MathTools();
    var result = sut.Divide(2M, 2M);
    Assert.IsTrue(1M == result);
  }
}

And again our favorite commands:

dotnet restore
dotnet build

The command dotnet restore will fail in offline mode, because MSTest is not delivered with the runtime and the default NuGet packages, but xUnit is. This means, it needs to fetch the latest packages from NuGet.org. Kinda weird, isn't it?

The last task to do, is to run the unit tests:

dotnet test SuperAwesomeLibrary.MsTest\SuperAwesomeLibrary.MsTest.csproj

This doesn't really look hard.

What about Nunit?

Unfortunately there is no default template for a Nunit test projects. I really like Nunit and I used it for years. It is anyway possible to use NUnit with .NET Core, but you need to do some things manually. The first steps seem to be pretty similar to the other examples, except we create a console application and add the NUnit dependencies manually:

mkdir SuperAwesomeLibrary.Nunit & cd SuperAwesomeLibrary.Nunit
dotnet new console
dotnet add package Nunit
dotnet add package NUnitLite
dotnet add reference ..\SuperAwesomeLibrary\SuperAwesomeLibrary.csproj
cd ..

The reason why we need to create a console application is, that there is not yet a runner for visual studio available for NUnit. This also means dotnet test will not work. The NUnit Devs are working on it, but this seems to need some more time. Anyway, there is an option to use NUnitLite to create a self executing test library.

We need to use NUnitLight and to change the static void Main to a static int Main:

static int Main(string[] args)
{
  var typeInfo = typeof(Program).GetTypeInfo();
  return new AutoRun(typeInfo.Assembly).Execute(args);
}

This lines automatically executes all TestClasses in the current assembly. It also passes the NUnit arguments to NUnitLite, to e. g. setup the output log file, etc.

Let's add a NUnit test class:

[TestClass]
public class MathToolsTests
{
  [Test]
  public void AddTest() 
  {
    var sut = new MathTools();
    var result = sut.Add(1M, 2M);
    Assert.That(result, Is.EqualTo(3M));
  }
  [Test]
  public void SubstrTest() 
  {
    var sut = new MathTools();
    var result = sut.Substr(2M, 1M);
    Assert.That(result, Is.EqualTo(1M));
  }
  [Test]
  public void MultiplyTest() 
  {
    var sut = new MathTools();
    var result = sut.Multiply(2M, 1M);
    Assert.That(result, Is.EqualTo(2M));
  }
  [Test]
  public void DivideTest() 
  {
    var sut = new MathTools();
    var result = sut.Divide(2M, 2M);
    Assert.That(result, Is.EqualTo(1M));
  }
}

Finally we need to run the tests. But this time we cannot use dotnet test.

dotnet restore
dotnet build
dotnet run -p SuperAwesomeLibrary.Nunit\SuperAwesomeLibrary.Nunit.csproj

Because it is a console application, we need to use dotnet run to execute the app and the NUnitLite test runner.

What about mocking?

Currently creating mocking frameworks is a little bit difficult using .NET Standard, because there is a lot of reflection needed, which is not completely implemented in .NET Core or even .NET Standard.

My Favorite tool Moq is anyway available for .NET Standard 1.3, which means it should work here. Let's see how it works.

Lets assume we have a PersonService in the SuperAwesomeLibrary that uses a IPersonRepository to fetch Persons from a data storage:

using System;
using System.Collections.Generic;

public class PersonService
{
  private readonly IPersonRepository _personRepository;

  public PersonService(IPersonRepository personRepository)
  {
    _personRepository = personRepository;
  }

  public IEnumerable<Person> GetAllPersons()
  {
    return _personRepository.FetchAllPersons();
  }
}

public interface IPersonRepository
{
  IEnumerable<Person> FetchAllPersons();
}

public class Person
{
  public string Firstname { get; set; }
  public string Lastname { get; set; }
  public DateTime DateOfBirth { get; set; }
}

After building the library, I move to the NUnit test project to add Moq and GenFu.

cd SuperAwesomeLibrary.Nunit
dotnet add package moq
dotnet add package genfu
dotnet restore

GenFu is a really great library to create the test data for unit tests or demos. I really like this library and use it a lot.

Now we need to write the actual test using this tools. This test doesn't really makes sense, but it shows how Moq works:

using System;
using System.Linq;
using NUnit.Framework;
using SuperAwesomeLibrary;
using GenFu;
using Moq;

namespace SuperAwesomeLibrary.Xunit
{
  [TestFixture]
  public class PersonServiceTest
  {
    [Test]
    public void GetAllPersons() 
    {
      var persons = A.ListOf<Person>(10); // generating test data using GenFu
      
      var repo = new Mock<IPersonRepository>();
      repo.Setup(x => x.GetAllPersons()).Returns(persons);

      var sut = new PersonService(repo.Object);
      var actual = sut.GetAllPersons();

      Assert.That(actual.Count(), Is.EqualTo(10));
    }
  }
}

As you can see the, Moq works the same was in .NET Core as in the full .NET Framework.

Now let's start the NUnit tests again:

dotnet build
dotnet run

Et voilà:

Conclusion

Running unit tests within .NET Core isn't really a big deal and it is really a good thing that it is working with different unit testing frameworks. You have the choice to use your favorite tools. It would be nice to have the same choice even with the mocking frameworks.

In one of the next post I'll write about unit testing a ASP.NET Core application. Which includes testing MiddleWares, Controllers, Filters and View Components.

You can play around with the code samples on GitHub: https://github.com/juergengutsch/dotnetcore-unit-test-samples/

How we moved to Visual Studio 2017

28.03.2017 03:45:00 |

x

Visual Studio 2017 has arrived and because of .NET Core and other goodies we wanted to switch fast to the newest release with our product OneOffixx.

Company & Product Environment

In our solution we use some VC++ projects (just for Office Development & building a .NET shim), Windows Installer XML & many C# projects for desktop or ASP.NET stuff.

Our builds are scripted with CAKE (see here for some more blogposts about [CAKE}(https://blog.codeinside.eu/2017/02/13/create-nuget-packages-with-cake/) and us the TFS vNext Build to orchestrate everything.

Step 1: Update the Development Workstations

The first step was to update my local dev enviroment and install Visual Studio 2017.

After the installation I started VS and opened our solution and because we have some WIX projects we needed the most recent Wix 3.11 toolset & the VS 2017 extension.

Step 2: VC++ update

We wanted a clean VS 2017 enviroment, so we decided to use the most recent VC++ 2017 runtime for our VC++ projects.

Step 3: project update

In the past we had some issues that MSBuild used the wrong MSBuild version. Maybe this step is not needed, but we pointed all .csproj files to the newest MSBuild ToolVersion 15.0.

Step 4: CAKE update

The last step was to update the CAKE.exe (which is controlled by us and not automatically downloaded via a build script) to 0.18.

Step 5: Minor build script changes

We needed to adjust some paths (e.g. to the Windows SDK for signtool.exe) and ensure that we are using the most recent MSBuild.exe.

Step 6: Create a new Build-Agent

We decided to create a new TFS Build-Agent and do the usual build agent installation, imported the code-cert and do some magic because of some C++/COM-magic (don’t ask… COM sucks.)

Recap

Besides the C++/COM/magic issue (see above) the migration was pretty easy and now our team works with Visual Studio 2017.

Visual Studio 2017 Launch bei der .NET User Group Koblenz

22.03.2017 14:39:39 | Andre Kraemer

Update 23.03.2017: Das Treffen ist am 23.03 2017 und nicht wie ursprünglich geschrieben am 24.03.

Am 7. März 2017 veröffentliche Microsoft pünktlich zum 20. Geburtstag von Visual Studio die neue Version 2017.

Bei der .NET User Group Koblenz werden wir aus diesem Anlass am 23. März 2017 um 19:00 Uhr ein Treffen abhalten, bei dem Eric Berres und ich die wichtigsten neuen Features vorstellen werden, mit denen Entwickler noch produktiver werden.

Unter anderem werden wir folgendes besprechen:

  • Visual Studio 2017 Installer / Workloads
  • Schnellerer Start von Visual Studio 2017 dank Lightweight Solution Load
  • Neue Refactorings in Visual Studio
  • ASP.NET Core und Docker
  • Live Unit Testing in Visual Studio 2017
  • Xamarin in Visual Studio 2017

Visual Studio 2017 Launch Event T-Shirts

Für alle Microsoft MSDN Abonnenten gibt es übrigens auch noch ein kleines Geschenk seitens Microsoft: Ein Visual Studio 2017 Geburtstags-T-Shirt. Sollten wir nicht genug Shirts für alle Teilnehmer haben, dann werden diese nach dem Prinzip: First-Come, First-Serve ausgeteilt.

Visual Studio 2017 Launch Event T-Shirts

Weitere Details gibt es auf der Webseite der .NET User Group Koblenz

Vortrag auf der Embedded meets Agile in München

17.03.2017 14:23:39 | Sebastian Gerling

ich bin eingeladen worden auf der Embedded meets Agile am 26.04 in München über das Thema „Design Thinking vs Business Canvas Modell“ zu sprechen. Ziel ist es hier aufzuzeigen, wo die Unterschiede liegen, worauf gerade in Umfeld Embedded geachtet werden muss und wo in einer Kombination aus Fuzzy und strukturiertem Ansatz Mehrwerte liegen können.   Weitere […]

Interview zu App Design und Storyboarding

16.03.2017 09:53:00 | Jörg Neumann

Im Rahmen der MobileTechCon hat mich S&S zum Thema App Design und Storyboarding interviewt. Das vollständige Interview gibt's hier.

Material von der MTC 2017

16.03.2017 09:47:00 | Jörg Neumann

Integrate Meetup events on your website

12.03.2017 19:00:00 | Jürgen Gutsch

This is a guest post, written by Olivier Giss about integrating Meetup events on your website. Olivier is working as a web developer at algacom AG in Basel and also one of the leads of the .NET User Group Nordwest-Schweiz


For two years, I am leading the .NET User Group Nordwest-Schweiz with Jürgen Gutsch that owns this nice blog. After a year, we decided also to use Meetup to get more participants.

Understanding the problem

But with each added platform where we post our events, we increased the workload to keep it all up to date. Jürgen had the great idea to read the Meetup events and list them on our own website to lower the work.

This is exactly what I want to show you.

The Meetup API

Before we start coding we should understand how the API of Meetup is working and what it does offer. The API of Meetup is well documented and supports a lot of data. What we want is to get a list of upcoming events for our meetup group and display it on the website without to be authenticated on meetup.com.

For our goal, we need the following meetup API method:

GET https://api.meetup.com/:urlname/events

The parameter “:urlname” is the meetup group name. In the request body we could sort, filter and control paging what we don’t need. If we would execute that query, you get an authorization error.

However, we don’t want that the user must be authenticated to get the events. To get it to work we need to use a JSONP request.

Let’s getting it done

The simplest way doing a JSONP request is using jQuery:

$.ajax({
  url: "https://api.meetup.com/Basel-NET-User-Group/events",
  jsonp: "callback",
  dataType: "jsonp",
  data: {
    format: "json"
  },
  success: function(response) {
    var events = response.data;
  }
});

Be aware: JSONP has some security implications. As JSONP is really JavaScript, it can do everything that is possible in the context. You need to trust the provider of the JSONP data!

After that call, we are getting the data from the Meetup API which can be used with simple data binding to display it on our website. You can choose any kind of MV* JS framework to do that. I used AngularJS.

<div class="row" ng-repeat="model in vm.Events track by model.Id" ng-cloak>
  <ahttp://feedproxy.google.com href="" target="_blank" title="Öffnen auf meetup.com"><h3></h3></a>
  <label>Datum und Uhrzeit</label>
  <p></p>
  <label>Description</label>
  <div ng-bind-html="model.Description"></div>
  <label>Ort</label>
  <p></p>
</div>

As you can see everything is One-Way bound because the data is never changed. The “ng-bind-html” binds HTML content from the meetup event description.

The Angular controller is simple, it uses the "$sce” service to ensure that the provided HTML content from the meetup API is marked as secure. When we change a model outside of angular, we must notify our changes with “vm.scope.$apply()”.

(function () {
  var module = angular.module('app', []);

  module.controller('MeetupEventsController', ['$scope', '$sce', MeetupEventsController]);

  MeetupEventsController.$inject = ['$scope', '$sce'];

  function MeetupEventsController($scope, $sce) {

    var vm = this;
    vm.Events = [];
    vm.scope = $scope;
    vm.loaded = false;

    vm.Refresh = function() {
      $.ajax({
        url: "https://api.meetup.com/Basel-NET-User-Group/events",
        jsonp: "callback",
        dataType: "jsonp",
        data: {
          format: "json"
        },
        success: function(response) {
          var events = response.data;

          for (var i = 0; i < events.length; i++) {
            var item = events[i];

            var eventItem = {
              Id: i,
              DisplayName: item.name,
              Description: $sce.trustAsHtml(item.description),
              Location: item.venue.name + " " + item.venue.address_1 + " " + item.venue.city,
              Time: new Date(item.time).toLocaleString(),
              Link :item.link,
            };
            vm.Events.push(eventItem)
          }
          vm.loaded = true;
          vm.scope.$apply();
        }
      });
    };
    function activate() {
      vm.Refresh();
    };
    activate();
  };
})();

Finally, we are finish. Not that complicated, right? Feel free to ask question or share your experience.


Just visit the website of the .NET User Group Nordwest-Schweiz to see the Meetup integration in action.

Regeln | Impressum