.
Anmeldung | Registrieren | Hilfe

Blogger

.NET-Blog Archiv

.NET Developer Blogs

Visual Studio 2017 Launch bei der .NET User Group Koblenz

22.03.2017 14:39:39 | Andre Kraemer

Update 23.03.2017: Das Treffen ist am 23.03 2017 und nicht wie ursprünglich geschrieben am 24.03.

Am 7. März 2017 veröffentliche Microsoft pünktlich zum 20. Geburtstag von Visual Studio die neue Version 2017.

Bei der .NET User Group Koblenz werden wir aus diesem Anlass am 23. März 2017 um 19:00 Uhr ein Treffen abhalten, bei dem Eric Berres und ich die wichtigsten neuen Features vorstellen werden, mit denen Entwickler noch produktiver werden.

Unter anderem werden wir folgendes besprechen:

  • Visual Studio 2017 Installer / Workloads
  • Schnellerer Start von Visual Studio 2017 dank Lightweight Solution Load
  • Neue Refactorings in Visual Studio
  • ASP.NET Core und Docker
  • Live Unit Testing in Visual Studio 2017
  • Xamarin in Visual Studio 2017

Visual Studio 2017 Launch Event T-Shirts

Für alle Microsoft MSDN Abonnenten gibt es übrigens auch noch ein kleines Geschenk seitens Microsoft: Ein Visual Studio 2017 Geburtstags-T-Shirt. Sollten wir nicht genug Shirts für alle Teilnehmer haben, dann werden diese nach dem Prinzip: First-Come, First-Serve ausgeteilt.

Visual Studio 2017 Launch Event T-Shirts

Weitere Details gibt es auf der Webseite der .NET User Group Koblenz

Vortrag auf der Embedded meets Agile in München

17.03.2017 14:23:39 | Sebastian Gerling

ich bin eingeladen worden auf der Embedded meets Agile am 26.04 in München über das Thema „Design Thinking vs Business Canvas Modell“ zu sprechen. Ziel ist es hier aufzuzeigen, wo die Unterschiede liegen, worauf gerade in Umfeld Embedded geachtet werden muss und wo in einer Kombination aus Fuzzy und strukturiertem Ansatz Mehrwerte liegen können.   Weitere […]

Interview zu App Design und Storyboarding

16.03.2017 09:53:00 | Jörg Neumann

Im Rahmen der MobileTechCon hat mich S&S zum Thema App Design und Storyboarding interviewt. Das vollständige Interview gibt's hier.

Material von der MTC 2017

16.03.2017 09:47:00 | Jörg Neumann

Integrate Meetup events on your website

12.03.2017 19:00:00 | Jürgen Gutsch

This is a guest post, written by Olivier Giss about integrating Meetup events on your website. Olivier is working as a web developer at algacom AG in Basel and also one of the leads of the .NET User Group Nordwest-Schweiz


For two years, I am leading the .NET User Group Nordwest-Schweiz with Jürgen Gutsch that owns this nice blog. After a year, we decided also to use Meetup to get more participants.

Understanding the problem

But with each added platform where we post our events, we increased the workload to keep it all up to date. Jürgen had the great idea to read the Meetup events and list them on our own website to lower the work.

This is exactly what I want to show you.

The Meetup API

Before we start coding we should understand how the API of Meetup is working and what it does offer. The API of Meetup is well documented and supports a lot of data. What we want is to get a list of upcoming events for our meetup group and display it on the website without to be authenticated on meetup.com.

For our goal, we need the following meetup API method:

GET https://api.meetup.com/:urlname/events

The parameter “:urlname” is the meetup group name. In the request body we could sort, filter and control paging what we don’t need. If we would execute that query, you get an authorization error.

However, we don’t want that the user must be authenticated to get the events. To get it to work we need to use a JSONP request.

Let’s getting it done

The simplest way doing a JSONP request is using jQuery:

$.ajax({
  url: "https://api.meetup.com/Basel-NET-User-Group/events",
  jsonp: "callback",
  dataType: "jsonp",
  data: {
    format: "json"
  },
  success: function(response) {
    var events = response.data;
  }
});

Be aware: JSONP has some security implications. As JSONP is really JavaScript, it can do everything that is possible in the context. You need to trust the provider of the JSONP data!

After that call, we are getting the data from the Meetup API which can be used with simple data binding to display it on our website. You can choose any kind of MV* JS framework to do that. I used AngularJS.

<div class="row" ng-repeat="model in vm.Events track by model.Id" ng-cloak>
  <ahttp://feedproxy.google.com href="" target="_blank" title="Öffnen auf meetup.com"><h3></h3></a>
  <label>Datum und Uhrzeit</label>
  <p></p>
  <label>Description</label>
  <div ng-bind-html="model.Description"></div>
  <label>Ort</label>
  <p></p>
</div>

As you can see everything is One-Way bound because the data is never changed. The “ng-bind-html” binds HTML content from the meetup event description.

The Angular controller is simple, it uses the "$sce” service to ensure that the provided HTML content from the meetup API is marked as secure. When we change a model outside of angular, we must notify our changes with “vm.scope.$apply()”.

(function () {
  var module = angular.module('app', []);

  module.controller('MeetupEventsController', ['$scope', '$sce', MeetupEventsController]);

  MeetupEventsController.$inject = ['$scope', '$sce'];

  function MeetupEventsController($scope, $sce) {

    var vm = this;
    vm.Events = [];
    vm.scope = $scope;
    vm.loaded = false;

    vm.Refresh = function() {
      $.ajax({
        url: "https://api.meetup.com/Basel-NET-User-Group/events",
        jsonp: "callback",
        dataType: "jsonp",
        data: {
          format: "json"
        },
        success: function(response) {
          var events = response.data;

          for (var i = 0; i < events.length; i++) {
            var item = events[i];

            var eventItem = {
              Id: i,
              DisplayName: item.name,
              Description: $sce.trustAsHtml(item.description),
              Location: item.venue.name + " " + item.venue.address_1 + " " + item.venue.city,
              Time: new Date(item.time).toLocaleString(),
              Link :item.link,
            };
            vm.Events.push(eventItem)
          }
          vm.loaded = true;
          vm.scope.$apply();
        }
      });
    };
    function activate() {
      vm.Refresh();
    };
    activate();
  };
})();

Finally, we are finish. Not that complicated, right? Feel free to ask question or share your experience.


Just visit the website of the .NET User Group Nordwest-Schweiz to see the Meetup integration in action.

Mit Künstlicher Intelligenz die User Experience verbessern

07.03.2017 22:00:19 | Kazim Bahar

… so lautet mein erster Blogeintrag auf AISOMA Analytics: AISOMA Analytics Blog

Using dependency injection in multiple .NET Core projects

05.03.2017 18:00:00 | Jürgen Gutsch

One of my last post was about Dependency Injection (DI) in .NET Core Console Applications. Some days after that post was published, I got a question about how to use the IServiceCollection in multiple projects. In this post I'm going to try to explain, how to use the IServiceCollection in a Solution with more projects.

Setup

To demonstrate that, I created a Solutions with two .NET Core Console apps and two .NET Standard libraries. One of the console apps uses two of the libraries and the other one is just using on. Each library provides some services which need to be registered to the DI container. Also the console apps provide some services to add.

We now have four projects like this:

  • DiDemo.SmallClient
    • .NET Core Console app
    • includes a WriteSimpleDataService
    • references DiDemo.CsvFileConnector
  • DiDemo.BigClient
    • .NET Core Console app
    • includes a WriteExtendedDataService
    • includes a NormalizedDataService
    • references DiDemo.SqlDatabaseConnector
    • references DiDemo.CsvFileConnector
  • DiDemo.SqlDatabaseConnector
    • .NET Standard library
    • includes a SqlDataService
    • includes a SqlDataProvider used by the service
  • DiDemo.CsvFileConnector
    • .NET Standard library
    • includes a CsvDataService

BTW: Since one of the latest updates the "Class Libraries (.NET Standard)" project disappeared from the ".NET Core" node in the "Add New Project" dialogue and the "Class Library (.NET Core)" is back again. The "Class Libraries (.NET Standard)" is now in the ".NET Standard" node under the "Visual C#" node.

In the most cases it doesn't really makes sense to create a .NET Core class library. The difference here is, that the Class Library (.NET Core) has some .NET Core related references. They targeting the netcoreapp1.x instead of the netstandard1.x. This means they have a lot of references, which are not needed in a class library in the most cases, e. g. the Libuv and the .NET Core runtime.

The WriteExtendedDataService uses a INormalizedDataService to get the data and writes it to the console. The NormalizedDataService fetches the data from the CsvDataService and from the SqlDataService and normalize it, to make it usable in the WriteExtendedDataService.

The WriteSimpleDataService uses only the ICsvDataService and writes the data out to the console.

Setup the DI container

Let's setup the DI container for the SmallClient app. Currently it looks like this:

var services = new ServiceCollection();
services.AddTransient<IWriteSimpleDataService, WriteSimpleDataService>();
services.AddTransient<ICsvDataService, CsvDataService>();

var provider = services.BuildServiceProvider();

var writer = provider.GetService<IWriteSimpleDataService>();
writer.write();

That doesn't really look wrong, but what happens if the app grows and gets a lot more services to add to the DI container? The CsvDataService is not in the app directly, but it is in the separate library. Usually I don't want to map all the services of the external library. I just want to use the library and I don't want to know anything about the internal stuff. This is why we should set-up the mapping for the DI container also in the external library.

Let's plug things together

The .NET Standard libraries should reference the Microsoft.Extensions.DependencyInjection.Abstractions to get the IServiceCollection interface. Now we can create a public static class called IServiceCollectionExtensions to create an extension method to work in the IServiceCollection:

public static class IServiceCollectionExtension
{
  public static IServiceCollection AddCsvFileConnector(this IServiceCollection services)
  {
    services.AddTransient<ICsvDataService, CsvDataService>();
    return services;
  }
}

Inside this method we do all the mappings from the interfaces to the concreate classes or all the other registrations to the DI container. Let's do the same to encapsulate all the services inside the SmallClient app and to keep the program.cs as small as possible:

public static class IServiceCollectionExtension
{
  public static IServiceCollection AddInternalServices(this IServiceCollection services)
  {
    services.AddTransient<IWriteSimpleDataService, WriteSimpleDataService>();
    return services;
  }
}

We can now use this methods in the program.cs of the SmallClient app to plug all that stuff together:

var services = new ServiceCollection();
services.AddInternalServices();
services.AddCsvFileConnector();

var provider = services.BuildServiceProvider();

var writer = provider.GetService<IWriteSimpleDataService>();
writer.write();

It looks much cleaner now. Maybe you remember the AddSomething methods? Exacctly, this is the same way, it is done in ASP.NET Core with e. g. the services.AddMvc() method.

We now need to do the same thing for the BigClient app and the SqlDatabaseConnector library. At first let's create the mapping for the SqlDatbaseConnector:

public static class IServiceCollectionExtension
{
  public static IServiceCollection AddSqlDatabaseConnector(this IServiceCollection services)
  {
    services.AddTransient<ISqlDataService, SqlDataService>();
    services.AddTransient<ISqlDataProvider, SqlDataProvider>();
    return services;
  }
}

We also need to create a extension method for the internal services:

public static class IServiceCollectionExtension
{
  public static IServiceCollection AddInternalServices(this IServiceCollection services)
  {
    services.AddTransient<IWriteExtendedDataService, WriteExtendedDataService>();
    services.AddTransient<INormalizedDataService, NormalizedDataService>();
    return services;
  }
}

Now let's plug that stuff together in the BigClient App:

var services = new ServiceCollection();
services.AddInternalServices();
services.AddCsvFileConnector();
services.AddSqlDatabaseConnector();

var provider = services.BuildServiceProvider();

var writer = provider.GetService<IWriteExtendedDataService>();
writer.write();

As you can see, the BigClient app uses the already existing services.AddCsvFileConnector() method.

Does it really work?

It does. Start the BigClient app in Visual Studio to see that it will work as expected:

To see the full sources and to try it out by yourself, please visit the GitHub repository: https://github.com/JuergenGutsch/di-core-multi-demo

What do you think? Do you have questions or some ideas to add? Feel free to drop a comment :)

HowTo: Get User Information & Group Memberships from Active Directory via C#

03.03.2017 01:45:00 |

I had to find a way to access all group memberships from a given Active Directory user. The problem here is, that groups may contain other groups and I needed a list of “all” applied group memberships - directly or indirectly.

The “fastest” solution (without querying each group) is to use the Token-Groups attribute, which already does this magic for us. This list should contain all applied groups.

The code would also allow to read any other AD property, e.g. the UPN or names etc.

Code

class Program
{
    static void Main(string[] args)
    {
        Console.WriteLine("ListAllGroupsViaTokenGroups:");

        List<string> result = new List<string>();

        try
        {
            result = ListAllGroupsViaTokenGroups("USERNAME", "DOMAIN");

            foreach (var group in result)
            {
                Console.WriteLine(group);
            }
        }
        catch (Exception exc)
        {
            Console.WriteLine(exc.Message);
        }

        Console.Read();
    }

  
    private static List<string> ListAllGroupsViaTokenGroups(string username, string domainName)
    {
        List<string> result = new List<string>();

        using (PrincipalContext domainContext = new PrincipalContext(ContextType.Domain, domainName))
        using (var searcher = new DirectorySearcher(new DirectoryEntry("LDAP://" + domainContext.Name)))
        {
            searcher.Filter = String.Format("(&(objectClass=user)(sAMAccountName={0}))", username);
            SearchResult sr = searcher.FindOne();

            DirectoryEntry user = sr.GetDirectoryEntry();

            // access to other user properties, via user.Properties["..."]

            user.RefreshCache(new string[] { "tokenGroups" });

            for (int i = 0; i < user.Properties["tokenGroups"].Count; i++)
            {
                SecurityIdentifier sid = new SecurityIdentifier((byte[])user.Properties["tokenGroups"][i], 0);
                NTAccount nt = (NTAccount)sid.Translate(typeof(NTAccount));

                result.Add(nt.Translate(typeof(NTAccount)).ToString() + " (" + sid + ")");
            }
        }

        return result;
    }

}

Hope this will help someone in the future.

Code @ GitHub

ActiveRoute TagHelper

02.03.2017 18:00:00 | Jürgen Gutsch

I recently read the pretty cool blog post by Ben Cull about the IsActiveRoute TagHelper: http://benjii.me/2017/01/is-active-route-tag-helper-asp-net-mvc-core/. This TagHelper adds a css class to an element, if the specified route or route parts are in the current active route. This is pretty useful, if you want to highlight an active item in a menu.

Inspired by this idea, I created a different TagHelper, which shows or hide contents, if the specified route or route parts are in the current route. This could be useful, e.g. if you don't want to have a link in an active menu item.

From the perspective of an semantic web, it doesn't make sense to link to the current page. That means, the menu item that points to the current page should not be a link.

The usage of this TagHelper will look like this:

<ul class="nav navbar-nav">
  <li>
    <a asp-active-route asp-action="Index" asp-controller="Home" asp-hide-if-active="true">
      <span>Home</span>
    </a>
    <span asp-active-route asp-action="Index" asp-controller="Home">Home</span>
  </li>
  <li>
    <a asp-active-route asp-action="About" asp-controller="Home" asp-hide-if-active="true">
      <span>About</span>
    </a>
    <span asp-active-route asp-action="About" asp-controller="Home">About</span>
  </li>
  <li>
    <a asp-active-route asp-action="Contact" asp-controller="Home" asp-hide-if-active="true">
      <span>Contact</span>
    </a>
    <span asp-active-route asp-action="Contact" asp-controller="Home">Contact</span>
  </li>
</ul>

As you may see on the a-Tag, multiple TagHelper can work on a single Tag. In this case the built in AnchorTagHelper and the ActiveRouteTagHelper are manipulating the Tag. The a-Tag will be hidden if the specified route is active and the span-Tag is shown in that case.

If you now navigate to the About page, the a-Tag is removed from the specific menu item and the span-Tag is shown. The HTML result of the menu now looks pretty clean:

<ul class="nav navbar-nav">
  <li>
    <ahttp://feedproxy.google.com href="/">
      <span>Home</span>
    </a>
  </li>
  <li>
    <span>About</span>
  </li>
  <li>
    <ahttp://feedproxy.google.com href="/Home/About">
      <span>Contact</span>
    </a>
  </li>
</ul>

Using this approach for the menu, we don't need Ben Culls TagHelper here to add a special CSS class. The style for the active item can be set via the selection of that list item with just the span in it:

.nav.navbar-nav li > a { ... }
.nav.navbar-nav li > a > span { ... }
.nav.navbar-nav li > span { ... } /* this is the active item*/

This CSS is based on the default Bootstrap based template in a new ASP.NET Core project. If you use another template, just replace the CSS class which identifies the menu with your specific identifier.

That means, to get that active menu item looking nice, you may just add a CSS like this:

.navbar-nav li > span {
    padding: 15px;
    display: block;
    color: white;
}

This results in the following view:

To get this working, we need to implement the TagHelper. I just created a new class in the project and called it ActiveRouteTagHelper and added the needed properties:

[HtmlTargetElement(Attributes = "asp-active-route")]
public class ActiveRouteTagHelper : TagHelper
{
  [HtmlAttributeName("asp-controller")]
  public string Controller { get; set; }

  [HtmlAttributeName("asp-action")]
  public string Action { get; set; }

  [HtmlAttributeName("asp-hide-if-active")]
  public bool HideIfActive { get; set; }
  
  
}

That class inherits the TagHelper base class. To use it on any HTML tag, I defined a attribute name which is needed to on the HTML we want to manipulate. I used the name "asp-active-route". Also the attributes getting a specific name. I could use the default name, without the leading "asp" prefix, but I thouhgt it would make sense to share the Controller and Action properties with the built-in AnchorTagHelper. And to be consistent, I use the prefix in all cases.

Now we need to override the Process method to actually manipulate the specific HTML tag:

public override void Process(TagHelperContext context, TagHelperOutput output)
{
  if (!CanShow())
  {
    output.SuppressOutput();
  }

  var attribute = output.Attributes.First(x => x.Name == "asp-active-route");
  output.Attributes.Remove(attribute);
}

If I cannot show the Tag because of the conditions in the CahShow() method, I completely suppress the output. Nothing is generated in that case. Not the contents and not the HTML tag itself.

At the end of the method, I remove the identifying attribute, which is used to activate this TagHelper, because this attribute will be kept usually.

To get the RouteData of the current route, we cant use the TagHelperContext or the TagHelperOutput. We need to add the inject the ViewContext:

[HtmlAttributeNotBound]
[ViewContext]
public ViewContext ViewContext { get; set; }

Now we are able to access the route data and get the needed information about the current route:

private bool CanShow()
{
  var currentController = ViewContext.RouteData.Values["Controller"].ToString();
  var currentAction = ViewContext.RouteData.Values["Action"].ToString();

  var show = false;
  if (!String.IsNullOrWhiteSpace(Controller) &&
      Controller.Equals(currentController, StringComparison.CurrentCultureIgnoreCase))
  {
    show = true;
  }
  if (show &&
      !String.IsNullOrWhiteSpace(Action) &&
      Action.Equals(currentAction, StringComparison.CurrentCultureIgnoreCase))
  {
    show = true;
  }
  else
  {
    show = false;
  }

  if (HideIfActive)
  {
    show = !show;
  }

  return show;
}

One last step you need to do, is to register your own TagHelpers. In Visual Studio open the _ViewImports.cshtml and add the following line of code:

@addTagHelper *, CoreWebApplication

Where CoreWebApplication is the assembly name of your project. * means use all TagHelpers in that library

Conclusion

I hope this makes sense to you and helps you a little more to get into the TagHelpers.

I always have fun, creating a new TagHelper. With less code, I'm able to extend the View engine the way I need.

I always focus on semantic HTML, if possible. Because it makes the Web a little more accessible to other devices and engines than we usually use. This could be screen readers for blind people, as well as search engines. Maybe I can do some more posts about accessibility in ASP.NET Core applications.

ActiveRoute TagHelper

02.03.2017 18:00:00 | Jürgen Gutsch

I recently read the pretty cool blog post by Ben Cull about the IsActiveRoute TagHelper: http://benjii.me/2017/01/is-active-route-tag-helper-asp-net-mvc-core/. This TagHelper adds a css class to an element, if the specified route or route parts are in the current active route. This is pretty useful, if you want to highlight an active item in a menu.

Inspired by this idea, I created a different TagHelper, which shows or hide contents, if the specified route or route parts are in the current route. This could be useful, e.g. if you don't want to have a link in an active menu item.

From the perspective of an semantic web, it doesn't make sense to link to the current page. That means, the menu item that points to the current page should not be a link.

The usage of this TagHelper will look like this:

<ul class="nav navbar-nav">
  <li>
    <a asp-active-route asp-action="Index" asp-controller="Home" asp-hide-if-active="true">
      <span>Home</span>
    </a>
    <span asp-active-route asp-action="Index" asp-controller="Home">Home</span>
  </li>
  <li>
    <a asp-active-route asp-action="About" asp-controller="Home" asp-hide-if-active="true">
      <span>About</span>
    </a>
    <span asp-active-route asp-action="About" asp-controller="Home">About</span>
  </li>
  <li>
    <a asp-active-route asp-action="Contact" asp-controller="Home" asp-hide-if-active="true">
      <span>Contact</span>
    </a>
    <span asp-active-route asp-action="Contact" asp-controller="Home">Contact</span>
  </li>
</ul>

As you may see on the a-Tag, multiple TagHelper can work on a single Tag. In this case the built in AnchorTagHelper and the ActiveRouteTagHelper are manipulating the Tag. The a-Tag will be hidden if the specified route is active and the span-Tag is shown in that case.

If you now navigate to the About page, the a-Tag is removed from the specific menu item and the span-Tag is shown. The HTML result of the menu now looks pretty clean:

<ul class="nav navbar-nav">
  <li>
    <ahttp://feedproxy.google.com href="/">
      <span>Home</span>
    </a>
  </li>
  <li>
    <span>About</span>
  </li>
  <li>
    <ahttp://feedproxy.google.com href="/Home/About">
      <span>Contact</span>
    </a>
  </li>
</ul>

Using this approach for the menu, we don't need Ben Culls TagHelper here to add a special CSS class. The style for the active item can be set via the selection of that list item with just the span in it:

.nav.navbar-nav li > a { ... }
.nav.navbar-nav li > a > span { ... }
.nav.navbar-nav li > span { ... } /* this is the active item*/

This CSS is based on the default Bootstrap based template in a new ASP.NET Core project. If you use another template, just replace the CSS class which identifies the menu with your specific identifier.

That means, to get that active menu item looking nice, you may just add a CSS like this:

.navbar-nav li > span {
    padding: 15px;
    display: block;
    color: white;
}

This results in the following view:

To get this working, we need to implement the TagHelper. I just created a new class in the project and called it ActiveRouteTagHelper and added the needed properties:

[HtmlTargetElement(Attributes = "asp-active-route")]
public class ActiveRouteTagHelper : TagHelper
{
  [HtmlAttributeName("asp-controller")]
  public string Controller { get; set; }

  [HtmlAttributeName("asp-action")]
  public string Action { get; set; }

  [HtmlAttributeName("asp-hide-if-active")]
  public bool HideIfActive { get; set; }
  
  
}

That class inherits the TagHelper base class. To use it on any HTML tag, I defined a attribute name which is needed to on the HTML we want to manipulate. I used the name "asp-active-route". Also the attributes getting a specific name. I could use the default name, without the leading "asp" prefix, but I thouhgt it would make sense to share the Controller and Action properties with the built-in AnchorTagHelper. And to be consistent, I use the prefix in all cases.

Now we need to override the Process method to actually manipulate the specific HTML tag:

public override void Process(TagHelperContext context, TagHelperOutput output)
{
  if (!CanShow())
  {
    output.SuppressOutput();
  }

  var attribute = output.Attributes.First(x => x.Name == "asp-active-route");
  output.Attributes.Remove(attribute);
}

If I cannot show the Tag because of the conditions in the CahShow() method, I completely suppress the output. Nothing is generated in that case. Not the contents and not the HTML tag itself.

At the end of the method, I remove the identifying attribute, which is used to activate this TagHelper, because this attribute will be kept usually.

To get the RouteData of the current route, we cant use the TagHelperContext or the TagHelperOutput. We need to add the inject the ViewContext:

[HtmlAttributeNotBound]
[ViewContext]
public ViewContext ViewContext { get; set; }

Now we are able to access the route data and get the needed information about the current route:

private bool CanShow()
{
  var currentController = ViewContext.RouteData.Values["Controller"].ToString();
  var currentAction = ViewContext.RouteData.Values["Action"].ToString();

  var show = false;
  if (!String.IsNullOrWhiteSpace(Controller) &&
      Controller.Equals(currentController, StringComparison.CurrentCultureIgnoreCase))
  {
    show = true;
  }
  if (show &&
      !String.IsNullOrWhiteSpace(Action) &&
      Action.Equals(currentAction, StringComparison.CurrentCultureIgnoreCase))
  {
    show = true;
  }
  else
  {
    show = false;
  }

  if (HideIfActive)
  {
    show = !show;
  }

  return show;
}

One last step you need to do, is to register your own TagHelpers. In Visual Studio open the _ViewImports.cshtml and add the following line of code:

@addTagHelper *, CoreWebApplication

Where CoreWebApplication is the assembly name of your project. * means use all TagHelpers in that library

Conclusion

I hope this makes sense to you and helps you a little more to get into the TagHelpers.

I always have fun, creating a new TagHelper. With less code, I'm able to extend the View engine the way I need.

I always focus on semantic HTML, if possible. Because it makes the Web a little more accessible to other devices and engines than we usually use. This could be screen readers for blind people, as well as search engines. Maybe I can do some more posts about accessibility in ASP.NET Core applications.

Xamarin Video-Training veröffentlicht

27.02.2017 10:55:00 | Jörg Neumann

Ab dem 1.3.2017 steht mein Video-Training "Cross-Plattform-App-Development mit Xamarin" bei entwickler.tutorials bereit. Bis zum 17.4. gibt es 25% Rabatt. Also anschauen!


Material von der BASTA! Spring

26.02.2017 08:44:00 | Jörg Neumann

Hier das Material meiner Sessions auf der BASTA! Spring 2017:



Die Unterlagen meines Workshops "Cross-Plattform-App-Development mit Xamarin" erhalten Sie gerne auf Anfrage.

Tools: NDepend v2017.1 Smart Technical Debt Estimation

25.02.2017 16:36:02 | Steffen Steinbrecher

Jeder Entwickler dürfte das folgende Szenario aus dem Alltag kennen: Es soll ein neues Feature/Anforderung in die bestehende Software integriert werden und dafür gibt es (wie fast immer) mehrere Möglichkeiten: die Quick & Dirty Lösung – lässt sich schnell erledigen, obwohl man sich darüber bewusst ist über kurz oder lang wieder über diese Codestelle „stolpern“. […]

Creating a Word document OutputFormatter in ASP.NET Core

22.02.2017 18:00:00 | Jürgen Gutsch

In one of the ASP.NET Core projects we did in the last year, we created an OutputFormatter to provide a Word documents as printable reports via ASP.NET Core Web API. Well, this formatter wasn't done by me, but done by a fellow software developer Jakob Wolf at the yooapps.com. I told him to write about it, but he hadn't enough time to do it yet, so I'm going to do it for him. Maybe you know about him on Twitter. Maybe not, but he is one of the best ASP.NET and Angular developers I ever met.

About OutputFormatters

In ASP.NET you are able to have many different formatters. The best known built-in formatter is the JsonOutputFormatter which is used as the default OutputFormatter in ASP.NET Web API.

By using the AddMvcOptions() you are able to add new Formatters or to manage the existing formatters:

services.AddMvc()
    .AddMvcOptions(options =>
    {
        options.OutputFormatters.Add(new WordOutputFormatter());
        options.FormatterMappings.SetMediaTypeMappingForFormat(
          "docx", MediaTypeHeaderValue.Parse("application/ms-word"));
    })

As you can see in the snippet above, we add the Word document formatter (called WordOutputFormatter to provide the Word documents if the requested type is "application/ms-word".

You are able to add whatever formatter you need, provided on whatever media type you want to support.

Let's have a look how a output formatter looks like:

public class MyFormatter : IOutputFormatter
{
  public bool CanWriteResult(OutputFormatterCanWriteContext context)
  {
    // check whether to write or not
    throw new NotImplementedException();
  }

  public async Task WriteAsync(OutputFormatterWriteContext context)
  {
    // write the formatted contents to the response stream.
    throw new NotImplementedException();
  }
}

You have one method to check whether the data can be written to the expected format or not. The other async method does the job to format and output the data to the response stream, which comes with the context.

This way needs to do some things manually. A more comfortable way to implement an OutputFormatter is to inherit from the OutputFormatter base class directly:

public class WordOutputFormatter : OutputFormatter
{
  public string ContentType { get; }

  public WordOutputFormatter()
  {
    ContentType = "application/ms-word";
    SupportedMediaTypes.Add(MediaTypeHeaderValue.Parse(ContentType));
  }

  // optional, but makes sense to restrict to a specific condition
  protected override bool CanWriteType(Type type)
  {
    if (type == null)
    {
      throw new ArgumentNullException(nameof(type));
    }

    // only one ViewModel type is allowed
    return type == typeof(DocumentContentsViewModel);
  }

  // this needs to be overwritten
  public override Task WriteResponseBodyAsync(OutputFormatterWriteContext context)
  {
    // Format and write the document outputs here
    throw new NotImplementedException();
  }
}

The base class does some things for you. For example to write the correct HTTP headers.

Creating Word documents

To create Word documents you need to add a reference to the Open XML SDK. We used the OpenXMLSDK-MOT with the version 2.6.0, which cannot used with .NET Core. This is why we run that specific ASP.NET Core project on .NET 4.6.

Version 2.7.0 is available as a .NET Standard 1.3 library and can be used in .NET Core. Unfortunately this version isn't yet available in the default NuGet Feed. To install the latest Version, follow the instructions on GitHub: https://github.com/officedev/open-xml-sd Currently there is a mess with the NuGet package IDs and versions on NuGet and MyGet. Use the MyGet feed, mentioned on the GitHub page to install the latest version. The package ID here is DocumentFormat.OpenXml and the latest stable Version is 2.7.1

In this post, I don't want to go threw all the word processing stuff , because it is too specific to our implementation. I just show you how it works in general. The Open XML SDK is pretty well documented, so you can use this as an entry point to create your own specific WordOutputFormatter:

public override async Task WriteResponseBodyAsync(OutputFormatterWriteContext context)
{
  var response = context.HttpContext.Response;
  var filePath = Path.GetTempFileName();

  var viewModel = context.Object as DocumentContentsViewModel;
  if (viewModel == null)
  {
    throw new ArgumentNullException(nameof(viewModel));
  }

  using (var wordprocessingDocument = WordprocessingDocument
         .Create(filePath, WordprocessingDocumentType.Document))
  {
    // start creating the documents and the main parts of it
    wordprocessingDocument.AddMainDocumentPart();

    var styleDefinitionPart = wordprocessingDocument.MainDocumentPart
      .AddNewPart<StyleDefinitionsPart>();
    var styles = new Styles();
    styles.Save(styleDefinitionPart);

    wordprocessingDocument.MainDocumentPart.Document = new Document
    {
      Body = new Body()
    };
    var body = wordprocessingDocument.MainDocumentPart.Document.Body;

    // call a helper method to set default styles
    AddStyles(styleDefinitionPart); 
    // call a helper method set the document to landscape mode
    SetLandscape(body); 

    foreach (var institution in viewModel.Items)
    {
      // iterate threw some data of the viewmodel 
      // and create the elements you need
      
      // ... more word processing stuff here

    }

    await response.SendFileAsync(filePath);
  }
}

The VewModel with the data to format, is in the Object property of the OutputFormatterWriteContext. We do a save cast and check for null before we continue. The Open XML SDK works based on files. This is why we need to create a temp file name and let the SDK use this file path. Because of that fact - at the end - we send the file out to the response stream using the response.SendFileAsync() method. I personally prefer to work on the OutputStream directly, to have less file operations and to be a little bit faster. The other thing is, we need to cleanup the temp files.

After the file is created, we work on this file and create the document, custom styles and layouts and the document body, which will contain the formatted data. Inside the loop we are only working on that Body object. We created helper methods to add formatted values, tables and so on...

Conclusion

OutputFormatters are pretty useful to create almost any kind of content out of any kind of data. Instead of hacking around in the specific Web API actions, you should always use the OutputFormatters to have reusable components.

The OutputFormatter we build, is not really reusable or even generic, because it was created for a specific kind of a report. But with this starting point, we are able to make it generic. We could pass a template document to the formatter, which knows the properties of the ViewModel, this way it is possible to create almost all kind of Word documents.

Create NuGet packages with Cake

14.02.2017 01:45:00 |

This blogpost is a follow up to these Cake (C# Make) related blogpost:

Scenario

x

Let’s say we have this project structure. The “Config”, “Result” and “Engine” projects contains a corresponding .nuspec, like this:

<?xml version="1.0"?>
<package >
  <metadata>
    <id>Sloader.Config</id>
    <version>$version$</version>
    <title>Sloader.Config</title>
    <authors>Code Inside Team</authors>
    <owners>Code Inside Team</owners>
    <licenseUrl>https://github.com/Code-Inside/Sloader/blob/master/LICENSE</licenseUrl>
    <projectUrl>https://github.com/Code-Inside/Sloader</projectUrl>
    <requireLicenseAcceptance>false</requireLicenseAcceptance>
    <description>Sloader Config</description>
    <releaseNotes>
      ## Version 0.1 ##
      Init
    </releaseNotes>
    <copyright>Copyright 2017</copyright>
    <tags>Sloader</tags>
    <dependencies />
  </metadata>
</package>

Nothing fancy - pretty normal NuGet stuff, but be aware of the “$version$” variable. This variable is called a “replacement-token”. When the NuGet package is created and it detects such a replacement-token, it will search for the AssemblyVersion (or other replacement-token sources).

Versioning in NuGet:

I’m not a NuGet expert, but you should also versioning your assembly info, otherwise some systems may have trouble to update your dll. The version inside the package can be different from the actual assembly version, but you should manage booth or use this replacement-token-mechanic.

Goal

The goal is to create a NuGet package for each target project with Cake.

build.cake

The usage in Cake is pretty much the same as with the normal nuget.exe pack command The sample only shows the actual cake target - see the older blogposts for a more complete example:

Task("BuildPackages")
    .IsDependentOn("Restore-NuGet-Packages")
	.IsDependentOn("RunTests")
    .Does(() =>
{
    var nuGetPackSettings = new NuGetPackSettings
	{
		OutputDirectory = rootAbsoluteDir + @"\artifacts\",
		IncludeReferencedProjects = true,
		Properties = new Dictionary<string, string>
		{
			{ "Configuration", "Release" }
		}
	};

    MSBuild("./src/Sloader.Config/Sloader.Config.csproj", new MSBuildSettings().SetConfiguration("Release"));
    NuGetPack("./src/Sloader.Config/Sloader.Config.csproj", nuGetPackSettings);
    MSBuild("./src/Sloader.Result/Sloader.Result.csproj", new MSBuildSettings().SetConfiguration("Release"));
    NuGetPack("./src/Sloader.Result/Sloader.Result.csproj", nuGetPackSettings);
    MSBuild("./src/Sloader.Engine/Sloader.Engine.csproj", new MSBuildSettings().SetConfiguration("Release"));
    NuGetPack("./src/Sloader.Engine/Sloader.Engine.csproj", nuGetPackSettings);
});

Easy, right? The most interesting part here is the NuGetPack command. Before we invoke this command we need to make sure that we build the last recent version - to enforce that we just rebuild each project in release mode.

Result

x

The console output should make the flow pretty clear:

PS C:\Users\Robert\Documents\GitHub\Sloader> .\build.ps1 -t BuildPackages
Preparing to run build script...
Running build script...
Analyzing build script...
Processing build script...
Installing tools...
Compiling build script...

========================================
Clean
========================================
Executing task: Clean
Cleaning directory C:/Users/Robert/Documents/GitHub/Sloader/artifacts
Finished executing task: Clean

========================================
Restore-NuGet-Packages
========================================
Executing task: Restore-NuGet-Packages
MSBuild auto-detection: using msbuild version '14.0' from 'C:\Program Files (x86)\MSBuild\14.0\bin'.
All packages listed in packages.config are already installed.
Finished executing task: Restore-NuGet-Packages

========================================
BuildTests
========================================
Executing task: BuildTests
Start Building Test: Sloader.Config.Tests
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

  Sloader.Config -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Config.Tests\Sloader.Config.dll
  Sloader.Config.Tests -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Config.Tests\Sloader.Config
  .Tests.dll
Start Building Test: Sloader.Result.Tests
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

  Sloader.Result -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Result.Tests\Sloader.Result.dll
  Sloader.Result.Tests -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Result.Tests\Sloader.Result
  .Tests.dll
Start Building Test: Sloader.Engine.Tests
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

  Sloader.Config -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Engine.Tests\Sloader.Config.dll
  Sloader.Result -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Engine.Tests\Sloader.Result.dll
  Sloader.Engine -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Engine.Tests\Sloader.Engine.dll
  Sloader.Engine.Tests -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Engine.Tests\Sloader.Engine
  .Tests.dll
Finished executing task: BuildTests

========================================
RunTests
========================================
Executing task: RunTests
Start Running Tests
xUnit.net Console Runner (64-bit .NET 4.0.30319.42000)
  Discovering: Sloader.Config.Tests
  Discovered:  Sloader.Config.Tests
  Starting:    Sloader.Config.Tests
  Finished:    Sloader.Config.Tests
  Discovering: Sloader.Engine.Tests
  Discovered:  Sloader.Engine.Tests
  Starting:    Sloader.Engine.Tests
  Finished:    Sloader.Engine.Tests
  Discovering: Sloader.Result.Tests
  Discovered:  Sloader.Result.Tests
  Starting:    Sloader.Result.Tests
  Finished:    Sloader.Result.Tests
=== TEST EXECUTION SUMMARY ===
   Sloader.Config.Tests  Total: 23, Errors: 0, Failed: 0, Skipped: 0, Time: 0.554s
   Sloader.Engine.Tests  Total: 17, Errors: 0, Failed: 0, Skipped: 0, Time: 1.070s
   Sloader.Result.Tests  Total:  4, Errors: 0, Failed: 0, Skipped: 0, Time: 1.061s
                                --          -          -           -        ------
                   GRAND TOTAL: 44          0          0           0        2.684s (5.697s)
Finished executing task: RunTests

========================================
BuildPackages
========================================
Executing task: BuildPackages
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

Build started 2017-02-19 22:00:09.
The target "_ConvertPdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,37)" does not exist in the project, and will be ignored.
The target "_CollectPdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,70)" does not exist in the project, and will be ignored.
The target "_CollectMdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (41,38)" does not exist in the project, and will be ignored.
The target "_CopyMdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Common
.targets\ImportAfter\Xamarin.Common.targets (41,71)" does not exist in the project, and will be ignored.
Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Config\Sloader.Config.csproj" on node 1 (Build target(s))
.
GenerateTargetFrameworkMonikerAttribute:
Skipping target "GenerateTargetFrameworkMonikerAttribute" because all output files are up-to-date with respect to the i
nput files.
CoreCompile:
Skipping target "CoreCompile" because all output files are up-to-date with respect to the input files.
CopyFilesToOutputDirectory:
  Sloader.Config -> C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Config\bin\Release\Sloader.Config.dll
Done Building Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Config\Sloader.Config.csproj" (Build target
(s)).


Build succeeded.
    0 Warning(s)
    0 Error(s)

Time Elapsed 00:00:00.22
Attempting to build package from 'Sloader.Config.csproj'.
MSBuild auto-detection: using msbuild version '14.0' from 'C:\Program Files (x86)\MSBuild\14.0\bin'.
Packing files from 'C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Config\bin\Release'.
Using 'Sloader.Config.nuspec' for metadata.
Found packages.config. Using packages listed as dependencies
Successfully created package 'C:\Users\Robert\Documents\GitHub\Sloader\artifacts\Sloader.Config.0.2.1.nupkg'.
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

Build started 2017-02-19 22:00:10.
The target "_ConvertPdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,37)" does not exist in the project, and will be ignored.
The target "_CollectPdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,70)" does not exist in the project, and will be ignored.
The target "_CollectMdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (41,38)" does not exist in the project, and will be ignored.
The target "_CopyMdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Common
.targets\ImportAfter\Xamarin.Common.targets (41,71)" does not exist in the project, and will be ignored.
Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Result\Sloader.Result.csproj" on node 1 (Build target(s))
.
GenerateTargetFrameworkMonikerAttribute:
Skipping target "GenerateTargetFrameworkMonikerAttribute" because all output files are up-to-date with respect to the i
nput files.
CoreCompile:
Skipping target "CoreCompile" because all output files are up-to-date with respect to the input files.
CopyFilesToOutputDirectory:
  Sloader.Result -> C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Result\bin\Release\Sloader.Result.dll
Done Building Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Result\Sloader.Result.csproj" (Build target
(s)).


Build succeeded.
    0 Warning(s)
    0 Error(s)

Time Elapsed 00:00:00.24
Attempting to build package from 'Sloader.Result.csproj'.
MSBuild auto-detection: using msbuild version '14.0' from 'C:\Program Files (x86)\MSBuild\14.0\bin'.
Packing files from 'C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Result\bin\Release'.
Using 'Sloader.Result.nuspec' for metadata.
Found packages.config. Using packages listed as dependencies
Successfully created package 'C:\Users\Robert\Documents\GitHub\Sloader\artifacts\Sloader.Result.0.2.1.nupkg'.
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

Build started 2017-02-19 22:00:12.
The target "_ConvertPdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,37)" does not exist in the project, and will be ignored.
The target "_CollectPdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,70)" does not exist in the project, and will be ignored.
The target "_CollectMdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (41,38)" does not exist in the project, and will be ignored.
The target "_CopyMdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Common
.targets\ImportAfter\Xamarin.Common.targets (41,71)" does not exist in the project, and will be ignored.
The target "_ConvertPdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,37)" does not exist in the project, and will be ignored.
The target "_CollectPdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,70)" does not exist in the project, and will be ignored.
The target "_CollectMdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (41,38)" does not exist in the project, and will be ignored.
The target "_CopyMdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Common
.targets\ImportAfter\Xamarin.Common.targets (41,71)" does not exist in the project, and will be ignored.
Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Engine\Sloader.Engine.csproj" on node 1 (Build target(s))
.
Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Engine\Sloader.Engine.csproj" (1) is building "C:\Users\R
obert\Documents\GitHub\Sloader\src\Sloader.Config\Sloader.Config.csproj" (2) on node 1 (default targets).
GenerateTargetFrameworkMonikerAttribute:
Skipping target "GenerateTargetFrameworkMonikerAttribute" because all output files are up-to-date with respect to the i
nput files.
CoreCompile:
Skipping target "CoreCompile" because all output files are up-to-date with respect to the input files.
CopyFilesToOutputDirectory:
  Sloader.Config -> C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Config\bin\Release\Sloader.Config.dll
Done Building Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Config\Sloader.Config.csproj" (default targ
ets).

The target "_ConvertPdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,37)" does not exist in the project, and will be ignored.
The target "_CollectPdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (34,70)" does not exist in the project, and will be ignored.
The target "_CollectMdbFiles" listed in a BeforeTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Com
mon.targets\ImportAfter\Xamarin.Common.targets (41,38)" does not exist in the project, and will be ignored.
The target "_CopyMdbFiles" listed in an AfterTargets attribute at "C:\Program Files (x86)\MSBuild\14.0\Microsoft.Common
.targets\ImportAfter\Xamarin.Common.targets (41,71)" does not exist in the project, and will be ignored.
Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Engine\Sloader.Engine.csproj" (1) is building "C:\Users\R
obert\Documents\GitHub\Sloader\src\Sloader.Result\Sloader.Result.csproj" (3) on node 1 (default targets).
GenerateTargetFrameworkMonikerAttribute:
Skipping target "GenerateTargetFrameworkMonikerAttribute" because all output files are up-to-date with respect to the i
nput files.
CoreCompile:
Skipping target "CoreCompile" because all output files are up-to-date with respect to the input files.
CopyFilesToOutputDirectory:
  Sloader.Result -> C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Result\bin\Release\Sloader.Result.dll
Done Building Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Result\Sloader.Result.csproj" (default targ
ets).

BclBuildEnsureBindingRedirects:
Skipping target "BclBuildEnsureBindingRedirects" because all output files are up-to-date with respect to the input file
s.
GenerateTargetFrameworkMonikerAttribute:
Skipping target "GenerateTargetFrameworkMonikerAttribute" because all output files are up-to-date with respect to the i
nput files.
CoreCompile:
Skipping target "CoreCompile" because all output files are up-to-date with respect to the input files.
_CopyAppConfigFile:
Skipping target "_CopyAppConfigFile" because all output files are up-to-date with respect to the input files.
CopyFilesToOutputDirectory:
  Sloader.Engine -> C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Engine\bin\Release\Sloader.Engine.dll
Done Building Project "C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Engine\Sloader.Engine.csproj" (Build target
(s)).


Build succeeded.
    0 Warning(s)
    0 Error(s)

Time Elapsed 00:00:00.54
Attempting to build package from 'Sloader.Engine.csproj'.
MSBuild auto-detection: using msbuild version '14.0' from 'C:\Program Files (x86)\MSBuild\14.0\bin'.
Packing files from 'C:\Users\Robert\Documents\GitHub\Sloader\src\Sloader.Engine\bin\Release'.
Using 'Sloader.Engine.nuspec' for metadata.
Found packages.config. Using packages listed as dependencies
Successfully created package 'C:\Users\Robert\Documents\GitHub\Sloader\artifacts\Sloader.Engine.0.2.1.nupkg'.
Finished executing task: BuildPackages

Task                          Duration
--------------------------------------------------
Clean                         00:00:00.1083837
Restore-NuGet-Packages        00:00:00.7808530
BuildTests                    00:00:02.6296445
RunTests                      00:00:05.9397822
BuildPackages                 00:00:05.2679058
--------------------------------------------------
Total:                        00:00:14.7265692

BrowserLink causes an error in a new ASP.NET Core web

09.02.2017 18:00:00 | Jürgen Gutsch

Using the latest bits of Visual Studio 2017 and the latest SDK of .NET Core, you will possibly get an error while starting your ASP.NET Core Web in Visual Studio 2017 or using dotnet run.

In Visual Studio 2017 the browser opens on pressing F5, but with wired HTML code in the address bar. Debugging starts and stops a few seconds later. Using the console you'l get a more meaningful error message:

This error means, something is missing the system.runtime.dll ("System.Runtime, Version=4.2.0.0") which is not referenced or used somewhere directly. I had a deeper look into the NuGet references and couldn't found it.

Because I often had problems with BrowserLink in the past, I removed the NuGet reference from the project and all worked fine. I added it again, to be sure that the removal didn't clean up anything. The error happened again.

It seams that the current Version of BrowserLink is referencing a library which is not supported by the app. Remove it, if you get the same or a similar error.

BrowserLink in general is a pretty cool feature, it refreshes the browser magically if anything changes on the server. With this tool, you are able to edit your CSS files and preview it directly in the browser without doing a manual refresh. It is a VisualStudio Add-in and uses NuGet-Packages to extend your app to support it.

Build & run xUnit tests with Cake

08.02.2017 01:45:00 |

Last year I already covered the basic usage of Cake, which stands for “C# Make”. This time we want to build and run xUnit tests with Cake.

Scenario

x

Let’s say we have this project structure. Be aware that all our tests have the suffix “Tests” in the project name.

The files are organized like this, so we have all “Tests” in a “tests” folder and the actual code under “src”:

src/Sloader.Config
src/Sloader.Engine
src/Sloader.Hosts.Console
src/Sloader.Result
tests/Sloader.Config.Tests
tests/Sloader.Engine.Tests
tests/Sloader.Result.Tests
.gitignore
build.cake
build.ps1
LICENSE
Sloader.sln

Goal

Now we want to build all tests projects and run them with the xUnit console runner. Be aware that there are multiple ways of doing it, but I found this quite good.

build.cake

#tool "nuget:?package=xunit.runner.console"
//////////////////////////////////////////////////////////////////////
// ARGUMENTS
//////////////////////////////////////////////////////////////////////

var target = Argument("target", "Default");
var configuration = Argument("configuration", "Release");

//////////////////////////////////////////////////////////////////////
// PREPARATION
//////////////////////////////////////////////////////////////////////

// Define directories.
var artifactsDir  = Directory("./artifacts/");
var rootAbsoluteDir = MakeAbsolute(Directory("./")).FullPath;

//////////////////////////////////////////////////////////////////////
// TASKS
//////////////////////////////////////////////////////////////////////

Task("Clean")
    .Does(() =>
{
    CleanDirectory(artifactsDir);
});

Task("Restore-NuGet-Packages")
    .IsDependentOn("Clean")
    .Does(() =>
{
    NuGetRestore("./Sloader.sln");
});

Task("Build")
    .IsDependentOn("Restore-NuGet-Packages")
    .Does(() =>
{

     
});

Task("BuildTests")
    .IsDependentOn("Restore-NuGet-Packages")
    .Does(() =>
{
	var parsedSolution = ParseSolution("./Sloader.sln");

	foreach(var project in parsedSolution.Projects)
	{
	
	if(project.Name.EndsWith(".Tests"))
		{
        Information("Start Building Test: " + project.Name);

        MSBuild(project.Path, new MSBuildSettings()
                .SetConfiguration("Debug")
                .SetMSBuildPlatform(MSBuildPlatform.Automatic)
                .SetVerbosity(Verbosity.Minimal)
                .WithProperty("SolutionDir", @".\")
                .WithProperty("OutDir", rootAbsoluteDir + @"\artifacts\_tests\" + project.Name + @"\"));
		}
	
	}    

});

Task("RunTests")
    .IsDependentOn("BuildTests")
    .Does(() =>
{
    Information("Start Running Tests");
    XUnit2("./artifacts/_tests/**/*.Tests.dll");
});

//////////////////////////////////////////////////////////////////////
// TASK TARGETS
//////////////////////////////////////////////////////////////////////

Task("Default")
    .IsDependentOn("RunTests");

//////////////////////////////////////////////////////////////////////
// EXECUTION
//////////////////////////////////////////////////////////////////////

RunTarget(target);

Explanation: BuildTests?

The default target “Default” will trigger “RunTests”, which depend on “BuildTests”.

Inside the “BuildTests”-target we use a handy helper from Cake and we parse the .sln file and search all “Test”-projects. With that information we can build each test individually and don’t have to worry over “overlapping” files. The output of this build will be saved at “artifacts/_tests”.

Running xUnit

To run xUnit we have to include the runner at the top of the cake file:

#tool "nuget:?package=xunit.runner.console"

Now we can just invoke XUnit2 and scan for all Tests.dlls and we are done:

XUnit2("./artifacts/_tests/**/*.Tests.dll");

Result

The console output should make the flow pretty clear:

PS C:\Users\Robert\Documents\GitHub\Sloader> .\build.ps1
Preparing to run build script...
Running build script...
Analyzing build script...
Processing build script...
Installing tools...
Compiling build script...

========================================
Clean
========================================
Executing task: Clean
Creating directory C:/Users/Robert/Documents/GitHub/Sloader/artifacts
Finished executing task: Clean

========================================
Restore-NuGet-Packages
========================================
Executing task: Restore-NuGet-Packages
MSBuild auto-detection: using msbuild version '14.0' from 'C:\Program Files (x86)\MSBuild\14.0\bin'.
All packages listed in packages.config are already installed.
Finished executing task: Restore-NuGet-Packages

========================================
BuildTests
========================================
Executing task: BuildTests
Start Building Test: Sloader.Config.Tests
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

  Sloader.Config -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Config.Tests\Sloader.Config.dll
  Sloader.Config.Tests -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Config.Tests\Sloader.Config
  .Tests.dll
Start Building Test: Sloader.Result.Tests
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

  Sloader.Result -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Result.Tests\Sloader.Result.dll
  Sloader.Result.Tests -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Result.Tests\Sloader.Result
  .Tests.dll
Start Building Test: Sloader.Engine.Tests
Microsoft (R) Build Engine version 14.0.25420.1
Copyright (C) Microsoft Corporation. All rights reserved.

  Sloader.Config -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Engine.Tests\Sloader.Config.dll
  Sloader.Result -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Engine.Tests\Sloader.Result.dll
  Sloader.Engine -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Engine.Tests\Sloader.Engine.dll
  Sloader.Engine.Tests -> C:\Users\Robert\Documents\GitHub\Sloader\artifacts\_tests\Sloader.Engine.Tests\Sloader.Engine
  .Tests.dll
Finished executing task: BuildTests

========================================
RunTests
========================================
Executing task: RunTests
Start Running Tests
xUnit.net Console Runner (64-bit .NET 4.0.30319.42000)
  Discovering: Sloader.Config.Tests
  Discovered:  Sloader.Config.Tests
  Starting:    Sloader.Config.Tests
  Finished:    Sloader.Config.Tests
  Discovering: Sloader.Engine.Tests
  Discovered:  Sloader.Engine.Tests
  Starting:    Sloader.Engine.Tests
  Finished:    Sloader.Engine.Tests
  Discovering: Sloader.Result.Tests
  Discovered:  Sloader.Result.Tests
  Starting:    Sloader.Result.Tests
  Finished:    Sloader.Result.Tests
=== TEST EXECUTION SUMMARY ===
   Sloader.Config.Tests  Total: 22, Errors: 0, Failed: 0, Skipped: 0, Time: 0.342s
   Sloader.Engine.Tests  Total:  9, Errors: 0, Failed: 0, Skipped: 0, Time: 0.752s
   Sloader.Result.Tests  Total:  4, Errors: 0, Failed: 0, Skipped: 0, Time: 0.475s
                                --          -          -           -        ------
                   GRAND TOTAL: 35          0          0           0        1.569s (3.115s)
Finished executing task: RunTests

========================================
Default
========================================
Executing task: Default
Finished executing task: Default

Task                          Duration
--------------------------------------------------
Clean                         00:00:00.0155255
Restore-NuGet-Packages        00:00:00.5065704
BuildTests                    00:00:02.1590662
RunTests                      00:00:03.2443534
Default                       00:00:00.0061325
--------------------------------------------------
Total:                        00:00:05.9316480

Using Dependency Injection in .NET Core Console Apps

07.02.2017 18:00:00 | Jürgen Gutsch

The Dependency Injection (DI) Container used in ASP.NET Core is not limited to ASP.NET Core. You are able to use it in any kind of .NET Project. This post shows how to use it in an .NET Core Console application.

Create a Console Application using the dotnet CLI or Visual Studio 2017. The DI Container is not available by default, bit the IServiceProvider is. If you want to use an Custom or third party DI Container, you should provide an implementation if an IServiceProvider, as an encapsulation of a DI Container.

In this post I want to use the DI Container used in the ASP.NET Core projects. This needs an additional NuGet package "Microsoft.Extensions.DependencyInjection" (currently it is version 1.1.0)

Since this library is a .NET Standard Library, it should also work in a .NET 4.6 application. You just need to add a reference to "Microsoft.Extensions.DependencyInjection"

After adding that package we can start to use it. I created two simple classes which are dependent to each other, to show the how it works in a simple way:

public class Service1 : IDisposable
{
  private readonly Service2 _child;
  public Service1(Service2 child)
  {
    Console.WriteLine("Constructor Service1");
    _child = child;
  }

  public void Dispose()
  {
    Console.WriteLine("Dispose Service1");
    _child.Dispose();
  }
}

public class Service2 : IDisposable
{
  public Service2()
  {
    Console.WriteLine("Constructor Service2");
  }

  public void Dispose()
  {
    Console.WriteLine("Dispose Service2");
  }
}

Usually you would also use interfaces and create the relationship between this two classes, instead of the concrete implementation. Anyway, we just want to test if it works.

In the static void Main of the console app, we create a new ServiceCollection and register the classes in a transient scope:

var services = new ServiceCollection();
services.AddTransient<Service2>();
services.AddTransient<Service1>();

This ServiceCollection comes from the added NuGet package. Your favorite DI container possibly uses another way to register the services. You could now share the ServiceCollection to additional components, who wants to share some more services, in the same way ASP.NET Core does it with the AddSomething (e. g. AddMvc()) extension methods.

Now we need to create the ServiceContainer out of that collection:

var provider = services.BuildServiceProvider();

We can also share the ServiceProvider in our application to retrieve the services, but the proper way is to use it only on a single entry point:

using (var service1 = provider.GetService<Service1>())
{
  // so something with the class
}

Now, let's start the console app and look at the console output:

As you can see, this DI container is working in any .NET Core app.

Neuer Server neues Glück – Blog Umzug ist vollzogen

06.02.2017 18:08:54 | Hans-Peter Schelian

Nachdem es lange Zeit "sehr" still rund um meinen Blog war, möchte ich den gerade vollzogenen Umzug meines Blog auf einen neuen Server auch dazu nutzen wieder regelmäßig zu bloggen. Mit dem Umzug auf den neuen Server habe ich auch gleich das Zertifikat Thema aktualisiert. Ab sofort setze ich hier im Blog ein Zertifikat von Letsencrypt ein.

Visual Studio: NDepend v2017 erschienen

05.02.2017 13:23:43 | Steffen Steinbrecher

NDepend ist in der Version v2017 erschienen und bietet tolle neue Features: Smart Technical Debt Estimation – mit dieser Funktion sieht man auf einen Blick wie viel Zeit es kosten würde kritische Codestellen/Funktionen zu fixen bzw. ein Refactoring durchzuführen. Angenommen ein Entwickler implementiert eine neue Funktion und die NDepend Analyse würde jetzt ergeben, dass diese […]

C#: OData-Operationen mit Parametern aufrufen

05.02.2017 11:56:19 | Steffen Steinbrecher

OData unterstützt benutzerdefinierte Operationen, sogenannte Aktionen und Funktionen. Eine Funktion muss Daten zurückliefern und hat im Normalfall keine Nebenwirkungen, d.h. eine Funktion sollte keine Daten verändern und lediglich lesend auf Daten zugreifen (GET). Aktionen hingegen können CRUD-Operationen auf Entitäten durchführen, d.h. man kann benutzerdefinierte Aktionen definieren, welche CREATE-, UPDATE- oder DELETE-Operationen auf Entitägen ausführen, wenn […]

C#: OData, SAP NW Gateway und CSRF-Token

01.02.2017 15:53:09 | Steffen Steinbrecher

Bei allen Änderungsanfragen (PUT, POST und DELETE) eines Clients gegen einen SAP Netweaver OData-Service muss der Client ein entsprechendes CSRF (Cross Site Request Forgery) Token mitgeben. In diesem Beitrag wird jetzt gezeigt wie man an solches Token anfordern und dann bei den OData-Requests mitgeben kann. CSRF – Cross Site Request Forgery Zunächst einmal ein paar […]

Mein Technologieradar für 2017

25.01.2017 07:30:47 | Johnny Graber

Mit dem neuen Jahr wird es auch wieder Zeit für eine Aktualisierung meines Technologieradars. Nach den Ausgaben für 2014 und 2015 verzichtete ich auf eine Ausgabe für 2016. Die grossen Ankündigungen von Microsoft (wie ASP.Net vNext) verzögerten sich und nur die Jahreszahl anzupassen war mir zu wenig.   Mein neuer Technologieradar Auch weiterhin kommt das … Mein Technologieradar für 2017 weiterlesen

Building a home robot: Part 5 - arms and hands

13.01.2017 19:00:00 | Daniel Springwald

(see all parts of "building a home robot")

I wanted Roobert to get two identical hands with separate moveable fingers.

Because of this (and the small size) I decided to use a commercial construction kit instead of designing and constructing the hands on my own.

Although it was a construction kit it was fun for hours to assemble the hands:

Each arm is constructed from the hand construction kit, 3 servos, 3d printed servo brackets and an I2C servo controller. Because the servo controller seemed to be unable to shut down the servo power, I attached a relays for each arm to turn the servo power on/on.

The servo holder for the upper arm parts are printed in 3D:

The complete arms:

The right arm:

A roobert-hand-assembling-workplace :-)

GitHub API: Create or update files

03.01.2017 01:45:00 |

This blogpost covers a pretty basic GitHub topic: Creating and updating content on GitHub. Of course, there are many ways to do it - e.g. you could do the full Git-ceremony and it would work with all Git hosts, but in my case I just wanted to target the offical GitHub API.

Prerequisite: A GitHub User, Repo and Token

To use this code you will need write access to a GitHub repository and you should have a valid GitHub token.

Code

The most simple way to communicate with the GitHub API is by using the Octokit SDK (from GitHub).

Description: Inside the try-block we try to get the target file, if it is already committed in the repo the API will return the last commit SHA.

With this SHA it is possible to create a new commit to do the actual update.

If the file was not found, we create the file. I’m not a huge fan of this try/catch block, but didn’t found any other way to check if the file is comitted or not (please give me a hint if this is wrong ;))

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Octokit;

namespace CreateOrUpdateGitHubFile
{
    class Program
    {
        static void Main(string[] args)
        {
            Task.Run(async () =>
            {
                var ghClient = new GitHubClient(new ProductHeaderValue("Octokit-Test"));
                ghClient.Credentials = new Credentials("ACCESS-TOKEN");

                // github variables
                var owner = "OWNER";
                var repo = "REPO";
                var branch = "BRANCH";

                var targetFile = "_data/test.txt";

                try
                {
                    // try to get the file (and with the file the last commit sha)
                    var existingFile = await ghClient.Repository.Content.GetAllContentsByRef(owner, repo, targetFile, branch);

                    // update the file
                    var updateChangeSet = await ghClient.Repository.Content.UpdateFile(owner, repo, targetFile,
                       new UpdateFileRequest("API File update", "Hello Universe! " + DateTime.UtcNow, existingFile.First().Sha, branch));
                }
                catch (Octokit.NotFoundException)
                {
                    // if file is not found, create it
                    var createChangeSet = await ghClient.Repository.Content.CreateFile(owner,repo, targetFile, new CreateFileRequest("API File creation", "Hello Universe! " + DateTime.UtcNow, branch));
                }

                
                
            }).Wait();
        }
    }
}

The demo code is also available on GitHub.

Hope this helps.

Regeln | Impressum