Quantcast
Channel: The art of simplicity
Viewing all 2694 articles
Browse latest View live

Turning your .NET Core 3 Worker Service into a Windows Service

$
0
0

Until the introduction of the Worker Service in .NET Core 3 I always used TopShelf to turn my console application into a windows service.

Let’s see how we can do this using the built-in packages…

  • Start by creating a new worker project from the command line(or open Visual Studio and search for ‘worker’ in the available templates)

dotnet new worker

  • Now add the following nuget package to our project:
Install-Package Microsoft.Extensions.Hosting.WindowsServices
  • Next we have to modify our program.cs file and add a “UseWindowsService()”.
  • That’s it!

Of course, you don’t have to believe me like that so let’s try to install our newly created windows service. This can be done through the standard Windows Service installer:

sc create ExampleService BinPath=C:\Projects\test\ExampleService\bin\Debug\netcoreapp3.1>ExampleService.exe

Now open up your services window and have a look:


C# 6–Using static

$
0
0

So far I’ve never used the ‘using static’ directive introduced in C# 6. To simplify the assertions of my tests I created a static TestHelper:

I’m using this TestHelper to simplify my NetArchTests(but more about that in another blog post).

Inside my tests I can now do the following:

Neat!

XUnit–Could not load file or assembly 'Microsoft.VisualStudio.CodeCoverage.Shim’

$
0
0

When executing my XUnit tests on the build server, it failed with the following message:

System.IO.FileNotFoundException : Could not load file or assembly 'Microsoft.VisualStudio.CodeCoverage.Shim, Version=15.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a'. The system cannot find the file specified.

Inside my csproj file following packages were referenced:

<PackageReference Include="xunit" Version="2.4.0" />

<PackageReference Include="xunit.runner.visualstudio" Version="2.4.0" />

<PackageReference Include="coverlet.collector" Version="1.0.1" />

The ‘xunit.runner.visualstudio’ implicitly has a dependency on the Microsoft.NET.Test.Sdk(at minimum version 15.0) what could explain why he tried to load the assembly mentioned above.

To get rid of this error, I had to explicitly include a reference to the ‘Microsoft.NET.Test.Sdk’:

<PackageReference Include="Microsoft.NET.Test.Sdk" Version="16.4.0" />

Virtual Azure Community Day–March 2020

$
0
0

In case you missed the last (virtual) Azure Community Day in March, all content is available online:

You have 4 tracks with each 8 hours of content! A must see for every Azure addict…

Serilog - IDiagnosticContext

$
0
0

The ‘classic’ way I used to attach extra properties to a log message in Serilog was through the LogContext.

From the documentation:

Properties can be added and removed from the context using LogContext.PushProperty():

Disadvantage of using the LogContext is that the additional information is only available inside the scope of the specific logcontext(or deeper nested contexts). This typically leads to a larger number of logs which doesn’t always help to find out what is going on.

Today I try to follow a different approach where I only log a single message at the end of an operation. Idea is that the log message is enriched during the lifetime of an operation and that we end up with a single log entry.

This is easy to achieve in Serilog thanks to the IDiagnosticContext interface. The diagnostic context is provides an execution context (similar to LogContext) with the advantage that it can be enriched throughout its lifetime. The request logging middleware then uses this to enrich the final “log completion event”.

More info: https://nblumhardt.com/2019/10/serilog-mvc-logging/

Azure Pipelines error - NuGet.CommandLine.CommandLineException: Error parsing solution file

$
0
0

After installing the latest Visual Studio version on our build servers, some of our builds started to fail with the following error message:

This error only happened on the build servers running MSBuild version 16.5.0.12403:

One or more errors occurred. ---> NuGet.CommandLine.CommandLineException: Error parsing solution file at D:\b\4\agent\_work\153\s\VLM.MELO.sln: Exception has been thrown by the target of an invocation. at NuGet.CommandLine.MsBuildUtility.GetAllProjectFileNamesWithMsBuild(String solutionFile, String msbuildPath) at NuGet.CommandLine.RestoreCommand.ProcessSolutionFile(String solutionFileFullPath, PackageRestoreInputs restoreInputs)

This turns out to a bug in the NuGet client where older versions have trouble with this new version of MSBuild.

To resolve this issue in Azure Pipelines, add a NuGet Tool Installer task to your pipeline before any tasks that use NuGet, and set the version field to include the latest version.

Azure Charts–Help! Azure is evolving too fast…

$
0
0

As most cloud platforms, Azure is evolving quite fast. This makes it hard to keep up-to-date and know where you need to focus your energy on. Azure Charts can help. It is a web based application which allows you to see what the Azure consists of and how it evolves. I would specifically recommend to take a look at the Learning section to see what new Learning content got published.

More information: https://techcommunity.microsoft.com/t5/educator-developer-blog/azure-charts-visualize-your-cloud-learning-journey/ba-p/1353228

Azure Pipelines–DotNet restore error

$
0
0

After configuring a new build pipeline, the build failed with the following error when trying to execute the dotnet restore build task:

NuGet.targets(124,5): error :  Unable to load the service index for source http://tfs:8080/DefaultCollection/_packaging/Feed/nuget/v3/index.json

NuGet.targets(124,5): error :  No credentials are available in the security package

The strange thing was that the same task worked without a problem on other builds. Only for newly created builds it failed with the error message above.

A workaround that seemed to work was to switch the dotnet build task to the ‘custom’ command. By using the custom command I can add an extra ‘--force’ arguments to the ‘dotnet restore’ command. By adding this extra argument I got rid of the error message above.


Lens–The Kubernetes IDE

$
0
0

If you are working with Kubernetes I can recommend Lens,  an open-source and free IDE to take control of your Kubernetes clusters.

ASP.NET Core–The magic appearance of IMemoryCache

$
0
0

I created a small security library in .NET Core that simplifies the rather complex security setup we have at one of my clients. Inside this library I’m using the IMemoryCache to cache some non-volatile data.

When a colleague tried to use this library he told me that he had to add the following line

This doesn’t seem unexpected but the strange this was that in my example project I nowhere added this!? Time to investigate…

A walk through the ASP.NET Core source code (always a fun experience to discover and learn something new about the framework) learned me the following; when you call AddMvc() or AddResponseCaching() the framework will register for you an IMemoryCache behind the scenes.

If you are using a lower level method like AddControllers() this is not the case.

Learned something? Check!

Azure Pipelines- Error executing dotnet restore task

$
0
0

When trying to execute dotnet restore during a build it failed with the following error message:

2020-05-12T18:14:36.8332220Z C:\Program Files\dotnet\sdk\3.1.201\NuGet.targets(536,5): error :   The '@' character, hexadecimal value 0x40, cannot be included in a name. Line 6, position 35. [D:\b\4\agent\_work\200\s\IAM.Core\IAM.Core.csproj]

2020-05-12T18:14:36.8820520Z      2>Done Building Project "D:\b\4\agent\_work\200\s\IAM.Core\IAM.Core.csproj" (_GenerateRestoreGraphProjectEntry target(s)) -- FAILED.

2020-05-12T18:14:36.9152564Z      1>Project "D:\b\4\agent\_work\200\s\IAM.Core.Tests\VLM.IAM.Core.Tests.csproj" (1) is building "D:\b\4\agent\_work\200\s\IAM.Core.Tests\IAM.Core.Tests.csproj" (1:5) on node 1 (_GenerateRestoreGraphProjectEntry target(s)).

2020-05-12T18:14:36.9162330Z      1>C:\Program Files\dotnet\sdk\3.1.201\NuGet.targets(536,5): error : NuGet.Config is not valid XML. Path: 'D:\b\4\agent\_work\200\Nuget\tempNuGet_60617.config'. [D:\b\4\agent\_work\200\s\IAM.Core.Tests\IAM.Core.Tests.csproj]

2020-05-12T18:14:36.9162330Z C:\Program Files\dotnet\sdk\3.1.201\NuGet.targets(536,5): error :   The '@' character, hexadecimal value 0x40, cannot be included in a name. Line 6, position 35. [D:\b\4\agent\_work\200\s\IAM.Core.Tests\IAM.Core.Tests.csproj]

2020-05-12T18:14:36.9162330Z      1>Done Building Project "D:\b\4\agent\_work\200\s\IAM.Core.Tests\IAM.Core.Tests.csproj" (_GenerateRestoreGraphProjectEntry target(s)) -- FAILED.

2020-05-12T18:14:36.9230692Z      1>Done Building Project "D:\b\4\agent\_work\200\s\IAM.Core.Tests\IAM.Core.Tests.csproj" (Restore target(s)) -- FAILED.

2020-05-12T18:14:36.9230692Z

Let’s have a look at our nuget.config file to see what is going wrong:

It turns out that NuGet doesn’t like that you use an ‘@’ sign in the name of the feed.

Renaming solved the problem…

Git sparse checkout

$
0
0

With the growing usage of mono-repositories the standard git checkout or git status no longer work and become frustrating slow. A solution would be to use Git LFS(Large File Storage) but not all repositories have this extension installed.

An alternative solution can be provided through the (new) git sparse-checkout command.

To restrict your working directory to a set of directories, run the following commands:

  1. git sparse-checkout init
  2. git sparse-checkout set <dir1> <dir2> ...

If you get stuck, run git sparse-checkout disable to return to a full working directory.

Remark: this feature is part of git 2.25. So if the command is not recognized check your git version and update first.

More information: https://github.blog/2020-01-17-bring-your-monorepo-down-to-size-with-sparse-checkout/

.NET Core–Generate documentation

$
0
0

Although I try to make my API’s as descriptive as possible, sometimes good documentation can still make a difference.

One way to enable documentation generation is through Visual Studio:

  • Right click on your project and select Properties.
  • On the Properties window go to the Build tab.
  • Check the XML documentation file checkbox
  • Don’t forget to save these changes.

As a result the following is added to your csproj file:

There are a few things I don’t like about this:

  • First a condition is applied to the PropertyGroup which doesn’t seem necessary
  • Second an absolute path is used to define where to generate the documentation XML

So I would recommend no longer to use this approach. What you can do instead is directly manipulate the csproj file and add the following line to a PropertyGroup:

Hands-on-labs: App modernization

$
0
0

A colleague shared the following hands-on-lab with me: https://github.com/microsoft/MCW-App-modernization

It’s a great starting point to learn about the cloud and take your first steps towards it. It combines a whiteboard design session and a hands-on-lab.

This is wat you will design and build:

GraphQL Inspector

$
0
0

While in traditional REST API’s versioning is a hot topic, GraphQL takes a strong opinion on avoiding versioning by providing the tools for the continuous evolution of a GraphQL schema. As GraphQL only returns the data that is explicitly requested, it becomes easier to introduce new functionality by adding new types and fields without introducing breaking changes. As you know what fields are used by which clients you can have a lot more knowledge in your hands to prevent breaking your clients.

For small schema’s it can be feasible to inspect your schema for changes manually but for larger schemas or federated schema’s good tooling becomes a necessity.

A tool that can help you to achieve this is GraphQL Inspector.

It offers the following (free) features:

  • Compares schemas
  • Detect breaking or dangerous changes
  • Schema change notifications
  • Use serverless functions validate changes
  • Validates Operations and Fragments against a schema
  • Finds similar / duplicated types
  • Schema coverage based on Operations and Fragments
  • Serves a GraphQL server with faked data and GraphiQL
  • Docker Image

Getting started

To get started you have multiple items. You can use it as a Github application, a Github action but also as a commandline tool.

Let’s see how to use the commandline tool:

npm install --global @graphql-inspector/cli graphql

Now we can compare two schema’s:

graphql-inspector diff old.graphql new.graphql

Detected the following changes (2) between schemas:

  Description was removed from field Post.createdAt
  Field Post.createdAt changed type from String to String!
success No breaking changes detected

It is a must have for every GraphQL developer!


Using YARP to create a reverse proxy server

$
0
0

So far I’ve always used ProxyKit to create a reverse proxy in ASP.NET Core. But with the announcement of Yarp, it is time to try this alternative…

  • I created a new ASP.NET Core “empty” project:

dotnet new web -n ProxyTest -f netcoreapp3.1
The template "ASP.NET Core Empty" was created successfully.

Processing post-creation actions...
Running 'dotnet restore' on ProxyTest\ProxyTest.csproj...
  Restore completed in 278,54 ms for C:\Projects\test\yarptest\ProxyTest\ProxyTest.csproj.

Restore succeeded.

  • Next step is to reference the Microsoft.ReverseProxy preview nuget package:
<ItemGroup> 
<PackageReference Include="Microsoft.ReverseProxy" Version="1.0.0-preview.1.*" />
</ItemGroup>
  • Now it is time to update our Startup.cs. This is what I had when using Proxykit:
  • And here is the updated Startup.cs after switching to Yarp:
    • In Yarp everything is handled through configuration right now, so the real magic is there:

    I'm curious on how this will evolve in the future...

    ASP.NET Core–Set environment through the commandline

    $
    0
    0

    ASP.NET Core has built-in support for multiple environments. This makes it easy to load different configuration and apply different middleware depending on the environment.

    The typical to control the environment we want to use is through the ASPNETCORE_ENVIRONMENT environment variable.

    It is also possible to set the environment variable by passing it to the dotnet run command as an argument.

    To set this up, we have to modify the Program.cs:

    The AddCommandLine method allows us to read configuration values from the command line.

    Now we can start the app with dotnet run --environment Development.

    Sharing authentication ticket between .NET Core and ASP.NET (Owin)

    $
    0
    0

    By default authentication tickets cannot be shared between .NET Core and OWIN. The good news is that it is possible but we have to take some extra steps:

    .NET Core App

    On .NET Core side we have to change the cookie authentication middleware:

    • The cookie name should match the name used by the OWIN Cookie Authentication Middleware (.AspNet.SharedCookie for example).
    • An instance of a DataProtectionProvider should be initialized to the common data protection key storage location.

    ASP.NET (OWIN) App

    On ASP.NET (OWIN) side we have to install the Microsoft.Owin.Security.Interop package first.

    Then we can change the cookie authentication middleware:

    • The cookie name should match the name used by the ASP.NET Core Cookie Authentication Middleware (.AspNet.SharedCookie in the example).
    • An instance of a DataProtectionProvider should be initialized to the common data protection key storage location.

      Application Insights - Stop tracking 404 errors

      $
      0
      0

      By default Application Insights will log every 404 error in your web app as an error. I think this is a good default, but what if you don’t want to see these 404 errors?

      There are 2 options to solve this:

      Telemetry Processor

      A telemetry processor gives you direct control over what is included or excluded from the telemetry stream.

      We can register our new TelemetryProcessor by using the AddApplicationInsightsTelemetryProcessor extension method on IServiceCollection, as shown below:

      Telemetry Initializer

      Telemetry initializers allow you to enrich telemetry with additional information and/or to override telemetry properties set by the standard telemetry modules. By default, any request with a response code >= 400 is flagged as failed. But if we want to treat a 404 as a success, we can provide a telemetry initializer that sets the Success property:

      We can register the TelemetryInitializer in our Startup.cs:

      Advantage of the Telemetry Initializer is that we still log the 404 event but no longer as an error.

      More information: https://docs.microsoft.com/en-us/azure/azure-monitor/app/api-filtering-sampling

      Azure Service Bus Explorer in Azure Portal

      $
      0
      0

      Until recently I used the Service Bus Explorer to debug and manage Azure Service Bus. But last week I noticed the following new menu item in Azure Service Bus:

      To use the Azure Service Bus explorer, you need to navigate to the Service Bus namespace on which you want to perform send, peek, and receive operations. Then select either ‘Queues’ or ‘Topics’ from the from the navigation menu. After doing that you should see the ‘Service Bus Explorer’ option.

      Following operations are supported:

      • Queues
        • 'Send' to a queue
        • 'Receive' from a queue.
        • 'Peek' from a queue.
        • 'Receive' from DeadLetterQueue.
        • 'Peek' from the DeadLetterQueue.
      • Topics
        • 'Send' to a topic.
      • Subscriptions
        • 'Peek' from a subscription on a topic.
        • 'Receive' from a subscription.
        • 'Peek' from the DeadLetter subscription.
        • 'Receive' from the DeadLetter subscription.

      To learn more about the Service Bus Explorer tool, please read the documentation.

      Viewing all 2694 articles
      Browse latest View live


      Latest Images

      <script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>